Voltmeter Gauge Question
Depends on what you are trying to achieve.
A) Alternator terminal.
The most accurate measurement of voltage at source is at the alternator output terminal. That gets you the "high water mark" of what is feeding in the system, before voltage / current losses from transmitting the power to the PCM or battery.
Want to be real clever?
Measure the voltage the battery is supplying to the rotor field winding terminal.
Within the alternator, there is in most cases a built in regulator.
The regulator turns off the charging system by lowering this current when your battery system is fully charged and no power is needed from the alternator to run the system.
Quite often, a "bad" alternator that is not giving output is failing because the voltage from the battery is faulty / insufficient even before it gets to the regulator.
So measuring the voltage here can have the double benefit of seeing the battery voltage that the alternator's regulator "sees".
B) Battery terminal(s)
The most accurate measurement of voltage for practical knowledge about your battery's condition is at the battery terminals.
If your batteries are hooked up in parallel, you may not be able to see the voltage differences between your 2 batteries unless there is a way to "disengage" and measure them one at a time.
This is the best deal --- it really tell you what actual voltage is seen at the battery when the vehicle is running and it is charging.
Plus, it gives you the voltage at the battery when you are cranking --- and you see how much it draws down. That is an excellent indication of battery condition.
In fact, load testing a battery basically involves hooking up a large resistive or other loads on it, and then after a certain amount of time / power drained, see how much the voltage drops.
C) Connected "wherever"
The most common connection for a volt meter is whatever wiring happens to be acceptable and convenient.
This is also the worse arrangement as many of these power circuits are now managed by the onboard PCM --- so the voltage "seen" can be vastly different from what is "seen" at the alternator or battery terminals OR at critical devices like at the starter solenoid, the FICM power terminal, etc.
If you are really into batteries, you rig up a system where each individual cell can be connected / disconnected individually, and each cell is monitored by a special chip that monitors cell condition, discharge, regulates charge rate, temperature, etc.
Then a separate bank of sensors monitor the alternator(s), and voltage measurements at different critical sub-systems (e.g. FICM, Starter, etc.) and their voltage, current draw, etc.....
So tell me what you are trying to accomplish....
It is the "dumb" questions that are the hardest to answer!
A) Alternator terminal.
The most accurate measurement of voltage at source is at the alternator output terminal. That gets you the "high water mark" of what is feeding in the system, before voltage / current losses from transmitting the power to the PCM or battery.
Want to be real clever?
Measure the voltage the battery is supplying to the rotor field winding terminal.
Within the alternator, there is in most cases a built in regulator.
The regulator turns off the charging system by lowering this current when your battery system is fully charged and no power is needed from the alternator to run the system.
Quite often, a "bad" alternator that is not giving output is failing because the voltage from the battery is faulty / insufficient even before it gets to the regulator.
So measuring the voltage here can have the double benefit of seeing the battery voltage that the alternator's regulator "sees".
B) Battery terminal(s)
The most accurate measurement of voltage for practical knowledge about your battery's condition is at the battery terminals.
If your batteries are hooked up in parallel, you may not be able to see the voltage differences between your 2 batteries unless there is a way to "disengage" and measure them one at a time.
This is the best deal --- it really tell you what actual voltage is seen at the battery when the vehicle is running and it is charging.
Plus, it gives you the voltage at the battery when you are cranking --- and you see how much it draws down. That is an excellent indication of battery condition.
In fact, load testing a battery basically involves hooking up a large resistive or other loads on it, and then after a certain amount of time / power drained, see how much the voltage drops.
C) Connected "wherever"
The most common connection for a volt meter is whatever wiring happens to be acceptable and convenient.
This is also the worse arrangement as many of these power circuits are now managed by the onboard PCM --- so the voltage "seen" can be vastly different from what is "seen" at the alternator or battery terminals OR at critical devices like at the starter solenoid, the FICM power terminal, etc.
If you are really into batteries, you rig up a system where each individual cell can be connected / disconnected individually, and each cell is monitored by a special chip that monitors cell condition, discharge, regulates charge rate, temperature, etc.
Then a separate bank of sensors monitor the alternator(s), and voltage measurements at different critical sub-systems (e.g. FICM, Starter, etc.) and their voltage, current draw, etc.....
So tell me what you are trying to accomplish....
It is the "dumb" questions that are the hardest to answer!
Wow, thank you for the info! I want to monitor the batteries so I know when they are going bad. I let the last pair fail before I replaced them and the FICM followed shortly afterward. I just wasn't sure if I connect the wires directly to the battery terminals. By the way I am using 16 gauge wire... is this the right sized wire to use? Thanks again for the input!
Hook it up at the battery.
If you have batteries on either side hooked up in parallel, assuming you keep the terminals religiously clean and in good contact, only one lead is needed.
Use the battery and lead closest to the starter (the one that is nearest to starter). This gives you the voltage that the starter "sees" when cranking.
If you want to see a lower reading, hook it up to the other battery that is further away (with more cable losses). That would give the "worse case".
How far of a run are you doing from gauge to battery terminal?
That determines what gauge wire you need to minimize losses.'
Normally it is not a big deal because there is very little current flowing on the wire.
Make sure your ground is first rate. Clean contacts, and don't forget a dab of dielectric grease keeps corrosion away for a long time.
If you can get one cheap, an induction ammeter gives great info that when interpreted with the voltmeter, tells all the tales you care to have.
If you have batteries on either side hooked up in parallel, assuming you keep the terminals religiously clean and in good contact, only one lead is needed.
Use the battery and lead closest to the starter (the one that is nearest to starter). This gives you the voltage that the starter "sees" when cranking.
If you want to see a lower reading, hook it up to the other battery that is further away (with more cable losses). That would give the "worse case".
How far of a run are you doing from gauge to battery terminal?
That determines what gauge wire you need to minimize losses.'
Normally it is not a big deal because there is very little current flowing on the wire.
Make sure your ground is first rate. Clean contacts, and don't forget a dab of dielectric grease keeps corrosion away for a long time.
If you can get one cheap, an induction ammeter gives great info that when interpreted with the voltmeter, tells all the tales you care to have.
Hook it up at the battery.
If you have batteries on either side hooked up in parallel, assuming you keep the terminals religiously clean and in good contact, only one lead is needed.
Use the battery and lead closest to the starter (the one that is nearest to starter).
How far of a run are you doing from gauge to battery terminal?
That determines what gauge wire you need to minimize losses.'
Normally it is not a big deal because there is very little current flowing on the wire.
If you can get one cheap, an induction ammeter gives great info that when interpreted with the voltmeter, tells all the tales you care to have.
If you have batteries on either side hooked up in parallel, assuming you keep the terminals religiously clean and in good contact, only one lead is needed.
Use the battery and lead closest to the starter (the one that is nearest to starter).
How far of a run are you doing from gauge to battery terminal?
That determines what gauge wire you need to minimize losses.'
Normally it is not a big deal because there is very little current flowing on the wire.
If you can get one cheap, an induction ammeter gives great info that when interpreted with the voltmeter, tells all the tales you care to have.
5 ft?
16 gauge?
No problems.
American wire gauge - Wikipedia, the free encyclopedia
BTW.... that gauge is probably accurate to +/- 1 volt...
http://www.isspro.com/proddetail.php?prod=R8760
Remember --- the critical thing is not the gauge voltage when you crank, but the amount of voltage drop.
Often, a crapped out battery will still read "charged" at 12.xx volt.. but when you crank, it plunges to... say 8V.. or less.
The lower the voltage drop when you crank, the healthier the battery.
Here is a chart and a calculator for calculating how big a wire you need to do the job:
http://www.powerstream.com/Wire_Size.htm
What do you expect to see....
Well, the 6.0 starter is normally about 3kw @ 12v, and there are some 450s out there with 3.6kw starters.
@12 volt nominal, the starter draws 250 amps.
@10 volt nominal, the starter is drawing 350 amps.
So.... look at the thickness of your battery to starter wire.. and you can very quickly guess what gauge it is, how much voltage drop is in the wire alone.. and how much of a benefit you can reap from just upgrading the wiring.
I haven't got around to it yet... but in the spring... I am upgrading the battery to vehicle wiring.
Note --- you have to upgrade the whole system to benefit --- the weakest link is the place with the most resistance.
But if you have a very short run (e.g. the negative terminal to ground) you don't need as thick a wire to lower voltage loss.
16 gauge?
No problems.
American wire gauge - Wikipedia, the free encyclopedia
BTW.... that gauge is probably accurate to +/- 1 volt...
http://www.isspro.com/proddetail.php?prod=R8760
Remember --- the critical thing is not the gauge voltage when you crank, but the amount of voltage drop.
Often, a crapped out battery will still read "charged" at 12.xx volt.. but when you crank, it plunges to... say 8V.. or less.
The lower the voltage drop when you crank, the healthier the battery.
Here is a chart and a calculator for calculating how big a wire you need to do the job:
http://www.powerstream.com/Wire_Size.htm
What do you expect to see....
Well, the 6.0 starter is normally about 3kw @ 12v, and there are some 450s out there with 3.6kw starters.
@12 volt nominal, the starter draws 250 amps.
@10 volt nominal, the starter is drawing 350 amps.
So.... look at the thickness of your battery to starter wire.. and you can very quickly guess what gauge it is, how much voltage drop is in the wire alone.. and how much of a benefit you can reap from just upgrading the wiring.
I haven't got around to it yet... but in the spring... I am upgrading the battery to vehicle wiring.
Note --- you have to upgrade the whole system to benefit --- the weakest link is the place with the most resistance.
But if you have a very short run (e.g. the negative terminal to ground) you don't need as thick a wire to lower voltage loss.
Trending Topics
I actually hooked it up, for now, with the positive going to the 12V ignition (thanks rocky1074 for the article!) Thank you Gearloose for all the info ... I will think about my options you provided and then re-run the wires to either the alternator or the battery. Reps to both of you!
Thread
Thread Starter
Forum
Replies
Last Post
dalberti
Electrical Systems/Wiring
5
Apr 4, 2016 06:58 PM










