HELP I need to be educated
#1
HELP I need to be educated
Please explain to me how using 12 volts decreases the load on wiring. If I have a headlight that is 12 volts and 4.58 amps I get 55 watts. Watts is heat isn't it? So if I drop to 6 volts and increase the current to 9.17 I get 55 watts. I understand the current went down but the voltsage goes up. HELP
#2
#3
Power = Voltage X Current. So if you double the voltage, you reduce the current by one half. This is most evident on your battery cables. A lot of power (Watts) is required to crank your engine. At 12 Volts, the current (Amperes) draw is half what it is at 6 Volts.
In other words, it's not the heat; it's the humidity.
Sorry for that last crack.
In other words, it's not the heat; it's the humidity.
Sorry for that last crack.
#4
Originally Posted by Christopher2
I understand the current went down but the voltsage goes up. HELP
#5
#6
Actually, the higher Voltage is more efficient, though you will not likely notice the savings unless the price at the pump continues on it's rise.
Power loss over transmission lines = Current squared X Resistance.
Since the current is squared (current X current) in this equation, if you can cut it in half, your reduce the power loss by a factor of 4.
Which is why electrical power is sent over high voltage lines and reduce to 110 Volts AC by a transformer as close to your house as possible.
Power loss over transmission lines = Current squared X Resistance.
Since the current is squared (current X current) in this equation, if you can cut it in half, your reduce the power loss by a factor of 4.
Which is why electrical power is sent over high voltage lines and reduce to 110 Volts AC by a transformer as close to your house as possible.
#7
Trending Topics
#8
#9
#10
#11
#12
Chris, I used to get this stuff confused all the time. Still do sometimes. Most of what you need to know has already been said - sometimes it just needs to be said in a different way for different people.
The first thing to remember is that the power you are using at your device (headlamps, starter motor, whatever) is NOT the same as the power that is dissipated in the wiring as heat. If you need 55W at your device (headlight), you can use 12V and 4.58A, or 6V and 9.17A - assuming you use the right headlamp bulb. The power lost in the wiring will not be the same, however.
Let's assume you want to wire up the headlamp and you use 10 feet of wire that has a resistance of 0.01 Ohm per foot, for a total wire resistance of 0.1 Ohm. If you use the 6V system, you'll have 9.17A running through the wire and the power dissipated in the wire will be 9.17A x 9.17A x 0.1 Ohm (I x I x R), or 8.4W. If you use the 12V system, you'll have 4.58A running through the wire and the power dissipated in the wire will be 4.58A x 4.58A x 0.1 Ohm (I x I x R), or 2.1W. Note that, since the current is squared in the wire loss calculation, cutting the current in half cuts the wire power dissipation by a factor of four (assuming you use the same size wire in both cases). All the power loss in the wire comes out as heat, and has to be dissipated to the air or material surrounding the wire.
Another way to look at it is that, if you are using a 6V system and you want to draw a certain number of Watts, you have to use wire that has twice the diameter (or four times the cross-sectional area) to get the power loss in the wire down to the same level as a 12V system. That's one reason car manufacturer's are going to the 24V standard as Barry mentioned - to save all the weight and cost of that copper in the larger wiring needed at lower voltages. Of course, as you increase the voltage, there are other limits. I sure as heck don't want to be under the hood of a car designed to be using 120V for the primary electrical system. One touch of the wrong terminal and you could have a very bad day.
The first thing to remember is that the power you are using at your device (headlamps, starter motor, whatever) is NOT the same as the power that is dissipated in the wiring as heat. If you need 55W at your device (headlight), you can use 12V and 4.58A, or 6V and 9.17A - assuming you use the right headlamp bulb. The power lost in the wiring will not be the same, however.
Let's assume you want to wire up the headlamp and you use 10 feet of wire that has a resistance of 0.01 Ohm per foot, for a total wire resistance of 0.1 Ohm. If you use the 6V system, you'll have 9.17A running through the wire and the power dissipated in the wire will be 9.17A x 9.17A x 0.1 Ohm (I x I x R), or 8.4W. If you use the 12V system, you'll have 4.58A running through the wire and the power dissipated in the wire will be 4.58A x 4.58A x 0.1 Ohm (I x I x R), or 2.1W. Note that, since the current is squared in the wire loss calculation, cutting the current in half cuts the wire power dissipation by a factor of four (assuming you use the same size wire in both cases). All the power loss in the wire comes out as heat, and has to be dissipated to the air or material surrounding the wire.
Another way to look at it is that, if you are using a 6V system and you want to draw a certain number of Watts, you have to use wire that has twice the diameter (or four times the cross-sectional area) to get the power loss in the wire down to the same level as a 12V system. That's one reason car manufacturer's are going to the 24V standard as Barry mentioned - to save all the weight and cost of that copper in the larger wiring needed at lower voltages. Of course, as you increase the voltage, there are other limits. I sure as heck don't want to be under the hood of a car designed to be using 120V for the primary electrical system. One touch of the wrong terminal and you could have a very bad day.
#13
#14
#15
not to change the subject, but I ran a 57 ford with 312, and 12:1 pistons, I would use a 6 volt starter, spun that motor up right away, ran it for years and it never burned out, think it was due to the sort period of time it cranked, has anyone else used a 6 volt starter on 12 volt system?