Electrical question

tet1953

Well-Known Member
I currently have two rooms, 2x 1000w in each room. Actually, one room has 1x 1000w and 2x 600w. Each 1000w, and the 2 600w all have their own 20A circuit, with 20A breaker, 20A outlet and 12-2 wire, running on 110V service.
A buddy was mentioning something about running the lights on 220 instead, and saving money. Is this true, and if so, can someone point me in the right direction to learn more about it? Thanks
 

rawrfox

Active Member
I currently have two rooms, 2x 1000w in each room. Actually, one room has 1x 1000w and 2x 600w. Each 1000w, and the 2 600w all have their own 20A circuit, with 20A breaker, 20A outlet and 12-2 wire, running on 110V service.
A buddy was mentioning something about running the lights on 220 instead, and saving money. Is this true, and if so, can someone point me in the right direction to learn more about it? Thanks
damn i remember reading a thread about this somehwere i wont be able to find it again but i believ i remember someone proving it was common misconception for people to belive 220v saves money. A kwh is a kwh...regardless of the voltage or something to that extent....ofc, wait for someone elses opinion aswell,
 

Kdn

Member
A Watt is a Watt, you will draw the same amount of juice either way. This kinda breaks down at the limits of a conductor, take a 12awg conductor for instance.

As you approach and exceed 20 amps the resistance will increase causing a slight voltage drop which results in drawing more amperage to keep up the wattage, at this limit is where you find breakers tripping due to 1)overcurrent and 2)conducter thermalcoupler tripping due to heating.

if you were to put a 25 amp breaker in you would cause conductor heating from increased resistance and therefor be wasting energy(you should never do this anyways as its not safe) 12awg is only good for about 2000watts(20% rule) at 120v if you want to support more wattage then at this point you would have to up the voltage to 220v or increase conductor size.
 

MrEDuck

Well-Known Member
It's good for being able to run more current on less circuits, but that's it. You can run 2x as much per amp with 220V which is great for a big grow. By the time you consider switching to 220V because you can't run everything you want your electric bill isn't a big concern anymore.
 

DiabloZoe

Member
It's about efficiency when you run 220 like the poster said you get charged in kwh and a watt is a watt! But when you run 220 you cut your amps in half if you pulling 6 amps the 220 would be 3 amps! Now this comes into play when running you wire 12 gauge is good for 20 amps and 10 gauge is good for 30 amps and so on copper is not cheap. Amps produce heat also!
 

DiabloZoe

Member
A Watt is a Watt, you will draw the same amount of juice either way. This kinda breaks down at the limits of a conductor, take a 12awg conductor for instance.

As you approach and exceed 20 amps the resistance will increase causing a slight voltage drop which results in drawing more amperage to keep up the wattage, at this limit is where you find breakers tripping due to 1)overcurrent and 2)conducter thermalcoupler tripping due to heating.

if you were to put a 25 amp breaker in you would cause conductor heating from increased resistance and therefor be wasting energy(you should never do this anyways as its not safe) 12awg is only good for about 2000watts(20% rule) at 120v if you want to support more wattage then at this point you would have to up the voltage to 220v or increase conductor size.
Length plays a huge part in this! And that's where voltage drop comes into play
 

Uncultivated

Well-Known Member
A Watt is a Watt, you will draw the same amount of juice either way.
Not exactly. If you're at double the voltage, then you'll draw half the current. Resistive heat losses are proportional to current ^2, so 220V is certainly more efficient.

Not so much to save money, I think it's just safer to go 220 when you're pulling a lot of power. Less current is safer IMO and lower heat certainly wouldn't hurt.
 
Top