LED light power draw increase....is this a thing?

Rob Roy

Well-Known Member
A friend, who uses led lights for a tent grow asked me a question the other day, which I couldn't answer with certainty. It seems he had experienced some increased power consumption on his bill and couldn't attribute it to any increased use and wondered...

Do some Led lights (diodes?) (drivers?) become worn, for lack of a better term over time and thus draw more power to keep them lit? Could this happen ?

Thanks in advance.
 
Last edited:

ChiefRunningPhist

Well-Known Member
As LEDs get hot the forward voltage drops. So if he was using a CV driver without limiting current, then yes, as the arrays heat up they will pull more current because their forward voltage is dropping. Because of the increased current they get hotter, then they allow more current to flow and it feeds off itself till they POP! Its called thermal runaway. You have to limit current if using a CV driver. If the heatsinking is good then the thermal runaway won't get to the point of failure, it will only heat up more and draw more current.

I doubt it'd be enough to notice at the power bill though, but maybe.

Like what @It's not oregano said, I'd throw a kilowatt meter on it or a few of the LED arrays to test and see how much they are pulling.
 

Randomblame

Well-Known Member
A friend, who uses led lights for a tent grow asked me a question the other day, which I couldn't answer with certainty. It seems he had experienced some increased power consumption on his bill and couldn't attribute it to any increased use and wondered...

Do some Led lights (diodes?) (drivers?) become worn, for lack of a better term over time and thus draw more power to keep them lit? Could this happen ?

Thanks in advance.
Maybe the neighbor is stealing electricity, lol! Normally there is no such thing like increased power draw with LED drivers. At least not that high. Maybe the powerfactor goes down a bit but that would not cause such a high discrepance.
There must be another reason. Has he used a wall watt meter/kill-a-watt to check the actual power draw of his lights?

Maybe the freezer door is half open all the time? Who knows?
I'm pretty sure a Meanwell driver has enough protecting circuits und would just switch off if the power draw exceeds a certain value. Voltage and current must both get higher with no reason and thats acutally impossible without thermal runaways and even with thermal runaways the driver maximum would limit the current flow at some point.
I'm pretty sure it has another reason!

Has he changed his lights? A 300w Mars fixture is actually only a 130w light and when he has changed it to 300w COBs he would see twice as much power draw. Without more info its almost impossible to answer..
 

Rob Roy

Well-Known Member
As LEDs get hot the forward voltage drops. So if he was using a CV driver without limiting current, then yes, as the arrays heat up they will pull more current because their forward voltage is dropping. Because of the increased current they get hotter, then they allow more current to flow and it feeds off itself till they POP! Its called thermal runaway. You have to limit current if using a CV driver. If the heatsinking is good then the thermal runaway won't get to the point of failure, it will only heat up more and draw more current.

I doubt it'd be enough to notice at the power bill though, but maybe.

Like what @It's not oregano said, I'd throw a kilowatt meter on it or a few of the LED arrays to test and see how much they are pulling.

Good idea on the kilowatt meter, I had mentioned that to him. Interesting on the heatsink thing too.

I'm aware that a "1000 watt" led grow light pulls considerably less than the max sum of the diodes when operating normally.

You gave me something to research, how much more current can be pulled before the light fails? I imagine that varies by brand and type, but I don't know if a "1000 watt" light that pulls say 270 watts normally could start pulling like 600, 700 or 800 watts etc. before it fails.

I wonder about drivers too, how do they know to regulate the flow of current ? Do they regulate the flow?

He'd been using the lights to veg under 24 hours so I suppose that could be a variable too as to why the lights are failing, as many LED light manufacturers claim they work better under a 12/12 or something.

As far as other things that could be contributing to the increase in power consumption, yes of course it could be something other than the lites, but I told him I'd ask here on the forum.

Thanks to everybody for the replies.
 

ChiefRunningPhist

Well-Known Member
Good idea on the kilowatt meter, I had mentioned that to him. Interesting on the heatsink thing too.

I'm aware that a "1000 watt" led grow light pulls considerably less than the max sum of the diodes when operating normally.

You gave me something to research, how much more current can be pulled before the light fails? I imagine that varies by brand and type, but I don't know if a "1000 watt" light that pulls say 270 watts normally could start pulling like 600, 700 or 800 watts etc. before it fails.

I wonder about drivers too, how do they know to regulate the flow of current ? Do they regulate the flow?

He'd been using the lights to veg under 24 hours so I suppose that could be a variable too as to why the lights are failing, as many LED light manufacturers claim they work better under a 12/12 or something.

As far as other things that could be contributing to the increase in power consumption, yes of course it could be something other than the lites, but I told him I'd ask here on the forum.

Thanks to everybody for the replies.
An LED allows X amount of current to flow at a certain Y voltage differential or potential. The temperature of the semiconductor changes how much current the LED will flow through itself when a particular potential is placed across it.

When they say 1000w LED grow light it's very misleading. They could mean it's a 1000w HPS equivalent, or that the individual chip max wattages add up to 1000w (like you said), or that's its actually 1000w (though yet to see that bad boy lol).

What depends on how much wattage a light will use is the drivers. They supply the potential. The chips by nature allow X current to flow at Y potential. So the only way the chips can pull more wattage is if the drivers are able to supply it. That means the drivers have to increase their voltage to flow more current (if the current is available from the driver to flow). A driver can't shove current through, a driver creates a potential and then current flows accordingly, just like all electronics (though inductors create a sort of "current momentum").

The drivers take AC from the wall and run it through a rectifier & inductor to flatten the signal and step up or step down the voltage. Then they have a PWM switch cct that switches the AC main/rectifier/inductor at a certain rate to flow the nessecary amount of DC current on the output side. They are PWM switched internally on the AC side depending on how much current they need on the DC side.
Screenshot_2019-02-26-00-52-01~2.png
The drivers can't flow more current than the components of the driver will allow. That's why you have to buy different drivers for different wattage outputs because the inductor coils and components used will only provide a certain power range. You can have failures in a driver that could allow for more current to flow but the lights would most likely fail and you'd notice a big difference in heat and brightness if you had a driver failure. If your driver allowed overpowering you'd most likely smell some burnt components in the driver as well from the overload protection ccts. I'm not sure which would reach 100% failure first. There are many ways a driver can fail and a few different outcomes possible from failures, but if the lights are still running and not overheating then the driver is not failing, or at least not giving symptoms of failure.

A 250w LED array could pull 500w+ depending on how much per chip they were initially powered at, and if the driver can supply the extra voltage and corresponding current to allow it. But I've never heard or seen anything of the like. Even thermal runaway I've never seen to the point of failure, knock on wood :). Some chips are really durable. There are a few videos by LEDgardner on YouTube of him trying to blow up a QB lol he pushes it well over the maximum ratings, although I can't remember duration...

A kilowatt meter would probably be the fastest way to check. Good luck :bigjoint:
 
Last edited:

1212ham

Well-Known Member
I can't see how a lights power draw could increase enough to be noticed on the electric bill.... unless a dimmer was turned up.
Two questions, what is the brand and model of the light in question and what is the price of electricity at that location? We can calculate the operating cost if the lights wattage and your electricity price are known.
 

Rob Roy

Well-Known Member
I can't see how a lights power draw could increase enough to be noticed on the electric bill.... unless a dimmer was turned up.
Two questions, what is the brand and model of the light in question and what is the price of electricity at that location? We can calculate the operating cost if the lights wattage and your electricity price are known.
I could try to find out the variables from my friend and yes I agree with a kilowatt type meter and a known price of electricity the calculation should be simple...If he even has the lights anymore, they may have already hit the landfill. I loaned him a 315 cmh to replace the two led lights he was using. I'll ask him for more details on the leds he was using. I know they had a few hours on them.

From our conversation, I became more interested in the properties of an LED light and if they could ever malfunction and go from drawing an initial approx. 270 watts for a "1000 watt" light all the way up to (or near) the 1000 watts. I reasoned if all of the individual diodes had the potential to take more electricity they might IF whatever regulates the flow of electricity somehow malfunctioned and all the diodes were lit up to their full capacity etc.
 

whytewidow

Well-Known Member
When I changed over to DIY leds my electric bill cut in half. I dropped out AC, and now pulling wattage that my old lights were suppose to be. I replaced an old modded mars reflector series 920 I think is what it was. Pulled 440w at the wall. Plus running AC and fans and extractor fans. Now I push 475w in the same tent. Dropped out AC and 3 fans. Now I have one fan and the light. My AC was eating electric like crazy.

Maybe he has an electric leak somewhere lol.
 

whytewidow

Well-Known Member
I could try to find out the variables from my friend and yes I agree with a kilowatt type meter and a known price of electricity the calculation should be simple...If he even has the lights anymore, they may have already hit the landfill. I loaned him a 315 cmh to replace the two led lights he was using. I'll ask him for more details on the leds he was using. I know they had a few hours on them.

From our conversation, I became more interested in the properties of an LED light and if they could ever malfunction and go from drawing an initial approx. 270 watts for a "1000 watt" light all the way up to (or near) the 1000 watts. I reasoned if all of the individual diodes had the potential to take more electricity they might IF whatever regulates the flow of electricity somehow malfunctioned and all the diodes were lit up to their full capacity etc.
If it's a Chinese-ish light. Like an Amazon light. Like example BestVA 2000w led grow light. It pulls 390w total. They say it has 200 10w chips. But the drivers and fans in it only pull 390w. There is no way with the drivers in it, Can it draw any where near 2000w of power. Even it if was super hot. The led cant draw anymore than the driver can put out. Without changing drivers. The led will only draw whatever voltage to match the current pushing through it. Like I'm meaning say your using a 700mA driver. The diodes will only pull voltage to match the current. So if its say 24v at 700mA it's not gonna draw 40v at 700mA even super hot without any sinking.

Screenshot_20190425-084532_Amazon.jpg
 
Last edited:

Rob Roy

Well-Known Member
When I changed over to DIY leds my electric bill cut in half. I dropped out AC, and now pulling wattage that my old lights were suppose to be. I replaced an old modded mars reflector series 920 I think is what it was. Pulled 440w at the wall. Plus running AC and fans and extractor fans. Now I push 475w in the same tent. Dropped out AC and 3 fans. Now I have one fan and the light. My AC was eating electric like crazy.

Maybe he has an electric leak somewhere lol.

Yes, it certainly could be something other than the led lights. It may have been just a coincidence that his light bill spiked for no apparent reason at the same time his led lights were crapping out.

I've used some leds in the past and found I didn't get the penetration of an HPS, but that was a few years ago and it sounds like they've gotten better. I still sometimes run an led or two intermittently for vegging if I get jammed up or over zealous with popping seeds and / or making clones and don't want to generate lots more heat or the space isn't configured to take a t-5 light etc.
 
Top