Good idea on the kilowatt meter, I had mentioned that to him. Interesting on the heatsink thing too.
I'm aware that a "1000 watt" led grow light pulls considerably less than the max sum of the diodes when operating normally.
You gave me something to research, how much more current can be pulled before the light fails? I imagine that varies by brand and type, but I don't know if a "1000 watt" light that pulls say 270 watts normally could start pulling like 600, 700 or 800 watts etc. before it fails.
I wonder about drivers too, how do they know to regulate the flow of current ? Do they regulate the flow?
He'd been using the lights to veg under 24 hours so I suppose that could be a variable too as to why the lights are failing, as many LED light manufacturers claim they work better under a 12/12 or something.
As far as other things that could be contributing to the increase in power consumption, yes of course it could be something other than the lites, but I told him I'd ask here on the forum.
Thanks to everybody for the replies.
An LED allows X amount of current to flow at a certain Y voltage differential or potential. The temperature of the semiconductor changes how much current the LED will flow through itself when a particular potential is placed across it.
When they say 1000w LED grow light it's very misleading. They could mean it's a 1000w HPS equivalent, or that the individual chip max wattages add up to 1000w (like you said), or that's its actually 1000w (though yet to see that bad boy lol).
What depends on how much wattage a light will use is the drivers. They supply the potential. The chips by nature allow X current to flow at Y potential. So the only way the chips can pull more wattage is if the drivers are able to supply it. That means the drivers have to increase their voltage to flow more current (if the current is available from the driver to flow). A driver can't shove current through, a driver creates a potential and then current flows accordingly, just like all electronics (though inductors create a sort of "current momentum").
The drivers take AC from the wall and run it through a rectifier & inductor to flatten the signal and step up or step down the voltage. Then they have a PWM switch cct that switches the AC main/rectifier/inductor at a certain rate to flow the nessecary amount of DC current on the output side. They are PWM switched internally on the AC side depending on how much current they need on the DC side.
The drivers can't flow more current than the components of the driver will allow. That's why you have to buy different drivers for different wattage outputs because the inductor coils and components used will only provide a certain power range. You can have failures in a driver that could allow for more current to flow but the lights would most likely fail and you'd notice a big difference in heat and brightness if you had a driver failure. If your driver allowed overpowering you'd most likely smell some burnt components in the driver as well from the overload protection ccts. I'm not sure which would reach 100% failure first. There are many ways a driver can fail and a few different outcomes possible from failures, but if the lights are still running and not overheating then the driver is not failing, or at least not giving symptoms of failure.
A 250w LED array could pull 500w+ depending on how much per chip they were initially powered at, and if the driver can supply the extra voltage and corresponding current to allow it. But I've never heard or seen anything of the like. Even thermal runaway I've never seen to the point of failure, knock on wood
. Some chips are really durable. There are a few videos by LEDgardner on YouTube of him trying to blow up a QB lol he pushes it well over the maximum ratings, although I can't remember duration...
A kilowatt meter would probably be the fastest way to check. Good luck