Seeing as how 240W models is actually still cheaper than 240W of actual draw from any other company by a long shot, I am going with blackstar still. I'll just buy twice the units for the price, and they seem to work fine. Has anyone tested their kessils? or their own models? Out side of growledhydro? Cuz im sure we will find that a lot of companies do that. I would bet all but a small handful do.
Just like the horsepower analogy: yeah car is advertised a at so much horse power, actually produces this much, and that is only if you floor it all the time.
Actually wattage draw varies I would imagine, and therefore most companies put up the only number that doesn't: the max draw. From what I gather, LED's are strange creatures and don't work as simply as "this many watts produces this much light".
Some good nuggets about LEDS:
"Most common LEDs have a maximum current rating of 20mA. A number of factors make it
advantageous to use less current. For instance, if you have very limited airflow around the LED it cannot
dissipate its small amount of heat and may fail. A high ambient temperature can have the same effect.
Power supply voltage variations are also a factor which should not be ignored. A circuit designed at
maximum current will exceed that value if the power supply voltage increases.
In addition, there is
usually little noticeable difference in the brightness of an LED run at the maximum current of 20mA and
one run at 10mA."
"If you want excellent reliability, then design for 10mA or less for standard LEDs. First you need to know
the voltage drop across the LED. This is a parameter associated with the specific type of LED and you
can find it on the devices data sheet. If the data sheet is not available, it is usually safe enough to assume
1.7volts for non-high-brightness red, 1.9volts for high-brightness, high-efficiency and low-current red,
2volts for orange and yellow, and 2.1volts for green. Assume 3.4volts for bright white, bright non-
yellowish green, and most blue types. Assume 4.6 volts for 430 nM bright blue types such as Everbright
and Radio Shack. Design for 10mA for the 3.4 volt types and 8mA for the 430 nM blue.
You can design for higher current if you are adventurous or you know you will have a good ventilation,
preventing heat buildup. In such a case, you might design for 25mA for the types with voltage near 2
volts, 18 mA for the 3.4 volt types, and 15 mA for the 430 nM blue. However, as stated, there will be
little additional brightness. Meet or exceed the maximum rated current of the LED only under extremely
favorable conditions which are not subject to change. Some LED current ratings assume some really
favorable test conditions, such as being surrounded by air no warmer than 25 degrees Celsius, and some
decent thermal conduction from where the leads are mounted. Running the LED at specified laboratory
conditions used for maximum current rating will make it lose half its light output after rated life
expectancy, said to be 20,000 to 100,000 hours, but this is very optimistic! You can use somewhat higher
currents if you heat-sink the leads and/or can tolerate much shorter life expectancy."
"The last thing to do is to check the resistor wattage. You could multiply the resistors voltage drop by the
LED current to get the wattage being dissipated in the resistor. Example: 2.6volts times .03amp (30
milliamps) is .078Watt. For good reliability, I recommend not exceeding 50 percent of the wattage rating
of the resistor. A 1/4Watt (.25W) resistor can easily handle .078 watt. In case you need a more powerful
resistor, there are 1/2 watt resistors widely available in the popular values.
This is not necessarily the preferred method, however. Since you have to use a standard value resistor,
your LED current is not going to be the same as it would have been if the computed resistance were used
in the circuit. A better method of calculating the power rating for the resistor when the exact resulting
current is not known is to use the formula P=E2/R, where E is the voltage across the resistor and R is the
actual value of the resistor used (not the computed value). Then, as always, use a resistor rated for at least
twice this amount of power."
Sourced from:
learning.hccs.edu/faculty/david.wells/musc1323/handouts/handout-2
So umm having your LEDs run at twice the current (which keeping the voltage the same, as you cant vary that) is the only way to increase wattage. Increasing the current yields little noticeable differences in light output, therefore LEDs are more efficient than we think if they are advertised at max wattage?? Am I wrong?
So the Grow LED hydro advertising at 300W is probably 600W of 3W diodes running at 50%? No?
So In reality I am push 750W worth of blackstar. That is only drawing 375W you are telling me, and I am putting out about the same output as I would be if they were at full bore?
Works for me. I thought I was getting 750W worth of light, and, from what I gather, at full bore, I almost am. However I am using half the electricity?
So ledbudguy yielded 1.94 grams per usable watt if those panels are advertised the same way?? Then LED lights fucking rock, and I will be doubling down on them as It seems I can afford the electricity.
So here is the score: 600W (max output) of 3W Diodes running at 50% puts out about the same light product as the same 3W diodes running at 90%? The only better product is someone advertising 600W actual draw from 1200W of 3W diodes running at 50%, that will yield twice the light.
So bang for bang most light advertised at 300W are using 150W and putting up similar PAR and Intensity numbers at the same diodes at 100% wattage? Most of the energy after that 50% threshold of resistance is used in making heat not light?
Also i heard the LED's become even more efficient over time? Is that true?