e LEDs are not strictly linear in terms of this behavior. Higher per-strip drive levels promote higher operating temperatures, which depending on the scenarios one chooses to compare can cause reductions in efficacy on the order of 5 to 20%.
tilopa108
2
May '20
Question about nominal drive current.
rick_1976
Applications Engineer
May '20
“Nominal” drive current values typically indicate the level at which the other device characteristics are measured and communicated. Manufacturers generally choose this value to reflect an operating point that offers a good balance of competing design tradeoffs under expected operating conditions.
Because different applications value different performance characteristics differently, it’s not necessarily the optimum drive level for all purposes; it’s a suggested starting point. Yes, further reductions in drive level may yield increases in luminous efficacy, but there is a point of diminishing returns.
tilopa108
1
May '20
Sorry, I totally changes the focus of my quesiton, or really added a question by editing my post completely, but you had already responded. I should have just left it and added question. That was dumb.
Thanks for clarifying about nominal current.
Just to be clear about efficiency. As I am about to buy a driver and trying to decide between 2 different ones.
So, 2 scenarios:
1: total wattage is 400w, current is 1000mA per strip for a total of 10 strips. Voltage is 40v.
2: total wattage is 400w, current is 750mA per strip for a total of 10 strips. Voltage is 53.3v.
Scenario 2 would be more efficient? In other words efficiency comes from lower drive current even if the voltage increases?
David_1528
Applications Engineer
May '20
Your scenarios don’t make sense, if you are using the same LED strip for both scenarios. The higher the current passing through a strip, the higher the voltage drop across them. Here is how I would try to explain how efficacy works:
Say you run 1000mA through 10 LED strips at a Vf of 40V, and it yields a total of 70,000 Lumens. This would give you an efficacy of 70,000 Lumens / 400W, or 175 Lumens/W. If you ran 500mA through the same LED strips in the same arrangement, the Vf would probably be in the neighborhood of 38.5V, and the output would be roughly 36,000 Lumens (possibly slightly more). Under this scenario, your input power is 38.5V x 5000 mA = 192.5W, and your efficacy is 36,000 Lumens / 192.5W = 187 Lumens/W.
So, your efficacy increases from 175 Lumens/W to 187 Lumens/W, but you are getting barely over 1/2 half as much total light. If you put 20 LED strips in parallel and ran them all at 500mA, you would get about 72,000 Lumens at 385W (20 strips x 0.500A x 38.5V). This would give you more light with less power, but you would use 2x the number of LED strips to accomplish it.
Does that make sense?
tilopa108
May '20
Yes, it makes sense. And it opened up a greater understanding. I should be focusing on lumen output instead of wattage. I was not remembering that wattage does not simply translate to amount of light.
I’m using these 2ft srtrips:
https://www.bridgelux.com/sites/default/files/resource_media/DS132 Bridgelux EB Series Gen3 Data Sheet 20190617 Rev A.pdf 5
Which have a typical forward voltage of 19.1Vf and a nominal current of 700mA. And a typical flux @ 25c of 2490 lumens. And efficiency: 19.1*.700 = 13.37w = 2490/13.37 = 186 lm/w.
Now, if I want to find the lumens and wattage of an array of these strips driven by a particular driver how do I do that?
10 of these strips using this driver: meanwell HLG-240H-48
https://www.meanwell.com/Upload/PDF/HLG-240H/HLG-240H-SPEC.PDF 3
This driver has total current of 5A, and a voltage range of 24-48v. I would like to wire these 10 by wiring 2 in series 5 times, then taking those 5 sets of 2 and parallel them together.
I can find the current per each of the 5 sets by dividing the driver current by 5. which is 5A/5 = 1A. Using the datasheets “current vs forward voltage” graph I can see that at 1 amp the voltage is about 19.6v. If I double the voltage: 19.6v * 2 = 39.2v. Each series pair uses 39.2v * 1A = 39.2w, multiply this by 5 = 39.2 * 5 = 196w. And the number of lumens is simply 10 * 2490 = 24900. So efficiency is: 24900/196 = 127 lm/w.
Clearly my calculations are incorrect. What am I doing wrong?
David_1528
Applications Engineer
May '20
What you are missing is the increased lumen output from the increased current. That is given in the Figure 4 graph.
image
The lumen output will be about 140% of what it is at the test current (700mA), so roughly 3486 lumens per strip. Ten of these will give about 34,860 lumens for an efficacy of 34,860lm / 196w = 178lm/w.
In a real application, this won’t quite be true, because light output and Vf both drop slightly at elevated temperatures for a given current (which will be the case here due to self heating). The Vf drop adds slightly to efficiency, but the loss in light output more than makes up for that, so overall efficacy is reduced by about 3.5% percent if the case temperature rises to 60°C. This comes from Figures 7 and 8 on page 7 of the datasheet.
I found this using " Hey Google" on my phone...