LED Heat Management

MidnightSun72

Well-Known Member
Ok so I am trying to find a method to calculate how much heat an LED strip makes and how much heat sinking is needed.

so from using google I found this link:

In the section under "Calculating Thermal Resistance"
An equation is listed:

temperature change = thermal resistance x LED power (where LED power = voltage x current)

Taking the Seoul semiconductor strip
SMJD-3622120B-XXN1A as an example.
Data sheet below
This strip has a Thermal Resistance of 0.3 *C/W (value given in data sheet)
The nominal current is 0.65a and Forward voltage is 33.6V.
So the temperature change of the LED strip should be

Delta T= 0.3 *C/W X (0.65X33.6)W = 6.55*C

for the EB Gen 2 strips BXEB-L0560Z-35E2000-C-B3
Using the temperatures provided in this link we can work back and figure out the thermal resistance of the strip.


Thermal resistance = rise in temperature form ambient/strip wattage (Amps X Volts)

thermal resistance = (45*C (Temp of back of strip)- 26.7 (ambient t)) / (19.5 X 0.625) = 1.5*C/W <---this could be lower in reality since the guy did record back of strip temps as low as 37*C so if you work it out with that you get a resistance of 0.845*C

Is it possible the Seoul strip is that much better at heat management? Am I even understanding the equations correctly?
 
Last edited:
This post may help, if not the site may be able to.
If neither.. fuck it I tried lol I'm no electrician. I'm just available.
 
Ok so I am trying to find a method to calculate how much heat an LED strip makes and how much heat sinking is needed.

so from using google I found this link:

In the section under "Calculating Thermal Resistance"
An equation is listed:

temperature change = thermal resistance x LED power (where LED power = voltage x current)

Taking the Seoul semiconductor strip
SMJD-3622120B-XXN1A as an example.
Data sheet below
This strip has a Thermal Resistance of 0.3 *C/W (value given in data sheet)
The nominal current is 0.65a and Forward voltage is 33.6V.
So the temperature change of the LED strip should be

Delta T= 0.3 *C/W X (0.65X33.6)W = 6.55*C

for the EB Gen 2 strips BXEB-L0560Z-35E2000-C-B3
Using the temperatures provided in this link we can work back and figure out the thermal resistance of the strip.


Thermal resistance = rise in temperature form ambient/strip wattage (Amps X Volts)

thermal resistance = (45*C (Temp of back of strip)- 26.7 (ambient t)) / (19.5 X 0.625) = 1.5*C/W <---this could be lower in reality since the guy did record back of strip temps as low as 37*C so if you work it out with that you get a resistance of 0.845*C

Is it possible the Seoul strip is that much better at heat management? Am I even understanding the equations correctly?


Just reading through this Bridgelux Document

page 12of 31 references that the Vero datatsheet " Rj-c value was found in the Vero Module Datasheet to be 0.28 °C/W "


So pretty similar to the Seoul Semiconductor. I can't wrap my head around how the Bridgelux strips run the temperatures they do at 0.7a and 39v for the 4ft or 0.7a and 19.5 for the 2 ft. Every time I run it through a calculator I get very high source temperatures that don't make sense. Any mechanical engineers in the house?

I feel like half of whats tripping me up is the relation between the thermal resistance of Rth (case to base) the resistance of the whole system to ambient.
 
1 watt = 3.412142 BTU.
Yes that is factual.

What I really need to understand is how many BTU per hour a strip can shed at an ambient temperature of 35* Celsius. And I want it in equation form so I can work it out for any PCB.

This way I can look at any strip and figure out it it needs a heat sink and how much big. I dislike doing things just because that's , though it's great to have that as a fall back.

I'll just keep reading. Don't worry I'll update this when I make a breakthrough.
 
Last edited:
Ok so I am trying to find a method to calculate how much heat an LED strip makes and how much heat sinking is needed.

so from using google I found this link:

In the section under "Calculating Thermal Resistance"
An equation is listed:

temperature change = thermal resistance x LED power (where LED power = voltage x current)

Taking the Seoul semiconductor strip
SMJD-3622120B-XXN1A as an example.
Data sheet below
This strip has a Thermal Resistance of 0.3 *C/W (value given in data sheet)
The nominal current is 0.65a and Forward voltage is 33.6V.
So the temperature change of the LED strip should be

Delta T= 0.3 *C/W X (0.65X33.6)W = 6.55*C

for the EB Gen 2 strips BXEB-L0560Z-35E2000-C-B3
Using the temperatures provided in this link we can work back and figure out the thermal resistance of the strip.


Thermal resistance = rise in temperature form ambient/strip wattage (Amps X Volts)

thermal resistance = (45*C (Temp of back of strip)- 26.7 (ambient t)) / (19.5 X 0.625) = 1.5*C/W <---this could be lower in reality since the guy did record back of strip temps as low as 37*C so if you work it out with that you get a resistance of 0.845*C

Is it possible the Seoul strip is that much better at heat management? Am I even understanding the equations correctly?
Don't overthink it. If you run lots of strips at low current then you don't need sinks you can just use aluminium L shaped plates. Add a fan and you'll be good
 
Back
Top