Measurement basics in LED thermal management

August 18, 2016 //By John Cafferkey, Cambridge Nanotherm
Measurement basics in LED thermal management
A successful LED design needs a balance of form and function to be a desirable luminaire with the right lumen output. Sounds simple enough, but these two requirements are often in conflict.

When form trumps function, LEDs that are usually mounted onto a metal-clad PCB (MCPCB) as a module are all too often crammed together, creating a module with high-power density. If the device has not been designed to remove the heat from the LEDs effectively, there is a real risk of the LED overheating. As with any semiconductor, when LEDs overheat efficiency is reduced, light quality deteriorates, lifespan shortens and ultimately the LED can catastrophically fail.

Even with extreme high power density designs, this can be avoided by having a basic understanding of thermal design. However, it can be difficult to unravel the claims made by thermal management suppliers regarding how their materials perform. This can lead poor choices, and in a worst-case scenario overheating or failing, LEDs.
Three critical factors to consider when looking at the thermal performance are conductivity, interface resistance and impedance, which combine to give the total thermal resistance of the design.

Thermal conductivity
Thermal conductivity is a simple constant that describes how well a particular material transfers heat by conduction. However, the thermal conductivity figures given by manufacturers cannot always be taken at face value. While there are plenty of known and standardized methods for testing thermal conductivity, based on the number of heterogeneous materials, the result can vary widely depending on the testing method employed. And, as you might expect, manufacturers often pick the test value that returns the most favourable result. Datasheet values describing thermal conductivity should therefore be treated with caution, especially when using them as input for wider thermal models.

Thermal resistance and the importance of the z-axis
Thermal resistance (the measurement of the temperature difference by which a given material resists a heat flow) is a highly useful measure as it draws in the z-axis (thickness) of the material in question; a critical real-world consideration.

Layer thickness ÷ thermal conductivity = thermal resistance

Reduction in the thickness of a material can have a dramatic effect on the level of thermal resistance. As a variable, it is easier to reduce the z-axis by an order of magnitude (thus dramatically reducing thermal resistance and improving thermal performance) than it is to find an alternative material with dramatically better thermal conductivity within an appropriate price bracket.

Design category: 

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.