In the last decade, it has been proposed that the sun's IR-A wavelengths might be deleterious to human skin and that sunscreens, in addition to their desired effect to protect against UV-B and UV-A, should also protect against IR-A (and perhaps even visible light). Several studies showed that NIR may damage skin collagen content via an increase in MMP-1 activity in the same manner as is known for UVR. Unfortunately, the artificial NIR light sources used in such studies were not representative of the solar irradiance.
Yet, little has been said about the other side of the coin. This article will focus on key information suggesting that IR-A may be more beneficial than deleterious when the skin is exposed to the appropriate irradiance/dose of IR-A radiation similar to daily sun exposure received by people in real life.
IR-A might even precondition the skin – a process called photoprevention - from an evolutionary standpoint since exposure to early morning IR-A wavelengths in sunlight may ready the skin for the coming mid-day deleterious UVR.
Consequently IR-A appears to be the solution, not the problem. It does more good than bad for the skin. It is essentially a question of intensity and how we can learn from the sun.