The Evolution of Artificial Illumination and its Physiological Impact

The human species has a long history of utilizing artificial light to extend periods of activity beyond daylight hours. Early forms of artificial illumination, dating back to prehistoric times, involved the controlled combustion of organic materials such as wood, animal fats, and plant-based oils. These methods produced light through incandescence, characterized by a broad spectrum of emitted wavelengths and relatively low luminous efficacy. The intensity and spectral distribution of light from these sources, such as torches and oil lamps, were inherently limited by the characteristics of the combustible material and the controlled burn rate.

By Luca Olsen
SemiPremium founder, sleep expert                                                      Published 29.1.2026
_________________________________________________________________________________________________

The 19th century marked a significant leap with the widespread adoption of petroleum-based illuminants, offering increased brightness and portability. However, the true democratization of artificial light arrived with the advent of electrical incandescent lighting in the late 19th and early 20th centuries. Incandescent light bulbs, operating on the principle of thermal radiation from a heated filament, provided a more stable, controllable, and significantly brighter light source than their predecessors. While these bulbs emitted a broad spectrum of light, their luminous efficacy was relatively low, leading to considerable energy consumption and heat generation.

A pivotal shift in artificial lighting technology occurred with the mass adoption of Light Emitting Diodes (LEDs) in the late 20th and early 21st centuries. Driven primarily by economic incentives and the imperative to reduce energy consumption for utility cost savings rather than initial environmental concerns, LEDs offered a fundamentally different mechanism of light production based on electroluminescence. This allowed for significantly higher luminous efficacy and a longer lifespan compared to incandescent bulbs. However, the early widespread implementation of LEDs presented unforeseen physiological implications. Unlike incandescent sources, early LED technology often produced a narrower, more intense spectrum of light, frequently with a dominant peak in the blue light wavelengths. This characteristic became problematic due to the established understanding of the human circadian rhythm and the role of specific light wavelengths in regulating melatonin secretion by the pineal gland. Exposure to blue-enriched light, particularly at higher intensities, has been demonstrated to suppress melatonin production, thereby interfering with the body's natural sleep-wake cycle. The very efficiency that drove the adoption of LEDs meant that they could produce substantial light output (lux) at lower power consumption, making them a potent environmental stimulus. Initial generations of LEDs lacked sophisticated dimming capabilities or dynamic spectral control, meaning users were often exposed to the full, potent luminous flux. Subsequent incremental innovation within LED technology, however, has addressed some of these early limitations. Modern LEDs are now commonly equipped with advanced dimming circuits and, in some cases, tunable white light capabilities that allow for adjustment of color temperature and spectral composition. While these advancements offer improved control over light exposure, the fundamental problem of powerful artificial light sources in the domestic environment, particularly those integrated into handheld devices, persists.


Concurrent with the evolution of general illumination, the development and widespread adoption of television introduced another significant source of artificial light into the home environment. Early television sets primarily utilized Cathode Ray Tube (CRT) technology, generating images by scanning electron beams across a phosphorescent screen. In its nascent stages, television programming typically concluded at a reasonable hour, providing a de facto buffer period between content cessation and average bedtime. This temporal separation, combined with the often-dim ambient light of living rooms and the relatively low luminous output of early CRT displays compared to direct sunlight, meant that early television use was less likely to significantly disrupt natural physiological rhythms. Traditional alternatives for evening activities, such as reading, were often facilitated by nightstand lamps, typically designed to direct light onto the reading material while minimizing direct light exposure to the eyes, thereby attempting to mitigate physiological impact.


The landscape of television consumption dramatically changed with the advent of 24/7 broadcasting via cable and satellite television, followed by the proliferation of Smart TVs and streaming services. These developments eliminated fixed programming schedules, enabling continuous access to content at any hour. Modern televisions, particularly those employing LED-backlit LCD or OLED technologies, are capable of producing significantly higher luminous intensities and often feature a prominent blue light component in their spectral output. This increased light exposure, coupled with the shift in viewing habits—whereby a substantial percentage of individuals consume content on digital screens in bed—creates a direct and sustained exposure to powerful artificial light during a period when the body is naturally preparing for sleep. This places the television, along with handheld devices, as a pervasive and physiologically impactful source of artificial illumination in the nocturnal environment, contributing to the disruption of circadian rhythms and melatonin regulation. The inherent portability and constant proximity of handheld devices to the user's eyes further magnify the potential for their emitted light to influence biological processes.


One practical way to reduce these disruptions—especially when using a smartphone or tablet for passive nighttime content like audiobooks, podcasts, or calming videos—is to minimize direct screen interaction and blue light exposure during the vulnerable sleep onset window. A dedicated remote controller for smartphones such as SemiPremium allows users to control volume, pause, skip tracks, or manage playback with physical buttons from under the covers, without ever touching the screen or lighting it up. Read more about SemiPremium here. By keeping the device face-down, dark, and at a safe distance while still accessing its audio or queued content, this approach preserves melatonin production, prevents accidental bright flashes from notifications or ads, and avoids the cognitive and physical arousal that comes from handling the device. It turns a potentially sleep-disrupting habit into a low-stimulation one, helping maintain the natural downward progression of arousal needed for smooth sleep onset.

Explore more in the Sleep Onset Toolbox for strategies to protect your evenings from artificial light and digital overstimulation. Small changes to how we use technology at night can make a meaningful difference in how easily we drift off.