True, but that's Luminous efficiency. Luminous efficiency indicates what portion of the emitted electromagnetic radiation is usable by human eyes. We can't see infrared and UV light emitted by most light sources, so that light is "wasted" on us. Most light sources have poor luminous efficiency and a lot of the light created via the current ran through the bulb is wasted. Of interesting note on luminous efficiency, our Sun is only 12% luminously efficient yet we've come up with lighting that's more efficient than the sun in making light we can see
I'm talkin thermal efficiency though. How much electrical energy is wasted as heat versus how much total electric energy flows through the light source to create electromagnetic radiation (visible or not). So if 100 watts of electrical energy goes into the light fixture and 20 watts of heat comes out of the system, it's 80% efficient at turning electrical energy into light.
While I may have the wrong thermal efficiencies here, bottom line is T5s, PCs, and Halides create enough heat to burn your skin and raise the temperature of a closed room by a few degrees; meanwhile LEDs, even the new high-powered ones cannot burn you and will not warm up the temperature of a commonly sized room.
I'm talkin thermal efficiency though. How much electrical energy is wasted as heat versus how much total electric energy flows through the light source to create electromagnetic radiation (visible or not). So if 100 watts of electrical energy goes into the light fixture and 20 watts of heat comes out of the system, it's 80% efficient at turning electrical energy into light.
While I may have the wrong thermal efficiencies here, bottom line is T5s, PCs, and Halides create enough heat to burn your skin and raise the temperature of a closed room by a few degrees; meanwhile LEDs, even the new high-powered ones cannot burn you and will not warm up the temperature of a commonly sized room.