Most CRT monitors claim to have a phosphor life of between 12,000 and 15,000 hours until half brightness. Compare this with the latest 7 series Panasonic plasma displays (60,000 hours), and today's CRT monitors look pretty poor. Why then, do people worry more about image retention on a plasma than on a CRT? Surely if the CRT reaches its half brightness more quickly than even a budget plasma panel (20-30,000 hours), then a heavily used CRT will show signs of uneven phosphor aging (such as a start button burned in to the screen) well in advance of a plasma?. But that never seems to happen on a CRT, they seem to age fairly evenly despite very static images. Anyone have any ideas?