100Hz processing, as others have mentioned, was designed to reduce the "large area flicker" that 50Hz refresh-rates displays can be seen to suffer from. Large area flicker is usually more noticable in peripheral vision than in the central area, so it becomes more of a problem with larger screens.
Some people are more sensitive to it than others, and US/Japanese viewers used to 60Hz refresh rate displays are much more sensitive to it - at least initially.
By doubling the screen refresh rate to 100Hz the flicker is massively reduced - and becomes invisible to almost everyone at TV viewing distances, phosphor decays and brightnesses. (CRT flicker on a PC monitor at 60Hz is more noticable because PC monitors are often much closer to you than a TV, and thus occupy more of your field of view - and thus more peripheral vision, and often have brighter pictures - as most PC displays are based on bright backgrounds these days - which flicker more than black obviously!)
However converting 50Hz interlaced material to 100Hz interlaced material (100Hz displays still use interlacing - they aren't progressive) is actually pretty difficult to do cleanly. You can't just repeat every 50Hz field twice, and neither can you simply repeat each 25Hz frame, as with fluid motion you'd be jumping forward and back in time and get awful judder.
Instead 100Hz displays have to convert an analogue signal to digital, do a form of de-interlacing from 50i to 50p, frame double to 100p and then interlace to 100i, (there are other ways of converting 50i to 100i but they aren't great) and convert back to analogue for display, which as everyone involved in HD knows is almost impossible to do totally cleanly, and very difficult to do at a low cost consumer price point. The early 100Hz displays used very poor A/D and D/A (often only 6 bit) conversions, and very low sampling rates (much lower than the 13.5MHz used for pro gear) giving very soft results, but at the time the costs of high quality A/D and D/A conversion and frame storage and high speed processing was prohibitive. Added to this is that any noise (and compression artefacts appear to be noise in some cases) present in the source will confuse a de-interlacer, and thus relatively high levels of noise reduction were added, causing smearing, and if the noise reduction is removed often the motion processing falls over.
Additionally, some manufacturers have added motion interpolation algorithms, which if switched in will convert 25p originated film material to a 50i/p look "video" motion - making films look like they were shot on interlaced video cameras!
I've yet to see a 100Hz CRT I would chose to watch full time - and of course broadcasters would never use them for picture monitoring in critical situations (as the picture is hugely altered by the processing) They have got better in recent years at the high end - but the artefacts are still clearly visible.