In "non-pure" 100Hz sets (the vast majority of the market), additional digital processing is undertaken on the image, such as adding extra scan lines to produce a less interlaced-looking image. This makes motion blurring more pronounced and also often produces the artefacts that most people are capable of noticing and which are often the source of discussion on the forums.
So all that digital processing we pay for only makes the image worse? Right-o
I think it's important to note that for this is mostly a matter of taste. On big screens, especially when close by, 50Hz will seem unstable while 100Hz is solid. Considering some people can't stand PAL coming from NTSC which is only 10Hz faster, I wouldn't dismiss the effect of doubling the refresh rate.
However, any form of video processing wil deteriorate an otherwise perfect source. Especially with videogames this can be a nuisance (not to mention the fact that lightgun games won't always work @ 100Hz, depends on the game, gun and TV).
Regarding the sometimes criticised image processing, a lot of showrooms have pretty poor video feeds going on. Post processing on a bad feed looks extraordinary ugly because the TV is treating the interference like part of the source. However, with a good but less than perfect source, a bit of interference usually gets washed out completely, which is a big plus in this age of analogue inputs.
Personally, I luv my Philips 28PW8807. Yes, there are processing artifacts sometimes (some objects develop halos, sometimes motion looks artificially sped up), but the improved stability and fluidity more than makes up for it.
Until 100Hz HDTV is a standard (in like half a century) there will always be a compromise to make.