I've just searched this section of the Forum for threads about the bitrates used on Freeview channels. There appears to be a need for a concerted effort to do something about this because although there are some references to bitrates they are scattered, so here goes! Assuming a good signal strength there is clear evidence that the bitrate varies considerably between channels. BBC channels appear OK as does ITV-1 but others vary with Five and Sky-3 being the worst I've observed so far. The effect of a low bitrate is what I call "rubber face". A close up of a face has large areas of the screen (eg cheeks) of a similar texture. Presumably a low bitrate can use a very low coding for such an area. If the face moves slightly (as it will always do!) then the cheek area, for example, stays still with the rest of the face moving around it. This gives a "Wallace and Gromit" like effect to the face which is why I call it "rubber face". At its worst a mole, for example, in the centre of the cheek will be coded once and then repeated in successive frames in the same position. As the face then shifts slightly that mole can appear to travel all round the cheek because it's actually staying on the same pixels on the screen. I believe it is not acceptable to foist such appalling artefacts on the public. It is arrogant in the extreme for the providers of these channels to assume the public won't notice - we shall!! With the analogue switch-off on the horizon and HD in the foreground it's time to "get it right" with regard to standard definition tv. Who agrees? Incidentally, why does Channel 4 digital from the Rowridge transmitter suffer from appalling drop-outs on Sunday afternoons? I've had to resort to analogue to watch "Enterprise" - fortunately it's presented in 16:9 letterbox on that platform. Oh, and who at Five thought it was acceptable to present "The Forbidden Planet" in 4:3 the other day?