Until now I've always accepted that transcoding my Blu-Ray mkv rips to .m4v (mp4) will cause video quality loss (and of course the loss of DTS-HD), but I finally decided to try and find out how much and do a proper comparison to see if this loss is very perceptible or not. Lets use Star Wars Episode III as an example. I ripped the Blu-Ray using MakeMKV and ended up with an mkv file that contained three tracks, video track, DTS-HD (lossless) track and the PGS Subtitle track. This file is 35.45GB in size. Using Handbrake (High Profile, x264 Tune:Film, x264 Preset:Slow, CQ:21) the transcoded .m4v file came out at 5.65GB. Each file was played in VLC, stopped at a precise moment (using the goto seconds counter) and a snapshot was taken. I opened each snapshot side-by-side on my iMac and scanned for differences between the two images (including zooming in) but I couldn't see any. Thinking my 27" iMac screen might be too small, I also viewed the images on my Samsung UE55F8000 TV, again I can't see any differences. Colours are exactly the same, detail looks exactly the same, colour graduation looks exactly the same (no banding or steps). Not content, I done the same test using Lord of The Rings FOTR, this time choosing a meadow scene, and again, I can't see any differences. Every blade of grass, meadow flower etc, was just as detailed in the .m4v as it was in the original mkv. I now want to ask the community if my testing method is flawed? There has to be a difference, I know there does, but why doesn't my snapshot comparison show it? I want to justify to myself why it's worth keeping the original mkv's, and I'm looking forward to your replies and perhaps offering some guidance on where I'm going wrong?