Why are some DVD/BD players better than others?

Vintage

Standard Member
Joined
Mar 14, 2010
Messages
95
Reaction score
2
Points
57
Location
London
I wonder if anyone has a definitive answer to this!

The way I understand it, a DVD player contains a laser for reading the disc, and then some kind of software to decode the data into a picture that can be understood by the display. As we know, DVD players and now Blu-Ray players, vary a lot in price, and it's clear that some players produce better (or at least different) pictures. What I would like to know, is which part of this process affects the results of picture quality.

Because we're dealing with digital data, shouldn't the output be the same on any player? I realise that there can be some loss of quality through the cables, and of course there can be limitations with a particular display. But I would like to understand exactly why some players are able to produce better raw data from the same source.

Thanks!


M
 
If you base the likehood that all manufacturers reverse engineer their competitiors product, then they all know how the circuit boards work. I can't imagine that each machine is unique compared to others, they are likely to share similar if not the same parts. So i'd venture to say that its all down to components.
How can you manufacture the same product for less? by using inferior components which are cheaper. They will work with the raw data but the end result will not be as good.
The big guys make the better product using the finest materials (well you'd hope they did!!) whereas the others make the same , lay it out a bit different, using cheaper parts and knock it out cheaper, the end result. A player which works but not as good as the top end parts used in the more expensive one!

There's a saying somewhere about being the sum of all parts, can't quite remember it but i'm sure you get the drift!
 
With BD players , and with the HDMI interface being so regulated , I think you will find with a bit of research that with regards to playing content from Blu rays discs , they all are pretty much identical.

There are some tiny differences in data , but these are usually not visible , those that are visible you need a huge screen to see.

Because the data is digital from disc to output , all the player has to do is not mess with it and the output will be as good as it can be.
Also , with HDMI cables , they cannot affect image or sound quality in any way , they either work or they dont , subtle differences in picture or sound between any two HDMI cables is completely and totally impossible.

When it comes to BD players , the more expensive models are usually universal players , they have quality audio stereo outputs , quality audio multichannel outputs , they play just about any disc in existence and also act as network streamers and file based players.
That is what you are paying for , there is no appreciable difference in digital output from Blu ray for either sound or picture for it to be worth paying huge amounts of money for.

DVD for most of its life has had huge variability in how it handles content , there are at least 20 content tests it has to pass based on the type of content available and if you wanted a player that could handle all of this you had to spend some money.

Also , the vast majority of DVD players had analog outputs , Scart , S-Video , component video and so on , the content was digital , but D/A conversion was done in the player , and this will always make players different depending on how its implemented.

So , in short , with BD players they are all pretty much the same when it comes to playing BD discs.
Its what else they can do that differentiates them.

DVD players are an older technology relying on too many analog stages and this is where the variability in playback came from.

These days , a BD player is best for playing both types of disc.
 
Last edited:
Just to add: a difference between DVD & BD playback is that the latter proceeds much as the OP imagines, and therefore BD image quality is largely consistent across all players, whereas DVD content must first be interpreted by the player to produce the image, which is one reason why player DVD picture quality and price so much varies.
 
With BD players , and with the HDMI interface being so regulated , I think you will find with a bit of research that with regards to playing content from Blu rays discs , they all are pretty much identical.
.

I strongly disagree. There are differences, most noticed on tracking shots. For example in the last chapter of "Star Trek" on BluRay the tracking shot on most main stream players is jerky and on high quality player this is very smooth. I am only using that disc as an example, it applies to most disc's that I have tried.

There are different processors used across the board at different prices and as a generalisation the higher the price the better. Picture quality is only part of the issue. Load times can also be sited (one of the best oddly enough is the Sony Playstation), also with audio outputs which can depend on how you use the player, direct to TV using HDMI you will never see or hear a difference using an expensive or cheap cable but the gain a good HDMI cable will have into a high end A/V Amp most certainly is. Should you use the analogue output of your player into a HiFi Amp, then again quality of the chipset for Digital to Analogue conversion very much comes into play. Have you ever tried playing a CD in an inexpensive DVD/BluRay player, it sounds appalling.

Clive
 
.....also with audio outputs which can depend on how you use the player, direct to TV using HDMI you will never see or hear a difference using an expensive or cheap cable but the gain a good HDMI cable will have into a high end A/V Amp most certainly is.

Complete nonsense , See here and about a million other sites ,

http://www.avforums.com/forums/hdmi-cables-switches/831330-hdmi-cables-just-facts.html

There is never ever a case where a HDMI cable makes a difference to sound or picture , ever , no matter what kind of magic gear you think you have , it is not technically possible.

This is my job , my field , I know Im right.

Jerky playback is caused by a number of issues , most of them setup issues. It is rarely if ever down to the player.

As for expert player reviews , take a look here ,
AVForums Hardware Reviews | AVForums.com - UK Online

And here ,

Blu-ray Players - Secrets of Home Theater and High Fidelity

As with all of the reviews for Blu Ray players both here and on other sites ,this or something like it will be said ,

As with 3D, any player should be capable of an equally impressive performance when delivering 1080p24 over HDMI and needless to say, the BDT500 displayed 1080p24 encoded Blu-rays without introducing any issues. The player also had no problems handling 720p Blu-rays encoded at 50Hz or 60Hz. As long as you selected the Cinema picture mode, there appeared to be no unwanted processing going on with 1080p content and as a result, the suitably unadulterated 1080p output looked great.

That quote is from the review of the Panasonic DMP-BDT500 on here ,

Show me one "Expert" review where it says there is a noticeable difference in picture between players when playing blu ray content.
"Expert" reviews by the way , are ones where the opinion is not subjective , but based on actual data measurements.
 
Last edited:
Thank you all for these replies!

I realise that a higher priced player might use better quality materials, or have more features/options/connections etc. But I am only curious about picture and sound quality - and the 'weak link' in the chain that might affect the results.

What this is all leading to, is that I plan to rip my prerecorded DVD collection (~800 discs :suicide:) to HDDs. I want to ensure that I get a 'full quality' back up onto my HDDs, because I can worry later about which software/hardware I will use to playback the files. The ultimate plan is to use a PC to replicate the performance of a high-end DVD player, without the inconvenience of having to take DVDs out of their boxes and insert them into the drive. :)

I was worried that perhaps the high end DVD players have a better laser, or a more accurate way of reading the DVD (without errors). If this were the case, then my ripping of files, and the resulting quality of those files, will be dependent on the DVD ROM in my PC. But from your advice so far, it seems that the only loss would occur during processing, rather than reading.

Any further advice is appreciated! :thumbsup:
 
There is never ever a case where a HDMI cable makes a difference to sound or picture , ever , no matter what kind of magic gear you think you have , it is not technically possible.
This is my job , my field , I know Im right.
.

According to Andy the free HDMI cable supplied with a Sky+HD box (usually refered to as a piece of wet string) is as good as for example a £200 HDMI cable from such sources as Russ Andrews, Chord Co. Black Rhodium etc. Why then do we, the vast majority of users of this forum spend in total thousends of pounds on high quality cables, we must according to Andy be complete dimwits with money to waste.

Such arrogance.
 
You don't speak for me. I for one don't waste money on cables. £1 per HMDI link is enough.
 
According to Andy the free HDMI cable supplied with a Sky+HD box (usually refered to as a piece of wet string) is as good as for example a £200 HDMI cable from such sources as Russ Andrews, Chord Co. Black Rhodium etc. Why then do we, the vast majority of users of this forum spend in total thousends of pounds on high quality cables, we must according to Andy be complete dimwits with money to waste.

Such arrogance.

As has been already stated in previous post speak for yourself because Andy is correct.

I may spend a modest amount on buying better analogue cables for Hi-Fi use but even then only £10 - £15 max.

However, as far as HDMI cables are concerned just use those supplied with source component.:cool:
 
According to Andy the free HDMI cable supplied with a Sky+HD box (usually refered to as a piece of wet string) is as good as for example a £200 HDMI cable from such sources as Russ Andrews, Chord Co. Black Rhodium etc. Why then do we, the vast majority of users of this forum spend in total thousends of pounds on high quality cables, we must according to Andy be complete dimwits with money to waste.

Such arrogance.

The Data is in the links , not my data you understand , and when it comes to HDMI cables , the vast majority around here know the expensive cables make no difference , that is a fraud that is now quite well known , watch the videos , read the data , its impossible for them to make a difference.

The likes of chord , RA , and anyone else who sells expensive HDMI are lying if the say there is a picture or sound difference , and everywhere you look you will find proof.

Heres a little something I wrote up a couple of years ago as a quick explanation of why its impossible for any HDMI cable to be better than any other ,

Why is it impossible for a HDMI cable to be responsible for incremental changes in picture quality ?

Video information is transmitted as a series of 24-bit pixels - 8 bits each for each of the primary colors; these are encoded using the TMDS protocol into three 10-bit words per pixel clock period (i.e. each pixel is made up of 30bits).

They are also supplied to the screen at a rate equal too …

Bandwidth = Resolution x Refresh Rate x [1 + Blanking Period] in Bps

Where the Blanking Period is the sum of the horizontal and vertical blanking intervals.

For 1080p @60hz this would be

1920 x 1080 x 60 x [1 + 0.16] = 144.4MHz or 144.4 million pixels/sec.

You can work out how many pixels per second for any particular content by substituting resolution figures and hz rates into the equation above , Im picking 1080p @60hz as this is the worst case in terms of content actually possible at the moment.

Now lets start by assuming a 0% bit error rate on a particular cable.
That means no errors whatsoever in the stream , meaning all pixel information , that's 3 x 10 bit words for each pixel gets through as it was transmitted , meaning a perfect result.

A Cable is a passive collection of wires , there is no way for a cable to manipulate the data in the 3 x 10 bit words , so a cable can in no way improve on the data , so for a cable with 0 bit error rate , this is as good as it can be.

That means that the only possible way for a cable to change the data is if the cable in some way corrupts the data.

Now consider what would have to be the nature of this corruption if the cable is somehow to be responsible for “Deeper blacks“, more vibrant colours , and so on.
The type of thing claimed by some of the more disreputable magazine reviewers and cable sellers !

The corruption would have to be in the form of a fault which somehow changes the 3 x 10 bit words for each pixel so that they all the pixel information for blacks were changed to a deeper value of black and all the different pixel information for the other colours were changed to more values that encoded for more vibrant colours !

So you would have to believe that random errors introduced into the bitstream could somehow cause all pixels to encode for values that produced a better picture and that it could do this 144 million times per second.

Clearly such a thing is beyond any probability, in fact any errors introduced would most likely cause a 3 x 10 bit word pixel value that made no sense or was totally different to the original value , showing nonsense on the screen if indeed it showed anything at all. ( In the vast majority of cases this is exactly what happens on data corruption , the interface just stops working )

In the case of long cables , there is an attenuation problem , but this again does not affect sound or picture quality because the electrical signal present on the cable is not an analog signal , only the levels representing the data are important , and when these become indistinct you lose data. Meaning in the case of an HDMI cable , which carries so much data , that in all probability the cable just stops working.

As you can see from this , it is clearly impossible for a HDMI cable to be responsible for incremental picture changes , either the data gets through with a very very low or zero bit error rate, or most of the data gets corrupted , meaning for most cables it really is a case of they work or they don't.

You cannot have cables that are slightly better than others in terms of picture quality; the technology simply doesn't allow it. Its completely impossible.

Regarding audio quality with HDMI , Audio data is inserted into the video blanking periods , it is not a continuous bitstream , it is recovered and reclocked at the sink ( receiving device ) using a formula based on the Video clock.

The audio clock is not transmitted over HDMI
Rather, it is derived at the sink end from the video clock

The Source computes integers N and CTS such that
128×fs = fTMDS_CLK×N/CTS

N is fixed for a given video and audio rate (table lookup)

Source counts TMDS clocks per audio clock to determine CTS
N, CTS transmitted in audio clock regeneration packet

The sink regenerates the audio sample clock from the received fTMDS_CLK, N, and CTS values

Asynchronous video and audio clocks, or audio clock jitter, can cause CTS to change over time , but its important to realize that such jitter cannot happen due to the
cable , this can only happen due to problems in either the source or sink silicon/clock combination , so regardless , the cable cannot affect audio quality.

In addition , you need to note that typical HDMI chipsets , such as the Silicon image 9134/9135 transmitters and receivers , have jitter performance better than 1ps.

Summing it all up , in the case of a HDMI cable , no amount of spending will improve picture or sound quality , thats impossible. If it works , it works , Thats it !!
 
Last edited:
My own personal observations are that under 10m all HDMI cables I've tried produce the same image. Over that and I've had some issues, but more along the lines of it not working correctly than the image lacking 'pop' or any other WHF nonsense...

I've tried £3-5 and £40-£60 HDMI cables at 1-1.5m on both the LCD and the PJ and couldn't see any difference.

But if you see an improvement with more expensive cables then go for it and enjoy. :thumbsup:
 
Expensive Mains cables are as big a fraud as expensive HDMI cables.

Anyone that takes the time out to learn how equipment power supplies work ,which is simple stuff by the way , will see straight away that they couldnt possibly do anything useful and can in no way be worth the money they cost.

Russ Andrews have a string of ASA adjudications against them for making unsubstantiated claims about expensive cables , so buyer beware , regardless of cable , they have a history and a very dubious reputation.
 
Just to add: a difference between DVD & BD playback is that the latter proceeds much as the OP imagines, and therefore BD image quality is largely consistent across all players, whereas DVD content must first be interpreted by the player to produce the image, which is one reason why player DVD picture quality and price so much varies.

I don't get this. DVD is compressed using a lossy compression algorithm namely mpeg2. A DVD player rebuilds the original content as far as possible. The higher the bitrate the better objects in motion are reproduced. Most blu-ray uses H264/AVC a newer and more efficient compression codec that can give equivalent quality to mpeg2 but using a lower bitrate. Decoding this is an identical process. In fact some blu-rays will have mpeg2 compressed content anyway. Virgin media use mpeg2 for HD broadcasting, the picture is just as good as satellite using H264/AVC but needs a much higher bitrate.

Even blu-ray can't support uncompressed 1920 x 1080 24fps content.

The maths.

Each on screen pixel requres 24 bits of data for every frame. (8 each for Red, Green and Blue).

For a full frame thats 24 x 1920 x 1080 bits = 49766400 bits. The screen requires this 24 times a second = 1194393600 bits/second or 1194.3936Mbps. A 50GB bluray disc would hold 50 * 8 *1000 megabits = 400000. Divide this by 1194 gives only 335 seconds of programme content.

Most blu-ray players will play back DVD's that look significantly better than most DVD players. Reason they have better scalers to scale 720 x 576 to 1920 x 1080.
 
Last edited:
I don't get this. DVD is compressed using a lossy compression algorithm namely mpeg2. A DVD player rebuilds the original content as far as possible. The higher the bitrate the better objects in motion are reproduced. Most blu-ray uses H264/AVC a newer and more efficient compression codec that can give equivalent quality to mpeg2 but using a lower bitrate. Decoding this is an identical process. In fact some blu-rays will have mpeg2 compressed content anyway. Virgin media use mpeg2 for HD broadcasting, the picture is just as good as satellite using H264/AVC but needs a much higher bitrate.

Even blu-ray can't support uncompressed 1920 x 1080 24fps content.

The maths.

Each on screen pixel requres 24 bits of data for every frame. (8 each for Red, Green and Blue).

For a full frame thats 24 x 1920 x 1080 bits = 49766400 bits. The screen requires this 24 times a second = 1194393600 bits/second or 1194.3936Mbps. A 50GB bluray disc would hold 50 * 8 *1000 megabits = 400000. Divide this by 1194 gives only 335 seconds of programme content.

Most blu-ray players will play back DVD's that look significantly better than most DVD players. Reason they have better scalers to scale 720 x 576 to 1920 x 1080.

Indeed, you are clearly a deluded boffin.
 
Indeed, you are clearly a deluded boffin.

Explain :confused:

Why deluded. Whoever said cheap blu-ray must be identical to more expensive ones because of the way the picture was created compared to DVD players was the deluded one.

I made no comment as to whether a expensive blu-rayer produces better pictures than a cheaper one.

Merely that the explanation that DVD players work differently to blu-ray players makes no sense at all.

They don't, blu-rays use a blue laser capable of a narrower track focus so can pack more data into the same space than a DVD can. This means that they can read data faster than a DVD can, they have to as they are producing 1920 x 1080 pixels as opposed to 720 x 576 for a UK DVD. They otherwise work identically to DVD in that the data is compressed in a form that 1 full frame (I frame) is sent followed by subframes with only difference data. The decoder has to recreate as far as possible the full frame data for each subframe basically by intelligent guesswork.

I would have thought it likely that a blu-ray box with a better de-coder would be likely to produce better pictures. Same argument applies to DVD players. If coming up with arguments why this isn't the case at least come up with a reason that makes some sort of sense.

The same applies to a TV, watching the same HD channel, having the same size ant type of screen so getting the same digital data by the same argument they should all have the same picture quality. They clearly don't the better sets have superior video processing. Compare a 40" Panasonic with a Vestel el cheapo from Tesco.
 
Why deluded. Whoever said cheap blu-ray must be identical to more expensive ones because of the way the picture was created compared to DVD players was the deluded one.

I made no comment as to whether a expensive blu-rayer produces better pictures than a cheaper one.

Merely that the explanation that DVD players work differently to blu-ray players makes no sense at all.

I didn't respond to this earlier because I wasnt entirely sure what you meant , and I'd already commented on the nature of DVD players and why they were different earlier in post 3.

Only when Blu ray came along did you have DVD players with pure digital outs for picture , so for most of the life of this format a key differential was the quality of the Digital to analog conversion onboard. This was mentioned earlier in toodeep's post as the player having to interpret the digital content.

Blu ray , from its inception , was intended as a digital delivery system from disc to output , the standards for the format say exactly what the player should be outputting in terms of data , and players can be easily tested for that. Hence the reason why reviews on here and just about anywhere else say most players are the same , the same data out means no difference and as long as the player is not messing with the data and outputting as it should be then that is usually the case.

DVD on the other hand , if were talking about an analog output player , must be decoded , processed and output as analog onboard , and there are huge variations in circuit design with the process , therefore huge variations in output.

Even with digital output DVD via HDMI , differences exist due to the variable nature of scaling chipsets , even while playing on BD players.

Panasonics uniphier chipset , The Oppo 83's Anchor bay chipset , The Oppo 9x models Marvell Qdeo Chipset , and the PS3's edge adaptive routine all produce markedly different results when playing back/upscaling DVD or SD content , but all of these players are so close as to make no difference when playing back BD content. That is as it should be , because all BD players should have at least one mode if not more where it outputs the data as is. If it doesn't , it usually gets a poor review!

As such , with BD content , variability happens at the screen only , as with identical data from the disc to the HDMI input on the screen , if the data is the same , then there is no visible difference.

The only variable left in the home video chain is the screen , with calibration this can be kept to an absolute minimum and your setup will be as accurate as it can be.
 
Last edited:
Only when Blu ray came along did you have DVD players with pure digital outs for picture , so for most of the life of this format a key differential was the quality of the Digital to analog conversion onboard. This was mentioned earlier in toodeep's post as the player having to interpret the digital content.

.

Strange then that my rather old Denon DVD player has hdmi out (it was new long before blu-ray), You have the option of the original 576i or allowing the internal scaler to produce 1080i. There's not much difference presumably as the TV has the equivalent quality scaler. Using the analogue RGB outputs look identical to the 576i digital outputs.
 
grahamlthompson said:
Strange then that my rather old Denon DVD player has hdmi out (it was new long before blu-ray), You have the option of the original 576i or allowing the internal scaler to produce 1080i. There's not much difference presumably as the TV has the equivalent quality scaler. Using the analogue RGB outputs look identical to the 576i digital outputs.

HDMI devices first went into production in 2003, same as the first Blu rays, 4 million HDMI devices sold in 2004, I suspect most of those would have been DVD players.

BD players , very expensive ones for early adopters , had been available since April 2003 ,and continued to be available right up until its "official" global release in 2006. But without a doubt , the advent of HDMI and HD content players coincided.

DVD has been around since 1995 or thereabouts.
 
Last edited:
So , in short , with BD players they are all pretty much the same when it comes to playing BD discs.

Andy - not quite.
The vast majority are pretty much the same, the other remainder are different.
For example, with MPEG-2 you get 2 dimenisonal artefacts called 'ringing' and the digital filtering to try and deal with this and the more complex the filter the better the result. There are various filters useful for MPEG-2 and H.264 and in the past couple of years I have seen player chips with varying performance filters.
In addition some of the other functions e.g. motion adaptation can be tweaked depending on a player manufacturer's preferences and architecture e.g. QDEO output processing.
Since the new Google TV uses the Marvell Aramada 1500 I am hoping the effort has been put into improving decode and PQ performance. You may have guessed I am picky about this. :blush:
 
Trollslayer said:
Andy - not quite.
The vast majority are pretty much the same, the other remainder are different.
For example, with MPEG-2 you get 2 dimenisonal artefacts called 'ringing' and the digital filtering to try and deal with this and the more complex the filter the better the result. There are various filters useful for MPEG-2 and H.264 and in the past couple of years I have seen player chips with varying performance filters.
In addition some of the other functions e.g. motion adaptation can be tweaked depending on a player manufacturer's preferences and architecture e.g. QDEO output processing.
Since the new Google TV uses the Marvell Aramada 1500 I am hoping the effort has been put into improving decode and PQ performance. You may have guessed I am picky about this. :blush:

As said , the vast majority output the data as it should be , there have been 1 or 2 that don't and these have been called out , Sony and LG are examples, specific measurements are provided both on here and Secrets of home theatre site I linked to above.

Most do have at least one mode where data is output unmolested. That is as good and as accurate an output as you get.

Taste for extra processing or manipulation of this data ( Qdeo etc.) is quite rightly frowned upon by most reviewers , especially in the case of a player where it cannot be turned off!

If its too your taste , fine , but if you want an accurate , calibrated video chain , this type of thing should be off.
 
Last edited:

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom