We about to hit a long term peak for AV Receivers?

dante01

Distinguished Member
A peak?

Why, when did they start addressing the elephant in the room and start spending money on the actual analogue amplification as opposed to pouring money into licensing gimmicks? Do AV receivers actually sound any better for having all those digital bells and whistles bolted onto them or are they in gact less well built in order to fascilitate getting those bells and whistles for a targetted price?


Heavilly into Morris dancing? Great, you'll be into bells and whistles then :smashin:



Denon practicing for the unveiling of their new models:

2560px-Morris_Dancers,_York_(26579460201).jpg
 

NewbieAudiophile

Active Member
I feel that there is a lot of unnecessary criticism and negativity in this thread and i get it - how different is 'imax enhanced' from a regular DTS:X soundtrack? It is really worth a bunch of money more on top of what you're paying? Probably not, but the good thing is that there is continual improvement and innovation.

Sure, most atmos and dts:x soundtracks don't utilise the height channels to pull capacity, but when i switch my sound decoder to Dolby ATMOS or DTS Neural: X, it definitely makes a difference to the viewing experience of regular dts or dolby audio tracks.

Some times i feel as if it's a gimmick though, like the hi-res music these days - can't tell the difference between 44.1/16 and 192/24 at all really - i guess we just have to discern carefully when making purchases in this space of audiophillia, etc.
 

Barney Gumble

Well-known Member
I feel that there is a lot of unnecessary criticism and negativity in this thread and i get it - how different is 'imax enhanced' from a regular DTS:X soundtrack? It is really worth a bunch of money more on top of what you're paying? Probably not, but the good thing is that there is continual improvement and innovation.

Sure, most atmos and dts:x soundtracks don't utilise the height channels to pull capacity, but when i switch my sound decoder to Dolby ATMOS or DTS Neural: X, it definitely makes a difference to the viewing experience of regular dts or dolby audio tracks.

Some times i feel as if it's a gimmick though, like the hi-res music these days - can't tell the difference between 44.1/16 and 192/24 at all really - i guess we just have to discern carefully when making purchases in this space of audiophillia, etc.
'Audiopphillia'?! I don't like that word. Conjures up all sorts of wrong images in my head... maybe it's just me:D.

I haven't read any unnecessary criticism. I believe it to be just.

Anyone that's paid for a new component or even a TV, had it delivered, set up and then have to go through the arduous task of packing it all back up again and arranging a return over the past twelve to eighteen months due to the HDMI 2.1 debacle has good grounds to be royally pee'd off.

There's no justification for rushing components to market that aren't capable of what they claim other than greed.
 

ShanePJ

Distinguished Member
AVForums Sponsor
I feel the only real thing that I can see lurking is that Intel and AMD have dropped the control pathway for playing 4k movies via PC's to stop piracy on their newer CPU's. This could be one area which might be incorporated into both Blu-ray players and AV Receivers as I cannot see these tech giants removing their ability to play 4k or 8k footage in the future without another system being designed to accommodate it.

Apart from that and the fact that the above will only affect a very small majority of people (including me which is why I've noticed it), I really feel it's a really good time to purchase with the one exception of the semiconductor shortage holding prices high, otherwise I suspect that everyone wouldn't hessite now in picking one up
 
Last edited:

unground

Active Member
I disagree with all those saying that there's not much to happen - we already know what's next and have a pretty good idea when.

We didn't need UHD - almost nobody sits so close to the TV that s/he can tell UHD from Full HD. Yet when UHD came, everybody scrambled to upgrade because 1) UHD was seen as the must-have improvement that everybody imagined gave them a better, clearer picture and 2) of fear of missing out.

8K is coming. Like 4K nobody needs it, but it will generate massive sales as everybody rushes to get it for the reasons given above. Since 8K places demands on multiple products, there's a lot of money to be spent on this must-have improvement.
This is plainly nonsense. I've seen the charts that indicate this but I wouldn't be surprised if they're taken out of context. Granted, some of the improvement might come from HDR but the difference is still very clear even from a couple of metres away.
 
D

Deleted member 901590

Guest
Well I have a visual acuity of 20-15 right eye and 20-12 left eye and at 4.9m from a 65" TV cannot tell the definition difference between 1080 or 2160.
For me it's about the HDR.
 

unground

Active Member
Well I have a visual acuity of 20-15 right eye and 20-12 left eye and at 4.9m from a 65" TV cannot tell the definition difference between 1080 or 2160.
For me it's about the HDR.
Fair enough. And to balance my earlier statement, I'm well-prepared (in the absence of blind testing) to concede that the benefit comes from HDR. It's a big benefit though, so I would like to see some proper up-to-date testing.

Of course testing can go both ways. For example, no way on earth I can tell the difference between Spotify and FLAC when volumes are matched.
 

CarMad

Member
Maybe there is more innovation on the AMP side that is required. For years its been tiny incremental changes and many of us have had to resort to power amps to get what we need.

Maybe if there could be some genuine class D amplifers from the Denon et al stable instead of what we have today it could move the game on. Much lower power consumption more of the time and more power when you need it to each channel.

If the processing and everything else is starting to steady and become ubiquitous (in 25+ years of buying hifi I don't think so) then maybe that is where they can innovate?
 

Mark.Yudkin

Distinguished Member
This is plainly nonsense. I've seen the charts that indicate this but I wouldn't be surprised if they're taken out of context. Granted, some of the improvement might come from HDR but the difference is still very clear even from a couple of metres away.
You might wish to google on ocular resolution and take a look at Visual acuity - Wikipedia for a basic explanation of how humans see fine detail. ASs can be seen, this works out to about one arc minute.

Here's a popular chart that does the trigonometry for you, converting into units of TV size and viewing distance.

main-qimg-6530b45f6f90aa4fcb794d3512a46f19-lq

So for my 55" UHD TV screen, I'd have to sit ca 2m from the screen to be able to see any difference (if my vision were perfect). I sit around 5m from the TV, but I still own a UHD TV and buy UHD disks.
 
D

Deleted member 901590

Guest
You might wish to google on ocular resolution and take a look at Visual acuity - Wikipedia for a basic explanation of how humans see fine detail. ASs can be seen, this works out to about one arc minute.

Here's a popular chart that does the trigonometry for you, converting into units of TV size and viewing distance.

main-qimg-6530b45f6f90aa4fcb794d3512a46f19-lq

So for my 55" UHD TV screen, I'd have to sit ca 2m from the screen to be able to see any difference (if my vision were perfect). I sit around 5m from the TV, but I still own a UHD TV and buy UHD disks.

Is that based on 20-20 vision?
 

Krobar

Well-known Member
I think it is telling that nearly all of the discussion in this thread is about HDMI video and the receiver passing through that video.
 

Mark.Yudkin

Distinguished Member
Is that based on 20-20 vision?
Please review the Wikipedia article on visual acuity that I referenced above. "In the ideal eye, the image of a diffraction grating can subtend 0.5 micrometre on the retina."

20-20 vision is used by your optometrist to compare your vision with somebody else's at 20' using his standard charts. It represents a target for corrective glasses.
 
Last edited:

Cevolution

Banned
You might wish to google on ocular resolution and take a look at Visual acuity - Wikipedia for a basic explanation of how humans see fine detail. ASs can be seen, this works out to about one arc minute.

Here's a popular chart that does the trigonometry for you, converting into units of TV size and viewing distance.

main-qimg-6530b45f6f90aa4fcb794d3512a46f19-lq

So for my 55" UHD TV screen, I'd have to sit ca 2m from the screen to be able to see any difference (if my vision were perfect). I sit around 5m from the TV, but I still own a UHD TV and buy UHD disks.

So are you simply an armchair expert google monkey, or are your comments based on practical experience, which you can back up, with specifics from first hand evidence, putting your own name and credibility on the line? That particular “popular” chart you have used, has been around for quite sometime (it’s certainly dated… The first time I saw it was more than 7 years ago, maybe even closer to 10+ years ago), and over the years I have witnessed numerous times it been proven to be inaccurate, on other AV sites, such as AVSforum, and Blu-ray.com.
 
Last edited:

eliotcole

Active Member
I mean ... simply putting decent kodi hardware in like an N2+ should be the bare-minimum amp MFRs should be doing to my mind.

Imagine how much better their interfaces would be running on a more full on linux build like that?
- Built in video streaming Apps for the major streaming services
 
D

Deleted member 901590

Guest
Please review the Wikipedia article on visual acuity that I referenced above. "In the ideal eye, the image of a diffraction grating can subtend 0.5 micrometre on the retina."

20-20 vision is used by your optometrist to compare your vision with somebody else's at 20' using his standard charts. It represents a target for corrective glasses.

I know how the system works, hence how I've quoted my visual acuity above. 😊
 

unground

Active Member
You might wish to google on ocular resolution and take a look at Visual acuity - Wikipedia for a basic explanation of how humans see fine detail. ASs can be seen, this works out to about one arc minute.

Here's a popular chart that does the trigonometry for you, converting into units of TV size and viewing distance.

main-qimg-6530b45f6f90aa4fcb794d3512a46f19-lq

So for my 55" UHD TV screen, I'd have to sit ca 2m from the screen to be able to see any difference (if my vision were perfect). I sit around 5m from the TV, but I still own a UHD TV and buy UHD disks.

Yes, I'm sure like many people I've seen this chart a few times and always thought it didn't seem right. I thought I'd go back to the source and look at how it was compiled.

I've learned, very slowly, that data presented by lay authors and/or read by lay readers (of which I am one for this topic) is often mis-represented/misunderstood, sometimes innocently, often not. And sometimes there are problems with the source data itself, e.g. error in methodology, data interpreted incorrectly, small sample size.

In this case several commenters say the data has been interpreted incorrectly leading to a calculation error - so the chart is wrong.

Quoting one of several comments on the article by readers claiming expert knowledge (and citing relevant articles) "...the viewing distance diagram above is wrong by a factor of two. This means that the optimum distance for matching the resolutions are twice as large."

If I understand the comments correctly, it means you'd need to sit 4m or closer to see a difference. Seems more like it to me.

Input welcome from a suitably qualified forum member.
 

Nutty667

Active Member
You might wish to google on ocular resolution and take a look at Visual acuity - Wikipedia for a basic explanation of how humans see fine detail. ASs can be seen, this works out to about one arc minute.

Here's a popular chart that does the trigonometry for you, converting into units of TV size and viewing distance.

main-qimg-6530b45f6f90aa4fcb794d3512a46f19-lq

So for my 55" UHD TV screen, I'd have to sit ca 2m from the screen to be able to see any difference (if my vision were perfect). I sit around 5m from the TV, but I still own a UHD TV and buy UHD disks.
Whilst I don't dispute the principle, I dispute the values here. My viewing distance is 3m (measured with a laser measure held to my head) and I can clearly see a noticeable increase in detail on 4k stuff on my 55" TV.
I don't wear glasses, but my eyesight is nowhere near perfect.
 

RMP888

Member
I do actually remember that story of the guy getting his own supply.

Inspired by that link, I just rang British Gas and asked for my own pole... the woman put the phone down.
Don't laugh, but I am currently living in the Philippines and had to consider my own dedicated transformer, power poles and powerlines for clean power some years ago [not cheap}. To cut a long story short the power company added one much improved transformer and I use a dedicated line from that. It certainly improved the lighting / aircon stability a lot, even the electric fan speed would vary in the past, but not any more.
 

The latest video from AVForums

Paramount + UK launch: Halo, Star Trek and Beavis, and all the latest 4K + Movie/TV News
Subscribe to our YouTube channel

Latest News

AVForums Podcast: 27th June 2022
  • By Phil Hinton
  • Published
Netflix confirms ad-supported option is on the way
  • By Ian Collen
  • Published
Rotel announces 60th Anniversary Diamond Series Hi-Fi duo
  • By Ian Collen
  • Published
Paramount+ launches in the UK and Ireland
  • By Ian Collen
  • Published
Hisense launches A9H 4K OLED TV
  • By Ian Collen
  • Published

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom