Receiver output power

Crisco

Standard Member
Joined
Jan 9, 2009
Messages
10
Reaction score
0
Points
4
Age
33
If a receiver has a power rating for like 100 watts per channel 7.1, specifically STRDG820, and you only use 5.1 channel, are those missing 200 watts divided by 5 among the other channels, so 40 added to each channel making it 140 at 5.1 with 8ohm speakers, or do they get diminished? Thanks in advanced.
 
If a receiver has a power rating for like 100 watts per channel 7.1, specifically STRDG820, and you only use 5.1 channel, are those missing 200 watts divided by 5 among the other channels, so 40 added to each channel making it 140 at 5.1 with 8ohm speakers, or do they get diminished? Thanks in advanced.

I'm disappointed no replied to your post, as I'd hoped for a nice clear explanation of the inner working of the amp myself. I've picked up lots of bits and pieces from other threads but it would be nice to have them pulled together.

For what it's worth, the rating I believe you are referring to is the minimum RMS per channel at 8 Ohms over a frequency range of 20Hz to 20kHZ, probably stating the level of THD (total harmonic distortion) at that level. On my Yamaha this is stated as 95W driving all 7 channels. Some manufaturers claims I have seen quote this figure for just the front two channels, so would have a lower figure if quoting for 5 (or 6 or 7) channels, since the amp have a finite amount of power to distribute.

Anyway, back to your original question, if the figure is for all channels and you are only using two, I don't believe the all the 'spare' power would be diverted to fronts (giving 350W per channel!), I think it more likely the amp puts out the same amount of RMS to the speaker but is working well withing its operating parameters, as opposedd to flat out.

Hopefully someone can confirm (or correct this) and give a better explanation - any have a free 'bump' on me.

Chris
 
The specs for that amp are very interesting as it happens.

It's rated at 85W with a decent THD.
Stereo is rated as 100W with a higher, but still decent THD level.
It's rated at 140W with a rubbish, but probably acceptable THD level.

But if you read carefully, it states;

"reference power output for front, centre, surround and surround back speakers.
Depending on the sound field settings and the source, there may be no sound output".

What use is that? :confused:

All pretty meaningless and I'm digressing a little, sorry.

In this case, I'd look at the stereo rating (100W) as a reference.
If that's what it puts out in stereo you can expect 5.1 surround to be significantly lower and 7.1 to be lower still. The limiting factor is usually the power supply and if it's nearly flat out in stereo, it makes sense that 5.1 will be nearer 40W per channel in reality (although it's not quite that simple). This applies to most amps, so hopefully sort of answers the question, in that conversely, if you use two less channels, you will get a bit more power to each channel in use. All this assumes each channel is going flat out, but in reality that doesn't happen (just to confuse you a bit more :D).

Very misleading, but Sony are not the only ones who do that sort of thing.
Onkyo, for example quote power via one channel running!!!

A much better indicator is to look at power in
In this case it's 230W, which in all honesty isn't brilliant.
Even then, it's only an indicator though.
I'm not saying it's a poor amp, it's probably about right for the money.
 
The short answer is no...

Power rating figures are usually inflated and just about any amp these days claims to be 100watts. You have to look carefully at the spec to see if they mean 100w with just 2 channels driven or whether it can drive more at this rating.

The problem is usually with the power supply i.e the transformer. It can only provide a certain amount of current and if all the channels are being driven hard it craters!
The real world power could be something like 100w 2 channels only and 40 watts with 7 channels!
This is continuous power by the way.

That is why a pair of scales to weigh the amplifier is usually a better indication of its power than anything written on paper!
 
Badger/Cliff, thanks for that - can you just confirm my assumption that if power is quoted acrosss for all channels (as it is for my DSP-AX763) then running in stereo puts out the same level to the fronts as it does in full surround (broadly speaking, allowing for all other factors), and that the amp is just running at much less stressed level? I assume it's the case as if the power increased I'd get a noticeable jump in volume when changing modes.

Chris
 
Last edited:
Badger/Cliff, thanks for that - can you just confirm my assumption that if power is quoted acrosss for all channels (as it is for my DSP-AX763) then running in stereo puts out the same level to the fronts as it does in full surround (broadly speaking, allowing for all other factors), and that the amp is just running at much less stressed level? I assume it's the case as if the power increased I'd get a noticeable jump in volume when changing modes.

Chris

You're asking too many questions, that have loads of answers :D
I can't answer, because there are too many variables..
It depends what the individual amps can cope with, for one.
Who knows if the amps are limited or the power supply is the limiting factor????
You would expect stereo mode to be the most efficient, to answer your question.
But you won't necessarily get a jump in volume, because the amp will be set up to give the highest fidelity in stereo mode (bear in mind the THD figures).

It's all about cost and the saying "you get what you pay for" definitely applies with amps, imo, simple as :(
 
That all correlates and explains the relative increase in cost to go from a claimed 7.1 100W per channel to a 130W per channel 7.1 Receiver.
 
You're asking too many questions, that have loads of answers :D - ooops sorry, maybe I should try asking on AV forum then...:D
I can't answer, because there are too many variables..
It depends what the individual amps can cope with, for one.
Who knows if the amps are limited or the power supply is the limiting factor???? -
You would expect stereo mode to be the most efficient, to answer your question. - Thanks!
But you won't necessarily get a jump in volume, because the amp will be set up to give the highest fidelity in stereo mode (bear in mind the THD figures).

It's all about cost and the saying "you get what you pay for" definitely applies with amps, imo, simple as :( -


Thanks again Badger, sorry to be a pain in the ass (again) :)

Chris
 
ooops sorry, maybe I should try asking on AV forum then...

Ask away mate. It never hurts and I could be talking total rubbish :)() :smashin:

But you're still looking for an impossible answer.
It's really not clearly cut ;)
 
ooops sorry, maybe I should try asking on AV forum then...

Ask away mate. It never hurts and I could be talking total rubbish :)() :smashin:

But you're still looking for an impossible answer.
It's really not clearly cut ;)

At least I'm not alone then :D

Chris
 
A couple of points that should be cleared up. Firstly even if there is a power increase if you are in stereo, the normal volume setting does not change. So you would not notice any difference from changing from 2 ch to 7 channels.

At high volumes, distortion will set in sooner when 7 channels are being driven, because in the case of the Sony and most receivers, each channel is powered from the same power supply and transformer.
Different manufacturers specify amplifier power in different ways and some are more honest than others. As I mentioned the Sony may be able to deliver 100watts in stereo but less when the other channels are demanding current from the power supply. Each amplifier channel is designed to deliver 100 watts but in fact may be able to deliver more if the other channels are silent.
Onkyo are a bit cheeky specifying power for one channel driven- yes the number is large but who listens to only one channel?
 
Last edited:

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom