PDA

View Full Version : Signal Attenuation and Sound Resolution



BAS-H
08-04-2011, 11:12 AM
I picked this topic up from another thread and was invited to start a new one on it.

Apparently, attenuating a signal from source to amp reduces sound resolution to the tune of 1 bit resolution per 6dB attenuation. I've never come across this before and am interested in any explanations / further information people can offer.

For info, I attenuate my CD player's signal (because the old amp's input sensitivity is too low).

Thanks.

kittykat
08-04-2011, 02:01 PM
Hi Ben

you mention attenuating the cd signal because the amp's sensitivity is low. i would have thought it might be the other way around ie. Because Amp’s sensitivity is high, you turn the down cd player output.

In any case, the 1bit 6dB thing is only relevant in the digital domain as i understand it eg. you use the attenuation capability built into DAC's. In any case, you won’t miss "what was there" in the first place if you had turned the volume down so much. “You can’t hear what you wouldn’t have heard”.
In the analog domain you mention ie. cd player and amp, you might be up against base circuit noise. If there is too much attenuation, noise might rise above signal. It is extremely unlikely though that you would attenuate (turn down) your cd player signal so much that you can start hearing hiss (eg from the amp) for example. Id say its a non-issue.

kk

STHLS5
08-04-2011, 02:36 PM
Hi Zanko,

I am not sure of the sensitivity of the other speakers but considering your Densen Amp's digital volume control, it is better that you require less attenuation than other speakers which is good because for every 6dB attenuation you lose about 1 Bit resolution......

It is known that digital volume control reduces about 1Bit resoultion for every 6dB attentuation. There are many articles can be found on the net such as this (http://www.tomshardware.com/forum/40637-6-digital-volume-control). However, before you are too concerned about losing resolution due to the need to attentuate 10 dB, you must read Alan's post (http://www.harbeth.co.uk/usergroup/showthread.php?825-Resolution-our-senses-and-loudspeakers-...)where he said that speakers are "probably somewhere about 11/12 bit". I was just trying to make a point to the previous member that he shouldn't be too worried about the need to push his volume control higher to have same loudness as previous speakers.

ST

A.S.
09-04-2011, 10:46 AM
It is known that digital volume control reduces about 1Bit resolution for every 6dB attenuation...Just to add to this.

When we say 6dB we mean a doubling of voltage. And -6dB is a halving of voltage.

The simplest binary system has just two possible states: on and off. We call this a 'one bit system'. If we add another bit to make it a two bit system, there are four possible states from off, off to on, on. Each time we add another bit we double the number of possible binary states available between all off and all on. So there is a synergy between the doubling of volume due to the 6dB shorthand and the addition of extra bits in a digital system.

Pluto, who I'm pleased to report is now out of hospital and ticking with the precision of a Harrison, can confirm the finer details as he has to contend with this situation every day, but this voltage v. bits cross comparison is highly pertinent in professional audio mastering and broadcast. To give one specific example:

When the BBC made a decision to move from analogue tape storage to digital (i.e. DAT at the time) they purchased DAT machines and hid them in the equipment racks out of site from the sound engineers, something they couldn't do with wardrobe sized analogue tape machines. So with the front panels of the DAT machines out of sight, so were their fancy bar-graph signal display meters. Which in fact, was deliberate. Since about 1940, the BBC - and most/all European broadcasters but not North American ones, had mixing desks fitted with peak programme meters, not VU meters. The printed scales and the relationship they had with the actual voltages passing through the desk varied a little from country to country in Europe. This didn't matter providing that when recordings were passed from one European Broadcasting Union country to another there was an unambiguous method of calibrating the signal levels. And as you know, in a digital system, there is a very definite and fixed maximum signal level beyond which there is total distortion. You can read about loudness metering here (http://en.wikipedia.org/wiki/Peak_programme_meter). It's a very important issue in pro audio.

The issue was how to calibrate the analogue PPMs that generations of (analogue era) BBC engineers had relied upon to tell them the signal level now that this new fangled digital recorder was to be used.

(More later but to get you thinking .... record at a nominal level of -18dB on a 16 bit DAT machine and what resolution are you recording at?)

A.S.
10-04-2011, 11:23 AM
STHLS5 wrote: My answer would be 13 bits, but then I am maybe wrong because 1 bit resolution affects the dynamic range and therefore, it may depend on the recording dynamic range.Correct. Note that I said "record at a nominal level...".

We know that digital systems, unlike analogue tape, are completely incapable to recording any signal above full range. Analogue tape has a ferro-magnetic compressor inbuilt and the signal can be rammed onto the ferrite significantly beyond its optimum design capabilities. Yes, the distortion will rise (dramatically) but as the signal is louder, the distortion tends to be swamped. But that recording approach is entirely inappropriate for digital recording where every precaution must be taken to avoid even a hint of clipping with the resulting total distortion.

There are two compounding problems in broadcasting that wouldn't be an issue to commercial recording (for CD) engineers. First, broadcasting is significantly about capturing live, one-off events that cannot be repeated. Yes, an orchestra could be re-booked but at what enormous cost? Second, the unlike the CD recording engineer who can assume that his customers are listening on fancy hifi gear, the broadcast sound engineer is aware that very few of his listeners are listening on hifi separates, and almost none on high quality systems. His or her listeners are likely to be listening on transistor radios, in a noisy environment. This means that there is an awareness that the signal level must not drop too low (it becomes inaudible under the listener's ambient noise) nor too high (could clip the digital recorder and/or damage the loudspeaker in the tinny tranny radio). So managing the dynamic range for broadcast is a constant second by second battle by the sound engineer - and somewhat automated now (http://www.youtube.com/watch?v=6lLSOftvzZA&feature=player_embedded#at=42). Most (probably all pop music) FM stations in the UK automate signal dynamics to 'punch' as loud a signal as they can - here is an example of a station manager's dream punch (http://www.youtube.com/watch?v=ReC2IAcns1Y&feature=related).

We can look at dynamic compression another time. I just want to focus on the need for the recording engineer to 100% guarantee, for every live orchestral broadcast (and digital record for archival/CD release purposes) no matter how dynamic the performance or score, that a digital clip will never occur. This means that there must be enough headroom (safety margin) so that even in the worst case of the analogue PPMs under-reading the true loudness peak + the performers and/or audience being especially energetic + the score itself + another little margin for error that the digital full scale recording threshold will not be reached - ever. And that implies that we have to set our 'nominal' level far below the full potential of the digital recording medium, or in simple language, we cannot use the full (16) bit resolution we theoretically have available.

In the BBC's case (I believe) the digital 'nominal' level is set to -18dB below 100% full scale and this is level would cause a BBC PPM needle to touch the PPM4 marker. Here is an example of a BBC ppm where the sound engineer has set the loudness to peak at PPM5 (http://www.youtube.com/watch?v=4xxKB6p8RMs). Note carefully how the peaks cause the white needle to accelerate fast - that's because this is a peak programme meter not an average level meter (VU meter*). Strange isn't it how non-intuitive the peak loudness is: would you be able to predict that certain words had such high level energy? Look out for 'horse' and 'oil'. In the last few seconds you can see the well controlled decal to zero and see the hardware-accelerated flick-up, all tightly and precisely defined in the PPM specification.

Metering is a really important subject (http://www.chromatec.com/pdf/WorkingWithAudioMetersV1_0.pdf). If the sound engineer is not certain about signal level he is recording blind. More on metering in digital audio workstations here (http://www.youtube.com/watch?v=lQTxSyK-ATI). And to show you that levels are not absolute here is the alignment of a tape recorder's replay levels (http://www.youtube.com/watch?v=JW5kifLh598&feature=related).

*The VU meter is widely used in consumer audio, and also broadcasting outside Europe - example USA - where the PPM is unheard of. There is a technical specification for a true VU average loudness meter but consumer grade equipment doesn't adhere to the spec. So if you record music using a 'VU' meter you have no idea whatever of the true peak signals. Not a problem for analogue tape but a clipping nightmare for digital recording.

Compare the expensive electronics enabled PPM display (http://www.youtube.com/watch?v=4xxKB6p8RMs) with its defined peak and well controlled fallback with a bog standard consumer VU meter (http://www.youtube.com/watch?v=U2M8U9vISOQ&feature=fvst).

(More later)