#### Stereophile

##### New member

- Joined
- Apr 19, 2013

- Messages
- 442

- Thread Author
- #1

<blockquote readability="35">

The most straightforward way of encoding an analog signal as a Pulse-Code-Modulated (PCM) digital datastream is to use an A/D converter operating at the sampling frequency that puts out digital words of the desired length. If you are talking about the CD's 16-bit/44.1kHz data, an ADC samples the analog signal 44,100 times every second, each time describing the instantaneous signal amplitude to the nearest one of 65,536 voltage levels (2 to the 16th power). This is, in fact, how digital audio recordings were made up to the mid-'80s. But the complexity of the ADC increases almost exponentially with the number of bits required, and the critical demands made on the analog antialiasing filter needed to eliminate every trace of signal above half the sample rate are extreme. A different A/D paradigm was required to achieve resolution greater than 16 bits and to achieve more accurate 16-bit resolution at lower cost and circuit complexity. . .

<p>

Instead of trying to attain higher resolution by increasing the number of bits, it was thought: why not increase the sample rate instead? In the limit, if you increase the sample rate to a sufficiently high frequency, you can use a 1-bit quantizer: a simple voltage comparator that outputs a "1" if the analog signal level is higher than it was at the previous sampling moment, or a "0" if it is lower. Because this "delta modulation" technique uses a sample rate very much higher than the baseband audio signal, the requirements for a "brickwall" analog antialiasing filter on the ADC's input can be relaxed. You can then either feed the high-rate pulse stream to a simple low-pass filter to reconstruct the analog original, or you can use a low-pass digital filter to "decimate" the low-resolution, high-sample-rate data to derive the desired multi-bit, low-sample-rate data. . .

</p><p>

The elegance of the idea behind DSD is that this decimation filter can be eliminated. Why not, Sony's engineers thought, just store the output of a 7th-order noise-shaped delta-sigma modulator running at a very high frequency (in DSD's case, 2.8224MHz, or 64 x 44.1kHz) on an appropriate medium. For playback, this datastream could be fed, in theory at least, to a D/A converter consisting of just a simple low-pass filter.

</p><p>

The use of such a high sampling frequency would mean the ADC's analog antialiasing filter needn't be a brickwall type but could instead be a sonically benign low-order type; linearity would inherently be excellent; there would be no digital decimation filter, with its necessary mathematical approximations on either the A/D or D/A conversions reintroducing PCM quantization noise or time-domain dispersion problems; there would be no multi-bit DAC, with its possible performance compromises

[Source: http://www.stereophile.com/content/news-about-dsd]