Post by welder on Jul 25, 2014 14:07:43 GMT
There is a lot of information regarding reported benefits of higher bit rate Dacs, along with of course the endless debates regarding the audible improvements on higher bit rate source material.
When I moved from vinyl to file based audio some years ago I put some effort into trying to understand something about the various Dac chips available and what their specifications meant; testing by measurement has fallen out of favour with the audiophile community by and large and if you want to know anything bar what gets advertised by the manufacturers a bit of research is needed.
One of the most often quoted specifications quoted is the bit depth of the Dac chip; this however only states what the Dac chip can receive. At my more cynical what is not so well published and far more important is the resolution of the chip.
Just because a particular Dac can accept say 24 bit depth it doesn’t mean that’s what comes out the other end. In fact there are a great many chips that will happily accept 24 bits but only output less than half that amount, my old HRT 11 for example, 24 bits in and only 11 bits out I believe.
Strangely, lots of people report vast sonic improvements attributed to moving from 16 bit to 24 bit files but their Dac chip can for example only output (resolve) say 10 bits. What’s going here then?
I bought my Benchmark pretty much entirely on it’s specifications (okay, I did get some heavy endorsements from a couple of my fiends) which apparently resolves 20 bits and I believe this extra resolution is audible.
I’ve read with interest where some kit building enthusiasts have written that it’s the output stage that makes all the difference, but the best output stage in the world can’t add to the Dacs resolution.
So, what can your Dac resolve and do you think some of the advertising of high bit rate Dacs is essentially misleading when they can’t output what goes in!?