NullDevice wrote: Bitrate does make a difference for noise floor and reproducable dynamic range.
I'm sure you meant bit size rather than rate. I did some googling to get my facts straight. Music stored on the very best analog tape had a dynamic range of about 60 DB. Did anybody ever complain about the sound quality of music on quality reel-to-reel tape?
You get 6 db of dynamic range for each bit. So storing music in 16 bits gives you 96 DB of dynamic range. The maximum dynamic range ever found in music is 90 DB - and that's really pushing it.
I used to be a software programmer in a audiology research lab. The standard for representing sound was 12-bits. 72 dB was plenty for all the experiments. Then as digital hardware got better and cheaper, 16 dB became the standard, with 96 dB being overkill.
The reason why 20-bit or 24-bit digital systems might offer something is in the sound processing. You get less error accumulated, I guess.
When you finally present the sound, it's inconceivable than more bits gets you anywhere. 20-bits takes you up to the threshold of pain (120 DB) for people with the most excellent hearing.
NullDevice wrote:All sampling rate increases do is shove the frequency of aliasing/interpolation noise to a higher frequency. Given the human ear is limited at best at 20khz, it's pretty ludicrous.
Right. And an older fart like me barely hears anything above 10Khz, frankly.
Higher sample rates than CD standard are useful for cat hearing. But human hearing - no way. (I guess somebody might consider the sharpness/quality of the analog anti-alias filters at the output, but 44Khz allows plenty of fudge.)