– Guest blog by Audio Dev Academy
The resolution of a sampled value
Getting proper advice on choosing the best audio bit depth for recording music can be a tough cookie, just like getting advice on choosing the best samplerate. But when producing music in the 21st century, you can’t really do without them. Both bit depth and samplerate are inherently ‘digital’ concepts and somewhat obscure when you get into the fine details. In my last short blog, Create Your Own VST Plugins, Part 2: Samplerate Explained, I defined the sample rate of an audio file as its resolution over time, because it gives you the amount of sampled values that are used to represent one second of audio. Now, because we are in the digital domain these values themselves also have limited resolution, determined by the bit depth of the file. Therefore, the bit depth of an audio file can be defined a the resolution of the sample values that make up the audio file.
Tiny little steps
As a little thought experiment, imagine a slider you can move up and down. In the real world you can move the slider in a continuous fashion. However, in the digital domain you are forced to move it in tiny little steps. If the steps are small enough you might not notice them at all, but if they get bigger you will start to feel the slider snapping from one step to the next. In the digital domain, all numbers will snap to fixed values predetermined by the amount of bits used to represent them. This also goes for the values that are sampled when recording audio. The higher the bit depth of your recording, the more accurate the values that determine the amplitude of your sound wave. For perspective, a sample recorded at 16-bit resolution can contain any one of 65,536 unique values: 2 to the power of 16. And a sample recorded at 24-bit resolution can contain any one of 16,777,216 unique values: 2 to the power of 24.
Bit depth and dynamic range
If you apply a bitcrusher plugin to your audio track, you can hear the degradation that occurs when gradually lowering the bit depth. At first you might not hear the difference. After a while though, you will start to notice your track becoming more noisy and gritty, until it starts to distort and finally fully falls apart sonically. An interesting fact is that you will get the same degradation by gradually lowering your track’s volume – even though the signal might get too low in volume for you to notice it. This happens because by lowering your track’s volume, you are not utilising the full dynamic range represented by the bit depth, which causes you to use less bits to represent the same signal. I’ll show you an experiment to prove this to you soon!
Audio Dev Academy is an exciting online environment for like-minded programmers, musicians and audio engineers who want to learn how to code audio plug-ins and virtual instruments – to be launched in 2019. In preparation for the launch, Audio Dev Academy will publish a series of blogs and ebooks about programming and the inner workings of audio plug-ins and virtual instruments. If you want to know more, find Audio Dev Academy on Facebook.