So I took the first six seconds of “So Emotional” and looped it eleven times, each time at a lower bit rate. I call it “The Degradation of Whitney in Eleven Stages,” and you can listen to it on SoundCloud. (My apologies to Whitney Houston. This is just intended to demonstrate how lowering the bit rate affects sounds quality.) I can’t embed the clip without bringing the whole thing down to a bit rate of 128 kbps, which would kind of defeat the purpose.
Here are some technical notes: The initial sample was extracted from an Apple Lossless MPEG-4 audio file with a bit rate of 787 kbps. The subsequent downsamples were generated with Audacity, then joined together in GarageBand and exported to MPEG-4. Since the export maxed at 320 kbps, the high rate of the original sample is not preserved, and the first two samples should be indistinguishable. The bit rates in sequence are: 787, 320, 250, 200, 180, 128, 80, 40, 24, 16, 8.
The degradation doesn’t register to my ears until the bitrate hits 80. What’s truly bizarre is that at the lowest bit rate I find I actually like the sound better than the original.
And so here’s the full “So Emotional” downsampled to eight kilobits per second. Again, apologies to Whitney. This is merely intended to demonstrate the audio effects of an extremely low bit rate. I think it sounds cool. Your mileage may vary.
I realized in retrospect that the extreme effect of this process isn’t solely a result of exporting at a low bit rate. It also has to do with sample rate. The lower bit rates don’t support the 44.1 kHz sample rate of the original, so I had to resample the music. I chose a rate of 22.05 kHz and got the results posted above. But when I did it again and chose the lowest sample rate of 8 kHz, the results weren’t nearly so dramatic. It was definitely low-fidelity, but the song was much more recognizable. That’s right, the lower sample rate produced a higher fidelity result. So clearly there’s some sort of cross-effect between sample and bit rate that I’m not understanding.
To understand the difference between sample and bit rate, I found the following explanation on Helium:
It is easiest to think of sample rate as how often the audio signal is sampled and bit rate as the amount of information recorded for a unit of time.
So perhaps interference patterns can emerge between the two — kind of like audio moiré? I’m just guessing and probably wrong.