dwightlooi
New member
- Joined
- Oct 4, 2004
- Messages
- 3
- Points
- 0
- Age
- 50
Can anyone shed some light on the iPod's equalizer implementation? I have the following questions:-
(1) Does the equalizer rely on digital signal processing -- changing the wave form of the data stream -- or does it act on the analog output?
(2) I have noticed that using the equalizer frequently result in "clipping" that sounds similar to CDs which have exceeded the dynamic range of its 16-bit audio format. Sort of like a blown speaker except that it seems to be volume and headphone independent, as the same distortion is heard whether the volume is at 25% or 75%.
(3) I have noticed that the aforementioned distortion can be mitigated or eliminated in wav files by "Normalizing" to a low peak level (say 60%). This suggests that the EQ is digitally causing peak dynamic range to be exceeded. Unfortunately, it seems that MP3 encoders artificially level the volume level during encoding so feeding them 50% normalized wavs and un-normalized wavs seem to result in tracks that are idenitcal in loudness. I have also noticed inconsistent volume levels throughout the track if I feed the encoder wavs that have been normalized to a very low peak level (say 20%). LAME 3.90.3 is used within EAC for the experiments using a command line input of "--vbr-old -q 0 -V 0 -m j -b 192 -c %s %d "
(4) It seems moronic that whoever implemented the EQ can even allow somethig like that to happen. We all know that CD tracks -- especially today's CD tracks -- have peak levels very close to or at 100%. In fact there are a lot of moronic recording engineers who accepts some considerable amounts of "clipping" just so the CD will play "louder". Nonetheless, if the iPod is performing the EQ digitally, shouldn't it only CUT levels at particular frequencies but never EVER add levels? The same EQ shape can be achieved via a reduction only implementation and such a scheme will never blow through the dynamic range. Any volume reduction can be compensated for simply by the user uping the volume.
(1) Does the equalizer rely on digital signal processing -- changing the wave form of the data stream -- or does it act on the analog output?
(2) I have noticed that using the equalizer frequently result in "clipping" that sounds similar to CDs which have exceeded the dynamic range of its 16-bit audio format. Sort of like a blown speaker except that it seems to be volume and headphone independent, as the same distortion is heard whether the volume is at 25% or 75%.
(3) I have noticed that the aforementioned distortion can be mitigated or eliminated in wav files by "Normalizing" to a low peak level (say 60%). This suggests that the EQ is digitally causing peak dynamic range to be exceeded. Unfortunately, it seems that MP3 encoders artificially level the volume level during encoding so feeding them 50% normalized wavs and un-normalized wavs seem to result in tracks that are idenitcal in loudness. I have also noticed inconsistent volume levels throughout the track if I feed the encoder wavs that have been normalized to a very low peak level (say 20%). LAME 3.90.3 is used within EAC for the experiments using a command line input of "--vbr-old -q 0 -V 0 -m j -b 192 -c %s %d "
(4) It seems moronic that whoever implemented the EQ can even allow somethig like that to happen. We all know that CD tracks -- especially today's CD tracks -- have peak levels very close to or at 100%. In fact there are a lot of moronic recording engineers who accepts some considerable amounts of "clipping" just so the CD will play "louder". Nonetheless, if the iPod is performing the EQ digitally, shouldn't it only CUT levels at particular frequencies but never EVER add levels? The same EQ shape can be achieved via a reduction only implementation and such a scheme will never blow through the dynamic range. Any volume reduction can be compensated for simply by the user uping the volume.
Last edited: