question about the origins of the iTunes aac files...

GO TO ADMIN PANEL > ADD-ONS AND INSTALL VERTIFORO SIDEBAR TO SEE FORUMS AND SIDEBAR

SkippyMcHaggis

New member
Joined
Oct 24, 2003
Messages
65
Points
0
i seem to recall having read somewhere that iTunes goes back to the original masters to make their 128kbps aac files. is this true? if it is it might explain something i noticed today.

i've been playing around with comparing mp3 and aac files to see which one i like best, and to finally determine for myself if there is a difference. i'll spare the details of my comparison, they might not be the best, but they work fine for me.

i've always ripped my files as 320kbps mp3's because quality has always been more important than overall size. a 9 meg file that is nearly indistingushable from cd audio beats the pants off a distorted 128kpbs file that i can cram a million of onto a cd.

i'm probably the perfect HD audio player customer.

so i sat around comparing various songs in 320 and 256 mp3, just to hear the difference between them. there is a noticable difference. i've never understood how people think 192 is cd quality...

i then took the songs and ripped them (from cd sources) to 256 aac. i then compared the 320 mp3 file and the 256 aac file. there are differences but for the sake of sanity i'd say they are "equivalent" in quality, not identical. i decided i'd re-rip everything i have to 256aac and save about 10% HD space. (at 500+ albums, that's a chore, but it's also a fair chunk of space).

then a friend of mine shared an iTunes song with me. (128aac). i then compared it to a 320kbps mp3 i had of the same song. while not identical, the aac file was really quite similar, in some ways it was better. (if you get a chance to try it with a song, do it and let me know if you agree.)

this puzzled me. so i sat and thought about it a bit and remembered the comments about the original master sourcing.

if it's true, i can see why an itunes 128aac file is really very high quality compared to a cd-sourced rip. studio masters are huge, raw, uncompressed data files. (if you've got John Mayer's Heavier Things album, look at the album art and there's a chart that shows the song file sizes from the studio masters.)

a compression of that is gonna be fairly close to a lossy compression rip of an aiff or wav file, due to the simple fact that it's a compression of a compressed file.

i may be wrong. my methodology may be flawed. but that's what i've noticed today. feel free to comment.

Skip
 

Adam

Brushed Steel Guy
Joined
Oct 14, 2003
Messages
2,698
Points
0
Age
38
Location
Sydney, Australia
What do you mean original master sourcing? Because regardless of the source quality, the destination bitrate is 128kbps, which is like cutting the top of a snow cone and cutting the top off everest, you still get the same result :p
 

SkippyMcHaggis

New member
Joined
Oct 24, 2003
Messages
65
Points
0
again, it's a memory, and i can't site a source for it, but i seem to recall mention of Apple using the original masters.

which means:

when you record in a studio you can record digital or analog...which i'm sure you know.

when you record digital, you record raw uncompressed data for each track. they're really large files.

so when you have a song with a vocal track, a guitar track, and say a percussion track, you've got 3 tracks, and each track for a 3 minute song could be 1 gig of data. so you have a 3 gig song.

(john mayer's song "daughters" is a good example of a small master file. it's only got vocal, guitar and percussion and it's about 1 tenth the size of hmm..."home life" which has a gazillion layers.)

when you use that 3 gig source file as the original source for a 128kbps compression file you have a much larger pool of sound data to filter down. if your compression algorithm is good, you'll have a much better sounding file at 128kbps than a file made from a 1440kpbs aiff/cd audio file.

at least that's how i think about it.
statistically you have a much larger sample group so your end result will be a better sample than if you only used an original group that's one tenth the size.

am i wacko? or does that make sense?

skip
 

SkippyMcHaggis

New member
Joined
Oct 24, 2003
Messages
65
Points
0
to try it with the snowcone/everest metaphor:

i'm not sure it's really cutting the tip off the mountain.

it's more like sampling from the entire mass, but ending with identically sized lumps.

again, i'm not an expert, and this is just how it seems to me. i'll gladly be proved wrong, and be forced to wonder why a 128kpbs file sounds nearly as good as a 320kbps file, when i can normally pick out noticable quality variations in neighboring bitrates when you look at the spectrum.

skip
 

..i

New member
Joined
Oct 10, 2003
Messages
34
Points
0
Location
Ireland
Yes, you are right Skippy, where possible Apple does encode its ITMS files from the original source (if the record label can provide it) and as you say, this can have a positive effect on the sound quality in the end file. By the way neither Napster 2 or Musicmatch employ this method, which is one of the reasons I use ITMS.
 

Adam

Brushed Steel Guy
Joined
Oct 14, 2003
Messages
2,698
Points
0
Age
38
Location
Sydney, Australia
What I was trying to say, is that 128kbps is a set height of the mountain, and that regardless of how high the mountain is, your going to cut the top off it. But then again that is a oversimplified way of looking at it.

There may be a difference, but I don't think it would be anywhere near noticeable.
 

saronian

New member
Joined
Sep 29, 2003
Messages
20
Points
1
Location
San Francisco
The difference is better illustrated using digital photography as an example.

Photograph some newsprint with a 5 megapixel camera and then use a 3 megapixel camera. Now resize each photograph to the same small dimensions, say 400x300 pixels.

While both images now consist of the same number of pixels, the 5MP source will have clearer text than the 3MP source. This is the result of having more source information from which to choose while the algorithm decides which pixels to eliminate.

The amount of difference it makes in the final image (or song) also depends on the information being compressed.

Scott
 

SkippyMcHaggis

New member
Joined
Oct 24, 2003
Messages
65
Points
0
i should clarify, now that i know the lingo used around here a bit better. when i say "itunes file" i mean an ITMS file (iTunes Music Store file).

it's probably pretty important to clarify that, as an iTunes file can be any file ripped/created in iTunes.

sorry for the lack of clarity.

and saronian, in your example, isn't it also extremely important to know what your compression algorithm is going to do? because like you mentioned, while the compression is important, it's still important how you compress it.

that probably mucks things up a bit, but i think it has to be included.

here's another attempt to analogize (is that a word?):

your source file is a swimming pool.
your cd audio file is a 1 gallon bucket filled with a sample from the swimming pool.
your 128kbps file is a pop can filled with a sample from one of the above.

if it's the swimming pool, you can will have a nice sample from the entire volume.

if it's the bucket, you're sampling a sample, regardless of quality, it's still less data to work with.

does that work?

Skip
 

Adam

Brushed Steel Guy
Joined
Oct 14, 2003
Messages
2,698
Points
0
Age
38
Location
Sydney, Australia
This thread should be called the metaphoric thread :p

SkippyMcHaggis, thats actually a good way of looking at it, you get a better 'average'. I still think that all of this means squat if you rip a song from a CD at >192kbps AAC.
 

..i

New member
Joined
Oct 10, 2003
Messages
34
Points
0
Location
Ireland
Adam said:
What I was trying to say, is that 128kbps is a set height of the mountain, and that regardless of how high the mountain is, your going to cut the top off it. But then again that is a oversimplified way of looking at it.

There may be a difference, but I don't think it would be anywhere near noticeable.
The difference is actually quite substantial.
I will try to explain it, I will start with a statement that sounds totally wrong but stay with me.

Technically CD audio is considered inferior quality to a 256KBps MP3 (and equal to approx 244kbps - this varies depending on song). This does not mean that if you rip a CD to 256KBps, you will have the same quality as is on the CD.

This is because CD audio and MP3 are completely different types of compression just like Jpeg compression is completely different to GIF compression, I will use thes two image compression standards in an analogy.
There are 2 images one of them is an uncompressed bitmap (original source) and the other is a Jpeg that was created from the bitmap which contains some compression (CD Audio), you decide that you want to convert these two images into GIF. The GIF image created from the original source is going to be higher quality because it has only be compressed by GIF, whereas the second image will have gone under compression from both these standards.

That is my attempt at explaining it, I know that it is probably not going to make any sense to most people but it's very hard to explain. Try reading it step by step. If I can think of a better analogy i'll post it later.
 

Adam

Brushed Steel Guy
Joined
Oct 14, 2003
Messages
2,698
Points
0
Age
38
Location
Sydney, Australia
Wouldn't it be good if all of us new the actual technical terms, we suck :)

So your kind of saying you have the source, and the source is downgraded for CD, then downgraded again for MP3, in three stages. But with the ITMS, it only has two stages? Thats understandable.
 

saronian

New member
Joined
Sep 29, 2003
Messages
20
Points
1
Location
San Francisco
This is fun! There are a bunch of ways to illustrate the same thing. But in the mountain of snow and swimming pool examples you are assuming the final song is constructed from a sample equal in size to the final song file.

What is really happening is the whole mountain, or the whole swimming pool is available to be squeezed into the final file. Every bit of the "big" source is being used by the compressor, and the increased amount of data presents a more detailed mass to make the reduction "smoother."

The compressor is like a knife with a fixed blade thickness. It will make a cut that appears more precise through a big object versus a small object, even though you still end up with the same size object. The small one will have rougher edges because of the fixed thickness of the blade.

OK. I may have just lost my mind.:confused:
 

..i

New member
Joined
Oct 10, 2003
Messages
34
Points
0
Location
Ireland
Adam said:
Wouldn't it be good if all of us new the actual technical terms, we suck :)

So your kind of saying you have the source, and the source is downgraded for CD, then downgraded again for MP3, in three stages. But with the ITMS, it only has two stages? Thats understandable.
Sorry I finished the post before I read saronian's post and before I realised all the analogies being thrown around.

The main point is that MP3 compression is a completely different type of compression to CD audio compression, that is why I used JPEG and GIF as an example. I still feel that I am not explaining it properly, but it's the best I can do.
 

..i

New member
Joined
Oct 10, 2003
Messages
34
Points
0
Location
Ireland
saronian said:
This is fun! There are a bunch of ways to illustrate the same thing. But in the mountain of snow and swimming pool examples you are assuming the final song is constructed from a sample equal in size to the final song file.

What is really happening is the whole mountain, or the whole swimming pool is available to be squeezed into the final file. Every bit of the "big" source is being used by the compressor, and the increased amount of data presents a more detailed mass to make the reduction "smoother."
Yeah, but you are missing the point about the 2 formats using completely different ways to compress the music, that is why I use the Jpeg and GIF analogy and not the swimming pool one.

While the analogy about the swimming pool is correct, it is not the only factor, the biggest factor is that:

MP3 and CD audio use completely different compression methods and therefore the difference between an MP3 file ripped from a CD and an MP3 file ripped from the original audio is quite large. This is a link to a working example that I made, hopefully it will explain it better
 

seinman

Stern but Groovy Master
Joined
Sep 7, 2003
Messages
76
Points
0
Age
40
Location
Williamsburg, VA
Website
www.seinman.net
Also, don't forget about equipment used. I'm sure Apple uses studio-quality playback devices and extremely high end encoders to make the iTunes Music Store songs. So of course they'll sound better than using a $15 CD-ROM and your home computer to encode. Add in the fact that they have the master recordings in most cases, and you can easily figure out why they sound better than ripped songs.
 

SkippyMcHaggis

New member
Joined
Oct 24, 2003
Messages
65
Points
0
that's a great little page ..i.

i like the photo, too. did you take it or does it come from somewhere else?

the photo analogy is probably the closest. i'm not sure you're really missing too much in the example ..i.

a more metaphysical example could be to go to Plato's Cave.

our source file is the the people behind us in the cave casting shadows on the wall we're facing. our CD audio file is the shadows that we're watching. and the cd rip would be the shadows that we can see being cast by some guy standing there with a flashlight and sock puppets imitating the original shadows.

the ITMS file could be created by a bunch of shadows that a bunch of trained actors on a stage with a nice big spotlight in a little private annex to the cave.

we've got hardware, and compression quality and method all taken care of (i think.). yay, for Plato. feel free to critique this, it's late and i may be missing something in the fun.

Skip
 

..i

New member
Joined
Oct 10, 2003
Messages
34
Points
0
Location
Ireland
seinman said:
Also, don't forget about equipment used. I'm sure Apple uses studio-quality playback devices and extremely high end encoders to make the iTunes Music Store songs. So of course they'll sound better than using a $15 CD-ROM and your home computer to encode. Add in the fact that they have the master recordings in most cases, and you can easily figure out why they sound better than ripped songs.
Exactly, that is another point I forgot to mention, as far as I know Quicktime uses an enhanced version of the 'best' quicktime encoder, when we burn a CD through iTunes the encoder used is 'better' which is inferior to best, but the reason iTunes uses 'better' when it rips CD's is because the 'best' encoder does not provide any increases in sound quality when ripped from a CD.

SkippyMcHaggis said:
that's a great little page ..i.

i like the photo, too. did you take it or does it come from somewhere else?

the photo analogy is probably the closest. i'm not sure you're really missing too much in the example ..i.

a more metaphysical example could be to go to Plato's Cave.

our source file is the the people behind us in the cave casting shadows on the wall we're facing. our CD audio file is the shadows that we're watching. and the cd rip would be the shadows that we can see being cast by some guy standing there with a flashlight and sock puppets imitating the original shadows.

the ITMS file could be created by a bunch of shadows that a bunch of trained actors on a stage with a nice big spotlight in a little private annex to the cave.

we've got hardware, and compression quality and method all taken care of (i think.). yay, for Plato. feel free to critique this, it's late and i may be missing something in the fun.

Skip
I think that's right (all these analogies are getting confusing).

Quick summary of my main point (Sorry):

CD audio has a different type of lossy compression to MP3

SO an MP3 ripped from a CD has the compression artefacts of both CD compression and MP3 compression.

The reason I keep emphasizing that they use different compression methods is because if MP3 used the same compression method as CD audio (but compressed it more) then very little would be gained from going back to the original which is why the swimming pool analogy is not the only factor.

When I say MP3, I include AAC, WMA, Ogg etc.. as they all optimised versions of MP3

phew! That's it, I wont bore you anymore

By the way, a full-size version of the picture is available here and a different one here
 

HiRez

New member
Joined
Oct 17, 2003
Messages
284
Points
0
Location
San Francisco, CA
I'll just throw one more analogy in: most "high-end" primetime tv shows (West Wing, 24, Frasier, etc.) are shot on 35mm film, and you can tell they look better. They could shoot the same show on consumer camcorders, after all it is all going to be transmitted through the same cable and end up on the same tv set (yours), but starting with the better source still matters. A lot.
 

lechonlubber

New member
Joined
Oct 10, 2003
Messages
111
Points
0
Location
Bucks County, PA
CD audio has a different type of lossy compression to MP3

SO an MP3 ripped from a CD has the compression artefacts of both CD compression and MP3 compression.

I agree with you that AAC files made directly from the masters can sound better than CD's ripped by users. But this may have to do with the higher quality control and superior equipment.

CD audio is not really compressed as in MP3 and AAC. The data on an audio CD is a digital representation of the actual music. No algorithms are used to remove data as in MPEG compression. The limiting factors in CD audio are sampling rate and bit width. These limits are well beyond our normal hearing limits and the limts of current consumer quality equipment.
 
Last edited:

monkeyking84

MonkeyPodder
Joined
Jun 9, 2003
Messages
69
Points
0
Well, if it matters, I noticed that the iTMS versions of Aqua (both albums) were better quality than the 192kbps MP3 files I previously had. Needless to say, I bought the two albums. =)
 
Top