Music Encoding -- what to believe?
Feb 11, 2006 at 5:22 AM Thread Starter Post #1 of 8

Cyrilix

500+ Head-Fier
Joined
May 31, 2005
Posts
715
Likes
10
Someone tells me that when you buy a CD, the tracks on the CD are encoded at 128 kbps, and that you decompress it to higher bitrates to make it sound better.

Now, in all my experience and discussions on any music forum or likewise, I've never heard of decompressing the music to a higher bitrate to make it sound better. To me, the concept is clear: if it's originally 128 kbps, it only has 128 kbps (kilobits or kilobytes per second) of data that it feeds to your music player (eg: foobar, winamp). What he said was along the lines of:

"I meant enchancing it by increasing how much info is unpacked per second by re encoding"

Now, that logic to me sounds inherently false, can someone confirm if I'm correct or not? To me, you only have so much info with 128 kbps, how do you "increase how much info is unpacked"?
 
Feb 11, 2006 at 11:04 AM Post #2 of 8
I've never heard about the 128 kbps, I think thats inaccurate. When I rip things, I can tell the difference between a 128 rip and 256+ rip. I don't know the exact quality of the original data but I'm sure it's higher than 128
 
Feb 11, 2006 at 11:08 AM Post #3 of 8
Quote:

Originally Posted by Cyrilix
Now, in all my experience and discussions on any music forum or likewise, I've never heard of decompressing the music to a higher bitrate to make it sound better. To me, the concept is clear: if it's originally 128 kbps, it only has 128 kbps (kilobits or kilobytes per second) of data that it feeds to your music player (eg: foobar, winamp).


With lossy encoding, it's absolute that transcoding always loses sound quality. Doesn't matter if you transcode the MP3 from 128k to 1,000,000k, it will sound equal to or worse than the original. Most likely worse, because encoders are designed to deal with data that's never been compressed.

If you want the benefits of higher bitrates, you have to re-encode from the (noncompressed or losslessly compressed) source to get them. There's no doubt about this whatsoever; your friend is simply wrong.
 
Feb 11, 2006 at 1:34 PM Post #4 of 8
Stop buying pirate CDs!
 
Feb 11, 2006 at 1:42 PM Post #5 of 8
Quote:

Originally Posted by Cyrilix
Someone tells me that when you buy a CD, the tracks on the CD are encoded at 128 kbps, and that you decompress it to higher bitrates to make it sound better.


The audiotracks on a redbook CD are encoded at approximately 1411 kbps.

Decompressing 128 kbps files (mp3, wma, aac and the like) to higher bitrates actually makes the files sound worse.

Please correct me if I'm wrong.

More info:
http://www.teamcombooks.com/mp3handbook/11.htm
 
Feb 11, 2006 at 8:42 PM Post #6 of 8
Thanks for the links, guys! I'm assuming that my friend was talking about audio remastering, and adding data to a low-bit rate encoded music file, but in terms of CDs, 1411 kbps is exactly what I thought it always was.
 
Feb 12, 2006 at 8:13 AM Post #7 of 8
Wikipedia shouldn't really be used as a "reference", but this is a good article regardless: http://en.wikipedia.org/wiki/Red_Boo...CD_standard%29

The article gives this equation:

Bit rate = 44100 samples/s × 16 bit/sample × 2 channels = 1411.2 kbit/s (more than 10 MB per minute)

44.1 kHz and 16-bit mastering used to be the standard for all audio CDs produced for a long time. Only recently have studios increased sound quality by recording at rates up to 96 kHz, and using 24-bit sampling for the original master (4608 kb/s), downsampling to 16-bit for CD production. Results in better sound quality you can really hear on high-end equipment. Some artists now use 24-bit mastering exclusively.
 
Feb 12, 2006 at 8:30 AM Post #8 of 8
Quote:

Originally Posted by Asr
Wikipedia shouldn't really be used as a "reference", but this is a good article regardless: http://en.wikipedia.org/wiki/Red_Boo...CD_standard%29

The article gives this equation:

Bit rate = 44100 samples/s × 16 bit/sample × 2 channels = 1411.2 kbit/s (more than 10 MB per minute)

44.1 kHz and 16-bit mastering used to be the standard for all audio CDs produced for a long time. Only recently have studios increased sound quality by recording at rates up to 96 kHz, and using 24-bit sampling for the original master (4608 kb/s), downsampling to 16-bit for CD production. Results in better sound quality you can really hear on high-end equipment. Some artists now use 24-bit mastering exclusively.




Yes, this is correct.
 

Users who are viewing this thread

Back
Top