Originally Posted by hockeyb213
Can someone briefly explain to me how v0, v1, v2 compare? I never used ogg but I need to have the knowledge for a friend and I can't find information on which is better then the other and whatnot. I know Ogg uses vbr instead of cbr which is a more optimal way of compressing the file.
Well... ogg uses quality settings (q0, q1, q2, etc), not V settings. They're very similar to LAMEs VBR -v settings except the higher number is the better quality for oggs. So q5 is better than q2.
To give you an idea. Q4 is about an average of 128kbps. Q6 is about 192kbps. The main difference with oggs that I find is lower bitrates have higher quality audio compared to mp3. It seems mostly to do with how mp3s drop off the higher frequency sounds. Oggs handles that differently and gives a much better listening experience for the lower bitrate files, in my opinion.
I find a 192kbps ogg (q6) to be transparent from a CD, while it takes a -V0 mp3 to do the same (yielding about 230-270kbps average). I do have extremely sensitive high frequency hearing so low bitrate mp3s are immediately obvious to me and very annoying. Most people probably can't tell... but I can hear the 18khz+ crap it drops out and tries to fill in randomly.
If you google around you can find graphs and stuff showing how the different formats alter the audio. It's interesting stuff.