1 CD, 2 Rips, Bitrate differences in extracted FLACs?
Nov 18, 2011 at 12:40 AM Thread Starter Post #1 of 14

mrksgrn

Head-Fier
Joined
Aug 16, 2011
Posts
61
Likes
11
As the title says, I have one CD that I ripped in EAC twice to extract 2 copies of the WAV CD image. In splitting and coverting these 2 WAV files, the resulting 2 FLAC files of the same track have small (<10kbps) bitrate differences.
 
For the record, EAC stated that the rip quality was 100.0% for both times.
 
Can anyone explain this bitrate difference and its implication, if any? I had assumed that a 100% quality rip entailed perfect CD quality with no variation. 
 
 
 
Nov 18, 2011 at 12:57 AM Post #2 of 14
I can't speak to whether the 2 rips differed at all, though I'd highly doubt any difference that may exist would be significant at all (aka 1 bit in a lot)
 
However, the FLAC encode can be different.  Perhaps while you performed one compression, your antivirus software kicked in, or your processor was slightly warmer than normal.  With the CPU not working as efficiently, then at the same level of encoding, the compression could be SLIGHTLY less efficient at finding better patterns. 
 
Just a guess, but it seems logically consistent
 
Nov 18, 2011 at 2:50 AM Post #4 of 14
Interesting. bit comparator says no differences found despite a 6kbps quality difference. I know this amount may be insignificant, but I find it absolutely strange that this translate to no difference at all. What does this say, then, about FLAC bitrate in general?
 
Quote:
If you use foobar, try using the bit comparator to see if there are truly any differences.



 
 
Nov 18, 2011 at 5:18 PM Post #7 of 14
As El_Doug suggested it might be related to CPU load.
All I know is that the compression level in FLAC means the amount of effort FLAC might spend to find the best possible compression.
If this is a limit in time, then it is possible that the efficiency of the compression is system load dependent.
 
An interesting experiment would be:
  1. Convert a WAV to FLAC on a nearly inactive system
  2. Convert the same WAV on a heavy loaded system, play a video, do a backup, some gaming, etc so lots of CPU and I/O
Wonder if this will make a difference.
 
Nov 19, 2011 at 12:07 AM Post #8 of 14
Quote:
As El_Doug suggested it might be related to CPU load.
All I know is that the compression level in FLAC means the amount of effort FLAC might spend to find the best possible compression.
If this is a limit in time, then it is possible that the efficiency of the compression is system load dependent.
 
An interesting experiment would be:
  1. Convert a WAV to FLAC on a nearly inactive system
  2. Convert the same WAV on a heavy loaded system, play a video, do a backup, some gaming, etc so lots of CPU and I/O
Wonder if this will make a difference.


That's absurd.  It will just take longer to encode if the CPU is busy doing something else.  FLAC encoding is deterministic.  The bitrate won't change if the encoding settings are the same and the input was identical.
 
The OP should use the accuraterip feature of EAC + a test & copy to ensure the rips are correct
 
Nov 19, 2011 at 12:50 AM Post #9 of 14
I ripped Journey's "Don't Stop Believin'" to ALAC , once with iTunes and once with DBpoweramp. The rip encoded with iTunes is 900kbps. The rip encoded with DBpoweramp converter is 901kbps. Each of these rips sounds identical played with iTunes media player, or played on foobar. I don't know what the addition or absence of one kbps means.
 
Nov 19, 2011 at 1:12 AM Post #10 of 14
Quote:
That's absurd.  It will just take longer to encode if the CPU is busy doing something else.  FLAC encoding is deterministic.  The bitrate won't change if the encoding settings are the same and the input was identical.


I agree.
 
My guess is that the encoder was having a bad day, which led to a less than optimal compression ratio.
tongue.gif
And as from the bit comparison, this difference in compression had absolutely no effect on the sound quality.
 
Perhaps one was encoded in level 7 compared to level 8, or something like that.
 
Nov 19, 2011 at 4:03 AM Post #11 of 14

In my case, both were level 8 FLAC. But I will try the test again.
Quote:
I agree.
 
My guess is that the encoder was having a bad day, which led to a less than optimal compression ratio.
tongue.gif
And as from the bit comparison, this difference in compression had absolutely no effect on the sound quality.
 
Perhaps one was encoded in level 7 compared to level 8, or something like that.



 
 
Nov 19, 2011 at 5:12 AM Post #12 of 14
Maybe it's about the silent groove, that's to say, spacing between selections.
 

Users who are viewing this thread

Back
Top