Is it a good idea to convert FLAC files to AAC files for listening with AAC bluetooth headphones?
Apr 18, 2021 at 2:41 PM Post #2 of 12
Apr 18, 2021 at 4:50 PM Post #3 of 12
Or just stick to FLAC, or even MP3?
So most devices over BT will have a workflow similar to this: source file -> decoding -> PCM -> mixing -> encoding -> codec (AAC/APTX/MP3*/SBC/etc.). From here the audio is sent over to the headphone which has the following workflow: decoding -> PCM -> additional DSP -> DAC -> amp -> driver.

Note how the first step has the file decoded whether or not it's a FLAC/AAC/mp3, etc. but then it's re-encoded into whatever codec the headphone will be using. So in theory your FLAC file will end up encoded to AAC/MP3/APTX/whatever based on the codec it's supporting before being sent to the headphones. In theory that first pipeline will have the audio re-encoded into AAC from whatever format it was in. So you could in theory see gains from avoiding a single lossy -> lossy encoding. Whether or not you'll hear a difference is going to be up to debate. You likely won't as AAC 256 and MP3 320 is generally very transparent, even when compared to a lossless format. The quality of the decoder and encoder on the source as well as the decoder on the BT headphone will likely make the bigger difference when it comes to overall audio quality. All that being said, if you're tight on space for your portable device, it may be worth using a lossy compression. However, if you're not (or are planning to keep both copies of the music on the same device), there isn't any gain in encoding the files yourself.

*I should note that in the past, an MP3 codec for BT has existed, but it's kind of gone away with APT-X becoming the norm for Android devices.
 
Last edited:
Apr 18, 2021 at 4:55 PM Post #5 of 12
Surely the question is whether you can tell the difference between the formats?
I don't think it is. I think it has to do with how much processing power, space, etc. it takes to decode something that isn't going to appear at full resolution out the earbuds anyway.
 
Apr 18, 2021 at 5:01 PM Post #6 of 12
I don't think it is. I think it has to do with how much processing power, space, etc. it takes to decode something that isn't going to appear at full resolution out the earbuds anyway.
But - how does it sound? Storage space won't affect how it sounds, will it? Will 'processing power' and 'time to decode' really make so much difference these days? A difference that you can hear?
 
Apr 18, 2021 at 5:03 PM Post #7 of 12
But - how does it sound? Storage space won't affect how it sounds, will it? Will 'processing power' and 'time to decode' really make so much difference these days? A difference that you can hear?
I meant this is a lot of calculations and guessing but at the end of the day AAC bluetooth is lossy and is a degradation over the source anyway. If you want hifi you don't use bluetooth. If you use bluetooth don't worry about picking the best codec...
 
Apr 18, 2021 at 5:06 PM Post #8 of 12
I meant this is a lot of calculations and guessing but at the end of the day AAC bluetooth is lossy and is a degradation over the source anyway. If you want hifi you don't use bluetooth. If you use bluetooth don't worry about picking the best codec...
Well I guess I'm lucky enough not to hear any difference at all between Bluetooth codecs, or indeed between Bluetooth and the wired headphones I have.
 
Apr 18, 2021 at 5:31 PM Post #9 of 12
But - how does it sound? Storage space won't affect how it sounds, will it? Will 'processing power' and 'time to decode' really make so much difference these days? A difference that you can hear?
Implementation of the codec on the source and headphone devices will definitely make a difference in how things sound. For example, Android devices have historically been known to have mixed quality implementations of AAC while Apple tends to have a very good implementation. For the most part, differences between things like SBC and other higher quality codecs can typically be heard as well.

However, as you start going to higher quality codecs, their transparency from one another becomes smaller and smaller. Granted this is a different debate, but majority of listeners may not be able to discern differences between 256 AAC, 320 MP3, and lossless. But keep in mind, this idea is heavily debated on its own.

Regarding processing power and time to decode... Processing power is going to be reliant on the audio processor itself while time to decode will be based on both the processing power and implementation of the codecs being used. Just like you can code something to run very efficiently and fast, you can also code something to be very very inefficient and slow... to make the latter code fast you'd cut corners. This sort of shoddy programming is unlikely to happen, but wouldn't be unheard of.

Despite being very small, I'll add that different ways of computing things can influence how accurate the computation is. This is unlikely to affect encoding and decoding much in terms of audibility. Though for more advanced DSP it could definitely crop up. There's an entire field of mathematics that is devoted to studying computation error.
 
Apr 18, 2021 at 5:40 PM Post #10 of 12
Implementation of the codec on the source and headphone devices will definitely make a difference in how things sound. For example, Android devices have historically been known to have mixed quality implementations of AAC while Apple tends to have a very good implementation. For the most part, differences between things like SBC and other higher quality codecs can typically be heard as well.

However, as you start going to higher quality codecs, their transparency from one another becomes smaller and smaller. Granted this is a different debate, but majority of listeners may not be able to discern differences between 256 AAC, 320 MP3, and lossless. But keep in mind, this idea is heavily debated on its own.

Regarding processing power and time to decode... Processing power is going to be reliant on the audio processor itself while time to decode will be based on both the processing power and implementation of the codecs being used. Just like you can code something to run very efficiently and fast, you can also code something to be very very inefficient and slow... to make the latter code fast you'd cut corners. This sort of shoddy programming is unlikely to happen, but wouldn't be unheard of.

Despite being very small, I'll add that different ways of computing things can influence how accurate the computation is. This is unlikely to affect encoding and decoding much in terms of audibility. Though for more advanced DSP it could definitely crop up. There's an entire field of mathematics that is devoted to studying computation error.
Thank you for your reply. No matter how hard I listen, I can't tell the difference between codecs. I don't disagree that some people may be able to - my point is that, has one determined that one can? If one can, then get the 'best'. If not...
 
Apr 19, 2021 at 3:34 AM Post #11 of 12
So most devices over BT will have a workflow similar to this: source file -> decoding -> PCM -> mixing -> encoding -> codec (AAC/APTX/MP3*/SBC/etc.). From here the audio is sent over to the headphone which has the following workflow: decoding -> PCM -> additional DSP -> DAC -> amp -> driver.

Note how the first step has the file decoded whether or not it's a FLAC/AAC/mp3, etc. but then it's re-encoded into whatever codec the headphone will be using. So in theory your FLAC file will end up encoded to AAC/MP3/APTX/whatever based on the codec it's supporting before being sent to the headphones. In theory that first pipeline will have the audio re-encoded into AAC from whatever format it was in. So you could in theory see gains from avoiding a single lossy -> lossy encoding. Whether or not you'll hear a difference is going to be up to debate. You likely won't as AAC 256 and MP3 320 is generally very transparent, even when compared to a lossless format. The quality of the decoder and encoder on the source as well as the decoder on the BT headphone will likely make the bigger difference when it comes to overall audio quality. All that being said, if you're tight on space for your portable device, it may be worth using a lossy compression. However, if you're not (or are planning to keep both copies of the music on the same device), there isn't any gain in encoding the files yourself.

*I should note that in the past, an MP3 codec for BT has existed, but it's kind of gone away with APT-X becoming the norm for Android devices.
So you're saying, theoretically, it's better to use FLAC files and I should keep using them unless I am tight on space, in which case I use AAC right?
 
Apr 19, 2021 at 10:12 AM Post #12 of 12
So you're saying, theoretically, it's better to use FLAC files and I should keep using them unless I am tight on space, in which case I use AAC right?
Theoretically FLAC is better. You should keep them regardless somewhere (if you’re tight on space get an external HDD and store there). But don’t delete the FLAC files as you can’t recreate them through a lossy format.
 

Users who are viewing this thread

Back
Top