iPhone AAC vs. Aptx and Aptx-hd real world
Jun 17, 2018 at 11:26 AM Post #121 of 315
look I really do not want this plague of a discussion to ruin yet another topic. please take it to a topic about human hearing, bit depth, dynamic or whatever it is you're trying to say. we can probably find a dozens of those topics still open on Headfi, so pick one if the subject really interests you and maybe link your answer in here so that the few who care, can go there and follow up on the discussion.
if you find a BT headphone with "110dB in precision"(whatever that means) at your usual listening level, maybe I'll reconsider all this as being relevant to this topic. then I'll probably go and buy that headphone. ^_^

Having fun yet? I generally disagree with not being allowed to flesh out answers with relevant info from surrounding fields, I believe a brief discussion about dynamic range was relevant to the understanding of precision when comparing different audio codecs.

If you are genuinely interested in quality Bluetooth headphones, consider modularising... Have a look at the Radsone Ear Studio ES100, which provides a portable unit that you can plug unbalanced or balanced headphones into with better than average sound quality using 2 of these internally:

https://www.akm.com/akm/en/file/datasheet/AK4375AECB.pdf

Straight out S/N ratio of 110 dB and some low distortion too.

It has support for AAC if you're Apple, or AptX HD if you have a device that supports transmitting it, AptX too, but I personally wouldn't use that unless you had to.

It sells itself so I don't have to and the designer/producer/manufactuer is active on head-fi with respect to fw updates etc. @wslee
 
Jun 17, 2018 at 11:51 AM Post #122 of 315
Welcome to head-fI Sound Science, and thanks for joining us.

Opus is pretty cool. You can grab the encoder pack for Foobar2000 and play around with it there if you are interested. My understanding, and I am no expert, is that it is state of the art, even beyond Apple AAC. However, for some mainstream uses, it has major compatability issues.

https://opus-codec.org/

I'm just going to leave this quote from Wikipedia about dynamic range:

"In 1981, researchers at Ampex determined that a dynamic range of 118 dB on a dithered digital audio stream was necessary for subjective noise-free playback of music in quiet listening environments."

That is the number that concurs well with my understanding, despite coming from 1981.
There does seem to be conflict about it in general, I guess because terms need to be clearly defined regarding what's being measured.
 
Last edited:
Jun 17, 2018 at 12:30 PM Post #123 of 315
Having fun yet? I generally disagree with not being allowed to flesh out answers with relevant info from surrounding fields, I believe a brief discussion about dynamic range was relevant to the understanding of precision when comparing different audio codecs.

If you are genuinely interested in quality Bluetooth headphones, consider modularising... Have a look at the Radsone Ear Studio ES100, which provides a portable unit that you can plug unbalanced or balanced headphones into with better than average sound quality using 2 of these internally:

https://www.akm.com/akm/en/file/datasheet/AK4375AECB.pdf

Straight out S/N ratio of 110 dB and some low distortion too.

It has support for AAC if you're Apple, or AptX HD if you have a device that supports transmitting it, AptX too, but I personally wouldn't use that unless you had to.

It sells itself so I don't have to and the designer/producer/manufactuer is active on head-fi with respect to fw updates etc. @wslee
to contest some of your points requires specific examples or really deep digging about the available research if we wish to get anywhere. both of which will drag this thread into a long off topic and inevitably into heated argumentation because the limits of human hearing is something people love to fight over. it's not my first dance, which is why I simply suggested to go continue this specific discussion(not to stop it!) in a proper thread. I'm not against having that conversation at all. I would just like it if this topic to survive it.
now if this is going to turn into a we should go somewhere else vs we shouldn't, for 5 pages, it really defeats the purpose of me trying to limit off topic.:sweat_smile:

a chipset's spec is not the fidelity of a device. I truly wish it was.
 
Jun 17, 2018 at 12:50 PM Post #124 of 315
to contest some of your points requires specific examples or really deep digging about the available research if we wish to get anywhere. both of which will drag this thread into a long off topic and inevitably into heated argumentation because the limits of human hearing is something people love to fight over. it's not my first dance, which is why I simply suggested to go continue this specific discussion(not to stop it!) in a proper thread. I'm not against having that conversation at all. I would just like it if this topic to survive it.
now if this is going to turn into a we should go somewhere else vs we shouldn't, for 5 pages, it really defeats the purpose of me trying to limit off topic.:sweat_smile:

a chipset's spec is not the fidelity of a device. I truly wish it was.

I defended my position regarding being on topic, I stand by it. Codecs in the real world.

A chipset's spec is not the fidelity of the device... I couldn't agree more, but it is where the chain starts, unless it is a resistor ladder or something with no integrated circuits.
 
Jun 17, 2018 at 9:25 PM Post #125 of 315
If this is off-topic please let me know. I don't even know enough to know what is off-topic, but I am very curious about all of this. I am afraid I am going to have to drag this down to my level because that is about where I am on the learning curve.

I have bluetooth 4.2 in my computer via an m.2 card. Is there anything special about that compared to prior versions of bluetooth? It seems to give me more reliable connections than I have had before and to remember my preferences better than earlier versions. I don't know if that's due to advances in the bluetooth or Windows itself or some combination.

I could buy a bluetooth 5.0 m.2 card for the slot to put in in place of the 4.2, for not very much money at all (they are both wifi plus bluetooth m.2 cards). Would this be of practical benefit?

My m.2 wifi +bluetooth card is an Intel card and I regularly get updates for it right off of the Intel sight, so I feel pretty good about that.

I truly don't understand how AAC versus Aptx versus aptx-hd comes into play with what I am doing, or if there is anything I can do at my computer to choose the best technology. I suppose i am as interested in reliability of connectivity as sound quality. I'm pretty happy with the sound quality. I'm really not too picky about sound quality. Things that are obviously wrong, I take care of, but if everything is running well I am pretty pleased.

FWIW, I turn off my graphics card for anything but photography because my CPU chip can handle anything else that comes its way with aplomb. I also use SSD drives to the extent it is practical. Those steps greatly reduce the background noise from my computer to almost nil so I can enjoy the music more, in a very practical way.

For audio from my computer I mainly use bluetooth to listen to two bluetooth speakers, one or the other. I've got a really rare top-flight Samsung speaker that I bought at Best Buy that was discounted at a fraction of the original price, and a Marshall bluetooth speaker that is less hifi but more convenient, which I got for a big discount at a holiday sale. (Yes, I am kind of a deal-hunter at times.) The Samsung will also do airplay and wifi and ethernet and line in and take a usb stick but I'm not using these options from day to day. Those are probably technically better hifi options than bluetooth, I would guess. Here is Samsung's link to product information about the Samsung speaker: https://www.samsung.com/uk/audio-video/audio-dock-e750/ I think it was more of a concept piece for Samsung and I don't think it sold very well and I saw it sitting on a shelf at Best Buy a few years ago for about a third or fourth of the original price so I went home and researched it and came back and snapped it up. It's sitting right in front of me about five feet away. It's quite a nice visual showpiece as well as a pretty darn good speaker, but nothing close to a nice home stereo setup. I know, the tube on the Samsung is a bit much, but honestly, the look is cool, and it does give off a glow and a warmth (visually and temperature-wise, not audibly) and it's otherwise quite striking and the sound is quite fulfilling to me. Again, I think they were trying to penetrate the market with a concept piece and it never took off. I am thinking the version of bluetooth it connects with might be the limiting factor though as the speaker is some years old, but I don't have any kind of grasp as to what takes place in the audio chain where bluetooth is involved.

The Marshall is about five feet behind me: https://www.amazon.com/Marshall-Sta...099&sr=1-3&keywords=marshall+stanmore+speaker I don't think it approaches anything as close to hifi as the Samsung but subjectively and for ease of use and due to the bass and treble knobs it really hits a soft spot for me subjectively, and having the bass and treble knobs readily accessible is really nice. It also has two line ins and an optical in if I want to max out audio quality. I've kind of experimented over time and gradually arrived at bass and treble settings that I really enjoy the most and don't fiddle with them much anymore, and I almost always use the bluetooth.

I don't think either speaker approaches my home stereo sound, but for my home stereo I generally use Apple music with Apple TV.

I am just looking for practical words of wisdom and some learning about what I am doing and what I could do better, and whether bluetooth 5.0 would give me some practical benefit over bluetooth 4.2. And also, if it's not too much over my head (and it may be), a good general picture (for a layman) of what goes on in the chain from computer to speaker over bluetooth, especially as it relates to audio CODECs. It's a very interesting subject for me. I find the whole technology kind of amazing.

Thanks, everyone.
 
Last edited:
Jun 18, 2018 at 12:23 AM Post #126 of 315
If this is off-topic please let me know. I don't even know enough to know what is off-topic, but I am very curious about all of this. I am afraid I am going to have to drag this down to my level because that is about where I am on the learning curve.

I have bluetooth 4.2 in my computer via an m.2 card. Is there anything special about that compared to prior versions of bluetooth? It seems to give me more reliable connections than I have had before and to remember my preferences better than earlier versions. I don't know if that's due to advances in the bluetooth or Windows itself or some combination.

I could buy a bluetooth 5.0 m.2 card for the slot to put in in place of the 4.2, for not very much money at all (they are both wifi plus bluetooth m.2 cards). Would this be of practical benefit?

My m.2 wifi +bluetooth card is an Intel card and I regularly get updates for it right off of the Intel sight, so I feel pretty good about that.

I truly don't understand how AAC versus Aptx versus aptx-hd comes into play with what I am doing, or if there is anything I can do at my computer to choose the best technology. I suppose i am as interested in reliability of connectivity as sound quality. I'm pretty happy with the sound quality. I'm really not too picky about sound quality. Things that are obviously wrong, I take care of, but if everything is running well I am pretty pleased.

FWIW, I turn off my graphics card for anything but photography because my CPU chip can handle anything else that comes its way with aplomb. I also use SSD drives to the extent it is practical. Those steps greatly reduce the background noise from my computer to almost nil so I can enjoy the music more, in a very practical way.

For audio from my computer I mainly use bluetooth to listen to two bluetooth speakers, one or the other. I've got a really rare top-flight Samsung speaker that I bought at Best Buy that was discounted at a fraction of the original price, and a Marshall bluetooth speaker that is less hifi but more convenient, which I got for a big discount at a holiday sale. (Yes, I am kind of a deal-hunter at times.) The Samsung will also do airplay and wifi and ethernet and line in and take a usb stick but I'm not using these options from day to day. Those are probably technically better hifi options than bluetooth, I would guess. Here is Samsung's link to product information about the Samsung speaker: https://www.samsung.com/uk/audio-video/audio-dock-e750/ I think it was more of a concept piece for Samsung and I don't think it sold very well and I saw it sitting on a shelf at Best Buy a few years ago for about a third or fourth of the original price so I went home and researched it and came back and snapped it up. It's sitting right in front of me about five feet away. It's quite a nice visual showpiece as well as a pretty darn good speaker, but nothing close to a nice home stereo setup. I know, the tube on the Samsung is a bit much, but honestly, the look is cool, and it does give off a glow and a warmth (visually and temperature-wise, not audibly) and it's otherwise quite striking and the sound is quite fulfilling to me. Again, I think they were trying to penetrate the market with a concept piece and it never took off. I am thinking the version of bluetooth it connects with might be the limiting factor though as the speaker is some years old, but I don't have any kind of grasp as to what takes place in the audio chain where bluetooth is involved.

The Marshall is about five feet behind me: https://www.amazon.com/Marshall-Sta...099&sr=1-3&keywords=marshall+stanmore+speaker I don't think it approaches anything as close to hifi as the Samsung but subjectively and for ease of use and due to the bass and treble knobs it really hits a soft spot for me subjectively, and having the bass and treble knobs readily accessible is really nice. It also has two line ins and an optical in if I want to max out audio quality. I've kind of experimented over time and gradually arrived at bass and treble settings that I really enjoy the most and don't fiddle with them much anymore, and I almost always use the bluetooth.

I don't think either speaker approaches my home stereo sound, but for my home stereo I generally use Apple music with Apple TV.

I am just looking for practical words of wisdom and some learning about what I am doing and what I could do better, and whether bluetooth 5.0 would give me some practical benefit over bluetooth 4.2. And also, if it's not too much over my head (and it may be), a good general picture (for a layman) of what goes on in the chain from computer to speaker over bluetooth, especially as it relates to audio CODECs. It's a very interesting subject for me. I find the whole technology kind of amazing.

Thanks, everyone.

I would say your setup with Bluetooth 4.2 is fine, I read that Bluetooth 5.0 has twice the data rate with 4 times the range over 4.2, but I would take the range increase with a grain of salt, I think that depends on implementation, I have seen Bluetooth devices that claim to use "Class 1 Bluetooth", which just boosts the signal as far as I can tell. Like this home audio device:

https://www.aliexpress.com/item/Ava...tter-and-Receiver-Bypass-and/32861590670.html

I nearly pulled the trigger on that one, because it has digital bypass for your TV, meaning your TV audio can be working fine, then you choose some music from your phone or computer to play over Bluetooth and it switches over automatically.

I think in terms of AptX HD (finally relevant to this thread), that can be done over either 4.2 or 5.0, but needs to be available on both ends to be used at all, so probably not from your computer without fiddling around a lot and there are standalone transmitters as well as some phones that can be an AptX HD source.

If you aren't too worried about achieving a high degree of sound fidelity, then stick with what you have, the M.2 Bluetooth 4.2 is not doubt quite useful and going up would only mean getting more components to match.
 
Last edited:
Jun 18, 2018 at 12:26 AM Post #127 of 315
Thanks. :)

I would say your setup with Bluetooth 4.2 is fine, I read that Bluetooth 5.0 has twice the data rate with 4 times the range over 4.2, but I would take the range increase with a grain of salt, I think that depends on implementation, I have seen Bluetooth devices that claim to use "Class 1 Bluetooth", which just boosts the signal as far as I can tell. Like this home audio device:

https://www.aliexpress.com/item/Ava...tter-and-Receiver-Bypass-and/32861590670.html

I nearly pulled the trigger on that one, because it has digital bypass for your TV, meaning your TV audio can be working fine, then you choose some music from your phone or computer to play over Bluetooth and it switches over automatically.

I think in terms of AptX HD (finally relevant to this thread), that can be done over either 4.2 or 5.0, but needs to be available on both ends to be used at all, so probably not from your computer without fiddling around a lot and there are standalone transmitters as well as some phones that can be an AptX HD source.

If you aren't too worried about achieving a high degree of sound fidelity, then stick with what you have, the M.2 Bluetooth 4.2 is not doubt quite useful and going up would only mean getting more components to match.
 
Jun 18, 2018 at 3:11 AM Post #128 of 315
I think with Opus the only potential downside is the resampling as it is 48 khz not 44. I am honestly not sure though as to how much of an issue this may be?

Back the codec debate I am still unsure too as to how the combination of file vs bluetooth codecs stack up in he real world, e.g. on an iPhone that only has AAC bluetooth is there any advantage to using an AAC based service such as Apple Music, Amazon or Tidal vs. say Spotify? Many far more learned people than me say that AAC files are still transcoded over AAC bluetooth anyway. My perception was that Spotify sounded worse over BT but now I think this may just be down to inconsistencies with it's master files rather than anything to do with vorbis vs AAC.

I am starting to think that there is no 'real world' advantage at all but I also do not fully understand how say LDAC behaves and Android with LDAC may have the edge if your files are 320 kbps as no transcoding down to 256 is needed? As you can probably tell I am not an expert! I am just hoping that it will not be long before this argument can be put to bed.
 
Jun 18, 2018 at 4:03 AM Post #129 of 315
Apparently you can transcode AAC to AAC many times without further degradation so I suspect you’d get better overall results with Apple Music vs Spotify assuming the original quality before transcoding was equivalent. Not tested it myself though.
 
Jun 18, 2018 at 5:47 AM Post #130 of 315
Apparently you can transcode AAC to AAC many times without further degradation so I suspect you’d get better overall results with Apple Music vs Spotify assuming the original quality before transcoding was equivalent. Not tested it myself though.

Somebody said that AAC was completely transparent after 100 recursive encodings, it isn't, but it is quite good at maintaining nearly all of the quality on repeated encodings, certainly compared to other encoders - there is a video on YouTube demonstrating just how bad other encoders can be.



There are minor differences in encoders just for AAC as well. I used the Fraunhofer encoder in FFMpeg, believed to be one of the best.

Having now being informed of Opus, which is claimed to be a better codec than AAC, that might well be a very similar story.

There is no way for any of the encoders to be certain that they are presented with PCM data that was already encoded once before, so over Bluetooth, it will be encoded again. Maybe Apple can gain control over this with end-to-end Apple products because they have a closed development system and do a pass-through from the App to the decoder at the Bluetooth sink, but it would be restricted to the Apple ecosystem.

I had my time using AAC, now I'm moving forward, focusing on FLAC for storage and AptX HD for wireless until Bluetooth can be lossless.

I really don't understand why people would say AptX HD is just a gimmick, mathematically, it's not a gimmick, I suspect they might be motivated by brand loyalty - We all know Apple has very strong brand loyalty.
 
Jun 18, 2018 at 6:01 AM Post #131 of 315
The merits of aptx-hd or LDAC may be up for debate but what is for sure is that over Bluetooth on an Apple device you are capped at 250 kbps.

Whether this is really an issue in the real world is also up for debate. On paper though right now android seems the better option for Bluetooth audio?
 
Jun 18, 2018 at 6:06 AM Post #132 of 315
The merits of aptx-hd or LDAC may be up for debate but what is for sure is that over Bluetooth on an Apple device you are capped at 250 kbps.

Whether this is really an issue in the real world is also up for debate. On paper though right now android seems the better option for Bluetooth audio?

No one wants wants to experience the wrath of a Fanboi, Apple or Android!

Sometime in the future, I might have the equipment to analyse the results of comparing them purely digitally, statistically. That's probably the only way to settle this.
 
Jul 14, 2018 at 12:35 AM Post #133 of 315
These codecs works differently from one another [AAC/Aptx]so they can't be compared directly like that and making blanket statements .Aptx splits the audio into 4 different sub bands and apply data reduction independently from one another, AAC works differently much closer and basically an improved version to MPEG and MP3 . All this reminds me of the old DD/DTS debate, where people just made up a bunch of incorrect theories why the preferred DTS over Dolby.

aptX is superior to AAC because it uses time domain ADPCM instead of perceptual encoding (which are based on psychoacustic models) which are commonly used by MP3, AAC and WMA. This makes aptX more efficient than other lossy codecs.

aptX sounds more like WavPack Hybrid Lossy, since the compression artifacts sounds identical on low frequencies (mild distortion + cassette like hiss)

So if there any AAC file that has been transcoded into aptX, there will be minimal changes to SQ as aptX will not "shave" off very high or low frequencies like what MP3 does, rather it varies the size of the quantization step, to allow further reduction of the required data bandwidth for a given S/N ratio.

So a 16 bit AAC file will be transcoded by the BT device into a 6 or 8 bit ADPCM file - which is 99% indistinguishable from the original file, and that is what you are going to hear when using an aptX device.
 
Last edited:
Jul 14, 2018 at 5:34 AM Post #134 of 315
Except that aptX audibly degrades the sound in certain frequencies and AAC is transparent.
aptX is superior to AAC because it uses time domain ADPCM instead of perceptual encoding (which are based on psychoacustic models) which are commonly used by MP3, AAC and WMA. This makes aptX more efficient than other lossy codecs.

aptX sounds more like WavPack Hybrid Lossy, since the compression artifacts sounds identical on low frequencies (mild distortion + cassette like hiss)

So if there any AAC file that has been transcoded into aptX, there will be minimal changes to SQ as aptX will not "shave" off very high or low frequencies like what MP3 does, rather it varies the size of the quantization step, to allow further reduction of the required data bandwidth for a given S/N ratio.

So a 16 bit AAC file will be transcoded by the BT device into a 6 or 8 bit ADPCM file - which is 99% indistinguishable from the original file, and that is what you are going to hear when using an aptX device.[/QUOT
 

Users who are viewing this thread

Back
Top