iPhone AAC vs. Aptx and Aptx-hd real world
Feb 18, 2018 at 2:37 PM Post #46 of 315
Again one listener's opinion won't cut it one way or another.

It will certainly answer the question conclusively for the person doing the test!

This is the thing I don't understand about people who claim to be scientific... They cite published papers in internet forums, then they complain that the papers don't exactly fit their situation. Other people offer real world experiences based on their own testing, and they wave them away saying "anecdotal". If you really want to know, just do the test yourself! What kind of scientist doesn't do any testing and relies only on other people's published papers? What could be more applicable to your situation and use than doing a controlled listening test yourself on your own equipment using your own recordings? I tested AAC and determined the precise point of transparency. I did the same for Frauenhofer and LAME MP3. I do comparison tests with every piece of equipment I buy. I'm not taking anecdotal impressions at face value. I'm not even relying on published papers. I found out for myself. It isn't a matter of rhetoric for me. I know.

That would require extensive DBTs to determine it objectively.

No, it would take you sitting down and finding out for yourself. Get going buster. If you don't care one way or the other and you don't know the answer, why are you posting on the topic? Your last three posts crossed over from actual content to rhetorical obfuscation. Dotting every i and crossing every t doesn't help people understand how audio works. It can actually make it even more difficult to understand. Is aptX a lossy format? Is it audibly transparent? Answer those questions with some degree of knowledge and I'll be happy with your answer. I'm not going to require peer review to determine something that everyone can determine at home with their own stereo. I don't have aptX, so I can't. But I'm interested in hearing from people who have heard artifacting.

I don't mean to be laying into you. I apologize if I come off that way. It's just that we all seem to lose sight of the reason we're here. It isn't to put on a lab coat and demand academic perfection. It's to use scientific principles to make our home stereos sound better. Helpfulness is a virtue.
 
Last edited:
Feb 18, 2018 at 3:13 PM Post #47 of 315
It will certainly answer the question conclusively for the person doing the test!

This is the thing I don't understand about people who claim to be scientific... They cite published papers in internet forums, then they complain that the papers don't exactly fit their situation. Other people offer real world experiences based on their own testing, and they wave them away saying "anecdotal". If you really want to know, just do the test yourself! What kind of scientist doesn't do any testing and relies only on other people's published papers? What could be more applicable to your situation and use than doing a controlled listening test yourself on your own equipment using your own recordings? I tested AAC and determined the precise point of transparency. I did the same for Frauenhofer and LAME MP3. I do comparison tests with every piece of equipment I buy. I'm not taking anecdotal impressions at face value. I'm not even relying on published papers. I found out for myself. It isn't a matter of rhetoric for me. I know.

And what gave you the idea that I haven't done my objective listening evaluation on Aptx? , I simply don't confuse that with objective proof for anything. Sure I have my opinions too.

No, it would take you sitting down and finding out for yourself. Get going buster. If you don't care one way or the other and you don't know the answer, why are you posting on the topic? Your last three posts crossed over from actual content to rhetorical obfuscation. Dotting every i and crossing every t doesn't help people understand how audio works. It can actually make it even more difficult to understand. Is aptX a lossy format? Is it audibly transparent? Answer those questions with some degree of knowledge and I'll be happy with your answer. I'm not going to require peer review to determine something that everyone can determine at home with their own stereo. I don't have aptX, so I can't. But I'm interested in hearing from people who have heard artifacting.
No I haven't heard any artifacts at all, however as mentioned Aptx comes in one bit rate only however the codec is not compatible to others as it does data reduction in a different manner than the typical Dolby/AAC/MP3 codecs do, so it's rate is not really relevant hence my original posts.
I don't mean to be laying into you. I apologize if I come off that way. It's just that we all seem to lose sight of the reason we're here. It isn't to put on a lab coat and demand academic perfection. It's to use scientific principles to make our home stereos sound better. Helpfulness is a virtue.

No lab coat needed and this isn't about helping or not,it's about making statements that a poster simply can't back up with facts, simple is that.
 
Feb 18, 2018 at 3:18 PM Post #48 of 315
Testing both codecs with the same file on the same Bluetooth receiver is sufficiently accurate, given the degree of audible difference between the codecs. Just like you don’t need the sensors from the Large Hadron Collider to detect something as simple as light in a dark room. Even if the receiver introduced the artefact, it’s an aptX failure because aptX is a hardware codec.

Anything more is just mental masturbation.
your thesis
sounds more PE.. or worse .....ED mentally speaking of course...:D
 
Last edited:
Feb 18, 2018 at 5:01 PM Post #49 of 315
we don't know what we don't know. sure it's easy to think that all is fine and that anytime I press a setting, I get what I ordered. except that we're talking BT here:
- to get a special codec we need to have both the source and the receiver to handle it. any issue and the signal is likely to revert to a more compressed version, or just to good old SBC as that format is known to always be available. most headphones won't tell us anything about the format and resolution it's receiving, many sources will still let us click on something not compatible and pretend that all is well. so unless you know very well the specs and actions of your devices, a little caution seems like a good idea before drawing big fat conclusions on codecs based on such a limited experience as a sighted test.
- the initial format might impact the final format and resolution picked for the streaming. so maybe some audible artifacts wouldn't occur if the original file was in a different format/resolution. let's say I have low bitrate mp3 files and pick aptx in my settings? is my signal always converted because I picked aptx and I can? is the device "clever" enough to get that it's a waste of processing and probably fidelity? even if one device is known to behave a given way, do we know that all devices will? in this instance, if the initial files are in AAC and both devices do handle AAC at that sample rate, obviously AAC will be the best logical option as it won't require any change at all. even if a better codec was available, it's not like it would magically add fidelity to the original lossy file. so the very example given was biased from the get go while claiming to be objective evidence of AAC's superiority.


and that's just stuff coming on the top of my head, I'm far from being a BT expert. there are probably some other playful events on some gears, like having sub par connectivity, having different generations of BT, having a fixed sample rate for most formats(usually 44.1 or more often 48khz), and stuff I don't know about that would result in a different way to handle some otherwise apparently identical settings. so the good old idea that if I only change the codec setting on a particular gear, then I'm objectively testing only the codecs' variations, well IMO that's a little candid when it comes to testing BT on one or 2 combos.
but most of all, I wonder why I end up having to make 3 posts to explain that an anecdotal sighted test is not supposed to directly result in a global objective claim. isn't it obvious?
 
Feb 18, 2018 at 6:33 PM Post #50 of 315
And what gave you the idea that I haven't done my objective listening evaluation on Aptx? , I simply don't confuse that with objective proof for anything.

The only proof I need is "can I tell a difference?" If I can arrive at an answer objectively, then for the purposes of my stereo and my ears, my job is done. If someone else says they hear artifacts in a careful listening test, I'll tend to believe them. You can feel free to organize a test with a broad sample of test subjects and prove it universally for yourself if you'd like though. Let me know how it comes out. I'll probably tend to believe you too.

sounds more PE.. or worse .....ED mentally speaking of course...:D

Physical Education? Erectile Disfunction?
 
Last edited:
Feb 19, 2018 at 9:17 AM Post #51 of 315
we don't know what we don't know. sure it's easy to think that all is fine and that anytime I press a setting, I get what I ordered. except that we're talking BT here:
- to get a special codec we need to have both the source and the receiver to handle it. any issue and the signal is likely to revert to a more compressed version, or just to good old SBC as that format is known to always be available. most headphones won't tell us anything about the format and resolution it's receiving, many sources will still let us click on something not compatible and pretend that all is well. so unless you know very well the specs and actions of your devices, a little caution seems like a good idea before drawing big fat conclusions on codecs based on such a limited experience as a sighted test.
- the initial format might impact the final format and resolution picked for the streaming. so maybe some audible artifacts wouldn't occur if the original file was in a different format/resolution. let's say I have low bitrate mp3 files and pick aptx in my settings? is my signal always converted because I picked aptx and I can? is the device "clever" enough to get that it's a waste of processing and probably fidelity? even if one device is known to behave a given way, do we know that all devices will? in this instance, if the initial files are in AAC and both devices do handle AAC at that sample rate, obviously AAC will be the best logical option as it won't require any change at all. even if a better codec was available, it's not like it would magically add fidelity to the original lossy file. so the very example given was biased from the get go while claiming to be objective evidence of AAC's superiority.
You can see the codec being used on a MacBook, which is where I noticed the aptX artefacts. You can force aptX, AAC or SBC and adjust the bitrate of AAC. The global system audio is recompressed into AAC / aptX - it does not attempt to bitstream the original file if it's already AAC. Fortunately AAC 256 has been proven to be transparent even after being transcoded 100 times.
 
Feb 28, 2018 at 6:38 AM Post #53 of 315
This is a topic that continues to occupy me as on an iPhone I have this feeling that you are capped at 256 kbps so the likes of Tidal or aything higher is just pointless?

It could just be placebo but I have found Google play music on my iPhone to be inferior to apple music but on my Pixel 2 XL it IMO sounded slightly better (using Sony 1000x m2s and LDAC). Is it a case that until Apple improves their adopted BT standard it will always be a bottleneck. Mind you all the time apple music is 256 aac they probably do not care!

So for now the best course for iPhone users who want to use bluetooth would seem to stick to an AAC service so either apple music amazon or tidal premium?
 
Feb 28, 2018 at 6:50 AM Post #54 of 315
This is a topic that continues to occupy me as on an iPhone I have this feeling that you are capped at 256 kbps so the likes of Tidal or aything higher is just pointless?

It could just be placebo but I have found Google play music on my iPhone to be inferior to apple music but on my Pixel 2 XL it IMO sounded slightly better (using Sony 1000x m2s and LDAC). Is it a case that until Apple improves their adopted BT standard it will always be a bottleneck. Mind you all the time apple music is 256 aac they probably do not care!

So for now the best course for iPhone users who want to use bluetooth would seem to stick to an AAC service so either apple music amazon or tidal premium?
The phone always re-transcodes the sound to AAC / aptX / LDAC, so it's potentially altered right at the source. Headphones like the WH-1000XM2 also alter the sound with their ultrasonic upsampling gimmicks. The WH-1000XM2 is a bad sounding headphone which alters some frequencies audibly (listen to the iPhone keyboard clicks), so pursuing high fidelity audio is pointless when you're listening on such a device - the sound has already been destroyed far more than the minute differences between audio codecs.
 
Last edited:
Feb 28, 2018 at 9:58 AM Post #55 of 315
Was wondering that as in theory the bitrates for AAC BT (250) and Apple AAC fileds (256) do not match so there must be some transcoding going on somewhere!

In theory then the iPhone is currently significantly limited for bluetooth audio vs. android. Again the question as to whether you can actually hear this difference is valid but I'd say on really good systems you probably can.
 
Mar 1, 2018 at 7:09 AM Post #56 of 315
Was wondering that as in theory the bitrates for AAC BT (250) and Apple AAC fileds (256) do not match so there must be some transcoding going on somewhere!

In theory then the iPhone is currently significantly limited for bluetooth audio vs. android. Again the question as to whether you can actually hear this difference is valid but I'd say on really good systems you probably can.
Android's entire audio stack is inferior (latency, resampling). The presence of alternative codecs like aptX (inferior to AAC) or LDAC (gimmick inaudible to humans) does not make it better.
 
Mar 1, 2018 at 8:31 AM Post #57 of 315
Android's entire audio stack is inferior (latency, resampling). The presence of alternative codecs like aptX (inferior to AAC) or LDAC (gimmick inaudible to humans) does not make it better.

This is interesting but is this really the case? The amount of conflicting info on this subject is quite baffling, and if true totally undermines aptx and LDAC with a lot of reviews still give android and the increased bitrate of aptx-hd and LDAC the advantage over AAC on an iDevice.

If what you say is the case then any service north of 256 kbs is overkill for bluetooth and Apple music is as good as it currently gets??
 
Last edited:
Mar 1, 2018 at 8:59 AM Post #58 of 315
at this point I don't know what I have to do.
Android's entire audio stack is inferior (latency, resampling)....
evidence of that? please don't come back with another one shot subjective anecdote for such a broad claim or I'm going to get mad.
as far as I know the lowest latency available for audio, comes with one of the aptx modes. if you have evidence of the contrary, please share it with us. about resampling, I just don't know what you're talking about.

... The presence of alternative codecs like aptX (inferior to AAC)...
prove that it's inferior or stop making that sort of claim.
aptx has superior max sample rate and that at least is a fact, so it can come closer to lossless if that's the aim for the user. it might not be necessary but it certainly doesn't define "inferior" in my book.


... or LDAC (gimmick inaudible to humans) does not make it better.
if we're going that way, all compression codecs hope to be gimmicks inaudible to humans. where is this coming from? LDAC isn't Apple so it has to be a gimmick? LDAC also offers higher bitrate than AAC for those who care.
 
Mar 1, 2018 at 9:43 AM Post #59 of 315
at this point I don't know what I have to do.

evidence of that? please don't come back with another one shot subjective anecdote for such a broad claim or I'm going to get mad.
as far as I know the lowest latency available for audio, comes with one of the aptx modes. if you have evidence of the contrary, please share it with us. about resampling, I just don't know what you're talking about.


prove that it's inferior or stop making that sort of claim.
aptx has superior max sample rate and that at least is a fact, so it can come closer to lossless if that's the aim for the user. it might not be necessary but it certainly doesn't define "inferior" in my book.



if we're going that way, all compression codecs hope to be gimmicks inaudible to humans. where is this coming from? LDAC isn't Apple so it has to be a gimmick? LDAC also offers higher bitrate than AAC for those who care.

http://superpowered.com/android-audio-latency-problem-just-got-worse
There is significant variance across devices, but in general the latency to even the wired headphone jack on Android is much higher than Windows / macOS / iOS. This latency is even before it goes through the Bluetooth stack, and is only compounded with the latency from AAC / aptX codec. It used to be much worse a few years ago (~200ms just to the wired headphone jack) but has improved significantly. It's still unacceptably high at over ~50ms for wired headphones.

https://www.rtings.com/headphones/tests/active-features/latency
I prefer aptX Low Latency in general since it only adds ~40ms, which in apps and games far outweighs the minor improvement in quality with AAC which has a ~200ms latency. Plain aptX is also ~180ms on most devices and thus AAC is preferable if the device does not support aptX Low Latency.

Resampling is required whenever the audio output is in shared mode (it usually is, so that you can hear notifications and sounds other apps). If the hardware runs at 48000 Hz / 24-bit (most devices do), music stored at 44100 Hz must be resampled in order to be mixed into the audio output. This can be done either by the app or automatically by the OS.
Most decoders decode AAC / MP3 to 16-bit rather than 24-bit (lossy compression does not have any bit-depth - it's a PCM thing). Converting between these is not as simple as padding / truncating zeroes - it needs to be anti-aliased / dithered, again either by the app or by the OS. Both are blackboxes with unknown quality.

aptX, aptX Low Latency, AAC, MP3, etc. are legitimate innovations. aptX HD, LDAC, DSD, MQA etc. are gimmicks - they have no benefit for music listening, and studios don't archive masters in these formats either. They were invented for the sole purpose of milking audiophiles.
 
Mar 2, 2018 at 11:35 AM Post #60 of 315
Is there any information on how the masking works with the aptX encoder? Is there no single demo file available that has been encoded by Qualcomm (and maybe reconverted back to WAV)? I would really love to do a very critical ABX test. I know how MP3 works and why AAC is more efficient (removes more data in higher freqs and even cuts off completely at 18 kHz, storing a lot more resolution than MP3 up to 4 kHz), but I know nothing about aptX.

I have a very hard time recognizing the AAC from the CD (or even HD for that matter). BTW, I use True VBR settings at q110. That is a tip I found years ago on Hydrogenaud.io and it is supposed to be slightly superior to "iTunes Plus" which uses a different VBR method - both average around 256 kbps.

Now regarding Bluetooth, I have some unanswered questions.
1. Between two AAC-enabled devices (for example iPhone and Bose headphones), is the transfer bit-perfect? Or is the AAC being re-encoded and trimmed yet again? If usually not, could it be that my settings that do not perfectly match the iTunes standard (though same codec) thus need to be re-encoded whereas Apple Music would not have to?
2. Is the bitrate fixed? I don't think so. I am almost sure that the file might be downsampled to 128 kbps or similar in some cases when the connection is not the best. Sometimes I believe to hear artifacts but sometimes it also cuts the sound completely. This question concerns all codecs (SBC, MP3, AAC,
3. How come MP3 is rarely supported? It is possible to use it as a codec for bluetooth. At the very least it should be better than SBC/MP2/Musicam.
4. How can I tell which connection my iPhone uses? My iMac 2017 prefers SBC. I had to download a developer app to force it to use aptX (with a Hugo 2) instead. I noticed how the bass in a Classical recording sounded off so I started to investigate. In this single particular case, switching to aptX actually seemed beneficial. Anyway, how can I tell if my Bose headphones use SBC or AAC?

So I think it is best to only use lossless audio (well, if you have the space on your device). That way the audio will be encoded before transmission every time, but you prevent further loss. For example, Spotify is always re-encoded due to OGG Vorbis not being Bluetooth-enabled.
I hope devices will be able to use ALAC or FLAC for future bluetooth devices. If the bandwidth is there for LDAC, it should be there for ALAC also.

aptX HD, LDAC, DSD, MQA etc. are gimmicks - they have no benefit for music listening
Fully agree. I did several blind tests comparing HD with 16bit and never succeeded. There is no single controlled blind test that ever managed to prove superior sound from HD material unless the volume was lifted so high that quantization errors became audible in a silent track. (I think one research used an acoustically treated room to push environment down to 19 dB and they still had to use painful volume to uncover the CD - with normal music 49% of 500 ppl guessed correctly which is exactly a coin flip and only one contestant managed 8/10 correct guesses which is still below the significant value.)
Of course there will always be people who claim they can tell the difference. Such happened also on Archimago's blog but ironically, those that claimed to hear the difference (same group that used equipment above 1.000 $ and even over 5.000 $) actually mistook the 16bit file for the 24bit file. I can't remember the exact number (I think over 60%) which makes it even worse than guessing by chance. Wow, talk about correlation.
Luckily, this is a good way to filter bad content and if a review tries to sell me how the headphone opens up with HD files, I know I found a bullshitter and cancel reading immediately.
 

Users who are viewing this thread

Back
Top