iPhone AAC vs. Aptx and Aptx-hd real world
May 16, 2018 at 3:11 PM Post #76 of 315
OK, I made an interesting discovery. While I am writing on a full in-depth beyerdynamic Aventho Wireless review, I was sketching the frequency response with REW and its tone generator. Connection was per default active via aptX. (My iMac 2017 does not have aptX HD.) Starting at 6,5 kHz artifacts become very clearly audible, all the way above 14 kHz the treble sounds complete ****. The compression is WAY TOO strong in the treble. You can hear how 6.000 Hz sounds fine and as you move up past 6.500 Hz it starts crackling.
I switched to SBC (with the developer tool Bluetooth Explorer) and all tones sounded fine again. I jumped back and forth to confirm this.

Now I am even more interested in aptX HD.

I discovered this issue way back with Sennheiser Momentums. Even created a bug with Apple (and Sennheiser) about it. Sennheiser told me they can't reproduce it, Apple did nothing. The same happened later with B&W P7 and B&W PX. I believe the implementation of aptX in Apple's bluetooth stack is simply faulty, artifacts are very audible with test tones.
However IRL I have not had a problem with quality, be it aptX or AAC (even SBC sounds good when the bitpool is high enough).

One problem I do have with my new Amiron Wireless is bluetooth range - with aptX it is vastly inferior (unusable) - I literally take two steps from my Macbook and it starts dropping out. AAC is better, but still not that good. My P7 reached with aptX the same as Amiron does with AAC, and the P7 has much greater range in AAC (50% more would be my guesstimate).
 
Jun 10, 2018 at 9:01 AM Post #77 of 315
I recently got B&W PX and use them via aptX HD on my Android Oreo phone.
It works great, SQ is nice, but I wonder one thing:

Does aptX HD automatically change the samplerate/bitdepth of the codec to match source material or is it fixed to 24bit/48kHz?
In Android Oreo developer settings it always defaults to 24bit/48kHz, even when playing back 16bit/44.1kHz content.

I double checked via Logcat and the logs also show 24/48.
Anybody know more about this?
 
Jun 10, 2018 at 10:38 AM Post #78 of 315
I recently got B&W PX and use them via aptX HD on my Android Oreo phone.
It works great, SQ is nice, but I wonder one thing:

Does aptX HD automatically change the samplerate/bitdepth of the codec to match source material or is it fixed to 24bit/48kHz?
In Android Oreo developer settings it always defaults to 24bit/48kHz, even when playing back 16bit/44.1kHz content.

I double checked via Logcat and the logs also show 24/48.
Anybody know more about this?
The OS usually runs the mixer at 24/48 in shared mode and will resample the music. aptX HD being a lossly codec does not have a bit-depth when transmitted. The decoding process will reproduce the equivalent of a 24-bit dynamic range.
 
Jun 10, 2018 at 11:54 AM Post #79 of 315
This is the only post on this thread I am competent to reply to with real-world data so I am very excited about that.

I have a nice portion of my library recorded in Itunes Plus (which is Apple AAC 256 VBR). For my ear it is transparent, full stop. The chance that I am guessing on ABX testing is 100 percent (sorry about that Bigshot, I am also practicing listening for the gurgling aquariums at 96 kbps in Frau CBR on the Sammy Davis, Jr., CD, now, I promise, although my CODEC has not yet seized up. I do look forward to that though.). As to whether Apple AAC 256 VBR caps out at 256 kpbs, the answer is no, and in my library, the average bitrate is considerably higher than 256 kbps for most songs ripped in Apple AAC 256 VBR. The lowest is 233 kbps for only two songs (a song from the Duke Ellington Blanton-Webster band, and a Pokemon song (it's one of my kid's, but I rip everything in the family and we share an Apple Music account). The Apple AAC 256 VBR rips max out at an average bitrate of 304 kbps (of course some passages in the file will be higher), interestingly for a few 1950s and 1960s jazz recordings (Oscar Peterson, Grant Green, Eddie Harris); and at 303 kbps I have a lot of modern jazz recordings. The Apple AAC 256k VBR encoder doesn't seem to up the bitrate that high even for the better classical recordings. The highest classical track is a Dvorak Slovic dance at 290 kbps. The range from 233 kbps to 304 kbps is a pretty smooth progression literally including muliptle instances of every single bitrate in-between. The median is definitely 256 kbps on the button--I don't even need to count--there are a ton of tracks at that bitrate. The mean average is higher. Just looking at the distribution I'd ballpark it at 275 kbps.

For me Itunes Plus is transparent and set it and forget it, but I do agree with Bigshot that some of the space it is taking up with these higher bitrates is a waste of resources. It's just super-easy.
Now you and this thread have me paranoid about my signal chain in terms of bluetooth streaming. :anguished: How the heck am I going to ABX that for peace of mind? I always just thought to myself, that sounds pretty good! Ignorance can be bilss, I guess. Until some huckster rips you off. I moved over from OS X to Windows 10 recently because I like Windows 10 better than OS X (gasp!) (we have both in the house, between me, my wife and three kids), and Apple is doing some increasingly weird stuff with its 27" Imac line, making very it hard to DYI upgrade for most things, and with 20/20 hindsight the Imac bluetooth was a lot less stable.

I believe I have an intel wireless / bluetooth device in the Windows computer I am using now. Wait, let me check speccy--yes, it's an Intel dual band AAC Wireless AC 3165 and bluetooth transmitter and receiver. I can always get the state-of-the-art driver for it and automatically install it using the Intel site. So that much of the chain I feel pretty good about. My setup also much more reliably remembers and automatically connects to my previously connected bluetooth devices than my Imac did. . . sometimes better than I want it to. I decided I had had enough of the folks at Apple and gave my 27" Imac to my mom. I would use a non-Apple codec if Apple AAC wasn't a no-brainer. Windows 10 and third-party monitors have finally gotten to the point where I trust them for photography, and I was personally able to get more performance for less money than Apple would charge and more focused on my needs than Apple would allow me. Now I am shooting 99th percentile on Passmark, with a focus on my preferences and needs, for way less than I could get that through Apple, by thousands of dollars. Photoshop and Lightroom scream. Gaming performance is beyond great. Sound stuff is a piece of cake. Our portable environments include lots of both IOS and Android devices. Plus we have both Apple TV and Roku. So I have every single variable mentioned in this thread at play. Everything has to work smoothly and well because we usually have something in the area of 30 devices connected to the network at any one time (3 teenagers!) and I don't pay the cable man for my TV media anymore.

The info on the interweb is both plentiful and vague on the comparative bitrates of APTX, AAC and SBC but as I understand AAC is capped at around 256 kbps and APTX around 350 kbps?

I regularly switch between iOS and Android I am curious as to the real world limitations of wireless headphones on an iPhone vs APTX equipped droids? e.g. using Apple Music and it's 256 kbps AAC should mean no loss but I'm unsure how higher bitrate stuff like Spotify 328 kbps or the standard Tidal 328 kbps AACs would compare and how much of a bottleneck the AAC codec would be on an iPhone?

With aptx-hd now available on both phones and headphones I am wondering how much of a real world quality advantage Androids now have with wireless headphones?
 
Last edited:
Jun 10, 2018 at 12:52 PM Post #80 of 315
I have a nice portion of my library recorded in Itunes Plus (which is Apple AAC 256 VBR).
I'm not sure what the iTunes Plus preset within iTunes uses, but both Apple Music and the iTunes Store use constrained VBR which does not exceed 256 Kb/s, but can dip below it. True VBR can exceed 256 Kb/s. If you encode using qAAC you can choose between CVBR and TVBR. Even 256 Kb/s CBR has been shown to be completely transparent.
 
Jun 10, 2018 at 1:31 PM Post #81 of 315
The Itunes Plus preset is the default Itunes rip, at least on my Windows Itunes--256 Apple AAC VBR, with most tracks encoding above that on average, and a few encoding below that on average, unless Foobar2000 is giving me totally bogus data, which I guess is possible. So I guess you are more likely to get a better rip from a CD using Itunes Plus than off of the Apple Music store. But I think both would be transparent, and the file from the store would probably (but not always) be smaller, which would arguably be better in some ways.

Edit: I did get a Sun Ra recording off of the Itunes store once for better sound quality--I had digitized the original LP. Apple has some weird thing lately where they remaster some Sun Ra music and are the only ones distributing it. Since I like Sun Ra in my more open-minded moments I noticed it. Anyway, there was harsh distortion at peaks on the Apple version that was not on the LP. I don't know where in the chain the distortion was introduced--the remastering, the encoding, or somewhere else.

Just joking around, I like to call Itunes Plus "Apple AAC 256 VBR, with A1 sauce," but now I guess there may really be something to it.

I'm not sure what the iTunes Plus preset within iTunes uses, but both Apple Music and the iTunes Store use constrained VBR which does not exceed 256 Kb/s, but can dip below it. True VBR can exceed 256 Kb/s. If you encode using qAAC you can choose between CVBR and TVBR. Even 256 Kb/s CBR has been shown to be completely transparent.
 
Last edited:
Jun 10, 2018 at 2:43 PM Post #82 of 315
The OS usually runs the mixer at 24/48 in shared mode and will resample the music. aptX HD being a lossly codec does not have a bit-depth when transmitted. The decoding process will reproduce the equivalent of a 24-bit dynamic range.
That makes sense, I didn´t think about that.
Is it possible to switch to an exclusive mode?
I know that USB Audio Player Pro has a Direct mode, that directly pushes the stream to the built in DAC, but I guess this doesn´t work with Bluetooth.

So what do I do?
Set USB Audio Player Pro to do the resampling to 48kHz or leave it at Direct mode and let Android do it?

I have only FLAC on my phone.
 
Last edited:
Jun 10, 2018 at 3:00 PM Post #83 of 315
Resampling recordings to a higher sample rate is not meaningfully lossy since the original signal was analog anyway.
 
Last edited:
Jun 10, 2018 at 11:55 PM Post #84 of 315
Here I go again quoting myself.

I just installed the Columns UI Interface for Foobar 2000. It gives you a real-time display of kbps rates as a song plays. For albums ripped with Itunes Plus the Columns UI tells me that the encoder can go well over 320 kbps at a given time for a track encoded at 256 VBR. The median average bitrate for a track ripped at this setting is definitely 256 kbps on the button, but it can go much, much higher..For example, on Band Call from the CD release of Oscar Peterson's album Night Train, Foobar says that for a 256 Itunes Plus rip the average bitrate is 304 kbps, and the real-time indicator for the Columns UI interface (I am watching right now) gets up to as high as 362 kbps, well over 320 kbps. For the opening of the song, the encoder spends most of the time over 320 kbps. And this is for the 256 kbps setting for Itunes Plus. That is one of three of the most demanding tracks in my library according to the Itunes AAC encoder, so for most tracks you won't get that result. But I have seen the bitrate going up over 320 kbps on other Itunes Plus 256 kbps rips as well. I just chose one of my most demanding tracks as a case study. So I do think Itunes Plus is definitely Apple AAC VBR with A1 sauce.


The Itunes Plus preset is the default Itunes rip, at least on my Windows Itunes--256 Apple AAC VBR, with most tracks encoding above that on average, and a few encoding below that on average, unless Foobar2000 is giving me totally bogus data, which I guess is possible. So I guess you are more likely to get a better rip from a CD using Itunes Plus than off of the Apple Music store. But I think both would be transparent, and the file from the store would probably (but not always) be smaller, which would arguably be better in some ways.

Edit: I did get a Sun Ra recording off of the Itunes store once for better sound quality--I had digitized the original LP. Apple has some weird thing lately where they remaster some Sun Ra music and are the only ones distributing it. Since I like Sun Ra in my more open-minded moments I noticed it. Anyway, there was harsh distortion at peaks on the Apple version that was not on the LP. I don't know where in the chain the distortion was introduced--the remastering, the encoding, or somewhere else.

Just joking around, I like to call Itunes Plus "Apple AAC 256 VBR, with A1 sauce," but now I guess there may really be something to it.
 
Last edited:
Jun 11, 2018 at 5:56 AM Post #85 of 315
VBR encodings can be 'Constrained' to a given maximum. but often when you set a value in VBR you ask for an 'Average' bitrate instead of a maximum. depends on what you used, maybe which year you used it. and if you went to mess around placing plenty of mysterious magical letters as parameters for your encoding for apps allowing it.


edit: I'm talking typical encoding codecs when we convert our music. not BT streaming that happens to also use their own codecs.
 
Last edited:
Jun 11, 2018 at 6:23 AM Post #86 of 315
I think you are confusing two different things - the source bitrate (and codec) and Bluetooth A2DP bitrate+codec.

All bluetooth devices I've seen used AAC VBR @ 256kbit (forcing CBR causes failure), and I've never seen them exceed 256kbit (but I've seen them transferring less).
I wasn't able to find any definiitive specs on this, so it's possible that >256kbit can be supported, or that some devices support CBR.

In any case, whatever source you use will first be decoded and sinked internally to PCM first, then transmitted using a codec. It might depend on the smartness of the source device how this is handled, i.e. what bitrates shows as supported on the bluetooth audio device and can be used (I'd be very surprised if it was anything else than 44.1kHz/32 bit float in case of A2DP).
aptX might claim some marketing mumbo jumbo about 48kHz/24, but that's likely just the sampling rate coming out of the bluetooth chip in the receiver, or some "marketing equivalent quality" they paired with it, but in reality it will always come as a 44kHz/32b source and nothing else.
Theoretically it should be possible to simply forward the AAC codec samples from the source (like Apple Music), but I don't think it's possible in practice due to the 256kbit limit, so you always end up resampling it. And what would the system do about mixing sounds together? One might be AAC music the other might be a ringtome stored as mp3, do you suddenly start encoding both to AAC and forwarding it? No, you mix it into the PCM stream...
Whether that is transparent or not - I don't think you'd know the difference anyway.
 
Jun 11, 2018 at 6:55 AM Post #87 of 315
I was answering @Steve999 who clearly wasn't talking about BT. sorry if I confused anybody with that sort of off topic ^_^.
 
Jun 12, 2018 at 5:52 AM Post #88 of 315
AptX is really quite basic in its approach and I'd struggle to believe that the results are better than AAC in practice. All AptX is doing is throwing away increasing numbers of sample bits in different bands and applying ADPCM encoding to each band - the fact it sounds as good as it does is surprising to me.

Is it acceptable to expand the topic to discussion of LDAC too? Now that Sony have opened it up to other vendors is anyone aware of any material that goes into technical detail about how it works? I believe it's part of the Android Open Source Project but I don't know if LDAC source code makes up part of that.
 
Last edited:
Jun 12, 2018 at 5:57 AM Post #89 of 315
aptX produces clearly audible artefacts when presented with a steadily rising/falling tone, or more than one tone. That by itself should be a red flag.
The only good thing about it is latency - usually much lower than AAC.

LDAC might be better than AAC or aptX (if only due to the massive bitrate jump), but "better than transparent" is still transparent. And higher bitrate means much higher battery drain.
 
Jun 12, 2018 at 9:46 AM Post #90 of 315
It's tempting to think that LDAC at ~990kbps *could* be lossless for 16/44 material but the CPU power involved in getting ALAC and FLAC encoding to the same size means that doesn't really stack up, despite what the Sony marketing machine would have to believe. Regardless, that's a lot of bits to play with.
 

Users who are viewing this thread

Back
Top