1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

iOS vs Android for audio 2018

Discussion in 'Sound Science' started by neil74, Dec 6, 2018.
  1. neil74
    So what's the verdict in 2018? On paper android has the edge both for wired and wireless.

    For wired iOS is dependent on the DAC in the dongle whereas on Android the LGs have the quad DAC and even the Samsungs have decent output.

    For wireless it is less clear, on paper android has a significant edge in that all current droids support LDAC and APTX HD whereas iOS is capped at 250 kbps AAC. Real world though seems to see different views with some saying that bluetooth is still flawed on android and there is more at play than just the codecs.

    What does everyone on here think?
     
  2. bigshot
    Apple has always used DACs that are audibly transparent. I owned the first AV Mac, the 8500 and it was capable of capturing and playing back sound perfectly fo human ears. I think the answer is that for sound quality, it probably doesn't matter. Choose based on convenience. And understand that wireless technology like bluetooth is a compromise. It's fine for casual use, but it isn't a replacement for wired for the most serious listening.
     
    Last edited: Dec 6, 2018
  3. jagwap
    Apple used to use wolfson DACs in the old ipods before the classic, which needed capacitors on the outputs. These could not be large enough to be transparent. They added distortion and phase errors that were frequency dependent. So they haven't always been transparent.

    2018 this is not a problem. The iPhone dongle is pretty good if you buy it outside of Europe (where it's output limited because of EU law) for $10.

    However android gives you a huge variety, with pluses and minuses: Most android units feed the audio through the Qualcom ASRC on the SoC and convert it to 48kHz, and there is an opinion (that I agree with) that this is not transparent. This can be fixed on some phones, but takes some knowledge (all here on Head-Fi). Some android platforms have fancy DACs and headphone amps, and their chief advantage is often bypassing this ASRC, and sometimes better headphone drive in terms of voltage and or output impedance.

    e.g. LG Vxx and Gx series have impedance sensing so that higher impedance headphones (>50R) get upto 2V rms, and lower impedance IEMs get much less gain so less noise. If the audio is handled in Neutron or UUAP the ASRC is bypassed (or convert the files to 24 bit with no added dither). So these units are virtually the only phones to drive planar headphones decent levels and quality.

    So for certain headphones android can beat apple when it matters.
     
  4. ksorota
    Is there a consensus, or does it matter, on which OS is better for feeding Tidal to a HiFi setup?

    Tidal stream on (iOS or ChromeOS or Android) Tablet to DAC to Amp to Transducers
     
  5. neil74
    I think I'm especially interested in how Bluetooth stacks up, on paper a droid should be better here due to the codecs on offer. I have seen more than one person claim though that Bluetooth on Android is still flawed?
     
  6. Brooko Contributor
    Can't believe I read this from you of all people.

    Agree with the first part. What do you base the part I highlighted red upon? As you have so often said, aac256 is transparent (as far as what we can actually hear). iOS devices are more than capable of transmitting aac natively - so no resampling / transcoding. If your source material is good (good mastering), then transmission should be flawless on modern good quality devices. Newer devices like FiiO's Q5 even use Bluetooth transmission + a high quality DAC rather than the SoC. End result - Bluetooth quality you won't be able to tell from wired.

    Suggest you actually go out and try some modern Bluetooth devices, before making a blanket statement which in the modern world is false (at least in terms of audibility).
     
  7. Brooko Contributor
    Suggest blind testing the codecs rather than make assumptions based on larger numbers is better. Thats where the whole hi-res myth took wings. The different codecs offer different levels of latency. Most of the extra bandwidth looks great on papaer - but means nothing when actual audibility is involved.
     
  8. bigshot
    I did a listening test comparing bluetooth to lossless using my most difficult to encode tracks a while back and I ended up with a small bit of artifacting. I was using the bluetooth in my iPhone to connect to my Mac I think. Or was it iPhone to AVR? Can't remember. It was back when I had my iPhone 3. I'm pretty sure it wasn't just transmitting an AAC file directly for playback. I do that with wifi with no problem. I understand that there are different codecs for wireless connections. Other ones might be better. I use bluetooth with my beater headphones at work and it sounds fine, but I'm not listening critically there.
     
    Last edited: Dec 12, 2018
  9. neil74
    People put a lot of faith in AAC as a codec for the files themselves and I would agree as to my untrained ears Apple music sounds better (on iOS or Android) than Spotify and Tidal (non-hifi) too sounds excellent and is probably imo the best of the £9.99 services for audio (shame the service/app are not really that good).


    For Bluetooth though AAC is almost looked down on when compared to LDAC or APTX-HD. To my ears I’d say that Tidal premium on my Note 9 over LDAC does sound better than anything I get out of my iPhone. Caveat to that though is I am no expert at all and it could well be placebo.
     
  10. Brooko Contributor
    How long ago was the test Bigshot? Was it actually artifacting - or perhaps the Bluetooth glitching (micro drop of signal). There seems to be no reason (native aac transmission) why there would be artifacting.

    Pity we weren't closer - I'd love to send you the Q5 - the quality of that device (over Bluetooth) is incredible.
     
  11. jagwap
    I have the Audeze Mobius, and you can switch the bluetooth CODEC from SBC, AAC and LDAC (no ABTX on Mobius as they use Microchip not CSR/Qualcomm). The LDAC sounds significantly better than the others. However this is far from definitive even as a sighted test, as there is no visibilty of data rate except in LDAC. Bluetooth has a habit of reducing the datarate on the fly without telling you when you add distance, go into a lift, go outside, stand near moving traffic and cross your legs. Also LDAC could be optimised in a mutlitude of other ways, and without knowing the full system details it cannot be confirmed to be the CODEC alone. But it does seem to get a priority in either the headphone or android so it's worth checking out if you can. ABTX-HD I haven't tried. However despite the marketing, please note over bluetooth it is not lossless.
     
  12. bigshot
    It was quite a long while ago, when I was setting up my media server. I was testing Airport wifi streaming at the time. I used my ALAC reference files to send sound from my phone to my Mac (or perhaps my AVR?) wirelessly using both bluetooth and over wifi to an Airport through the analogue output. I was under the impression that bluetooth had its own codec and I was testing it to see if it was transparent. I doubt if I would have bothered to test it if I thought it was just handing AAC across. Has bluetooth always used AAC?

    It could have been a testing error. on my part I wasn't really focused on bluetooth so much because it made sense to just install a dock on my systems so the phone could charge while it played. I was mostly checking to see if streaming to Airports over wifi was as good as keeping mirrors of the library on each end of the house. Airport came out transparent, so I went with that. I haven't played with bluetooth since... just my beater cans at the studio. So it probably is different now.
     
    Last edited: Dec 12, 2018
  13. castleofargh Contributor
    my biggest issue with Bluetooth headphones is usually how hissy the amp section turns out to be. second on my list is how easy it is to lose the signal(one a few devices, putting them in my back pocket and sitting down was the end of the streaming, and even if it keeps going on, you then never know what actual codec is being used, as historically, BT tends to revert to the lowest settings when it has connection issues.
    I have to confess that I never had something so clean that I got concerned with which highest codec setting was the best. to me BT is one of those things where it works fine, or it pisses me off. no middle ground.
     

Share This Page