If true, Apple is proving that digital cables do make a difference!
This is a bit crazy if true, though. It will create some customer confusion unless there is some indicator somewhere that the hi-res audio is getting through the cable. There is usually no way to know what the destination sampling rate is on the playback device. Although, I guess that could be shown in software.
If the cable is different internally, they should try to make the cable look physically different if the connector is backward-compatible (like other digital cabling standards that have been upgraded -- standard USB, HDMI, Ethernet). If they change the connector physically, that would mean buying a new device too.
Finally, do I have to buy new cables to get this to work? I am holding off until WWDC.
On the short-Lightning cable front, I whittled down the plastic around my Amazon Lightning cable. Now it fits in my case for my iPhone 5, so I can now pair it with the M8. Let's hope that hi-res works through it...
… or else I'm going with Expat's solution and getting a short USB dongle and Lightning CCK. I know this works already as I have the CCK. I tested the sample rates by using the optical digital out from the M8 to another DAC which shows the internal sampling rate. When I used the iDevice USB A connection, I got low-res at the digital out, but when I use the CCK and USB B connection, I get hi-res at the digital out. I used the FiiO E17's optical input to show this. Interestingly, the E17 shows 16-bit/48kHz sampling for low res (but I don't think it has a 44.1kHz display), and it shows 24-bit/192kHz sampling for hi-res. I'll have to try a more-reliable input device like a home theater receiver to see what it says about bit depths/sample rates.
Edited by jazzman7 - 5/14/14 at 7:32am