I don't entirely (yet?) need "evidence/proof" as miketlse mentions, but I can't even imagine the mechanism that could cause any change to the bitstream. As I understand it (and maybe I'm wrong!), UPnP and DLNA work to _route_ stuff from point A to point B. Using mConnect (that's what I use) I can choose a file and tell 2go to 'play' it. 2go takes that command, reads those bits off the sd card and then ships them out the 2go USB port. My understanding is that in no way is my phone involved in the bitstream of the content at all. If I could somehow tell 2go the right commands via a stone tablet and an old mule I found in a field of corn, then the process of moving the bits from sd card into 2go and out the USB port would be identical to what mConnect does.
Now if somehow the bits are moving from the sd card to your phone and then back to the 2go and then out the 2go USB port - well then things could DEFINITELY be impacted, but something would also be broken, no? If you're connected via bluetooth this also would max out certain hi-res formats (and I play DSD256 and 24/352 all the time this way). I've also done tests where I start playback and then power off my phone - and the music keeps going (as one would expect).
So I honestly don't have any idea how to even come up with a THEORY here (and I'm open to consider it!). I used to think "bits are bits" but I've learned a ton about RF noise in systems the last couple years. In this case, it's all wireless - there shouldn't be any RF problems (?).
The only thing I can think of is that the protocol implementation in app X somehow is 'cleaner' than the protocol implementation in app Y and app Y sends an excessive amount of commands to 2go constantly creating some sort of noise????? This seems super unlikely since these standards are, I believe, open-source, so everybody is using the same libraries most likely.
Anybody have a THEORY?????
Doody
EDIT: I've tried a dozen apps for control - I never noticed a difference - though I admit that I went into that evaluation process with the above mindset, so I never even considered to listen for audio quality changes.
Now if somehow the bits are moving from the sd card to your phone and then back to the 2go and then out the 2go USB port - well then things could DEFINITELY be impacted, but something would also be broken, no? If you're connected via bluetooth this also would max out certain hi-res formats (and I play DSD256 and 24/352 all the time this way). I've also done tests where I start playback and then power off my phone - and the music keeps going (as one would expect).
So I honestly don't have any idea how to even come up with a THEORY here (and I'm open to consider it!). I used to think "bits are bits" but I've learned a ton about RF noise in systems the last couple years. In this case, it's all wireless - there shouldn't be any RF problems (?).
The only thing I can think of is that the protocol implementation in app X somehow is 'cleaner' than the protocol implementation in app Y and app Y sends an excessive amount of commands to 2go constantly creating some sort of noise????? This seems super unlikely since these standards are, I believe, open-source, so everybody is using the same libraries most likely.
Anybody have a THEORY?????
Doody
EDIT: I've tried a dozen apps for control - I never noticed a difference - though I admit that I went into that evaluation process with the above mindset, so I never even considered to listen for audio quality changes.