Before you post, this is an open discussion. Any supported arguments are welcome, but if all you're here to do is say "No.", or something unintelligible in general, no thanks.
Question #1 - Where does packet loss happen most: Across a network, such as through Airplay, or over a physical interconnect, such as USB or optical?
Question #2 - If packet loss is nothing more than a packet of information failing to reach its end destination (ie Airplay to Receiver), does that in turn lead to jitter?
If jitter is viewed as nothing but packet loss across a network, it seems clear that packet loss leads to jitter, which in turns harms sound quality.
Here comes the big one:
If there were "packet loss" across a cable, be it USB or optical (any digital interconnect), wouldn't this in turn lead to jitter, therefore harming sound quality?
And, lastly, going back to question #1, if the assumption can be made that timing errors, aka jitter, and for the point of this post, packet loss, lead to sound quality harm, where do we truly lose less information? Across a network such as Airplay, or through a cable itself? My assumption would be a cable; nothing beats a physical interconnect. Then again, what is the average packet loss across a network of a standard FLAC file in comparison to a physical digital interconnect?