Yes the inherent communication protocol - total absence of a analog like wave form - or should I say 'signal intregrity' issues that require AGC schemes. You should be right - but then why do STP ethernet cables exist at all? As you are in the tech field maybe can help understand a little about this.
One major attraction of AOIP was the doing away with all the USB gizmos and gadgets. So not super keen on adding more to the AOIP chain - but if it does improve the SQ well by all means. Coolness can mean snake oil to some folks:
http://www.ebay.com/itm/111931217341?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT
Will know Friday on the effect of optical ethernet on the SQ in my systems.!
http://www.techrepublic.com/blog/data-center/some-interesting-twists-about-ethernet-cabling/
STP is for extreme cases. If you have that much potential for interference in a home environment, you're going to have much bigger issues with interference and noise in other parts of the system.
From a data standpoint:
Every packet is checksummed and re-transmitted if there are failures. If a packet starts off with X data, and the checksum passes at the other end, then the *exact* data got there, period. If there are packet re-transmits and packet loss, that will show up - you can monitor the interfaces for that, though in practice if you're just running a 2-channel audio stream you'll probably never even notice since you're using such a small portion of the available bandwidth.
We run 10Gb to full saturation over copper and fiber, some of our systems will happily saturate 40Gb LACP trunks. In all cases those cables are run through racks and under floors with literally thousands upon thousands of other cables both for data and power. If we see even a small amount of packet loss, it is immediately noticeable. In almost every case the issue is at one of the endpoints, not the cable itself. GBICs go bad, switch ports or line cards have issues, NICs fail. Rarely, if ever, is it due to an issue with the cabling, and when it is it is most likely at one of the connectors, physical damage along the way or a cable that fails spec.
I'll say it again, in a home network, running tiny amounts of data across cable that has specs which far exceed anything you'll put through it, interference is not going to be an issue.
From an analog "noise" standpoint:
If by some chance there is noise picked up in the cable that doesn't impact the integrity of the packets - it is electrically isolated from the end point by design. Even if you're using the fibre converters, the remote end converts back to copper and is then isolated at the NIC port.
Most likely the change in sound is coming from the 2 additional steps of conversion to/from copper. I haven't heard that myself, so I won't make any claim to whether that is better, worse or just different, but most likely it's something in the conversion.
-Mike