bfreedma
The Hornet!
- Joined
- Feb 3, 2012
- Posts
- 3,286
- Likes
- 2,675
This is due to different noise pollution levels these devices have. Fundamentally noise reduction is what it all comes down to
Please explain how “noise pollution” can impact a digital signal across galvanicaly isolated devices. Measured evidence of change to the bits would be very helpful in assessing this claim - I assume you must have said evidence as a manufacturer proposing this “solution”
As it’s technically impossible for an in spec router/switch/cable to alter packetized data in a manner that provides consistent noise shaping or noise reduction (unless the device is, for some reason, designed to intentionally alter the data), is the claim being made that the file itself remains bit perfect but that some kind of mechanical noise is being transmitted outside of the actual data stream? And that that “noise”, despite passing though several points of isolation is still somehow reaching and impacting the network end point?
Can anyone prove a technical explanation supporting these claims which doesn’t directly conflict with the .802 standards, which clearly work based on the trillions of bits of data that successfully make it though our “noise polluting network infrastructure“ hourly?