The Ethernet cables, Switches and Network related sound thread. Share your listening experience only.
Sep 12, 2021 at 2:47 PM Post #541 of 2,212
Depending on the quality of the DAC and other components and it’s handling of incoming jitter levels, nothing to do with lost packets but the timing accuracy involved,
the brains neuro receptors work at a threshold of around 4 microseconds so any errors greater than that will move what should be a dead centre stereo sound roughly 2 degrees, that’s the level of sensitivity, once we start talking about that same timing error in transients they affect instrument timbre as well as bass resolution and depth perception,
Just looking for reasons why that in some cases both sides can be correct, from an electrical engineers perspective as long as the hardware meets relevant specs there is no valid reason that something exceeding those specs can sound different, from an acoustic science perspective timing errors or jitter can and do effect the analogue audio quality coming out of a DAC,
There are some very good DAC’s out there lately with negligible jitter levels and some that are relatively poor so anything in the network chain that minimises levels going to a DAC already struggling to control Jitter may sound different in the converted analogue,
your opening line about jitter means then that the level is largely irrelevant in network data transmission,
but minimising it is vital when it’s a digital signal that needs to be accurately converted back to analogue ..

The issue I see with your view of this is that you’re not accounting for buffering and packet replacement and realignment. If there is a jitter problem with a DAC (haven’t seen that in over a decade), a cable won’t solve that problem. Ethernet is asynchronous, so unless someone intentionally sets buffers far too low, it can’t be an issue.

If the question is now whether some boutique dacs are so poorly designed that their analog suffers from jitter, there could conceivably be problems. But that would be occurring after the Ethernet cable.

Ethernet buffering is very resilient which can be easily seen in a simple test. Set your software’s network buffer to 1 minute or whatever it’s max buffer is. Start playing a song and in the middle, pull the Ethernet cable, wait a few seconds and reconnect the cable. You should hear the song play with no interruption or audio degradation despite a multi second long loss of connectivity. The buffer provides the data and Ethernet ECC ensures that the packets that weren’t received when the cable was disconnected are resent and properly ordered.

Given that, I need an explanation from a vendor claiming that jitter timing and subsequent packet replacement and reorder, which is orders of magnitude less time lost than the cable pull, would impact sound.

TLDR: Ethernet is almost never effected by jitter as it is asynchronous- buffering allows for problematic packets to be resent and replaced in correct sequence before the audio device needs the data.
 
Last edited:
Sep 12, 2021 at 3:01 PM Post #542 of 2,212
Not ignoring it, just asking what if any specified timing tolerances are involved “less than 1us … less than 4us “ ?
A little off topic but would you say Wireless could be better, worse or just the same as Ethernet for digital audio ?
 
Sep 12, 2021 at 3:16 PM Post #543 of 2,212
Not ignoring it, just asking what if any specified timing tolerances are involved “less than 1us … less than 4us “ ?
A little off topic but would you say Wireless could be better, worse or just the same as Ethernet for digital audio ?

Thanks for the clarification.

With a few caveats, wireless should be the same. It leverages similar buffering and ECC to address packet loss/replacement, so from a technical level, no difference.

The challenge with wireless is the variability of the operating environment. While multiple wired networks in close proximity will have no impact on each other, multiple overlapping wireless networks do have potential problems. Lack of appropriate channel separation of diverse networks leading to congestion. Physical impediments reducing signal strength, sometimes low enough to constrict data. The list goes on. Almost all can easily be rectified but unlike wired networking which is very much plug and play, some network knowledge and tuning can be helpful to resolve issues on wireless networks.

Most people won’t see these issues and the majority can be resolved by increasing signal strengt, adding a network bridge for better Wi-Fi coverage, or finding a channel that isn’t being used by the neighbors. It can happen though - I live near a large university and when they rolled out their meshed wireless solution, they grabbed all of the common channels for their traffic. My Wi-Fi network performance was definitely impacted until I moved my SSIDs to less commonly known 5G channels the school didn’t adopt.
 
Sep 12, 2021 at 3:42 PM Post #544 of 2,212
With wireless you then have an active WiFi radio in your component. I prefer to use wired.
 
Sep 12, 2021 at 3:54 PM Post #545 of 2,212
With wireless you then have an active WiFi radio in your component. I prefer to use wired.

Wired definitely presents plenty of performance with fewer potential pitfalls and would be my first choice if installation costs aren’t an issue - I’d run in wall Ethernet in any new construction given the choice. Might not be cost effective to retrofit into an existing house for most folks and when properly configured, wireless works equally well.
 
Last edited:
Sep 12, 2021 at 3:59 PM Post #546 of 2,212
Thanks for the clarification.

With a few caveats, wireless should be the same. It leverages similar buffering and ECC to address packet loss/replacement, so from a technical level, no difference.

The challenge with wireless is the variability of the operating environment. While multiple wired networks in close proximity will have no impact on each other, multiple overlapping wireless networks do have potential problems. Lack of appropriate channel separation of diverse networks leading to congestion. Physical impediments reducing signal strength, sometimes low enough to constrict data. The list goes on. Almost all can easily be rectified but unlike wired networking which is very much plug and play, some network knowledge and tuning can be helpful to resolve issues on wireless networks.

Most people won’t see these issues and the majority can be resolved by increasing signal strengt, adding a network bridge for better Wi-Fi coverage, or finding a channel that isn’t being used by the neighbors. It can happen though - I live near a large university and when they rolled out their meshed wireless solution, they grabbed all of the common channels for their traffic. My Wi-Fi network performance was definitely impacted until I moved my SSIDs to less commonly known 5G channels the school didn’t adopt.
But the issues you are describing (lost packets) will result in dropouts. I had that problem originally and resolved it with a repeater. Packet loss is not what what we should b e discussing here. The issue is, how does an ethernet cable or network gear affect sound quality? Saying it doesn't, or can't, is not a satisfactory answer for the legions of audiophile who's experience say it does.

I've heard the same counter for years, starting with amps, then speaker cables and interconnects, then digital cables, now network gear. My current streamer and DAC are made by exaSound and are technically proficient and highly transparent. The transparency allows me to hear clear differences when I swap ethernet cables or upgrade a power supply. (I always clear the buffer after a change and before continuing).
 
Sep 12, 2021 at 4:16 PM Post #547 of 2,212
With wireless you then have an active WiFi radio in your component. I prefer to use wired.
There's some truth to that. I heard a Sonore designer say that they would never have a wi-fi receiver in their renderers. I know people who added a Wi-Fi dongle to their SOtM renderer and reported diminished sound quality.

It's a topic of interest for me, as I cannot reach my main system with a cable, even though it's only 15 feet from my router (concrete walls, floor and ceiling). My Playpoint streamer has a built-in wi-fi receiver, which I was using initially. After reading about wi-fi radios causing negative sonic effects, I added a TP-Link 580D range extender, configured to receive, but not broadcast. I connect via ethernet cable to the Playpoint, which automatically disables the Playpoint's internal radio. I did hear a significant improvement in sound quality.

A blanket statement that "wired sounds better" however, is controversial at best. Depending on implementation, wired vs. wireless can sound equal, better or worse. If there's any consensus in the threads I've read on various audio sites, it's that a long ethernet cable is sonically inferior to a robust wi-fi signal.
 
Sep 12, 2021 at 4:21 PM Post #548 of 2,212
But the issues you are describing (lost packets) will result in dropouts. I had that problem originally and resolved it with a repeater. Packet loss is not what what we should b e discussing here. The issue is, how does an ethernet cable or network gear affect sound quality? Saying it doesn't, or can't, is not a satisfactory answer for the legions of audiophile who's experience say it does.

I've heard the same counter for years, starting with amps, then speaker cables and interconnects, then digital cables, now network gear. My current streamer and DAC are made by exaSound and are technically proficient and highly transparent. The transparency allows me to hear clear differences when I swap ethernet cables or upgrade a power supply. (I always clear the buffer after a change and before continuing).

I know this won’t be popular here, but sighted subjective evaluations are too prone to biases no human can control to serve as viable evidence. I believe that you believe there are differences. We simply differ on the reason for those.

Suggesting that an Ethernet cable and/or networking gear works as a reliable and repeatable EQ system is impossible within the published Ethernet specifications. While that doesn’t preclude alteration of those standards, the burden is on the one making off standard claims to sufficiently prove them which in turn would enable the standards to be altered to reflect that proof.

I wasn’t kidding about helping someone with sufficient evidence to get in front of the correct 802 subcommittee, but will need hard data in hand to get on a future meeting agenda. Unfortunately, there has been zero hard evidence presented that would enable even a rudimentary discussion of a gap in the current standards. Any vendor who had that evidence and could show need to alter the 802 standards would get great press in the industry and would surely increase sales. Yet this never happens…
 
Sep 12, 2021 at 4:38 PM Post #549 of 2,212
I know this won’t be popular here, but sighted subjective evaluations are too prone to biases no human can control to serve as viable evidence. I believe that you believe there are differences. We simply differ on the reason for those.

Suggesting that an Ethernet cable and/or networking gear works as a reliable and repeatable EQ system is impossible within the published Ethernet specifications. While that doesn’t preclude alteration of those standards, the burden is on the one making off standard claims to sufficiently prove them which in turn would enable the standards to be altered to reflect that proof.

I wasn’t kidding about helping someone with sufficient evidence to get in front of the correct 802 subcommittee, but will need hard data in hand to get on a future meeting agenda. Unfortunately, there has been zero hard evidence presented that would enable even a rudimentary discussion of a gap in the current standards. Any vendor who had that evidence and could show need to alter the 802 standards would get great press in the industry and would surely increase sales. Yet this never happens…
I don't see that this has to do with standards. The ethernet cables I've trialed all indicate on the jacket that they meet 802 standards.

I'm not aware of anyone doing what, to me, would be a viable blind test, of anything in audio, ever. These panel tests where random listeners sit in front of a strange system and listen to music samples have proven useless over and over through the years, no matter how many trials are conducted. I've explained my procedure; long-term listening in my own setting. Not some artificial test.
 
Sep 12, 2021 at 4:45 PM Post #550 of 2,212
I don't see that this has to do with standards. The ethernet cables I've trialed all indicate on the jacket that they meet 802 standards.

I'm not aware of anyone doing what, to me, would be a viable blind test, of anything in audio, ever. These panel tests where random listeners sit in front of a strange system and listen to music samples have proven useless over and over through the years, no matter how many trials are conducted. I've explained my procedure; long-term listening in my own setting. Not some artificial test.

That “artificial test” is a gold standard within the medical community. Would you be comfortable if they vetted drugs based only on the subjective opinions of patients who knew they had taken a drug and where they were told in advance what the expected outcome was? I know I wouldn’t be comfortable with that.

I’ll dig up some examples later, but there certainly have been blind tests in audio. Meyer and Moran would be a good place to start as it was a large, long term test with good controls and enough data collected to be statistically viable.

I won’t go further here on blind testing as it’s expressly forbidden in this forum. Happy to continue in a thread in Sound Science if you’d like to discuss DBT in detail.
 
Sep 12, 2021 at 4:56 PM Post #551 of 2,212
I hope this explanation helps.
 

Attachments

  • 87A06E29-CD2E-4C19-9C4C-C875DD88F957.jpeg
    87A06E29-CD2E-4C19-9C4C-C875DD88F957.jpeg
    302.8 KB · Views: 0
  • 4591FE42-B165-4503-881C-E1BD3F2E99A0.jpeg
    4591FE42-B165-4503-881C-E1BD3F2E99A0.jpeg
    352.2 KB · Views: 0
  • D647F577-FE55-4E89-9144-A9558FA38F59.jpeg
    D647F577-FE55-4E89-9144-A9558FA38F59.jpeg
    315.3 KB · Views: 0
Sep 12, 2021 at 5:13 PM Post #552 of 2,212
Products either follow the 802 standards (error correct, noise rejection, galvanic isolation) or they don’t work. If that wasn’t the case, the digital world as we know it today simply wouldn’t exist. Subjective reviews of a product following these standards demonstrate how powerful the placebo effect can be.
Its not a flaw of the designs of the products. This seem to be the physical rule of how electricity can impact audio. The cleaner the better even through ethernet cables some how. Blaming it on placebo seems to be an easy escape out thing to say.
 
Sep 12, 2021 at 5:18 PM Post #553 of 2,212
I hope this explanation helps.

Not really. The part where he discusses Ethernet conveniently ignores that Ethernet is asynchronous and error corrected. The worn out strawman about the waveform decision tree is irrelevant as a CRC check would identify the incorrectly read bit and would request a resend of the containing packet. For some perspective, we tested the real world error rate across approximately 250000 Ethernet connected devices for a substantial period of time - the error rate due to single misread bits (what the OP is describing) was seen in less than 1 in 10 billion packets. Given that, I wouldn’t worry about that issue as it’s simply not a problem in the real world

USB, due to being isnychronous, is, as described in your link, not able to use error correction but does leverage robust interpolation to remediate misspent data. A much smaller sample size shows similar vanishingly low rates of actual errors, but the test wasn’t broad enough for me to be comfortable stating the data confirms this isn’t a significant issue in the real world. It does point to that conclusion though
 
Sep 12, 2021 at 5:25 PM Post #554 of 2,212
Its not a flaw of the designs of the products. This seem to be the physical rule of how electricity can impact audio. The cleaner the better even through ethernet cables some how. Blaming it on placebo seems to be an easy escape out thing to say.

But every Ethernet connector following spec is galvanically isolated. And twisted pair has low susceptibility to electrical noise.

So how is this electrical noise end running the galvanically isolated connector?

For some perspective, large electrical power plants where electricity, RFI, and EMI are at levels far above a normal home, normal Ethernet cables are used and connected to highly sensitive analog data gathering devices. You would think we would see issues there and that there would be regulation in place insisting on “better” Ethernet. Yet ther is not, because it isn’t needed.
 
Sep 12, 2021 at 5:57 PM Post #555 of 2,212
That “artificial test” is a gold standard within the medical community. Would you be comfortable if they vetted drugs based only on the subjective opinions of patients who knew they had taken a drug and where they were told in advance what the expected outcome was? I know I wouldn’t be comfortable with that.

I’ll dig up some examples later, but there certainly have been blind tests in audio. Meyer and Moran would be a good place to start as it was a large, long term test with good controls and enough data collected to be statistically viable.
Drug tests and audio DBT's are not comparable. Drug tests work.

The first audio blind test I read about was in the 1970's, where The Absolute Sound reviewers were unable to distinguish between a Pioneer receiver and an expensive Audio Research (IIRC) tube amp. Similar failures have been repeated, ad infinitum. To you that means there is no difference, sighted listening is useless. To me, it demonstrates that DBT's are useless for judging sound quality. On a related topic, I won't say that all measurements are useless, but static tests such as those performed at Audio "Science" Review, are useless for determining sound quality.

I won’t go further here on blind testing as it’s expressly forbidden in this forum. Happy to continue in a thread in Sound Science if you’d like to discuss DBT in detail.

Thanks for the invitation, but to quote an old expression; "I'd rather stick pins in my eyes". I've participated on the internet since its inception. I've seen and heard just about everything. Audio blind tests have been the cause of a million death spirals, and talking about it again will change no one's mind.
 
Last edited:

Users who are viewing this thread

Back
Top