Do Audiophile Network Switches Make a Difference?
Sep 25, 2021 at 11:07 AM Post #121 of 144
No one ever went broke underestimating the intelligence of the public.
 
Sep 25, 2021 at 7:20 PM Post #122 of 144
You don’t have to be an auto mechanic to tell someone they can’t run their car on water instead of gasoline.
Very true, but what if they asked you if their Kia soul would run better or be faster with Turbo Blue 110 octane racing fuel or some fancy tripple tipped spark plug. Or should they get that cool looking electric turbo off ebay because the add says it adds 10- 20 horsepower. Or some plug in performance computer chip that does absolutely nothing. Or should my 15 year old daughter with a 25 inch draw length and 45 lbs of draw weight use a 350 grain arrow with a mechanical hunting tip because it will be faster, flatter and easier to tune. Those are things I can answer and tell you exactly why it either won't work or is a horrible idea.

I am not even sure how a network switch works or why one would be better than other. I do have a basic knowledge of how digital media works though so that is why I say it is an opinion. I will leave it up to someone who actually knows how this stuff works to say "No that is stupid, and this is why".

I have a habbit of being fairly opinionated and comming off as a know it all at times so I like to temper that with making a point that it is merely my opinion and I could be wrong. And I am always happy to admit when I am wrong if someone who knows more can correct me and explain why. We all learn something then.
 
Last edited:
Sep 25, 2021 at 7:45 PM Post #123 of 144
Well if anyone takes offense to anything I say, they can assume for themselves that it's IMHO. I don't want to preface everything I say with that. But it's fine if you want to.
 
Sep 25, 2021 at 7:55 PM Post #124 of 144
I get it, and I can be that way too. The older I get the less I give a *&*^ what others think lol. But I am fairly new here and don't know a ton about headphone audio so I am trying to tread lightly. Honestly I don't know why I keep coming back and commenting on things. I should just keep my mouth shut, lurk and search for topics that interest me. I guess I am just bored and lonely lol. The most fun and entertainment I got out of a thread ended up getting deleted, go figure.

I honestly don't do it so much because I am afraid of getting in trouble or whatever. I just don't like to mislead people and give poor information if I can help it.
 
Sep 27, 2021 at 4:13 PM Post #125 of 144
If I’m streaming music music, say from Tidal, those 1’s and 0’s have been through an untold amount of hubs, switches, routers, the local exchange, and long lengths of copper cable. I’m assuming that all this equipment would have been designed to carry data, and therefore not of‘audiophile‘ quality.

Are people saying that none of this matters, as long as you use an ‘audiophile’ network switch in your home?
All thouse switches and routers are in the realm of networking: binary signalling at physical layer; framing at data link layer; packet routing at network layer; and message transport at 3-way handshake TCP protocol at transport layer. No streaming DACs in these network segments. Which brings us to the last mile integration: delivery to the audiophile network. All of sudden, we care about LF/HF noise. Previously, we didn't care but now we care. And here we have two schools of thought.
One, proponents of audiophile switches. Most of these switches contain filters, among other things. While these filters remove noise they also constrain the bandwidth, another important component affecting jitter (other two being the clock and noise).
Two, proponents of no-filters-in-the-signal-path. The concept of jitter misleads people into thinking that all you need in a digital signal is the correct bits (which is relatively trivial to transmit) with great timing (low jitter), and so all you need is a great clock. This simplistic view is highly misleading. At least three things matter - the clock, noise and bandwidth. In the image of a perfect square wave, the horizontal axis is time and the vertical axis is voltage. We will assume the clock is perfect – ie. the vertical signal lines occur at perfectly spaced intervals (the bit rate). When the signal is representing a binary 0, it is at 0v. When the signal is representing a binary 1, it is at 1v. And we will assume that the receiver of this signal decides that the transition between a 0 and a 1 has occurred when the signal rises through the 0.5v level, and that a 1 has transitioned to a 0 when the signal falls through the 0.5v level. Now imagine that there is noise added to the signal. If the frequency of the noise is below the bitrate then this perfect square wave swims on top of a longer and smoother wave. The interesting point is that the timing between the data transitions (where those vertical lines pass through 0.5v) is unchanged. So no problem, yet. If the frequency of the noise is above the bitrate then the horizontal lines get fuzzy. And if we combine the low frequency noise with the high frequency noise the effect is combined. Again, the interesting point to note is that the timing between the data transitions (where those vertical lines pass through 0.5v) is unchanged provided the noise is not extremely high. So, again, no problem. Noise, on its own (as long as the deviations caused are materially below 0.5v) is not a problem. The reason it is not a problem is those vertical lines, because noise does not change the space between them.
Now imagine there is no noise. Zero noise is impossible, but something else that is impossible is the vertical line on the square wave, since it requires infinite bandwidth. The vertical lines imply the signal can achieve 0v and 1v in more or less the same instant. Whatever tools we have to transmit a signal, the demands of high bit-rate signals are way beyond what the available tools can deliver. Think about how your analog cables can mess with sound up to around 20kHz, and then think about the enormously wider frequency range required of a digital cable (and, optical cables just have a different set of problems, mainly related to reflections). The higher the bit rate the harder it gets. When we allow for constrained bandwidth, instead of transitions being instantaneous, the signal goes up a slope when transitioning from 0v to 1v, and down a slope when transitioning from 1v to 0v. If the bandwidth was the same as the bitrate then the signal would be a sine wave. To reasonably square out the signal you need to add several harmonics of the bitrate (say 7 or more) above the bitrate, and that is a lot of bandwidth - even more for higher bit rate signals. By adding harmonics, the sine wave begins to square out. Interestingly, in both of these constrained-bandwidth examples, the transitions through 0.5v are still perfectly spaced – even with the sine wave. So still no problem.
But as I mentioned, a higher bitrate signal (if you think high bitrate files must always sound better) requires even more bandwidth to square out the wave, and so in a system that has a finite limit on bandwidth, a lower bitrate signal will be more accurately represented than a high bitrate signal. On top of that, if you ask anything in a music server to work faster, it will work with less precision and this is a key trade-off to be aware of when you assume higher bit rates must be better, just because the numbers are bigger. These examples only allow us to conclude that there is no problem if we can achieve zero noise or infinite bandwidth. But each of those goals is unattainable, and the problem becomes apparent when there is both noise and constrained bandwidth. So what happens if we add a low frequency noise component to a frequency-constrained digital audio signal? All of sudden, the 0.5v points are shifted right or left by the addition of the low frequency noise that lifts or drops the signal between bits. Shifting the slopes up or down shifts the 0.5v points left or right. The greater the amplitude of the noise, and the greater the bandwidth constraint, the greater is the effect on timing (jitter).

Now if we add high frequency noise to a frequency-constrained signal you can see that the transition timing at precisely 0.5v is now hard to discern for any digital receiver. If the signal is vertical at the transition then noise does not affect it. But as soon as the transition is not vertical then noise changes the transition point. It is the combination of constrained bandwidth and noise that inevitably creates jitter (variation in data transition timing), regardless of how great the clock is. The proponents of this school of thought state that the noise can effectively be dealt with only at the upstream boundary e.g. music server. Once the noise propagates midstream and downstream it will be impossible to get rid of it.
 
Sep 27, 2021 at 9:04 PM Post #126 of 144
Audiophiles are experts at thinking up solutions to problems that don't exist. All that really matters is if it's audible... which it isn't.
 
Sep 28, 2021 at 3:59 AM Post #127 of 144
lol all that netsplaining about signal voltages and bits might sound a little bit convincing if the transmitted data were directly translated, raw, into the audio signal BUT - the data streamed to a music streamer from a services like Tidal, Spotify, et al is in a compressed format that still needs to be decompressed by a codec.

So none of that mumbo jumbo is going to apply as the data will be buffered by the music streamer, to be assembled and decompressed into the actual audio format afterwards. Whatever jitter or timing errors or anything anomaly for that matter, will be solely on the music streamer, it's firmware/codec/software, and whatever clock it's using.
 
Sep 28, 2021 at 4:05 AM Post #128 of 144
I'm going to repeat myself. Sorry about that!

Usually it's better to identify a problem and then go looking for a cure. Do a controlled listening test to see if the streamed copy sounds different than the original file. If it does, you definitely have a problem.

It is completely backwards to skip right past verifying that the problem exists and start working on what might be causing it and and how to fix it. Personally, I can't wrap my head around ideas like this, but it seems that there are a lot of people who have no concept of skepticism and documentation. They keep running down rabbit holes fixing problems that don't exist. It doesn't take an EE degree to listen to two files and check to see if one is degraded or not. Why don't people do that? Why don't they even THINK of doing that?
 
Sep 28, 2021 at 7:57 AM Post #129 of 144
So this is another angle to look at the alleged problem. This is concerning USB cables going directly to the DAC, not ethernet cables usually going directly to a source device, but in either case, we're talking about the proposed effects of interference on a digital cable. Ironically, I found this link because someone on reddit was trying to use it to defend the practice of buying an expensive USB cable, but I digress.

https://www.audiosciencereview.com/...s/do-usb-audio-cables-make-a-difference.1887/

The conclusion was that Amir actually did find a measurable difference between USB cables when using the worst-performing DAC he'd measured at that point. However, the measured difference was far below human hearing and this is a worst-case scenario, as the Schiit Modi DAC he used was doing a poor job of filtering noise on the data and power inputs.

So worst-case scenario on a DAC doing a poor filtering job...the difference was measurable but not audible.

I believe with a good degree of confidence that the conclusion could be extrapolated to ethernet cables.
 
Sep 28, 2021 at 2:00 PM Post #130 of 144
I have done exactly what Bigshot is talking about. Compared Tidal to the original or other versions of the music. As well different ways of streaming it on different devices. A lot of the music I listen to on tidal I actually own on CD, dating back to the 80's.

I hooked my dvd/cd player up to my dac via a digital connection. This makes it very easy to switch from source to the other. I just start the song on the devices at the same time and simply flip the input switch back and forth and I can listen to the two devices playing the same song at the time back to back. I also have various devices that will stream tidal in different ways. So I tested this too. Some of them are direct ethernet cables, some are wifi and even cellular. As well as files stored on my phone. Tidal has an "offline" mode that will let you store files on a portable device so you can listen to albums and playlists you saved in case you are in an area with no cell service. While I don't have a fancy expensive ethernet cable I do have a few that are pretty short and overbuilt for what is needed. I also have a 50 foot cable, so why not throw that into the mix. I also tried the tidal app on my computer compared to using tidal off an internet browser.

This is what I found. Most of it all sounded the same. In a few cases the tidal versions sounded a bit better. And in a few cases the CD's sounded better. One example would be Natalie Merchant "Seven Years" the Master Quality version of this on tidal is wrecked. It just sounds horrid. You can hear bleed through from another track on some soft parts like it was old worn out tape player and her voice distorts in some spots. The CD version sounds fantastic. And it didn't matter what device I used for tidal or how it streamed, via direct cable, wifi or cell it sounded the same. They were bit for bit perfect in sound quality from one to the other. Any of the differences I heard in the songs between the CD's and Tidal was purely due to how they transferred or re mastered the files. The other thing I discovered is just because the track or album is labeled "master quality" or "Hi Fi" doesn't mean it is going to sound better. Some sounded worse, some sounded good/better as in you could tell they did something to it like re mastered it. Most sounded the same, even non "hi fi CD quality" versions.

So in the it boiled down the same thing that has always happened in audio. The most important factor was the quality of the original source recording of the music. If it sucks nothing else in the chain matters. A bad recording is just a bad recording.
 
Sep 28, 2021 at 5:03 PM Post #131 of 144
Most of it all sounded the same. In a few cases the tidal versions sounded a bit better. And in a few cases the CD's sounded better.

I imagine most of the time, streaming services just rip a CD and use that. I know that is what Amazon does. I doubt that streaming services (other than perhaps Apple Music) do much in the way of mastering themselves, unless they are touting some sort of audiophile stream. The few that sound better or worse are probably just ripped from different CD releases with different mastering. I bet if you looked at used copies of the Natalie Merchant album you'd find an old CD release that has all the problems you describe. Yup. Garbage in, garbage out.
 
Sep 28, 2021 at 5:16 PM Post #132 of 144
So this is another angle to look at the alleged problem. This is concerning USB cables going directly to the DAC, not ethernet cables usually going directly to a source device, but in either case, we're talking about the proposed effects of interference on a digital cable. Ironically, I found this link because someone on reddit was trying to use it to defend the practice of buying an expensive USB cable, but I digress.

https://www.audiosciencereview.com/...s/do-usb-audio-cables-make-a-difference.1887/

The conclusion was that Amir actually did find a measurable difference between USB cables when using the worst-performing DAC he'd measured at that point. However, the measured difference was far below human hearing and this is a worst-case scenario, as the Schiit Modi DAC he used was doing a poor job of filtering noise on the data and power inputs.

So worst-case scenario on a DAC doing a poor filtering job...the difference was measurable but not audible.

I believe with a good degree of confidence that the conclusion could be extrapolated to ethernet cables.
Incorrect. Ethernet cables are tested to between 250ft and 300ft for data integrity.
 
Sep 29, 2021 at 12:36 AM Post #133 of 144
I'm amazed this thread has been going for 9 pages. 1Gbps switches are cheap now. My ISP/Modem/Router/Switches, they're all base 1Gbps. I can easily stream 4K video content and also have other devices on wifi. Audio codecs, even lossless, are not taking up the bandwidth of some of these other sources. Noise is dependent on signal. 1Gbps systems are well overboard for not having any errors due to signal noise with audio.
 
Sep 29, 2021 at 9:15 AM Post #134 of 144
Incorrect. Ethernet cables are tested to between 250ft and 300ft for data integrity.

I was speaking purely about the theory that interference running along the ethernet cables is affecting the sound. It's a ridiculous argument, to be sure, but just pointing out that a DAC that was bad at filtering out interference did show noise on a measurement - but that noise was well below the range of human hearing.
 
Sep 29, 2021 at 9:36 AM Post #135 of 144
I was speaking purely about the theory that interference running along the ethernet cables is affecting the sound. It's a ridiculous argument, to be sure, but just pointing out that a DAC that was bad at filtering out interference did show noise on a measurement - but that noise was well below the range of human hearing.
No you are just making crap up. You are taking one limited test between a dac and a source and trying to apply that same measurements to streaming music via ethernet cable. It's completely ludicrous. First the other test proved it was below audible detection. Second data transfer rate of the different systems is completely different. Plus the data still has to be processed by the pc and the dac as well as the drivers.

You are a new head-fier trying to create problems where none exist to make yourself stand out. You are just making yourself look foolish.
 

Users who are viewing this thread

Back
Top