bfreedma
The Hornet!
- Joined
- Feb 3, 2012
- Posts
- 3,290
- Likes
- 2,675
Depending on the quality of the DAC and other components and it’s handling of incoming jitter levels, nothing to do with lost packets but the timing accuracy involved,
the brains neuro receptors work at a threshold of around 4 microseconds so any errors greater than that will move what should be a dead centre stereo sound roughly 2 degrees, that’s the level of sensitivity, once we start talking about that same timing error in transients they affect instrument timbre as well as bass resolution and depth perception,
Just looking for reasons why that in some cases both sides can be correct, from an electrical engineers perspective as long as the hardware meets relevant specs there is no valid reason that something exceeding those specs can sound different, from an acoustic science perspective timing errors or jitter can and do effect the analogue audio quality coming out of a DAC,
There are some very good DAC’s out there lately with negligible jitter levels and some that are relatively poor so anything in the network chain that minimises levels going to a DAC already struggling to control Jitter may sound different in the converted analogue,
your opening line about jitter means then that the level is largely irrelevant in network data transmission,
but minimising it is vital when it’s a digital signal that needs to be accurately converted back to analogue ..
The issue I see with your view of this is that you’re not accounting for buffering and packet replacement and realignment. If there is a jitter problem with a DAC (haven’t seen that in over a decade), a cable won’t solve that problem. Ethernet is asynchronous, so unless someone intentionally sets buffers far too low, it can’t be an issue.
If the question is now whether some boutique dacs are so poorly designed that their analog suffers from jitter, there could conceivably be problems. But that would be occurring after the Ethernet cable.
Ethernet buffering is very resilient which can be easily seen in a simple test. Set your software’s network buffer to 1 minute or whatever it’s max buffer is. Start playing a song and in the middle, pull the Ethernet cable, wait a few seconds and reconnect the cable. You should hear the song play with no interruption or audio degradation despite a multi second long loss of connectivity. The buffer provides the data and Ethernet ECC ensures that the packets that weren’t received when the cable was disconnected are resent and properly ordered.
Given that, I need an explanation from a vendor claiming that jitter timing and subsequent packet replacement and reorder, which is orders of magnitude less time lost than the cable pull, would impact sound.
TLDR: Ethernet is almost never effected by jitter as it is asynchronous- buffering allows for problematic packets to be resent and replaced in correct sequence before the audio device needs the data.
Last edited: