New denafrips DDC (Iris and Gaia)
Feb 17, 2022 at 6:48 AM Post #391 of 760
Even though all of my music is 44.1?
I have no hirez music. Everything is rebook and my DAC is NOS
Not. This is a fine NOS DAC, it will play the best with a native sample rate.
 
Feb 17, 2022 at 6:53 AM Post #392 of 760
I don’t think length matters for SPDIF and AES signals. I’ve heard talk about USB length. Either as short as possible or based on optimal segment length, like 1-2m or whatever the theories are.
You can do long cable runs with AES/EBU, but not with S/PDIF.
 
Feb 17, 2022 at 12:36 PM Post #393 of 760
I don’t think length matters for SPDIF and AES signals. I’ve heard talk about USB length. Either as short as possible or based on optimal segment length, like 1-2m or whatever the theories are.

I disagree. My experience using AES/EBU is that the absolute best sound is with a very short cable, just barely long enough to reach from the source to the DAC. In my case from a Denafrips Iris digital-digital converter to the DAC it is sitting on (Yggdrasil), about 7 inches. The sound is very much clearer than with a 2m cable which supposedly has a minimum of jitter due to internal reflections.
 
Feb 17, 2022 at 12:46 PM Post #394 of 760
I disagree. My experience using AES/EBU is that the absolute best sound is with a very short cable, just barely long enough to reach from the source to the DAC. In my case from a Denafrips Iris digital-digital converter to the DAC it is sitting on (Yggdrasil), about 7 inches. The sound is very much clearer than with a 2m cable which supposedly has a minimum of jitter due to internal reflections.
This makes the most sense to me. However, over the years it's been listed everywhere that digital cables need to be 1.5 m in length. I always thought this was nonsense and the person giving their reasons would give this convoluted, stretched out, word salad of reasons that it needs to be that length. However, the common consensus for i2s is that it needs to be as short as possible. Regarding other digital cable types, I've read for years that it needs to be 1.5 m because of certain physics that are happening inside the cable at shorter length. Somebody please try to defend that argument because it seems so ridiculous to me.
 
Feb 17, 2022 at 2:23 PM Post #395 of 760
Forgot pics of new Hermes.
 

Attachments

  • IMG_20220217_135613519.jpg
    IMG_20220217_135613519.jpg
    3.3 MB · Views: 0
  • IMG_20220217_135641068.jpg
    IMG_20220217_135641068.jpg
    3.2 MB · Views: 0
  • IMG_20220217_135652373.jpg
    IMG_20220217_135652373.jpg
    3.3 MB · Views: 0
Feb 17, 2022 at 2:59 PM Post #396 of 760
Regarding other digital cable types, I've read for years that it needs to be 1.5 m because of certain physics that are happening inside the cable at shorter length.
The only things can call for longer length is a bending factor. If to short, radius can be smaller than a minimum allowed (it must turn 180 degrees), changing properties of the dielectric, so the cable. In all cases impedance matters the most: Transmitters, connectors, plugs, cable and receivers. Other parameters are less important on short distances.
 
Feb 17, 2022 at 3:00 PM Post #397 of 760
This makes the most sense to me. However, over the years it's been listed everywhere that digital cables need to be 1.5 m in length. I always thought this was nonsense and the person giving their reasons would give this convoluted, stretched out, word salad of reasons that it needs to be that length. However, the common consensus for i2s is that it needs to be as short as possible. Regarding other digital cable types, I've read for years that it needs to be 1.5 m because of certain physics that are happening inside the cable at shorter length. Somebody please try to defend that argument because it seems so ridiculous to me.
Of course the shortest cable that fits your requirements is theoretically better than a longer one.

The specifications and standards for AES/EBU, coaxial S/PDIF, optical S/PDIF, and USB specify maximum lengths that compliant devices should expect to work with.

As far IIS/I2S I think it was designed to connect components on the same board and was never expected (hahaha) to be used to connect physically separate devices.
 
Feb 17, 2022 at 3:03 PM Post #398 of 760
The only things can call for longer length is a bending factor. If to short, radius can be smaller than a minimum allowed (it must turn 180 degrees), changing properties of the dielectric, so the cable. In all cases impedance matters the most: Transmitters, connectors, plugs, cable and receivers. Other parameters are less important on short distances.
Of course the shortest cable that fits your requirements is theoretically better than a longer one.

The specifications and standards for AES/EBU, coaxial S/PDIF, optical S/PDIF, and USB specify maximum lengths that compliant devices should expect to work with.

As far IIS/I2S I think it was designed to connect components on the same board and was never expected (hahaha) to be used to connect physically separate devices.
Two good points.
 
Feb 17, 2022 at 7:53 PM Post #399 of 760
This makes the most sense to me. However, over the years it's been listed everywhere that digital cables need to be 1.5 m in length. I always thought this was nonsense and the person giving their reasons would give this convoluted, stretched out, word salad of reasons that it needs to be that length. However, the common consensus for i2s is that it needs to be as short as possible. Regarding other digital cable types, I've read for years that it needs to be 1.5 m because of certain physics that are happening inside the cable at shorter length. Somebody please try to defend that argument because it seems so ridiculous to me.

I think that the key to this is that digital cable lengths do matter a lot, but only after after the first few inches. In my experience with several different brands, for SPDIF, AES and USB a 2m or 1.5m cable sounds considerably better than a 1m, and this is almost certainly because of the internal reflection-caused timing jitter issue described by a lot of experts. This phenomenon appears to basically be that an impedance mismatch at the destination or receiver end of the cable (the DAC) causes a reflection of the digital pulse to propagate back along the cable to the source end, where similarly, an impedance mismatch causes another, delayed, reflection back to the destination DAC. At the DAC's receiver the pulse leading edge as detected by the DAC becomes in effect blurred or smeared or stretched out in time by these received delayed lower amplitude reflected pulses, with a resulting timing error. Cumulatively over many pulses this is called timing jitter. The relationship of this timing jitter to the bit to bit and word to word PCM pulse timing is a function of the impedance mismatch and the cable length and corresponding reflection delay time, resulting in the observed superiority of cables of 1.5m and 2m and more.

Of course it really is a tradeoff, since longer cables inherently degrade the signal due to other phenomena. It's just that the ear is inordinately sensitive to this particular form of distortion in digital systems.

It appears that this phenomenon goes away for very short lengths (like my short AES/EBU cable), where the internal reflections are of such short delays that the issue seems not to apply, I'm sure for reasons understandable from the physics of the interface.
 
Last edited:
Feb 17, 2022 at 8:22 PM Post #400 of 760
I think that the key to this is that digital cable lengths do matter a lot, but only after after the first few inches. In my experience with several different brands, for SPDIF, AES and USB a 2m or 1.5m cable sounds considerably better than a 1m, and this is almost certainly because of the internal reflection-caused timing jitter issue described by a lot of experts. This phenomenon appears to basically be that an impedance mismatch at the destination or receiver end of the cable (the DAC) causes a reflection of the digital pulse to propagate back along the cable to the source end, where similarly, an impedance mismatch causes another, delayed, reflection back to the destination DAC. At the DAC's receiver the pulse leading edge as detected by the DAC becomes in effect blurred or smeared or stretched out in time by these received delayed lower amplitude reflected pulses, with a resulting timing error. Cumulatively over many pulses this is called timing jitter. The relationship of this timing jitter to the bit to bit and word to word PCM pulse timing results in the observed superiority of 1.5m and 2m and more cables.

It appears that this phenomenon goes away for very short lengths (like my short AES/EBU cable), where the internal reflections are of such short delays that the issue seems not to apply, I'm sure for reasons understandable from the physics of the interface.
That's a theory that is well above my pay grade and sounds very fancy. I'm not saying that it's not correct I'm just saying I have no clue what you're talking about.

If that cable needs to be 1.5 m or 2 m, then why doesn't the wiring that leads to the outputs of the transport/DDC also need to be the same length? Why is it that at the outputs of a transport, all of a sudden, for some magical reason, the wire needs to become 1.5 m long before it hits the DAC?
 
Feb 17, 2022 at 9:32 PM Post #401 of 760
That's a theory that is well above my pay grade and sounds very fancy. I'm not saying that it's not correct I'm just saying I have no clue what you're talking about.

If that cable needs to be 1.5 m or 2 m, then why doesn't the wiring that leads to the outputs of the transport/DDC also need to be the same length? Why is it that at the outputs of a transport, all of a sudden, for some magical reason, the wire needs to become 1.5 m long before it hits the DAC?

If the cable needs to be say at least 1m or so just because of the distance between components, then the system will sound better to use a 2m or so cable to mostly eliminate internal reflection-caused jitter. If the cable distance between source and DAC is much shorter (say 5-7 inches) then experience says the system with the very short digital cable sounds better even than with the 2m.

"....why doesn't the wiring that leads to the outputs of the transport/DDC also need to be the same (1.5-2m) length?"
Very short digital transmitter to digital receiver cable lengths within the transport or CD player or DDC don't seem to cause enough delays for reflections to matter much at all. Internal direct-wired cable connections between circuit boards avoid connector-caused impedance mismatches and therefore greatly reduce any reflections. Also, with very short cables the cable characteristic impedance doesn't matter much and therefore doesn't cause much of any reflection phenomena. Also, such short direct-wired connections usually don't involve a digital communication transmitter/receiver format and protocol.
 
Last edited:
Feb 17, 2022 at 9:51 PM Post #402 of 760
If the cable needs to be say at least 1m or so just because of the distance between components, then the system will sound better to use a 2m or so cable to mostly eliminate internal reflection-caused jitter. If the cable distance between source and DAC is much shorter (say 5-7 inches) then experience says the system with the very short digital cable sounds better even than with the 2m.

"....why doesn't the wiring that leads to the outputs of the transport/DDC also need to be the same (1.5-2m) length?"
Very short digital transmitter to digital receiver cable lengths within the transport or CD player or DDC don't seem to cause enough delays for reflections to matter much at all. Internal direct-wired cable connections between circuit boards avoid connector-caused impedance mismatches and therefore greatly reduce any reflections. Also, with very short cables the cable characteristic impedance doesn't matter much and therefore doesn't cause much of any reflection phenomena. Also, such short direct-wired connections usually don't involve a digital communication transmitter/receiver format and protocol.
Is there scientific, proven data to back this up? Or is this just theory?
 
Feb 17, 2022 at 9:53 PM Post #403 of 760
Also, are there measurements to prove this? Are there blind listening tests done on a large scale?
 
Feb 17, 2022 at 9:55 PM Post #404 of 760
With all due respect, all I'm hearing is word salad.
 
Feb 18, 2022 at 3:35 AM Post #405 of 760
Also, are there measurements to prove this? Are there blind listening tests done on a large scale?
Reflections are measured, I suggest to remove content of your next message, it seems offensive. The first reflection (after traveling twice of a cable lenght) can smear transition on the receiver when arriving exactly at the time transition is not completed, therefore shifting time a logic level is recognised. It will introduce jitter. If a lenght of the cable is longer little bit, a transition is not smeared when a first reflection arrives after a logic level is already recognised. I never made such calculations, so I can't confirm validity of this case. Of course there is also other critical moment at the driver side, it hapens at the half of a critical distance and everything become more complicated when cable lenght increases. :)
 

Users who are viewing this thread

Back
Top