What parameters should be measured to quantify differences between cable acoustic properties.
Mar 5, 2009 at 11:56 AM Post #31 of 37
Quote:

Originally Posted by JaZZ /img/forum/go_quote.gif
Copied from a parallel thread:
So I'd recommend to give the skin effect/phase distortions a try when it comes to measurings. Not in the form of phase measurings, but rather in the form of complex waveforms -- such as a cymbal crash or an excerpt of it, resp. --, if possible analogue or in high-resolution digital (96 or 192 kHz sampling rate) for better signal-shape discrimination/identification due to the avoiding of transient corruption by the antialiasing filter.
.



Ok that will try to look for signal difference there.

The next step would be to rate the level of difference using studio listening tests. I will be the first to admit it would be best if a better qualified person than me did this testing

PS Can anyone lend me a recording studio and some sensitive monitoring equipment for a few weeks? :)
 
Mar 5, 2009 at 5:27 PM Post #32 of 37
Quote:

Originally Posted by Pio2001 /img/forum/go_quote.gif
Actually, it just occured to me that it might work if the DAC used for playback and the ADC used to record are slaved to the same clock.


Essentially they are already. The clock is embedded into the digital data stream by the ADC and your DAC uses it to sync it's own clock (except in asynchronous DACs). However, this still doesn't give an absolutely identical clock source because the PLL will "iron out" the timing to remove jitter. Which is, by the way, the reason why cable induced jitter is irrelevant.

G
 
Mar 5, 2009 at 7:17 PM Post #33 of 37
Quote:

Originally Posted by gregorio /img/forum/go_quote.gif
The clock is embedded into the digital data


We were talking about recording with a microphone... or at least from an analog output.
 
Mar 5, 2009 at 7:42 PM Post #34 of 37
Quote:

Originally Posted by Pio2001 /img/forum/go_quote.gif
We were talking about recording with a microphone... or at least from an analog output.


I'm lost, what difference does that make?

G
 
Mar 6, 2009 at 11:16 AM Post #35 of 37
Quote:

Originally Posted by gregorio /img/forum/go_quote.gif
I'm lost, what difference does that make?


I'm lost myself. When you test a cable inserted in a stereo, and record the sound with a microphone, preamplifier, and professional soundcard, the data stream produced by the ADC has no clock embedded. It goes through USB, or PCI, that are asychronous buses.
And even if it produced a clocked data stream, what effect could it have on the CD or SACD player of the system under test ?
 
Mar 6, 2009 at 1:51 PM Post #36 of 37
Quote:

Originally Posted by Pio2001 /img/forum/go_quote.gif
I'm lost myself. When you test a cable inserted in a stereo, and record the sound with a microphone, preamplifier, and professional soundcard, the data stream produced by the ADC has no clock embedded. It goes through USB, or PCI, that are asychronous buses.
And even if it produced a clocked data stream, what effect could it have on the CD or SACD player of the system under test ?



OK, I think I see where the confusion is. The datastream produced by the ADC does have clock embedded! Whether the datastream is later passed to the DAC via USB, PCI or SPDIF, Optical, etc., is pretty much irrelevant. It's loaded into the audio buffers of the DAC and then processed. In most cases the embedded clock is passed through a PLL where the clock signal is smoothed and used by the DAC chips for decoding. So it does have a direct effect on the system under test.

G
 
Mar 6, 2009 at 2:16 PM Post #37 of 37
Guys,

I think you are getting a bit bogged down with the experimental side of things here.
It is easy to measure and compare two recorded signals above and beyond what is capable by the human ear. Please stop thinking of test equipment like hi fi in terms of its capability, the sort of lab measuring equipment you can readily purchase can go way beyond the accuracy and sampling rates of hi fi equipment, which is only optimised for the audio range by its nature.

At the moment you are thinking along deterministic lines of comparison using a hard time signal, which is incidentally is entirely worthwhile and straightforward to achieve. But there are many other ways to compare the two signals, time or phase shifting a waveform at similar points in the music until they match (overlaying) may be all that is needed to show some differences such as different drop offs/reverbs etc.
 

Users who are viewing this thread

Back
Top