A proposed optical digital cable test
Jun 25, 2010 at 5:49 PM Post #136 of 138
I haven't read through the whole thing, but wouldn't it make more sense to first simply compare the signal going into the cable with the signal coming out. Basically loop a cable from an output to the input of a computer (obviosuly, don't have it pass-though input to output, cuz then you will just have an infinite loop of monkey death). Have a program monitor the input feed (by monitor, i mean make grpahs, etc, not actualyl play it), and then play something known through the output. In theory, since its digital, and optical, the input and output should be equal, so if you run a diff test on the input and output, it should be all zero. Do this for each cable, and unless the diff test ends up being non-zero on one or all of them,  you know the cables are all equal. Only if the results of the output test are different for each cable do you need to worry about figuring out how to compare the cables.
 
Sorry if this doesn't make sense, or has already been addressed. I didn't read all the pages.
 
EDIT: also, due to the nature of optics, I'd imagine the quality of the actual jacks (as in the transmitter and reciever) would have more effect than the cables themselves. Maybe a test can be doen on this as well.
 
Jun 27, 2010 at 2:52 PM Post #138 of 138


Quote:


I genuinely wonder if bit-perfectness really matters so much ? As far as I am aware nobody has attempted to DBT a bit-perfect original vs a non bit-perfect copy, it would be interesting, my own play/record setup cannot deliver bit-perfectness resulting in low level added noise, but noise that is over 100db down on the source, is this humanly detectable ? I do not know. One day I will try some test recordings and see what happens. But I do not lose any sleep knowing that what I get has a bit of low level corruption, compare this to the extra grunty on vinyl and it quickly falls into proper perspective...
 
 

Users who are viewing this thread

Back
Top