bigshot
Headphoneus Supremus
I lived for 29 years in an apartment with my nice big speakers. It helps to just not care about the neighbors!
Sighhh... A good USB cable really does sound better.... way better. This phenomenon isn't caused by bit errors. It goes beyond our expertise here to talk about it with any degree of authority. If the issue were simple bit errors, the difference between an average USB cable and a high-end USB cable would not be manifested as a loss in sound-staging, frequency response, dynamics, pace, (add your own here) and the subtle nuances that make music so enjoyable to us. If a most-significant bit error occurred, then such errors would result in a click/pop artifact. That not what happens between run-of-the-mill and high-end USB cables. If bit errors were responsible, the musical integrity would come and go, it would be intermittent. That doesn't happen either.
Please move past the theory and try it on a nice system... preferably not a Yamaha receiver. Go visit a good dealer (or perhaps find a nice company with a trial period), bring your nice $20 Belkin USB cable and ask them to swap it out with a really good USB cable. Start to finish, it should take less than 15 minutes (well maybe an hour, because like all of us who have done this, you'll go back and forth 3 times in disbelief)..If you have invested in computer-based music, and you don't do this, then you're leaving musical performance that you paid for on the table. That's sad... and that's why I took the time to share my experiences with all of you. Don't let the lack of explainable theory keep you stuck in the land of mid-fi. Once you hear the difference, you can enjoy your music more than ever, and then still go back and try to understand it... but good luck... you'll need it. If any of you really do take the challenge, then report back what happens. That would make me happy. Better yet, PM me as I don't expect to participate in this discussion any further..Bye guys. Happy Listening... I tried my best...
Stay updated on Audio-Technica at their sponsor profile on Head-Fi.
|
Here's a facebook post I made just now:
"Nerdiest thing in the world. Testing the difference in sound of usb cables through a semi-high end audio system. With digital cables, the signal either passes to the next component or it doesn't; there's no in between or "better" transmission. It's one or the other, therefore it should sound the same.
BUT IT DOESN'T"
I am being serious, no intent to troll or stir up trouble/ be a sower of discord. I can't really accept it either, I was 95% expecting no difference and 5% expecting the slightest lifting of a veil across the audio spectrum... but what did I get? More than that. Well, an hour of A/B'ing between 4 songs and each time I could tell the same differences. I have great ears, but I still believe anyone would be able to hear it.
My mind is not playing tricks, I HEAR IT. I am mad, because the audioquest cable is inferior compared to the tellerium cable I loaned from a dealer. I can't believe the difference because 1. it's a digital cable and 2. it's being processed by the dac and then the amp; after the dac, it should sound the same.
I will elaborate on each track I tried and what exactly sounded different later on. I'm startled by my testing and don't feel like typing now. (I already feel it's a waste of time to write about this but I will do it).
Two things I need to consider though. The audioquest carbon cable uses copper, the tellerium is pure silver. Also, the tellerium is about 1 meter, whereas the audioquest is 1.5 meters.
I also used SoX resampler to upsample to 192 even though CD rips are at 44.1 bitrate. All songs are in wav format. I noticed songs clipping more often, while being upsampled, when using the audioquest.
I would actually consider paying the 350$ for the tellerium if I can sell the audioquest for what it cost me. And my coworkers and I at the Hi-Fi boutique were laughing about the ludicrous price of this cable just earlier today.
I don't want to create any hype, but I think I'm a believer now
Daniel
(more to come)
Stay updated on Audio-Technica at their sponsor profile on Head-Fi.
|
It's not placebo because there is one bird cry in a song I can literally not hear with one cable and barely hear at all with the other.
Did you set up a preamp on each and line level match? I bet that's your problem here.
If you didn't have a switcher to directly compare, it's no good. Auditory memory for similar sounds won't hold up more than a second or two.
Did you set up a preamp on each and line level match? I bet that's your problem here.
If you didn't have a switcher to directly compare, it's no good. Auditory memory for similar sounds won't hold up more than a second or two.
playing any music through my apogee symphony io is definitely a boost in quality when it comes to sound big shot.
If something runs counter to accepted and already proven theories it is the responsibility of the former to prove the existence and/or validity of the claim. Just as lack of proof doesn't necessarily imply falsehood, lack of proof to the contrary is not equivalent to truism.
If A is true and any such B contradicts A, proving B is true would mean A is untrue. Inductively if B cannot be proved true we can say A is true in the general case. If no arbitrary B can be produced we can say that A is true and B is false.
To me (correct me if I am wrong), to prove there is a difference in sound one would have to:
A) Show quantitatively that an average (working) USB cable is unable to transport data as well as a high end cable. This can be done through monitoring packets or otherwise. Since it is a digital signal it should be bit perfect, this is quite easily to verify unlike analog signal that can depend on the sensitivity of the testing instruments. If the data is verified as having no errors it is unequivocal proof that cables make no difference.
B) ABX testing with statistically valid sample sizes. Since with any statistical test even as n goes to infinity one cannot state with true certainty ABX testing is only useful if case A) is inconclusive. It A proves there is no data degradation or otherwise but ABX testing shows statistically valid differences the analytic solution will always take precedence over statistical and numerical methods.
This argument cannot be necessarily extended to analog signal because then you are limited by the accuracy and precision of measuring equipment.
I've done production sound recording and editing, and I've supervised sound mixes. I had a 24 bit ProTools workstation on my desk at work. I did a lot of testing, because I was the recording and post production supervisor and was responsible for every single track.
What I found was that for recording, ProTools kicked ass. I could take an off mike vocal or a soft guitar lick and boost it as far as I wanted and there would be no noise. But once a mix was done and the track was bounced down to redbook, I could play it on ProTools or on a $150 Yamaha CD player and it sounded the same. The improved quality of the equipment only applied to raising the volume of quiet stuff. At normal listening volume it was identical.
Try a direct A/B line level matched comparison yourself. With a mixing board, it's easy to do. Yu'll find out what I'm talking about.