New Posts  All Forums:Forum Nav:

The HDMI Cable Discussion - Page 4

post #46 of 338

digital either works or does not.  can see no reason why cheap ones would work any differently from expensive snake oiled ones.

post #47 of 338
Thread Starter 
Quote:
Originally Posted by ac500 View Post

Because headphone voice coils are powered by analog electricity :P. At some point the audio data needs to be converted to electrical power relevant to power whatever headphone technology is used. Digital as well as optical is quite useful for everything in computers but at some point the data needs to be delivered to the actual physical device. If you provided a digital signal to a headphone, that would just mean it has to have a DAC and amp inside the headphone itself, which needless to say probably isn't going to sound that good.

 

In other words, the amount of quality lost over a cable from your amp to your headphone is negligible and as far as I know insignificant (although I'm not decided on that entirely); however the quality of your dac and amp is quite significant, so it's much better to have a big desktop dac/amp + long cable, than to have a tiny wimp of an amp built into your headphone and short internal wires from the tiny amp to the drivers.



interesting!
Although there are those eltro something rather headphones which also require power...so it wouldn't be the strangest thing in the world - would it?
I mean my question is:
Is it possible to have headphones on optical only?

 

post #48 of 338
Thread Starter 
Quote:
Originally Posted by mark2410 View Post

digital either works or does not.  can see no reason why cheap ones would work any differently from expensive snake oiled ones.

 

in that respect only thing would be shielding as in the optical cable might crack on cheaper models.
 

 

 

post #49 of 338
Quote:
Originally Posted by Totally Dubbed View Post



interesting!
Although there are those eltro something rather headphones which also require power...so it wouldn't be the strangest thing in the world - would it?
I mean my question is:
Is it possible to have headphones on optical only?

 


You must mean electrostatic headphones. Those use a different kind of driver. Instead of a voice coil making a membrane vibrate, they use also a membrane - although a much thinner one - with magnets inside a frame. When music plays, the signal goes through that frame and creates a magnetic field, making the membrane move. However these headphones work just the same as dynamic ones, it's also a membrane being moved and creating sound. They simply have a different driver and require a different kind of amplifier.

 

All headphones require power. The reason why headphones can't use an optical cable is because they can't use that signal. When you play a file on your computer, a program, like iTunes or hopefully something better, makes that buch of digital data get decoded. Since it's a computer, it's still a digital signal, i.e. just 1's and 0's. Then it reaches a DAC where by ways mysterious and magical to me, that digital signal made up of 1's and 0's gets turned into an analog signal. Then it goes into an amp, which basically increases the amplitude of that signal, and then into your headphones. If you plugged an optical cable straight into the headphones, it wouldn't work because they're expected to work only with analog signals. You would need a DAC inside, and if that wasn't enough a gain stage to amplify that signal. It's like if you try to throw gasoline at car wheels and expect them to move: it's impossible because you need a way to 'decode' the chemical energy in the fuel into mechanical energy, which is the car.

post #50 of 338
Thread Starter 
Quote:
Originally Posted by LizardKing1 View Post


You must mean electrostatic headphones. Those use a different kind of driver. Instead of a voice coil making a membrane vibrate, they use also a membrane - although a much thinner one - with magnets inside a frame. When music plays, the signal goes through that frame and creates a magnetic field, making the membrane move. However these headphones work just the same as dynamic ones, it's also a membrane being moved and creating sound. They simply have a different driver and require a different kind of amplifier.

 

All headphones require power. The reason why headphones can't use an optical cable is because they can't use that signal. When you play a file on your computer, a program, like iTunes or hopefully something better, makes that buch of digital data get decoded. Since it's a computer, it's still a digital signal, i.e. just 1's and 0's. Then it reaches a DAC where by ways mysterious and magical to me, that digital signal made up of 1's and 0's gets turned into an analog signal. Then it goes into an amp, which basically increases the amplitude of that signal, and then into your headphones. If you plugged an optical cable straight into the headphones, it wouldn't work because they're expected to work only with analog signals. You would need a DAC inside, and if that wasn't enough a gain stage to amplify that signal. It's like if you try to throw gasoline at car wheels and expect them to move: it's impossible because you need a way to 'decode' the chemical energy in the fuel into mechanical energy, which is the car.


that was a brilliant explanation - cheers mate :)

post #51 of 338

I hesitated to write this post, but it is the truth.

 

I have a Sony HX900 (The best LCD TV from 2010, full LED backlit) and CA751 blu-ray player (Oppo 93 equivalent). I had an ISF calibration done (made a huge difference, whites are now white instead of yellow). So you can see from my equipment I am a bit of a videophile, and I now refuse to watch anything but blu-ray.

 

I have purchased quite a few specialty HDMI cables and compared them, QED Signature, QED Reference, Chord Silver Active and Kordz EVX. (for those in the USA, QED and Chord are very big UK based high end cable companies).

 

I found either no difference or perhaps a slight imagined difference between the cables. One cable, the Chord Active HDMI, even seemed to make the picture worse, creating a subtle shifting to edges. This chord HDMI ran each signal line through some sort of filter to improve audio but did something to the video which created more noise.

 

So anyway, I presumed like so many others that HDMI is not a performance limiter. I was wrong. Recently I purchased a Mapleshade Vivlink HDMI with Plus Upgrade. This cable is 2m long (all the others were 1m), and it has the most ridiculous hairshirt construction you have ever seen. But it improved blu-ray output in every way. More everything, more detail. Where before detail in backround and pan shots were obscured, now you can see more clearly into the image.

 

To be sure, the improvment is still subtle, but it is blatantly noticeable. So noticable that a good video camera recording of the screen will show up the differences, such as edge pixelation of an image that just disappeared with the Vivlink HDMI.

 

Im not interested in debating those who read theoretical white papers and presume to understand how the real world works. Im just telling you my actual experience. I can understand the skepticism, as the more expensive cables I tried turned out to be no better than a $20 lead.

 

FWIW, the Mapleshade Vivlink HDMI was designed by Pierre Sprey, he does have some engineering backround, known as the father of the F-16 Falcon and A-10 Thunderbolt warplanes.

 

http://shop.mapleshadestore.com/Mapleshade-Vivlink-HDMI-Cable-with-PLUS-Upgrade/productinfo/VIVILINK2MP/


Edited by agisthos - 2/16/12 at 11:08pm
post #52 of 338

No. Just no.

 

None of the reasons listed really connect to how digital signals work and all of the comparisons to audio cables are irrelevant as they are most likely talking about analogue cables.

 

Even with your previous skepticism and experience, it doesn't rule out that you're simply perceiving differences that aren't there.

 

Their viewing panel is hardly legitimate support of their argument without an understanding of the methods used. They don't even explain why they weren't able to find video of sufficient quality without shooting their own. And I have to question what device they used that was able to outperform every existing movie with multi-million dollar budgets shot in resolutions far beyond any consumer displays. Sure, they have to scale it down to consumer formats, but I'd be extremely surprised if these guys were actually able to shoot video that was higher quality than ANYTHING available, and do it at a cheaper price.

 

If there truly is a difference, it would not be that hard to test. You make a source that outputs data at a desired rate. You create a receiver that calculates the rate and accuracy of the data. There's no need to do all of these perceived differences tests if we know how the technology works and can test it so easily.

 

The data gets sent as a set of 0's and 1's. You can't make those 0's or 1's any better. The data is either received or it's not. It's either correct or it's not. It's either transmitted at the proper rate, or it's not. You can take 2 cables test them like I described above and you'll know if there will be any differences in the image.

 

I immediately doubt them when they offer no explanation of what was changed between their 2 available models. Even a vague description. They don't offer any information about the specifications. Which HDMI standard is it designed for? It can't be future proof. No resolutions mentioned. No 3d capabilities mentioned. Absolutely nothing but perceived differences. If they did as much extensive testing as they describe, they should at least offer the most basic specifications or capabilities.

 

Edit: I just wanted to add something. I am absolutely open to the idea that cables make a difference. I'm just waiting to see blind testing with consistent results. Even if it comes down to a limited number of people being able to see a difference, I'll believe there might be a difference if they can correctly judge the difference with reasonably high accuracy. And I'm not just talking about slightly above chance. This wouldn't be that difficult to do, yet no one has done it.


Edited by CC Lemon - 2/17/12 at 12:40am
post #53 of 338
Thread Starter 
Quote:
Originally Posted by agisthos View Post

To be sure, the improvment is still subtle, but it is blatantly noticeable. So noticable that a good video camera recording of the screen will show up the differences, such as edge pixelation of an image that just disappeared with the Vivlink HDMI.

 

popcorn.gif

 

Well as long as YOU noticed the difference that's fine...but hey that's a hefty price to pay for "just a HDMI"
 

 



Quote:
Originally Posted by CC Lemon View Post

I immediately doubt them when they offer no explanation of what was changed between their 2 available models. Even a vague description. They don't offer any information about the specifications. Which HDMI standard is it designed for? It can't be future proof. No resolutions mentioned. No 3d capabilities mentioned. Absolutely nothing but perceived differences. If they did as much extensive testing as they describe, they should at least offer the most basic specifications or capabilities.

 

Edit: I just wanted to add something. I am absolutely open to the idea that cables make a difference. I'm just waiting to see blind testing with consistent results. Even if it comes down to a limited number of people being able to see a difference, I'll believe there might be a difference if they can correctly judge the difference with reasonably high accuracy. And I'm not just talking about slightly above chance. This wouldn't be that difficult to do, yet no one has done it.

 

Agreed with your edit, and yes it annoys me when they don't mention it, especially if they are expensive HDMI's! Its a schoolboy error.
 

 

 

post #54 of 338

I guess it is kind of like digital coaxial cables. Many audiophiles accept that analog cables alter the sound, but because digital cables are supposed to be 1 & 0's it does not fit in with their engineering knowledge and they will deny such a difference can be possible.

 

The only way I can think of that will convince some is to do a HD video camera test, and show the obvious difference by switching the cables in and out, before and after. It is the only thing that has made be consider ever buying a video camera.

post #55 of 338
Thread Starter 

I also have another question, probably more head-fi related too:

What about your take son OPTICAL cables.

As there are some stupidly expensive ones too...

post #56 of 338

I must say,

I'm a little dissapointed with this thread and what it has become.

 

If I am correct with assuming that this thread was specifically designed to DEBATE the controversial issue of whether 'Claimed' higher quality HDMI cables actually make a difference from the lower quality HDMI cables (noting that it would be of the same grade, i.e. v1.3, 1.4 etc..)

 

I was going reply to some of the comments made in regards to my previous posts about my own experience with HDMi and the difference I noted.

However, now I'm doubtful it it's even worth it at all.

 

You guys aren't debating anything. Debate involves a give and take. This clearly isn't happening here.  It's a comment along the lines of 'all HDMI cables will produce the exact same picture quality, period. No ifs ands or butts'.   Then comes along another poster who clearly done his research in terms of VD quality, and personally telling you that there is a difference.. But, instead of trying to discuss and dissect why this is despite contradicting scientific calims, it's 'NONONONONONONO... Science says this... You must be percieving it this way, but in actuality, it's this way.. It's a Placebo...' 

 

Seriously, if consumer bias and 'Placebo' is your only defense against obvious testimony to the fact, then it doesn't matter how times you'll recite scientific doctorines in regards. Bottom line is, people see differences, both visually and audibly. It happens, and there are far too many reporting this to be a placebo. Notwithstanding that quite a few will be under Placebo, but definitely and certainly not all of them.

 

Here's one example where your scientific argument falls short. 

Previously there was a statement made that HDMI will either produce as signal which will result in a full HDMI quality picture and sound, or it simply won't; as per the Digital encoding of the signal .. 1,0 1,0... and such.. 

Then, when the challenge of longer cables needing extra shielding and denser build, construction and design... it is explained that it is done to protect that signal from getting weak ... and thus, prevent distortion and other mentioned artifacts.. such as stepping/Shadowing/gittering/sync issues, pixalation, incorrect or laggy picture and motion, and down right to incorrectly displayed or random cropping of the picture itself (aspect ratio, yes, that was reported as-well)..

 

But wait, what happened to either there's a signal with full HDMI quality, or there isn't anything at all?  Clearly the above contradicts that quite clearly.

 

Now, before someone actually starts yelling 'this is different because the HDMI cable is either faulty or not to-spec'.

However, know this, that argument is question begging, plain and simple. Why, because part of the debate's core is that many of these HDMI no-name companies that produce cheaper cables, and calim that they're to-spec, are actually not to spec, yet are still sold as HDMI cables. And of course, since the evidence of seeing different quality and difference with different cables is inconsistent, scientific approach holds this as anecdotal evidence, and of course, try to explain it as placebo or some other pyschobable reasoning.

 

 

Keep that in mind.

 

 

 

 

 

post #57 of 338
Quote:
Originally Posted by agisthos View Post

I guess it is kind of like digital coaxial cables. Many audiophiles accept that analog cables alter the sound, but because digital cables are supposed to be 1 & 0's it does not fit in with their engineering knowledge and they will deny such a difference can be possible.


 

I had a long explanation of things typed out, but I kept making changes and adding information and now I've been up for way too long and just got frustrated with how it looked. I'm way too tired to think straight and explain everything I was trying to get across, so I'm going to try and keep it short. Please excuse any errors or anything that comes across as insulting.

 

It seems like you need to read up on what exactly it means to be digital and how that information is transferred and processed. It is inherently a series of bits (0's or 1's) and it really is that simple. Yes, there are errors. This thread has already covered how HDMI accounts for these errors and any proper cable should not be producing any detrimental issues if it is able to display the image. Basically, the only reason you should see an improvement is if you're using a cable that was below the required specifications.

 

The way you specifically mentioned the background and pan shots seeming to be better makes me question how much you understand about how the images on the screen are processed. Any differences should be consistent across the image, regardless of focus or content. Saying the HDMI cable specifically improved background images would be like saying it has specifically improved displaying lions. That's just not how it works.

 

Also, as I said earlier, it would be very easy to take a set of data transferred through one cable and a set of data transferred through another to see if there's a difference. If you don't think that's a sufficient way to compare cables, then you're basically implying that we don't actually have an understanding of how this technology works. If that's the case, we must be extremely lucky to have managed to create most digital devices. Literally the only important thing behind an HDMI cable is the rate of transfer and accuracy of the transferred data. It's either right, or wrong. And there's correction for the instances that it's wrong. At a point, the cable is no longer in the required specifications so it's not reasonable to compare that cable. You wont see a huge difference between 2 proper cables.

 

I'm actually more willing to believe brands like audioquest because they can at least provide some of the basic technical specifications. I would expect a small company that's supposed to be providing the best technology that has been extensively researched and tested to provide the very basic information as to how it is better. Even some statistical analysis/breakdown of the "panel" comparing the cables. I seriously doubt they did blind testing, so that's probably useless anyway. If they can't provide that information, then I'd like them to at least admit that they have no idea how or why it performs better.

 

So yeah... I'm done. Time for sleep. I'll let you guys discuss more and figure out if I want to say more when I wake up.

post #58 of 338

ac500 killing it.

 

Also, for long runs, I would be using HDMI over cat5/6 rather than just a long HDMI cable - very long cables tend to be specialised and expensive.

post #59 of 338

Also, as I said earlier, it would be very easy to take a set of data transferred through one cable and a set of data transferred through another to see if there's a difference. If you don't think that's a sufficient way to compare cables, then you're basically implying that we don't actually have an understanding of how this technology works. If that's the case, we must be extremely lucky to have managed to create most digital devices.

 

Exactly. This whole notion that cable X will magically improve the quality versus cable Y is seriously ignorant -- and that "the white papers are just theory, and ignores the real world" is double ignorant. Do people seriously still think that errors and distortions make it through without being noticed by the receiving end? This is true for analog, but as I explained earlier, digital errors cannot with any probability make it through without being detected. Or, do you really think that some cables are better due to reasons beyond the well-understood bit error rate -- some magical thing nobody knows about that can only be noticed "in practice"? 

 

The claim that "theory is not useful in the real world" proves to me conclusively that you are not an engineer, and you have literally no idea how engineering is done. You have clearly never designed any software, hardware, or any other high-tech device or product, because if you did you'd know that it's the theory that makes all of this possible in the first place.

 

It reminds me or arguing with people who think the theory of relativity is "false". They say "I don't believe Einstein was right" as though it's something you can just "disagree" with. This is like disagreeing with gravity -- it doesn't work, sorry to say. Without such "theory" being correct and all the related implications, your cell phones simply would not work. As CC Lemon said, do you think it's some massive "accident" that engineers were able to design all the technology you use today? Do you think it's all "magic" and that real-world experiences were used in some fluffy hand-wavy way to refine some circuit with trial-and-error until a TV was invented? As someone else said, "no no no no no"... because this is extremely ignorant. Once again such arguments are arguments of pure ignorance, and helping such people understand reality is like helping someone who "disagrees the moon orbits the earth" understand that it's not a matter of opinion.

 

Now to address the main non-via-ignorance counter-argument, which seems to be that "$10 cables are under-spec":

 

It would be extremely easy to test the claims of ANY cable manufacturer by simply using a device to measure the error rates of each respective cable. If it is discovered that all the $10 cables are vastly under-spec (as in something like a million times worse, which would be required before snow would appear), then why don't we see studies showing this? A simple review from a reliable unbiased party making said measurements of various cables would be a MASSIVELY effective marketing "trick", because it would confirm scientifically that all cheap cables are vastly under-spec (and for some unimaginable reason, all TVs in the world do not detect this despite being fully capable of displaying a "CABLE FAULTY" warning). And by the way, keep in mind I said "reliable unbiased party", not the cable company itself.


Edited by ac500 - 2/17/12 at 7:36am
post #60 of 338
Thread Starter 
Quote:
Originally Posted by Shotor102 View Post

I must say,

I'm a little dissapointed with this thread and what it has become.

 

If I am correct with assuming that this thread was specifically designed to DEBATE the controversial issue of whether 'Claimed' higher quality HDMI cables actually make a difference from the lower quality HDMI cables (noting that it would be of the same grade, i.e. v1.3, 1.4 etc..)

 


its a debate, but we can all be one-sided if we choose to be.

 

 

New Posts  All Forums:Forum Nav: