Type, ie 1.2/3/4
The difference without should be £3-5. With all those things, £20
Well I was wrong, I thought HDMI used a buffer like in USB, I didn't know about that kind of error correction. I guess in theory you could say a cheaper, not shielded cable would have more interference, which would ruin the digital signal more, making the TV have to correct it. That still wouldn't explain accounts of it actually giving you a better display.
So I've been sitting here all this time (well not all this time...but sitting yes)
And there is a question I would like some clarification on.
So, if you're setup requires a longer HDMI cable, let's say 20 feet or longer (let's say you're running them through the wall and such).
Clearly, you'll need to get the higher density and heavier shielded cables, better insulation, vacuum sealed and all that nice techmo-jargon the industry proclaims. Putting the scientific terminology aside here, in laments terms, my understanding is that this is required for longer cables, as the longer the distance the signal has to travel from the Sender (Media Player, BD or whatever) to the receiver (TV or Home theater Receiver), the more it is prone to signal interference and degradation. And as one poster briefly elaborated to me, to adhere to the HDMI standard of so and so (1) allowed bit errors per so many bits (1 billion) .. or such.. Now, all this is fine and great.
But of course, researchers naturally tested different HDMI cables (specifically the longer lengths ones) and have discovered that with poor construction and shielding, distortion was noted, such as snow, artifacting, stepping, lag and sync issues.
So, can someone actually explain this to me properly to fill in the gaps. It seems that there is a some grey area here that is not addressed, or that I am not seeing.
My understanding is that the longer it is, the signal, as you said might take longer to get there - but shielding, wire quality etc won't really help it - by that i mean the signal not the durability.
I think, and this is my understanding of it, is that the shielding will help it from interference, the cable thickness will help the cable's life, the gold terminations will help it last longer and not corrode.
Of course a 50m hdmi will be much more expensive, but shouldn't cost more than around £50 or so.
Still cheaper than your 2m monster cable one.
I'm probably wrong, but I'll take a shot at this from my limited knowledge...
I believe part of the issue is confusing signal quality and signal strength. I don't know that signal quality can really be applied to HDMI cables and so that's not really what can be described as the problem. Over a length of cable, you lose signal strength. Longer distance will result in more signal loss. Eventually, you'll start to have some of the issues described. You'll basically only be receiving part of the signal. Eventually, there wont be a strong enough signal to produce any image.
I may be wrong, but I believe a better constructed cable will be capable of carrying the signal for longer distances. I honestly don't know how much of a range we're talking about, but I don't believe it's massive, unless you're talking about a really poorly constructed cable. You eventually need an extender. For a typical length of cable, you shouldn't see any difference because there wont be enough distance for the signal to lose strength with the high or low quality cable.
I know that's an overly simplistic way of describing it and I most likely worded it poorly, but I believe it's a somewhat accurate way of describing what can be a problem with an HDMI cable and why it becomes more pronounced with length.
Exactly. There may be some signal loss, but it's not going to be something that will change the entire quality of the image or anything close to that. After corrections, it shouldn't even be noticeable.
I admit, I know very little and have been doing research just so that I'm a little bit more well versed on how HDMI actually works. From some of the reading I've done, it seems like the signals are constantly being sent twice and that is how the error correction is accomplished. One is the main signal and the other is an inverted signal. The receiving device then compares the received data to detect inconsistencies and corrects for differences. Is this correct?
> Whose "dornio"?
Apparently he's someone who critiques others for misspelling, while ironically misspelling "who's" :P
> But of course, researchers naturally tested different HDMI cables (specifically the longer lengths ones) and have discovered that with poor construction and shielding, distortion was noted, such as snow, artifacting, stepping, lag and sync issues.
A poorly constructed HDMI cable is, by definition, not to HDMI spec. If the HDMI cable is bad, the TV's decoder software is very much "aware" of the faulty connection, thanks to error detection and correction codes. How exactly it responds to this may depend on the TV, but it *should* shut off and display "not properly connected" or something of that sort before any visible distortion is possible. That said, I do know that a lot of software/firmware for some devices is known to be quite 'crappy' so some TVs may do a bad job and not display any message in the event of a faulty HDMI connection.
In any case, if a company sells you an HDMI cable that allows more than one bit error per billion, that company is not selling a HDMI-compliant cable, and by definition it is a faulty cable. Go get your money back, as the product is faulty.
I think some people are exaggerating though, I have had no such issues with super cheap HDMI cables.
Why do long cables need more protection from interference?
This is more of a general education question I guess but I'll answer it anyway.
It's not really so much about signal strength as it is due to electromagnetic interference. All around us are radio waves and electromagnetic field fluctuations of all kinds on pretty much every frequency you can imagine. Electromagnetic "induction" is a basic principle of physics, and it has the effect that these EM fields induce a current on wires within the field. This is how antennas work -- they're big "wires" exposed in such a way that they induce current from the EM field. Which frequencies are inducted and how strongly depends on the size, shape, and other physical properties of the wire. This is all very well understood, and this is how we design antennas -- they are constructed in such a way that they are "tuned" to induct current from the desired frequency bands very strongly.
The problem here for signal wires is that generally, the larger and bigger the antenna, the more current it inducts. So with all these radio waves all over the place, the longer the wire is, the more current is added to the signal from external EM fields. This isn't good, because the result is that your signal is corrupted by other interfering signals. Wires are supposed to be self-contained channels of an electrical signal from source to destination, but as the length of the wire gets longer and longer, it becomes a better and better antenna. As it becomes a better and better antenna (sort of), it becomes a worse and worse for the signal it's supposed to be carrying because it gets polluted by the radio signal being inducted on the wire.
Anyway, it would seem that this is such a terrible problem that we couldn't use wires for any long distance practically, but like most anything else, scientists and engineers have long since found many very nice solutions.
There are many techniques to reduce this interference, and shielding a very naive one (but it always good to have). One of the most powerful techniques is twisting the pair of wires. In other words, you have two terminals that you're sending the current through. One carries the positive current of the signal, the other a negative current. The wires are twisted in a sort of "double helix". What happens is, whatever current is inducted by the EM field on one wire, the opposite current is inducted on the other -- the end result canceling out the interference! It's not very easy to explain exactly why this canceling effect occurs without getting into EM mathematics, but it works. Another method is coaxial cables where the two terminals share the same axis of rotation. Tricks and techniques like these, in combination with shielding and other things like error correction codes result in very minimal actual distortion -- in the case of HDMI it's one bit error per billion.
I've read your explanation..
Interesting read, and certainly pretty much confirms what I was suspecting here, but of course with more technical aspect about it.
So, before I offer my take on you post and explanation. Indulge me with answering the following.
As you stated, those poorly constructed HDMI cables, that by definition are not to spec, which run through components... you said with a well constructed TV set, some message will display and no picture or signal will be shown. Correct? But with a not well constructed interface, the TV won't filter it as-well...
The question is, since this obviously has happened many many MANY times since the evolution of HD and HDMI, and all HDTV's that are sold as HDTVs are also built within as certain standard to decode and reproduce HD media, some (in fact many) still react this way.
So how is it that built-to-spec HDTV + a claimed built to spec HDMI cable resulted in the mentioned signal troubles, where a media form was obviously shown and heard, just with distortion, artifacting, syncing issues... and to the least, but still evident, quality control over the image itself, not to mention audio.
This has happened as opposed to the claimed 'either there's a signal or there's not'.
Please elaborate on this