or Connect
New Posts  All Forums:Forum Nav:

# The HDMI Cable Discussion

So, HDMI cables, worth the extra buck? Or another way of advertising for money?

I personally would never pay over £20-30 for one.

Edited by Totally Dubbed - 3/25/12 at 6:21am

>thoughts

GODAMNYOU I was going to start this thread! I wanted the Thread Starter self-estime booster ='( seriously now, thanks.

I still can't believe I'm in a place where cables are on-topic... I get tingles just thinking about it.

Edited by LizardKing1 - 2/13/12 at 3:26pm

I'll try to explain nontechnically.

Digital is 1s and 0s -- in concept. But digital exists in analog medium. Cables can suffer interference, changing 0s to 1s and 1s to 0s. Uh oh! That's bad! Fortunately, this problem has been solved for decades and dealing with this is pretty ordinary. The answer in short? Error detection and correction.

First lets start with a little "error detection 101". Let's say we want to send someone a series of numbers over a noisy channel, and we want to be 100% sure they get there intact, or not at all. Here's an idea: In addition to sending the numbers, add them all up and send the sum too at the end, so the receiver can verify. Example:

Send: 5 2 9 3 8 2 4 (34) <-- sum

Receive: 5 2 9 4 8 2 4 (34)

Uh oh. I received something that doesn't add up! But that's good... we've successfully detected an error! There's no uncertainty here at all* because we know when errors occur! Hmm... but really, no uncertainty at all

*What is the probability that a random distortion changes the numbers and the error goes undetected? Well, you'd have to change one number upwards and another number downwards just the right amount so the sum still adds up. Or you'd have to change one number and change the sum too. In this case, the odds of that happening depends on the range of numbers, the type of distortion possible, etc... but it's extremely unlikely. But it's still not unlikely enough. One in a billion still can happen. So what next?

Would you believe it if I told you... this is a stupidly simplified example? In reality, error detection codes are much more complex than this (not usually just a simple sum) and incredibly robust to the point where it's probably more likely that the sun spontaneously explodes than an error gets through undetected (that is, if we're actually trying to detect them -- and we almost always do -- certainly in HDMI we do).

Now my point in saying all this is simple. We can build a machine (HDMI TV) that can literally count the number of errors where the signal deviates from bit-perfect 100% perfect signal (and know exactly which pixels are effected). We can identify how many and what incorrect pixels were received with astounding certainty - I'm talking about certainty so high, that it would be more likely to win the lottery a million times in a row than for an error to be miss-identified.

Now, here's an exercise to the reader: For regular HDMI cable lengths, can you guess how often such errors occur? Exercise #2: When they do occur, is it possible for such errors to effect contrast/sharpness? (Hint: Not a chance.)

Edited by ac500 - 2/13/12 at 3:39pm
Quote:
Originally Posted by dorino

>thoughts

why lol :P

Quote:
Originally Posted by LizardKing1

GODAMNYOU I was going to start this thread! I wanted the Thread Starter self-estime booster ='( seriously now, thanks.

I still can't believe I'm in a place where cables are on-topic... I get tingles just thinking about it.

haha! :D

Meet spec = good to go.

I think it's just silly that this argument is still going on. One side has tons of theory behind it that supports the idea that there is no difference, while the other is a bunch of individual perceptions that suggest there is a difference that science is currently unable to explain. If that is the case, the burden of proving that there is a difference that can be consistently perceived. It would only take a few people to prove the point. If a few people can consistently identify which cords are "high quality" and "low quality" in a blind test, then there might be reason to believe there is a difference.

Most professional reviewers have suggested there is no difference and actually encourage purchasing from some of the quality, but cheap, cable manufacturers. I'm definitely more inclined to believe there is not a difference when you have a bunch of people that do this for a living and plenty of scientific reasons why there should not be a difference. That said, I'm more than open to the idea that there is a difference, pending some sort of reliable evidence.

Also, just to correct shotor (from the other thread), that was absolutely your perception. You said that the difference you saw was not your perception. Even if there was a difference, it was still something you perceived. Everything in the world is perceived. We sense things, then our brain interprets the signal from our senses. That interpretation is perception. Just because it's perceived doesn't mean it's not real. It also doesn't mean it was real or accurate.

Quote:
Originally Posted by FallenAngel

Meet spec = good to go.

by specs...you mean?

Monster Cable esk...as I would disagree.

One side has tons of theory behind it that supports the idea that there is no difference, while the other is a bunch of individual perceptions that suggest there is a difference that science is currently unable to explain.

Actually science can explain the other side too -- it's called the placebo effect and/or confirmation bias. Placebo works even if you're aware of it, too.

Quote:
Originally Posted by ac500

Now my point in saying all this is simple. We can build a machine (HDMI TV) that can literally count the number of errors where the signal deviates from bit-perfect 100% perfect signal (and know exactly which pixels are effected). We can identify how many and what incorrect pixels were received with astounding certainty - I'm talking about certainty so high, that it would be more likely to win the lottery a million times in a row than for an error to be miss-identified.

But this still doesn't mean an identified error won't be corrected? I mean I believe HDMI uses asynchronous transmition, so any error detected has that data part re-sent, so even if there is an error, and it gets identified, it's really hard for it not to be corrected and have an image problem.

Quote:
Originally Posted by Totally Dubbed

by specs...you mean?

Monster Cable esk...as I would disagree.

I think he means it meets the specifications for being an HDMI cable.

Quote:
Originally Posted by Totally Dubbed

why lol :P

Thoughts were what made this actually a debate. People who think they hear something different versus people who know there isn't.

Edited by dorino - 2/13/12 at 3:45pm

I don't think HDMI has any protocol to re-send missing pixels, simply because it's a pointless waste of time to do so. If a pixel is missing (very rare event) it can fill it in based on a blend of its neighbors. Blu-ray 1080p uses MPEG2 or MPEG4-AVC, both of which do not even approach quality sufficient to the point where a single blurred pixel would be detectable, even if that blurred pixel remained blurred for a long time (it doesn't). This is fairly easily proven mathematically via entropic or mean-squared error or any other psychovisual error metric.

As for the frequency of such errors, HDMI requires a bit error rate BETTER than 10^-9 at bare minimum requirements.

http://www.comlsi.com/Cat5_BER_EQ2.pdf

In other words, an HDMI-spec-compliant cable will lose no more than 1 bit of data per billion bits on average. As that article states, this amounts to one incorrect pixel for 1/60th of a second occurring at MOST every ~8 seconds (again keep in mind this is the absolute worst case of the cheapest poorest quality standards-compliant cable you could buy).

Now keep in mind that when a pixel is detected as missing, the surrounding pixels are used to predict what that pixel would have been. It would be like choosing a random pixel on an image, erasing it, then filling it in again based on a blend of the color of its neighboring pixels. It is impossible that a human could detect such an error even by staring at a 1080p still-frame image for hours, let alone be able to perceive the error during a 1/60th of a second window. It is impossible to detect because the compression codec itself introduces a level of noise and information loss which is FAR worse than a single interpolated pixel.

Most claims about HDMI is that it improves clarity, contrast, etc. etc. There is no way HDMI errors could result in loss of contrast or clarity, even if a missing pixel for 1/60th of a second was perceptible (it's not -- once again this can be mathematically proven via information theory). If you can detect a single missing interpolated pixel from a still-frame from even an H.264 compressed source, then please by all means contact your local university because you have just transcended informational entropy.

Edited by ac500 - 2/13/12 at 4:06pm
Quote:
Originally Posted by dorino

I think he means it meets the specifications for being an HDMI cable.

Thoughts were what made this actually a debate. People who think they hear something different versus people who know there isn't.

well that's the beauty of debates :)

Quote:
Originally Posted by ac500

Most claims about HDMI is that it improves clarity, contrast, etc. etc. There is no way HDMI errors could result in loss of contrast or clarity, even if a missing pixel for 1/60th of a second was perceptible (it's not -- once again this can be mathematically proven via information theory). If you can detect a single missing interpolated pixel from a still-frame from even an H.264 compressed source, then please by all means contact your local university because you have just transcended informational entropy.