Testing audiophile claims and myths
Aug 1, 2019 at 8:21 PM Post #13,396 of 17,336
I have a 110" screen and the Epson 5050UB.!

I have an Epson 7500UB, which is probably quite similar to yours. Epson makes great projectors, and great scanners too.
 
Aug 2, 2019 at 9:53 AM Post #13,397 of 17,336
Absolutely... and 4k is better in other ways too.

However, I would disagree that the extra resolution in 4k doesn't help.
As a few people have noted, and it agrees with my personal experience, a properly done 4k video offers much more realistic specular highlights.
(The little pinpoints of brightness that make shiny metal and glittery things look different that "just bright objects" and make metallic surfaces look "shimmery".)
Even when you can't see the specific details, metallic objects, and glittery or shiny objects like sunlight on water, or glittery confetti falling through a spotlight, tend to look more realistic in 4k.
(HDR helps with the extra brightness... but it is the extra resolution that allows the highlights themselves to be sharp enough to appear to "glitter" or "shine".)
I find this most apparent on direct panel displays - which do better with fine details than projectors - but it is noticeable on a good projector as well.
A scene with shiny metal, or sunlight on water, looks better in 4k with HDR.... even if you're sitting too far away to consciously see the extra resolution.

H.265 offers better compression ratios than H.264.
H.265 also has the added benefit that, when pushed to the point where it does visibly degrade, it does so more gracefully than h.264
Whereas H.264 tended to show blocky artifacts when over-compressed, H.265 tends to simply get visibly a bit softer, which is usually far less visually annoying.

I agree on the rest of what you said...
However, in the context of this discussion, the explanation of why it happens was just sort of background...
My point was merely that, even though a DVD usually looks far better than a VHS tape, in that particular movie there were clearly visible alterations produced by the encoding process.
However, if you hadn't seen a different version, and didn't know what to look for, they would have escaped as unnoticed.
(My point being that "a lack of visible artifacts does NOT ensure that you're seeing an ACCURATE reproduction of the original".)
(Any number of people could have viewed that DVD and said "it looks just fine".... but, in fact, when you know what to look for, it looks obviously "visibly different than it should" in a few places.)

My point there was simply this....
- even if you were to encode many CDs using a certain lossy CODEC and find that they were all audibly identical to the original
- no matter how many that was true for - it still would NOT provide 100% assurance that your EVERY track encoded with that CODEC would be audibly identical to the original
- all lossy CODECs rely on discarding information that has been determined to be inaudible USING A CERTAIN SPECIFIC MODEL OF WHAT IS AUDIBLE
- but all of those models make at least a few assumptions and generalizations, so none is 100% accurate, 100% of the time, with 100% of humans, and 100% of source material

- both musical content, and perceptual lossy CODECs, are so complex that you simply cannot safely assume that "clinkers" won't EVER occur occasionally


I used to take a lot of photos...
And, MOST of the time, a JPG looks just as good as a RAW file...
However, I still shoot important photos in RAW format, because the JPG doesn't ALWAYS look IDENTICAL to the JPG...
And, yes, I WOULD rather double or triple the negligible amount I spend on storage space rather than risk a single important photo being less than optimally stored.

Home video has vastly improved since whatever reference you're comparing with DVD vs videotape. DVD matured over the years with improved encoding techniques and film scanning (telecine). There also isn't really much inherent "noise" with digital authoring, as the only ADC is the scanner: there is grain from the source film and you can have artifacting from what encode you're using. Early DVDs could have suffered more from compression artifacts or Digital Noise Reduction (which with telecine, the algorithms try to reduce grain, which can also result in less detail, present in the film and adds contrast around the edges). For a time, movies have been scanned at 4K resolution for 35mm and 8k for 70mm during film restorations: maximizing resolutions that also means better grading (and since videophiles complain to studios, studios have also been less heavy handed with DNR: even re-issuing Blu-ray titles). The main advantage with 4K for home applications isn't so much resolution but greater dynamic range: adding more colors and tonality in scenes that have higher peak brightness and retaining shadow detail. Blu-ray and 4K UHD discs are also possible with more efficient video codecs than DVD: DVD was MPEG-2, BD is MPEG-4 (which allows greater compression with minimal artifacts). As a physical medium BD can have multiple layers, but it seems most home movie releases are BD-50 for 1080P movies and BD-66 for UHD discs. UHD 4K is able to be compressed further as it utilizes a new codec known as h.265 (MPEG-H).

I get better picture quality from my calibrated OLED TV at home then my local theaters....so for me, I like both my picture and sound at home vs the cinema (and it's great that older movies and some TV shows look and sound better than they originally did).
 
Aug 2, 2019 at 9:59 AM Post #13,398 of 17,336
Yes, the quality you get from cable often isn't very good at all.
(And you'll notice that, with at least some cable services, the quality of footage you DVR is further reduced.)

Netflix is better... and their videos often look very good... but they still have far less bandwidth available than a 4k disc.
Your data cap wouldn't last long at all if you allowed 120 gB for a single movie.

The HD remastering for Star Trek Next Gen was pretty impressive. The original VFX were filmed in passes on Vistavision film, and then scanned for analog SD editing. With the digital HD remaster, they scanned the original film and digitally composited passes (making quality as good as a movie). I find broadcast cable to be pretty bad by today’s standards. No wonder people watch most TV shows streaming. Some services like Netflix even has original programming in Atmos and Dolby Vision. The only main disadvantage with streaming a lot of 4K content is running into your internet provider’s data caps.
 
Aug 2, 2019 at 11:15 AM Post #13,399 of 17,336
However, I would disagree that the extra resolution in 4k doesn't help.
As a few people have noted, and it agrees with my personal experience, a properly done 4k video offers much more realistic specular highlights.
(The little pinpoints of brightness that make shiny metal and glittery things look different that "just bright objects" and make metallic surfaces look "shimmery".)
Even when you can't see the specific details, metallic objects, and glittery or shiny objects like sunlight on water, or glittery confetti falling through a spotlight, tend to look more realistic in 4k.
(HDR helps with the extra brightness... but it is the extra resolution that allows the highlights themselves to be sharp enough to appear to "glitter" or "shine".)
I find this most apparent on direct panel displays - which do better with fine details than projectors - but it is noticeable on a good projector as well.
A scene with shiny metal, or sunlight on water, looks better in 4k with HDR.... even if you're sitting too far away to consciously see the extra resolution.

I meant it as a generality with the average TV. Almost all of that benefit is not primarily 4K resolution, but greater dynamic range. SD and HD formats have been stuck at 8bit per channel tonality (or 256 shades of tone). Some RED cameras now have RAW video files that record 16bpc. Good HDR 4K TVs support Dolby Vision (which stores 12bpc DR: or 4096 shades of tone), and can display 10bpc DR (12 bit gets tone mapped to 10). I would say one of the reasons a good OLED has extra shininess than a projector is because of contrast range and support for DolbyVision (higher end consumer projectors just support HDR10 AFAIK). Another example of dynamic range being more a factor of "shiny highlights": many 4K releases with new movies are made with a 2K digital intermediate. Studios are still by and large editing a movie in 2K HDR formats because of file size and rendering times. They do use better upscaling processors for a UHD master then what your TV can do. You can find a list of UHD Marvel movies and how many were done with a 2K intermediate. I found Thor Ragnarok to be highly detailed, and it came from a 2K source. I think Black Panther was their first 4K intermediate...and I couldn't say I could find it even more detailed from my TV/viewing distance. So for newer movies that have been digitally edited, a primary benefit of the UHD title is the HDR component. Older movies on film are inconsistent with their resolving power (based on filming technique and film stock). After they're scanned in at least 4K HDR, they do get filters for removing any dust or scratches, color grading, and perhaps various digital restoration. So 4K is a good back up resolution (for 35mm film: 70mm gets scanned at 8K).

Having said all that, I would agree that resolution is a factor for detail. Normally, detail is thought of as the relationship between resolution and contrast (and as I stated before, with photography, it's also if the subject is in focus). However, human perception is also a factor (IE if you're standing further away, you don't need an image that has as much "detail"). I sit close to my OLED TV, so I'm almost at the cusp of what "general" recommendations are of being able to see a difference with 1080 vs UHD. The need for UHD (which specs are actually both 4K and future 8K standards) becomes more of a necessity the larger your display becomes (which will probably always be a trend).

And to get to your example of a VHS version of the movie showing more movement by way of analog noise. I think what's more likely with the DVD version was that in the editing (probably using too much DNR to remove film grain), enough contrast was delineated to not let details of the tornado show through. Again, studios were a bit more heavy handed with DNR earlier on (also algorithms to retain contrast weren't as sophisticated), and with a lot of complaining from videophiles, studios have eased up on DNR.

I used to take a lot of photos...
And, MOST of the time, a JPG looks just as good as a RAW file...
However, I still shoot important photos in RAW format, because the JPG doesn't ALWAYS look IDENTICAL to the JPG...
And, yes, I WOULD rather double or triple the negligible amount I spend on storage space rather than risk a single important photo being less than optimally stored.

This is where I disagree. There are professionals (sports photographers especially) who only take jpeg images so that they don't have to process the photo: where they're taking a lot of photos and need a fast turn around to go to the editors. I always shoot RAW as I am able to get the full exposure range and be able to adjust contrast throughout the image (a RAW works like the negative did with film). With RAW, you have a lot more leeway for adjusting exposure, reducing noise at high ISO, and fixing white balance issues. Now JPEG is fine if you're happy with how it turned out and don't want to get into post processing. With the consumer standard JPEG, it's limited to 8bpc. Digital cameras now can record RAWs up to 16bpc: so you're throwing out a lot of information. You're also tied to whatever color profiles you've had set in the camera (sports shooters who only shoot JPEG are very meticulous about how they've set up their profiles and workflows).

It's also interesting to go back to resolution with stills cameras. So 4K video resolution is a little over 8MP. My first DSLR was 12MP, and Sony has just announced a FF mirrorless that's 61MP. Why more MP with a stills camera vs video? I would say with video, because of action and viewing distance, you don't need as much displayed resolution. Also from a practicality standpoint, it takes a lot more computer power to try to process a high resolution video vs one still. With a still photograph, you may want to print it large, and still want the ability to let people come up to a few inches to closely examine. Lastly, photographers also like to have the option to crop (sometimes heavily....say you took a picture of a bird and didn't have a long enough lens).
 
Last edited:
Aug 2, 2019 at 11:33 AM Post #13,400 of 17,336
Your data cap wouldn't last long at all if you allowed 120 gB for a single movie.

blu-ray.com gives the disc size for every movie. Thanks to h.265, I’ve noticed a lot of 4k discs are 66GB (a few are 100). The largest bandwidth spec I’ve seen for streaming 4K is 16GB/hr. My data cap is 1028GB a month...so can’t watch streaming 4K every night. Streaming 4K is more compressed than disc, but I do find the quality is still good. One main disadvantage is if your internet cuts out or your updated app doesn’t work right.
 
Aug 2, 2019 at 2:31 PM Post #13,401 of 17,336
I think the transfer and projection is more important to get a film like presentation than 720, 1080 or 4K. I have great looking DVDs. Resolution is overrated. If you have to stand up and cross the room and squint at the screen to see the difference, it’s overkill. I watch movies from the couch.
 
Aug 2, 2019 at 5:13 PM Post #13,402 of 17,336
I think we mostly do agree....

I don't really do much video, but, from what I've read, here are the primary differences....
Especially at the consumer level, DSLRs allow you to use a variety of lenses, including those with lower F-stop numbers when you want to limit depth of field....
Most lower-end video cameras have a smaller sensor, and less choice of lenses, and so pretty much ALWAYS deliver a long depth of field.
(They have very good low light sensitivity - which means that, in a normally lit scene, you must use a bigger F-stop... and they get noisy if you add a neutral density filter.)
This gives you a much wider number of lens options with a DSLR than a low end video camera.

However, I've also heard that many higher-end consumer DSLR's have an issue with how the image is "exposed".
(Remember that many DSLRs still use a mechanical shutter in front of the sensor.)
They work well if the camera and the background are stationary...
However, if you try to do a horizontal camera pan, vertical objects like buildings will exhibit "tearing"...
Because the shutter exposes the sensor sequentially instead of instantly, different rows are exposed one after the other, resulting in a horizontal skew if you move the camera too fast.
(I'm sort of remembering that one of the Canon EOS cameras had this issue... the general advice is "well, director, don't do that sort of shot with this camera".)
Presumably a middle-line video camera can deliver equivalent resolution and sensitivity while avoiding this sort of quirks.

As for that movie.....
It could have been a deliberate choice of some human.
But I suspect that it could also have been the "most intelligent choice" of an automated system.
Remember that DVDs have limited bandwidth, so the encoder does a two-pass analysis/tradeoff when it encodes content.
MPEG depends on being able to re-use data from frame to frame to achieve a good compression ratio...
Therefore, since film grain, and plain old noise, are almost purely random, and change from frame to frame, they compress very poorly...
(Random noise is essentially the textbook example of "non-compressible data".)
Since an automatic system MUST fit the content into the bandwidth it's been allocated...
It's essentially going to analyze the content, then "set the noise filter threshold high enough that what's left has little enough random variation that what's left compresses well".
(So an automated system would try really hard to remove anything resembling random noise... and so would a human operator.)

You used to see this issue with encoded sports events.
A talking head, on a stationary background, always encoded very cleanly...
And a talking head, standing in front of a complex background, like a crowd, encoded well (although sometimes you'd see artifacts for a split second when the background shifted).
However, if the camera moved, and the crowd became a complex moving field, which compressed poorly...
You would notice all sorts of artifacts as the encoder struggled to squeeze everything into the allocated bandwidth....
Then, once the camera stopped moving, everything would "settle down again".

Yes, things have improved drastically since then.... :)

I meant it as a generality with the average TV. Almost all of that benefit is not primarily 4K resolution, but greater dynamic range. SD and HD formats have been stuck at 8bit per channel tonality (or 256 shades of tone). Some RED cameras now have RAW video files that record 16bpc. Good HDR 4K TVs support Dolby Vision (which stores 12bpc DR: or 4096 shades of tone), and can display 10bpc DR (12 bit gets tone mapped to 10). I would say one of the reasons a good OLED has extra shininess than a projector is because of contrast range and support for DolbyVision (higher end consumer projectors just support HDR10 AFAIK). Another example of dynamic range being more a factor of "shiny highlights": many 4K releases with new movies are made with a 2K digital intermediate. Studios are still by and large editing a movie in 2K HDR formats because of file size and rendering times. They do use better upscaling processors for a UHD master then what your TV can do. You can find a list of UHD Marvel movies and how many were done with a 2K intermediate. I found Thor Ragnarok to be highly detailed, and it came from a 2K source. I think Black Panther was their first 4K intermediate...and I couldn't say I could find it even more detailed from my TV/viewing distance. So for newer movies that have been digitally edited, a primary benefit of the UHD title is the HDR component. Older movies on film are inconsistent with their resolving power (based on filming technique and film stock). After they're scanned in at least 4K HDR, they do get filters for removing any dust or scratches, color grading, and perhaps various digital restoration. So 4K is a good back up resolution (for 35mm film: 70mm gets scanned at 8K).

Having said all that, I would agree that resolution is a factor for detail. Normally, detail is thought of as the relationship between resolution and contrast (and as I stated before, with photography, it's also if the subject is in focus). However, human perception is also a factor (IE if you're standing further away, you don't need an image that has as much "detail"). I sit close to my OLED TV, so I'm almost at the cusp of what "general" recommendations are of being able to see a difference with 1080 vs UHD. The need for UHD (which specs are actually both 4K and future 8K standards) becomes more of a necessity the larger your display becomes (which will probably always be a trend).

And to get to your example of a VHS version of the movie showing more movement by way of analog noise. I think what's more likely with the DVD version was that in the editing (probably using too much DNR to remove film grain), enough contrast was delineated to not let details of the tornado show through. Again, studios were a bit more heavy handed with DNR earlier on (also algorithms to retain contrast weren't as sophisticated), and with a lot of complaining from videophiles, studios have eased up on DNR.



This is where I disagree. There are professionals (sports photographers especially) who only take jpeg images so that they don't have to process the photo: where they're taking a lot of photos and need a fast turn around to go to the editors. I always shoot RAW as I am able to get the full exposure range and be able to adjust contrast throughout the image (a RAW works like the negative did with film). With RAW, you have a lot more leeway for adjusting exposure, reducing noise at high ISO, and fixing white balance issues. Now JPEG is fine if you're happy with how it turned out and don't want to get into post processing. With the consumer standard JPEG, it's limited to 8bpc. Digital cameras now can record RAWs up to 16bpc: so you're throwing out a lot of information. You're also tied to whatever color profiles you've had set in the camera (sports shooters who only shoot JPEG are very meticulous about how they've set up their profiles and workflows).

It's also interesting to go back to resolution with stills cameras. So 4K video resolution is a little over 8MP. My first DSLR was 12MP, and Sony has just announced a FF mirrorless that's 61MP. Why more MP with a stills camera vs video? I would say with video, because of action and viewing distance, you don't need as much displayed resolution. Also from a practicality standpoint, it takes a lot more computer power to try to process a high resolution video vs one still. With a still photograph, you may want to print it large, and still want the ability to let people come up to a few inches to closely examine. Lastly, photographers also like to have the option to crop (sometimes heavily....say you took a picture of a bird and didn't have a long enough lens).
 
Aug 2, 2019 at 6:55 PM Post #13,403 of 17,336
On basic points we agree. Such as 4K does make a difference. I do see extra detail and whether the shot is in focus or if there's a distortion with the lens used: but the biggest advantage of 4K is the extra contrast with HDR (I have a background in photography, so I always like seeing more contrast range). I also know jpeg is fine for folks who don't want to worry with post processing. It's just for me (where I'm doing landscapes and portraits), I like having extra exposure range. You also have the advantage of being able to save to an HDR format for future HDR displays (the JPEG group has developed 12bit specs, but it's not available in editing programs yet). I also have a professional background in 3D medical animation, and have had to render and encode for DVD, blu-ray, and web. Also with 3D, I have to take full 32bit HDR to get environmental light simulation. I can also geek out on film and VFX as I'm a movie buff and have attended quite a few conventions with VFX departments. Get ready for my visual geek :)

There's a few different things you're bringing up....so let me clarify with my knowledge about photography. First it's difficult to mix video cameras with DSLRs. Yes, low end video cameras have fixed lenses. But cinema cameras have specialty cinema lenses and a different type of mechanical shutter. The issue you're referring to with a skewed image during panning is known as rolling shutter: and larger format cameras can exhibit this. Also DSLRs that record video (have been referenced as HDSLR to differentiate between earlier ones that didn't take video) are likely to exhibit rolling shutter because their shutters are vertical travel shutters. However, cinema cameras have had a rotary shutter (that also correlates to fps being 24 frame increments). There's some different types of shutters current cinema brands are using, but they're all pretty minimal with rolling shutter. As for differences in cinema lenses, they're specialized to not have focus shift (a DSLR lens can have the frame of the image change a bit during focusing), as well as apertures being in T-stops instead of F-stops. F-stop is a ratio of the diameter of aperture compared to focal length, while T-stop is the actual exposure (amount of light) ratio. Studios have been using HDSLRs for some "B-roll" shots (IE not main shots). This might be done for the smaller size setup, and sometimes it's for the different framing of FF (full frame), 2:3 24x36mm sensor. BTW, Canon has their EOS HDSLRs that are used for B-roll, but Canon also is a competing brand with a different cinema line.

When it comes to noise, it's basically that the camera isn't getting enough exposure (and the shadows have gotten into the camera's sensor values that extend into the noise floor). Exposure is thought of the relationship between aperture and shutter. The more open the aperture, the more light can expose, and you get shallower depth of field. The slower the shutter, the more time there is for exposure (and there's a potential for motion blur). If you raise ISO, the camera is "brightening" the image and amplifying the saturation gain (and beginning to introduce noise). Current camera phones are great for taking photos outdoors. However, they have a really tiny sensor compared to a FF DSLR. It's a matter of physics: the larger sensor has larger photo-sites in which more photons can hit each site (thereby getting better exposure). The advantage of a larger sensor is greater dynamic range and ability to shoot in darker situations. One thing I've noticed about my DSLRs (even my first 5D I got 14 years ago), is that they have better high ISO performance than film. I remember processing 1600 ISO film thinking there was a lot of grain. With current DSLRs (even APS-C sized sensors), you can comfortably expose at 6400 ISO.

Now when it comes to video encoding....there's a lot of factors going on for picture quality: it's a lot more involved than how much compression was used with a certain codec. There's even differences in sharpness with different camera brands. Brands will put anti-alias filters on sensors to reduce moire. So you don't get odd chroma patterns on fabric, but the image will be softer. I found more than anything, when I was encoding with MPEG-2 and had to toe the line with applying more compression, it would be clearly evident as pixelation and blocking. I noticed this more with my earliest DVDs after I upgraded to a HDTV plasma screen. With maturation, studios seemed better about finding the right amount of compression to keep a good quality and still stay in file size. It was also more primitive then MPEG-4: which offers variable bitrate, and the algorithms don't just sample colors in a single frame, but frame to frame. It's also not an automatic process of scanning the film, having your master file, and encoding for disc. It's individualized by movie to judge how much color grading, dust and scratch removal, and encoding there needs to be. So, in short it's highly speculative to examine why that scene in the DVD was lacking enough detail.

I think we mostly do agree....

I don't really do much video, but, from what I've read, here are the primary differences....
Especially at the consumer level, DSLRs allow you to use a variety of lenses, including those with lower F-stop numbers when you want to limit depth of field....
Most lower-end video cameras have a smaller sensor, and less choice of lenses, and so pretty much ALWAYS deliver a long depth of field.
(They have very good low light sensitivity - which means that, in a normally lit scene, you must use a bigger F-stop... and they get noisy if you add a neutral density filter.)
This gives you a much wider number of lens options with a DSLR than a low end video camera.

However, I've also heard that many higher-end consumer DSLR's have an issue with how the image is "exposed".
(Remember that many DSLRs still use a mechanical shutter in front of the sensor.)
They work well if the camera and the background are stationary...
However, if you try to do a horizontal camera pan, vertical objects like buildings will exhibit "tearing"...
Because the shutter exposes the sensor sequentially instead of instantly, different rows are exposed one after the other, resulting in a horizontal skew if you move the camera too fast.
(I'm sort of remembering that one of the Canon EOS cameras had this issue... the general advice is "well, director, don't do that sort of shot with this camera".)
Presumably a middle-line video camera can deliver equivalent resolution and sensitivity while avoiding this sort of quirks.

As for that movie.....
It could have been a deliberate choice of some human.
But I suspect that it could also have been the "most intelligent choice" of an automated system.
Remember that DVDs have limited bandwidth, so the encoder does a two-pass analysis/tradeoff when it encodes content.
MPEG depends on being able to re-use data from frame to frame to achieve a good compression ratio...
Therefore, since film grain, and plain old noise, are almost purely random, and change from frame to frame, they compress very poorly...
(Random noise is essentially the textbook example of "non-compressible data".)
Since an automatic system MUST fit the content into the bandwidth it's been allocated...
It's essentially going to analyze the content, then "set the noise filter threshold high enough that what's left has little enough random variation that what's left compresses well".
(So an automated system would try really hard to remove anything resembling random noise... and so would a human operator.)

You used to see this issue with encoded sports events.
A talking head, on a stationary background, always encoded very cleanly...
And a talking head, standing in front of a complex background, like a crowd, encoded well (although sometimes you'd see artifacts for a split second when the background shifted).
However, if the camera moved, and the crowd became a complex moving field, which compressed poorly...
You would notice all sorts of artifacts as the encoder struggled to squeeze everything into the allocated bandwidth....
Then, once the camera stopped moving, everything would "settle down again".

Yes, things have improved drastically since then.... :)
 
Last edited:
Aug 2, 2019 at 7:19 PM Post #13,404 of 17,336
our needs for screen resolution and movie resolution depends on the viewing distance and screen size. without those variables, any conversation is bound to be full of holes.
just because a screen is 4K doesn't mean the resolution is the only difference with a smaller screen. as such, drawing conclusions about the benefits of increased resolution based on our impressions are wrong. it's not better than me sitting 6 meters away from my computer screen and thinking that I'll never need better than youtube 480P because what I'm seeing looks very sharp on my 24" screen.
and the same logic applies for movie formats, codecs, etc. without being able to isolate the variable, all we have are assumptions and rushed conclusions.

for cameras, it's been many years since I've stopped bothering with the MP numbers(on my first digital camera I had like 6MP I think and that was clearly not enough for my needs, coming form an EOS1N I cried for a few years before finding a proper transition to fully digital). noise in low light and my lenses are much more significant to me in term of how resolving the pic will be. I'm happy that I can crop for sure, but that's a different matter entirely.
 
Aug 2, 2019 at 8:19 PM Post #13,405 of 17,336
our needs for screen resolution and movie resolution depends on the viewing distance and
screen size. without those variables, any conversation is bound to be full of holes.
just because a screen is 4K doesn't mean the resolution is the only difference with
a smaller screen. as such, drawing conclusions about the benefits of increased
resolution based on our impressions are wrong. it's not better than me sitting 6
meters away from my computer screen and thinking that I'll never need better
than youtube 480P because what I'm seeing looks very sharp on my 24" screen.
and the same logic applies for movie formats, codecs, etc. without being able to
isolate the variable, all we have are assumptions and rushed conclusions.

for cameras, it's been many years since I've stopped bothering with the MP numbers
(on my first digital camera I had like 6MP I think and that was clearly not enough
for my needs, coming form an EOS1N I cried for a few years before finding a proper
transition to fully digital). noise in low light and my lenses are much more significant
to me in term of how resolving the pic will be. I'm happy that I can crop for sure, but
that's a different matter entirely.

And although questioned for its value, and misunderstood by most, calibration does matter. There are industry standard patterns, such as those on the Spears & Munsil disc: https://www.biaslighting.com/produc...MIuYSMmrbl4wIVAZSzCh23JwQ5EAQYASABEgJEovD_BwE

That will guide one to adjusting the basic and advanced picture settings correctly.

Think of it as 'wheel alignment' for your screen!
 
Last edited:
Aug 2, 2019 at 8:19 PM Post #13,406 of 17,336
Whether or not you can see the added resolution of 4K depends on how far you sit from the screen. And how close you sit to the screen affects the viewing angle and how much of the screen is off into your peripheral vision. The THX recommended viewing angle is 36 degrees. So if I do the math... I have a ten foot screen. I sit the THX recommended distance away from it of 15 feet. That puts me right smack in the sweet spot for blu-ray. 4K isn't necessary for me unless I get up out of my chair and stand 8-10 feet from the screen. If I do that, the viewing angle is about 65 degrees and the edges of the screen are out of my vision on the sides. In order to sit that close, THX recommends I would need a screen half that size... and it would still not be in the size/distance range for 4K to be worth it!

The truth is, if you are watching 1.85 movies at home. There isn't any practical way for 4K to make a bit of difference when it comes to resolution unless you get out of your chair and walk up to the screen. Do the math with the specs on your own setup using the calculator I've linked below, and you'll see what I mean. The only advantage to 4K is color and contrast... which is pretty much negated if you go to projection, because projectors have higher black levels.

Link to THX viewing distance calculator... https://myhometheater.homestead.com/viewingdistancecalculator.html

chart_Rtings.com_.jpg


By the way, I did a Spears and Munsil calibration on my projector and it ended up falling right into every one of the detents in the settings. Epson has tight standards and I suspect their projectors are pretty much calibrated right out of the box.
 
Last edited:
Aug 2, 2019 at 8:27 PM Post #13,407 of 17,336
The other factor is the width between the speakers. if you have a ten foot screen and you put the mains on either end of the screen, the recommended listening distance is going to be 12-14 feet from the speakers. Again, that puts you too far away from the screen to get any resolution advantage of 4K. You can sit closer, but the mains will be at an angle that negates any kind of soundstage.

I think the only way you could come up with a way to be able to benefit from 4K resolution would be to only watch films shot in 1.33 aspect ratio and to put your mains behind the screen at either side. If you did that, you could sit closer and have everything still work. But it wouldn't work for Cinemascope!
 
Last edited:
Aug 2, 2019 at 8:52 PM Post #13,408 of 17,336
Whether or not you can see the added resolution of 4K depends on how far you sit from the screen. And how close you sit to the screen affects the viewing angle and how much of the screen is off into your peripheral vision. The THX recommended viewing angle is 36 degrees. So if I do the math... I have a ten foot screen. I sit the THX recommended distance away from it of 15 feet. That puts me right smack in the sweet spot for blu-ray. 4K isn't necessary for me unless I get up out of my chair and stand 8-10 feet from the screen. If I do that, the viewing angle is about 65 degrees and the edges of the screen are out of my vision on the sides. In order to sit that close, THX recommends I would need a screen half that size... and it would still not be in the size/distance range for 4K to be worth it!

The truth is, if you are watching 1.85 movies at home. There isn't any practical way for 4K to make a bit of difference when it comes to resolution unless you get out of your chair and walk up to the screen. Do the math with the specs on your own setup using the calculator I've linked below, and you'll see what I mean. The only advantage to 4K is color and contrast... which is pretty much negated if you go to projection, because projectors have higher black levels.

Link to THX viewing distance calculator... https://myhometheater.homestead.com/viewingdistancecalculator.html

chart_Rtings.com_.jpg


By the way, I did a Spears and Munsil calibration on my projector and it ended up falling right into every one of the detents in the settings. Epson has tight standards and I suspect their projectors are pretty much calibrated right out of the box.

One of the problems is that "enough detail" isn't just a factor of your display's native resolution (that graph is a general rule of thumb, and not set in stone). Heck, I had a non-1080P plasma for years because it had great contrast and color rendition (and while native resolution wasn't 1080, it had better scalers with 1080i content than 720P). Image quality starts with the cinematographer being able to expose well and having all important subjects in sharp focus. Then it's how well the video files are edited and encoded for home formats. I do still have DVDs...many TV shows. A lot of those are still very watchable, but I can easily see their limitations on a big screen compared to HD. 1080P still looks good on my screen, and in given shots, I can see some good detail. But let us not forget that UHD isn't just resolution. More ground breaking is HDR. With HDR, I find I can see better gradation in landscapes (where I can see detail in clouds and detail in shadows). More readily, I can see greater contrast and detail in shadows. Some of my favorite movies have been released in 4K, and I do see improvements all around. Now watching some of my other favorite movies in HD, they do have detail...but seem flatter because I don't see as much shadow detail. I think contrast being the bigger factor is also confirmed by UHD sources (in which quite a few modern movies rated as "reference" were from 2K HDR intermediates).
 
Last edited:
Aug 3, 2019 at 6:34 AM Post #13,409 of 17,336
One of the problems is that "enough detail" isn't just a factor of your display's native resolution (that graph is a general rule of thumb, and not set in stone). Heck, I had a non-1080P plasma for years because it had great contrast and color rendition (and while native resolution wasn't 1080, it had better scalers with 1080i content than 720P). Image quality starts with the cinematographer being able to expose well and having all important subjects in sharp focus. Then it's how well the video files are edited and encoded for home formats. I do still have DVDs...many TV shows. A lot of those are still very watchable, but I can easily see their limitations on a big screen compared to HD. 1080P still looks good on my screen, and in given shots, I can see some good detail. But let us not forget that UHD isn't just resolution. More ground breaking is HDR. With HDR, I find I can see better gradation in landscapes (where I can see detail in clouds and detail in shadows). More readily, I can see greater contrast and detail in shadows. Some of my favorite movies have been released in 4K, and I do see improvements all around. Now watching some of my other favorite movies in HD, they do have detail...but seem flatter because I don't see as much shadow detail. I think contrast being the bigger factor is also confirmed by UHD sources (in which quite a few modern movies rated as "reference" were from 2K HDR intermediates).

Is your display calibrated(even just the basic controls)?
 
Aug 3, 2019 at 10:58 AM Post #13,410 of 17,336
Last edited:

Users who are viewing this thread

Back
Top