bigshot
Headphoneus Supremus
I have a 110" screen and the Epson 5050UB.!
I have an Epson 7500UB, which is probably quite similar to yours. Epson makes great projectors, and great scanners too.
I have a 110" screen and the Epson 5050UB.!
Home video has vastly improved since whatever reference you're comparing with DVD vs videotape. DVD matured over the years with improved encoding techniques and film scanning (telecine). There also isn't really much inherent "noise" with digital authoring, as the only ADC is the scanner: there is grain from the source film and you can have artifacting from what encode you're using. Early DVDs could have suffered more from compression artifacts or Digital Noise Reduction (which with telecine, the algorithms try to reduce grain, which can also result in less detail, present in the film and adds contrast around the edges). For a time, movies have been scanned at 4K resolution for 35mm and 8k for 70mm during film restorations: maximizing resolutions that also means better grading (and since videophiles complain to studios, studios have also been less heavy handed with DNR: even re-issuing Blu-ray titles). The main advantage with 4K for home applications isn't so much resolution but greater dynamic range: adding more colors and tonality in scenes that have higher peak brightness and retaining shadow detail. Blu-ray and 4K UHD discs are also possible with more efficient video codecs than DVD: DVD was MPEG-2, BD is MPEG-4 (which allows greater compression with minimal artifacts). As a physical medium BD can have multiple layers, but it seems most home movie releases are BD-50 for 1080P movies and BD-66 for UHD discs. UHD 4K is able to be compressed further as it utilizes a new codec known as h.265 (MPEG-H).
I get better picture quality from my calibrated OLED TV at home then my local theaters....so for me, I like both my picture and sound at home vs the cinema (and it's great that older movies and some TV shows look and sound better than they originally did).
The HD remastering for Star Trek Next Gen was pretty impressive. The original VFX were filmed in passes on Vistavision film, and then scanned for analog SD editing. With the digital HD remaster, they scanned the original film and digitally composited passes (making quality as good as a movie). I find broadcast cable to be pretty bad by today’s standards. No wonder people watch most TV shows streaming. Some services like Netflix even has original programming in Atmos and Dolby Vision. The only main disadvantage with streaming a lot of 4K content is running into your internet provider’s data caps.
However, I would disagree that the extra resolution in 4k doesn't help.
As a few people have noted, and it agrees with my personal experience, a properly done 4k video offers much more realistic specular highlights.
(The little pinpoints of brightness that make shiny metal and glittery things look different that "just bright objects" and make metallic surfaces look "shimmery".)
Even when you can't see the specific details, metallic objects, and glittery or shiny objects like sunlight on water, or glittery confetti falling through a spotlight, tend to look more realistic in 4k.
(HDR helps with the extra brightness... but it is the extra resolution that allows the highlights themselves to be sharp enough to appear to "glitter" or "shine".)
I find this most apparent on direct panel displays - which do better with fine details than projectors - but it is noticeable on a good projector as well.
A scene with shiny metal, or sunlight on water, looks better in 4k with HDR.... even if you're sitting too far away to consciously see the extra resolution.
I used to take a lot of photos...
And, MOST of the time, a JPG looks just as good as a RAW file...
However, I still shoot important photos in RAW format, because the JPG doesn't ALWAYS look IDENTICAL to the JPG...
And, yes, I WOULD rather double or triple the negligible amount I spend on storage space rather than risk a single important photo being less than optimally stored.
Your data cap wouldn't last long at all if you allowed 120 gB for a single movie.
I meant it as a generality with the average TV. Almost all of that benefit is not primarily 4K resolution, but greater dynamic range. SD and HD formats have been stuck at 8bit per channel tonality (or 256 shades of tone). Some RED cameras now have RAW video files that record 16bpc. Good HDR 4K TVs support Dolby Vision (which stores 12bpc DR: or 4096 shades of tone), and can display 10bpc DR (12 bit gets tone mapped to 10). I would say one of the reasons a good OLED has extra shininess than a projector is because of contrast range and support for DolbyVision (higher end consumer projectors just support HDR10 AFAIK). Another example of dynamic range being more a factor of "shiny highlights": many 4K releases with new movies are made with a 2K digital intermediate. Studios are still by and large editing a movie in 2K HDR formats because of file size and rendering times. They do use better upscaling processors for a UHD master then what your TV can do. You can find a list of UHD Marvel movies and how many were done with a 2K intermediate. I found Thor Ragnarok to be highly detailed, and it came from a 2K source. I think Black Panther was their first 4K intermediate...and I couldn't say I could find it even more detailed from my TV/viewing distance. So for newer movies that have been digitally edited, a primary benefit of the UHD title is the HDR component. Older movies on film are inconsistent with their resolving power (based on filming technique and film stock). After they're scanned in at least 4K HDR, they do get filters for removing any dust or scratches, color grading, and perhaps various digital restoration. So 4K is a good back up resolution (for 35mm film: 70mm gets scanned at 8K).
Having said all that, I would agree that resolution is a factor for detail. Normally, detail is thought of as the relationship between resolution and contrast (and as I stated before, with photography, it's also if the subject is in focus). However, human perception is also a factor (IE if you're standing further away, you don't need an image that has as much "detail"). I sit close to my OLED TV, so I'm almost at the cusp of what "general" recommendations are of being able to see a difference with 1080 vs UHD. The need for UHD (which specs are actually both 4K and future 8K standards) becomes more of a necessity the larger your display becomes (which will probably always be a trend).
And to get to your example of a VHS version of the movie showing more movement by way of analog noise. I think what's more likely with the DVD version was that in the editing (probably using too much DNR to remove film grain), enough contrast was delineated to not let details of the tornado show through. Again, studios were a bit more heavy handed with DNR earlier on (also algorithms to retain contrast weren't as sophisticated), and with a lot of complaining from videophiles, studios have eased up on DNR.
This is where I disagree. There are professionals (sports photographers especially) who only take jpeg images so that they don't have to process the photo: where they're taking a lot of photos and need a fast turn around to go to the editors. I always shoot RAW as I am able to get the full exposure range and be able to adjust contrast throughout the image (a RAW works like the negative did with film). With RAW, you have a lot more leeway for adjusting exposure, reducing noise at high ISO, and fixing white balance issues. Now JPEG is fine if you're happy with how it turned out and don't want to get into post processing. With the consumer standard JPEG, it's limited to 8bpc. Digital cameras now can record RAWs up to 16bpc: so you're throwing out a lot of information. You're also tied to whatever color profiles you've had set in the camera (sports shooters who only shoot JPEG are very meticulous about how they've set up their profiles and workflows).
It's also interesting to go back to resolution with stills cameras. So 4K video resolution is a little over 8MP. My first DSLR was 12MP, and Sony has just announced a FF mirrorless that's 61MP. Why more MP with a stills camera vs video? I would say with video, because of action and viewing distance, you don't need as much displayed resolution. Also from a practicality standpoint, it takes a lot more computer power to try to process a high resolution video vs one still. With a still photograph, you may want to print it large, and still want the ability to let people come up to a few inches to closely examine. Lastly, photographers also like to have the option to crop (sometimes heavily....say you took a picture of a bird and didn't have a long enough lens).
I think we mostly do agree....
I don't really do much video, but, from what I've read, here are the primary differences....
Especially at the consumer level, DSLRs allow you to use a variety of lenses, including those with lower F-stop numbers when you want to limit depth of field....
Most lower-end video cameras have a smaller sensor, and less choice of lenses, and so pretty much ALWAYS deliver a long depth of field.
(They have very good low light sensitivity - which means that, in a normally lit scene, you must use a bigger F-stop... and they get noisy if you add a neutral density filter.)
This gives you a much wider number of lens options with a DSLR than a low end video camera.
However, I've also heard that many higher-end consumer DSLR's have an issue with how the image is "exposed".
(Remember that many DSLRs still use a mechanical shutter in front of the sensor.)
They work well if the camera and the background are stationary...
However, if you try to do a horizontal camera pan, vertical objects like buildings will exhibit "tearing"...
Because the shutter exposes the sensor sequentially instead of instantly, different rows are exposed one after the other, resulting in a horizontal skew if you move the camera too fast.
(I'm sort of remembering that one of the Canon EOS cameras had this issue... the general advice is "well, director, don't do that sort of shot with this camera".)
Presumably a middle-line video camera can deliver equivalent resolution and sensitivity while avoiding this sort of quirks.
As for that movie.....
It could have been a deliberate choice of some human.
But I suspect that it could also have been the "most intelligent choice" of an automated system.
Remember that DVDs have limited bandwidth, so the encoder does a two-pass analysis/tradeoff when it encodes content.
MPEG depends on being able to re-use data from frame to frame to achieve a good compression ratio...
Therefore, since film grain, and plain old noise, are almost purely random, and change from frame to frame, they compress very poorly...
(Random noise is essentially the textbook example of "non-compressible data".)
Since an automatic system MUST fit the content into the bandwidth it's been allocated...
It's essentially going to analyze the content, then "set the noise filter threshold high enough that what's left has little enough random variation that what's left compresses well".
(So an automated system would try really hard to remove anything resembling random noise... and so would a human operator.)
You used to see this issue with encoded sports events.
A talking head, on a stationary background, always encoded very cleanly...
And a talking head, standing in front of a complex background, like a crowd, encoded well (although sometimes you'd see artifacts for a split second when the background shifted).
However, if the camera moved, and the crowd became a complex moving field, which compressed poorly...
You would notice all sorts of artifacts as the encoder struggled to squeeze everything into the allocated bandwidth....
Then, once the camera stopped moving, everything would "settle down again".
Yes, things have improved drastically since then....
our needs for screen resolution and movie resolution depends on the viewing distance and
screen size. without those variables, any conversation is bound to be full of holes.
just because a screen is 4K doesn't mean the resolution is the only difference with
a smaller screen. as such, drawing conclusions about the benefits of increased
resolution based on our impressions are wrong. it's not better than me sitting 6
meters away from my computer screen and thinking that I'll never need better
than youtube 480P because what I'm seeing looks very sharp on my 24" screen.
and the same logic applies for movie formats, codecs, etc. without being able to
isolate the variable, all we have are assumptions and rushed conclusions.
for cameras, it's been many years since I've stopped bothering with the MP numbers
(on my first digital camera I had like 6MP I think and that was clearly not enough
for my needs, coming form an EOS1N I cried for a few years before finding a proper
transition to fully digital). noise in low light and my lenses are much more significant
to me in term of how resolving the pic will be. I'm happy that I can crop for sure, but
that's a different matter entirely.
Whether or not you can see the added resolution of 4K depends on how far you sit from the screen. And how close you sit to the screen affects the viewing angle and how much of the screen is off into your peripheral vision. The THX recommended viewing angle is 36 degrees. So if I do the math... I have a ten foot screen. I sit the THX recommended distance away from it of 15 feet. That puts me right smack in the sweet spot for blu-ray. 4K isn't necessary for me unless I get up out of my chair and stand 8-10 feet from the screen. If I do that, the viewing angle is about 65 degrees and the edges of the screen are out of my vision on the sides. In order to sit that close, THX recommends I would need a screen half that size... and it would still not be in the size/distance range for 4K to be worth it!
The truth is, if you are watching 1.85 movies at home. There isn't any practical way for 4K to make a bit of difference when it comes to resolution unless you get out of your chair and walk up to the screen. Do the math with the specs on your own setup using the calculator I've linked below, and you'll see what I mean. The only advantage to 4K is color and contrast... which is pretty much negated if you go to projection, because projectors have higher black levels.
Link to THX viewing distance calculator... https://myhometheater.homestead.com/viewingdistancecalculator.html
By the way, I did a Spears and Munsil calibration on my projector and it ended up falling right into every one of the detents in the settings. Epson has tight standards and I suspect their projectors are pretty much calibrated right out of the box.
One of the problems is that "enough detail" isn't just a factor of your display's native resolution (that graph is a general rule of thumb, and not set in stone). Heck, I had a non-1080P plasma for years because it had great contrast and color rendition (and while native resolution wasn't 1080, it had better scalers with 1080i content than 720P). Image quality starts with the cinematographer being able to expose well and having all important subjects in sharp focus. Then it's how well the video files are edited and encoded for home formats. I do still have DVDs...many TV shows. A lot of those are still very watchable, but I can easily see their limitations on a big screen compared to HD. 1080P still looks good on my screen, and in given shots, I can see some good detail. But let us not forget that UHD isn't just resolution. More ground breaking is HDR. With HDR, I find I can see better gradation in landscapes (where I can see detail in clouds and detail in shadows). More readily, I can see greater contrast and detail in shadows. Some of my favorite movies have been released in 4K, and I do see improvements all around. Now watching some of my other favorite movies in HD, they do have detail...but seem flatter because I don't see as much shadow detail. I think contrast being the bigger factor is also confirmed by UHD sources (in which quite a few modern movies rated as "reference" were from 2K HDR intermediates).
Is your display calibrated(even just the basic controls)?