The problem with "standards" that attempt to quantify "what's visible" is that our ability to perceive and pick out details is actually quite complex.
We are actually able to pick out much smaller flaws and details of certian types than others.
And we often perceive details only vaguely - as "the glitter in that scene looks more metallic on screen A than on screen B" or "the water looks more like real water".
I mentioned specular highlights in another post.
Those are the tiny bright "glints" or "points" that you see on shiny metallic objects, and metallic tinsel, and the tips of ocean waves in direct sunlight (which may actually occupy a single pixel).
You may not be able to "see" an individual pixel at a certain distance.... but the smaller pixels in a 4k image still present a more realistic impression of things with specular highlights.
Shiny metal looks more like real metal, and water with sunlight glinting off it looks more like actual water.
How this affects "perceived realism" has been addressed in a few articles in trade magazines... but, since it's difficult to measure, and so somewhat subjective, is rarely mentioned otherwise.
(A single bright 4k pixel will be averaged to become four less bright pixels on an HD screen... and, whatever mechanism is involved in how we perceive it, it does look slightly different.)
Another excellent example I've found is exterior scenes that include a chain link fence in bright daylight..... where, presumably, the wire in the fence sometimes appears as a single pixel wide.
Specifically, look for scenes where there are detailed objects or people walking behind the fence.
As the camera, or objects behind the fence, moves... you see moire patterns between the wires of the fence and the objects behind it...
(These patterns look both clearer and less obvious on a 4k screen... and they tend to be easier to ignore as being "separate from what we're looking at".)
It's also worth noting that there could be several factors at work.....
- most modern 4k TVs are also HDR, while other TVs are not, which results in a significant difference in the dynamic range available on "4k sets"
- virtually all TVs and projectors implement a variety of smoothing, sharpening, and motion processing options to "improve" the picture, and these vary considerably
(for example, a typical sharpening filter operates on groups of pixels, and not on individual pixels, and so affects areas far larger than a single pixel)
- it's also reasonable to assume that any movie intended to become a 4k film or disc will be filmed using cameras with sharper optics and sensors
- (and, of course, any computer graphics created for use with a 4k version of something will be created using different settings as well)
Things get even more complicated when you consider thngs like chroma subsampling.
Thanks to chroma subsampling, your Blu-Ray disc delivers 1920 x 1080 resolution on green, but only half that resolution on blue.
This looks fine on most videos, and on objects like cars and faces, but results in color fringes on small white objects on a dark backgound (like white text on a computer monitor).
You also get color fringes on specular highlights... (with 4:2:0 subsampling, a single white pixel becomes a single green pixel, on top of a set of four blue pixels, and multiple red pixels)
That's why, if you plan to use yur TV as a computer monitor, and use it with text, you want to choose one that supports 4:4:4 chroma subsampling.
Even though Blu-Ray discs use 4:2:0 subsampling, most computer graphic cards will deliver 4:4:4, with full resolution on all colors... and no color fringing.
In general, sets that support full 4:4:4 chroma subsampling cost extra (because it's "a premium feature").
However, because of its higher native resolution, any set that delivers 4k at 4:2:0 can deliver 1080p at 4:4:4...
(It can easily deliver all three color planes at "half of its maximum resolution - which is 4k".)
I would also advise anyone, before they make ganeralizations about "what is possible with 4k projectors", to find some way to look at the image on one of the latest 4k projectors from JVC, or Sony, or Panasonic. The latest LASER illuminated models are extremely expensive, and you're more likely to encounter one at a trade show, or a very high end store, than at Best Buy.... however the sharpness and brightness they deliver are both remarkable... both from across the room... and when you walk up to the screen. And, if you want to talk about what 4k CAN do, and whether the difference is visible or not, it's only fair to compare the best examples of the technology... And, yes, the latest "high end home models" do deliver far better performance that what you'll find in most theaters. And, of course, you can expect this level of performance to become far cheaper over time.
Again, there are different recommendations (beyond THX). Within SMPTE, my viewing distance/screen size is acceptable viewing angle and just at the cusp of perceptually seeing 4K. You can go in circles trying to convince yourself about what optimal viewing angle there should be: but when it comes to cinema, it's always varied. When growing up, my dad and I would find seats in a movie theater, and my grandfather and grandmother would find seats further back (my grandfather's rule was extending your arm all the way, than extending your thumb and pinkie as edges for the frame). I can understand an argument that you shouldn't include a periphery as our eye resolution is centered towards a pretty narrow angle. However, we scan the environment (and why actual perceived resolution for our visual perception is even higher than 4K or 8K). So I don't subscribe to it, and many people do tend to want a larger screen size. It's also your argument for your projection system (that size trumps resolution). If we were to go by your current stance of "recommended" viewing angles, IMAX and Cinamascope would never had been marketable. Cinemascope was a theater standard. The original IMAX standard is for science museums (and filmed with a horizontal 70mm format). The IMAX in cineplexes is a smaller screen with *usually* an IMAX up-scaled digital format.
To me, the one cinema technology I've abhorred is 3D stereoscope. When watching content, your eye wanders and can scan elements besides the talking character. The object that's in focus and the only thing you can watch is the subject in a 3D movie. I've also noticed that with a good HDR display, the image pops and seems less flat (and to me, better than a 3-D stereoscopic movie).