Downsampling from a higher resolution than native increases image quality at the cost of performance. Upscaling from a lower resolution than native increases performance at the cost of image quality. The point of temporal reconstruction techniques like checkerboard is to mitigate that image quality loss as much as possible. Upscaling has always been a thing in PC games that let you select arbitrary resolutions, it just looks like blurry crap on anything less than native on an LCD. With ultrawide 1440p, 4K, and 5K monitors becoming more mainstream, the need for quality upscaling on PC has never been greater. Why spend crazy money on flagship GPUs to run at native res when you can render at a fraction of that and amortize the cost of native res rendering over multiple sub-native frames while maintaining good image quality? Efficiency > brute force in my book. Even something as simple as nearest neighbor integer upscaling for non-blurry fullscreen 1080p on a 4K monitor, or 1440p on a 5K, would be a godsend if implemented in Nvidia/AMD drivers, but ofc the IHVs don't give a rat's ass as they're in the business of selling you overpriced graphics cards so you can brute-force your way to native resolution.
And no, a GTX 1080 is not even close to being a 4K GPU IMO, not even a Titan XP for that matter. A 1080 already has min FPS below 60 at 1080p in recent, demanding titles at near-max to max settings such as TW3 w/HairWorks, DX:MD (Ultra no MSAA), and Forza Horizon 3. PS3 used blurry upscaling algos without the benefit of 2x MSAA that was free for the X360's eDRAM, PS4 Pro uses state-of-the-art techniques like checkerboard that have near-native IQ with HDR on top of that, so really nothing in common, plus Sony made it crystal-clear this time around that Pro is not for native 4K gaming.