NamelessPFG
Headphoneus Supremus
- Joined
- Jul 7, 2011
- Posts
- 3,095
- Likes
- 128
Out of pure curiosity. Is there such a thing as videophile video dacs?
Not really, since most displays and their interfaces (DVI, HDMI, DisplayPort) have gone digital for over a decade. Analog video only really pertains to CRTs, since instead of a digital fixed grid, they scan electron beams over a phosphor surface according to the video signal they're fed, as long as it's within the yoke's sync limits.
Because of that, the other direction - videophile ADCs - are more of a thing - mostly upscalers like the Micomsoft XRGB-Mini Framemeister that take analog RGB video and cleanly convert and upscale it to something considerably friendlier to modern HDMI displays and capture cards. More on that here if you're interested:
https://www.youtube.com/watch?v=VtTM7nU9SMA&list=PLTNBVisVMbSR1ZDDQRgjg6S9D2YQ4rwnZ
That said, analog video quality for PC use was a selling point back in the '90s, when people who wanted clean 2D quality with 10-bit color resolution in the LUT instead of just 8-bit needed to pay up for a Matrox card as opposed to the cheaper S3, ATI, NVIDIA, etc. offerings.
This actually worked out well when 3dfx burst onto the scene and had pure 3D accelerators that worked in tandem with an existing 2D graphics card, so it wasn't really a tradeoff between 2D graphics fidelity and blistering-fast 3D performance at first.
But of course, clean VGA output, 10-bit LUTs, and even dual VGA-outs became a standard thing as the market quickly consolidated into NVIDIA vs. ATI, and then LCDs took over.
Sure, but don't think they call them DACs. All I know is the Sony PS3 was considered a superior Blu-Ray player than most competition for video quality, until Oppo took the throne. And there's always exotic super $$$ items that jockey for "the best" quality position.
The signal decoder built into a TV is also quite important... Maybe that's the RAMDAC that Nameless is referring to? And 400hz is kind of like audio's 24-bit/196kHz sampling rate?
Signal decoders wouldn't really make sense from an analog video standpoint; the closest you'd have there would be the old comb filters used for deinterlacing and trying to make composite video not look like crap. There's nothing to decode when sending analog video to a CRT if it's already in its component RGBHV form, as it typically is with a VGA interface.
Basically, you need a faster RAMDAC to drive higher resolutions and refresh rates. For instance, 1600x1200 at 95Hz requires a 256.5 MHz pixel clock, and 1920x1200 at the same 95 Hz (which the FW900 can reach) requires roughly a 319.8 MHz pixel clock. That's well beyond what any garden variety DisplayPort to VGA adapter can handle.
A better comparison would be how newer HDMI and DisplayPort revisions are needed to increase the available video bandwidth and thus the available resolutions and refresh rates.
Personally I'm in no hurry to abandon 1080p. I'd rather be able to run high settings, a good framerate and lower resolution (native) than lower settings, a good framerate and a high resolution (native). Plus, half the time I sit/lay on my bed to play games, so that's half the time when the difference between a native 720p and 1080p screen would probably be negligible, never mind higher resolutions. Mind you, the biggest monitor I have room for is 24" (unless I were to downgrade my speakers), so I might feel differently if I had a bigger screen.
I'd actually gladly embrace 4K and resolutions even beyond that, but that would be murder on my poor GTX 980. Even the new Pascal-based Titan X (dammit, NVIDIA, Apple branding does NOT work with graphics cards!) would be stressed at 4K, I'm sure.
2560x1440 at 144 Hz seems like the best balance for now until GPUs catch up - and, yes, that higher refresh rate matters from a PC gaming perspective.