Quote:
Never questioned that. Lol. It was great description. I guess, but, isn't that a moot point when the refresh rate is something insane? The pixel mapping/stretched image is no good though. Right. Hmm, so what do u use it for then?
Isn't it obvious? The GDM-FW900 is my primary PC monitor. Only a PC would really take advantage of its capabilities anyway. (2304x1440 at 80 Hz, and lower resolutions at up to 160 Hz, should give you an idea of why it's so sought-after.)
Getting consoles hooked up to it just happens to be a side bonus. No need to waste money on an HDTV when the FW900 could do the job.
Quote:
Nameless,
Sony "loves" the green channel so much because it carries the most luminance data in an RGB signal. That's why all "Bayer Pattern" camera sensors are GRGB, only 2 camera sensor manufacturers use something different.
Higher refresh rates are moot for consoles, honestly I prefer 24 fps for movies over the higher stuff you see in soap operas and the new 48fps movie, The Hobbit (which is otherwise a great movie IMO, even for a huge fan of the book like me. Peter Jackson drew out some of the fight/escape scenes more than necessary though). HDTVs featuring higher than 60hz are just advertising useless marketing fluff. Display Port (and the related thunderbolt) are the way to go for computer monitors though.
Anyone know a good choice for an Xbox VGA cable? I noticed that on most of the 3rd party cables, people complain about ghosting problems.
Heh, camera sensor discussion...brings to mind all those Bayer vs. Foveon arguments. Too bad that Foveon sensors are only used by Sigma (with very few exceptions), and Sigma's DSLR offerings are hideously expensive, still behind in a few areas compared to the competition, and worst of all, use a proprietary SA lens mount, as if we didn't have enough competition between Canon EF, Pentax K, and whatever Nikon's using right now.
I thought that part of the reason green was favored to begin with, including giving it an extra bit of data over the red and blue channels with current RGB standards, is that the human eye is naturally more sensitive to green than the other additive primaries. Probably an instinctual thing for finding plants and whatnot.
Indeed, higher refresh rates are moot for consoles that were designed around 60 Hz displays anyway. PCs can get away with it because their performance isn't really fixed due to their nature, and also because PC CRT monitors were generally designed to run higher than 60 Hz. The best ones can max out at 160 Hz (most FD Trinitrons like the FW900), or even 180 Hz (certain Diamondtron NFs that I don't immediately recall at the moment).
I have no comments on Xbox 360 VGA cables; all I recall is that while Microsoft did offer an official one at launch, people complained about the brightness being way too high. I don't know if they fixed that in the numerous firmware overhauls they've released since.
As for how good higher refresh rates look, that really depends. Movies work differently than video games, the latter of which ALWAYS look better with higher framerates, especially when most games don't have any form of motion blur that obscures the jarring transitions between frames. Movie cameras obviously have the motion blur, so it looks more natural.
Quote:
Two different things yes, but they do interact in a way. I don't think a higher refresh rate is much of a boon if it's just refreshing the same frame over and over because the game's running at very low fps.
I'm admittedly not very knowledgeable about current gen consoles since I'm mostly into PC gaming these days, but when someone on that end gets a true 120Hz display it's usually with the intention of running the game at well over a constant 60fps.
Then you've got vsync and all that, no clue how consoles handle that tho I imagine it's more standardized since most games push the hardware pretty hard and there's less variety of display configurations.
You struck the nail on the head with that underlined bit. The real benefit to be had is when the game in question is already running at over 60 FPS. It's pretty easy to do if you're a hardcore Quake or Unreal Tournament player with today's hardware that can run those games in excess of 120 FPS constantly, and fast-paced shooters like that really benefit from the extra visible frames.
People tend to think that higher refresh rates were just to eliminate flicker from CRTs; apparently, they didn't consider the smoother motion benefits, probably because they're not competitive PC gamers. Then again, there are plenty of people who foolishly believe you can't see more than 24-30 FPS, while I can clearly perceive the added smoothness between 60 and about 90-100 FPS on a display that can actually refresh that fast.
Also important to know is that refresh rates, response time, and input lag are all completely separate things. High response time on LCDs does make higher refresh rates pointless if the frames are all blurred together like someone held down the shutter on a camera too long, but low response times do not equal high refresh rates. Meanwhile, input lag is the delay between when the display receives the signal and when it actually starts to render the signal on its surface; it's from that point that the response time starts.