obobskivich
Headphoneus Supremus
Not to argue with you, but I would have to disagree. Let's go ahead and explain...
Frame rate for video is not fixed by any sort of "storage space" or "processing power" - film can run at any rate you'd like, and 24 fps was selected because it produces fluid motion. There's actually something that film does, called temporal blur or temporal anti-aliasing, which digital systems cannot do (digital video CAN because it generally comes from digital intermediaries, film), which informs why 24 fps works. No film is shot at a higher rate as it doesn't make sense. This is age old. Digital video equipment is designed to replace film, it can somewhat accomplish this (DC projectors, for example). It is therefore designed with the current standards and limits in mind.
There is no DBT that supports any audible difference between expensive amps or DACs, and you're drawing an unrealistic comparison there (because you could go pick something like an SR-009 that can't plug into an iPod); keep it relevant.
Now on to 120hz and 60hz and 600hz and all of that:
You have never viewed a film that was captured at 120 fps. They are not made. What you have seen is a modern HDTV with something called Motion Interpolation that attempts to create TAA through frame generation; it's a (mathematically/technically) better way to do something called 3:2 pulldown (in other words, most all TVs ever made have a field rate of 60hz, most content is 24 or 30 FPS - that mismatches). There is a very visible difference running MI, but it is not somehow producing the content at 120 FPS. You're still viewing 24 FPS content. Regarding "you can tell if the objects aren't moving fluidly" - that's a limitation of the display technology not the film itself (within reason, again, there is temporal blur, but 1/24 is fairly fast) - LCDs are especially terrible at this. CRTs, OLEDs, and most PDPs are more or less immune to motion blur. Display size has nothing to do with that either - do you think IMAX is a blurry mess? How about normal cinemas? Those do not run at 120hz. You absolutely do not want Motion Interpolation for gaming, as it inserts quite a lot of input latency between the game and the gamer; this affects your response time, this is documented (I will provide some links below for more information).
Here's some links:
http://en.wikipedia.org/wiki/Motion_interpolation
http://www.anandtech.com/show/2803/1
https://en.wikipedia.org/wiki/Input_lag#Input_Lag_example_for_console_gaming
http://www.beyond3d.com/content/articles/66/
Regarding how many FPS you can "see":
https://en.wikipedia.org/wiki/Frame_rate
The article does a well enough job explaining that your eyes don't "sample" - they work in a continuous manner. Conventionally it is accepted that anything meeting the 30-60 FPS average is going to be "fluid motion" enough to trick your eyes. Anything extra is gravy. You can go "slower" than 30-60FPS if you have time data (film can do this) blurred into the media, but eventually the whole thing becomes a blurry mess (it's moving, but it's blurry). In other words, you're taking too large of a "slice" of time for your sample.
Again, I'm not trying to pitch an argument here, but we're really talking apples and oranges when dealing with HDTVs and film vs CG. Yes, there is absolutely a difference with MI turned on. No, it is not agreed upon as universally better (there are MANY long-running debates in HT enthusiast circles as to what the "best" option is - if you like the feature it, use it!). And finally, yes, it does introduce more delay into the video system which is to the detriment of console games. My personal take is that MI is great for animated content (like CG movies and cartoons), but pointless for anything that was shot on film (non-CG movies), or videogames (which are starting to introduce motion blur on their own).
As a final note, those Plasmas that claim 600 hz are using MI as well.
Quote:
Frame rate for video is not fixed by any sort of "storage space" or "processing power" - film can run at any rate you'd like, and 24 fps was selected because it produces fluid motion. There's actually something that film does, called temporal blur or temporal anti-aliasing, which digital systems cannot do (digital video CAN because it generally comes from digital intermediaries, film), which informs why 24 fps works. No film is shot at a higher rate as it doesn't make sense. This is age old. Digital video equipment is designed to replace film, it can somewhat accomplish this (DC projectors, for example). It is therefore designed with the current standards and limits in mind.
There is no DBT that supports any audible difference between expensive amps or DACs, and you're drawing an unrealistic comparison there (because you could go pick something like an SR-009 that can't plug into an iPod); keep it relevant.
Now on to 120hz and 60hz and 600hz and all of that:
You have never viewed a film that was captured at 120 fps. They are not made. What you have seen is a modern HDTV with something called Motion Interpolation that attempts to create TAA through frame generation; it's a (mathematically/technically) better way to do something called 3:2 pulldown (in other words, most all TVs ever made have a field rate of 60hz, most content is 24 or 30 FPS - that mismatches). There is a very visible difference running MI, but it is not somehow producing the content at 120 FPS. You're still viewing 24 FPS content. Regarding "you can tell if the objects aren't moving fluidly" - that's a limitation of the display technology not the film itself (within reason, again, there is temporal blur, but 1/24 is fairly fast) - LCDs are especially terrible at this. CRTs, OLEDs, and most PDPs are more or less immune to motion blur. Display size has nothing to do with that either - do you think IMAX is a blurry mess? How about normal cinemas? Those do not run at 120hz. You absolutely do not want Motion Interpolation for gaming, as it inserts quite a lot of input latency between the game and the gamer; this affects your response time, this is documented (I will provide some links below for more information).
Here's some links:
http://en.wikipedia.org/wiki/Motion_interpolation
http://www.anandtech.com/show/2803/1
https://en.wikipedia.org/wiki/Input_lag#Input_Lag_example_for_console_gaming
http://www.beyond3d.com/content/articles/66/
Regarding how many FPS you can "see":
https://en.wikipedia.org/wiki/Frame_rate
The article does a well enough job explaining that your eyes don't "sample" - they work in a continuous manner. Conventionally it is accepted that anything meeting the 30-60 FPS average is going to be "fluid motion" enough to trick your eyes. Anything extra is gravy. You can go "slower" than 30-60FPS if you have time data (film can do this) blurred into the media, but eventually the whole thing becomes a blurry mess (it's moving, but it's blurry). In other words, you're taking too large of a "slice" of time for your sample.
Again, I'm not trying to pitch an argument here, but we're really talking apples and oranges when dealing with HDTVs and film vs CG. Yes, there is absolutely a difference with MI turned on. No, it is not agreed upon as universally better (there are MANY long-running debates in HT enthusiast circles as to what the "best" option is - if you like the feature it, use it!). And finally, yes, it does introduce more delay into the video system which is to the detriment of console games. My personal take is that MI is great for animated content (like CG movies and cartoons), but pointless for anything that was shot on film (non-CG movies), or videogames (which are starting to introduce motion blur on their own).
As a final note, those Plasmas that claim 600 hz are using MI as well.
Quote:
Not to sound like an ass, but I am going to have to disagree with you here. The reason films and other main stream media are not stored at frame rates higher than 30FPS is not because you cannot tell the difference, but because of the obscene amounts of storage space and processing power that would required to decode and display the video. The frame rates used for television and movies are high enough that the spaces between the frames are not so large as to draw your attention to them rather than the video being shown. Also, consoles run at 60fps, not 30fps. Older consoles such as the Nintendo 64 ran at 30fps, more recent ones are not limited to this frame rate though some developers do implement their own 30fps limit in cases where they are unable to obtain 60fps consistently. Drops from 60fps to 30fps are more likely to draw your attention than if it was running 30fps the entire time. Battlefield 3 is an example of this, as it is locked to 30fps for consoles.
As for your comment about the eyes of human beings being unable to register above 24/30fps, that is comparable to saying that your ears cannot perceive the difference between powering a high-end set of headphones off of an ipod or a multi-thousand dollar Dac/Amp combo. I don't mean to insult you or to sound condescending, but having viewed films and other media on a 120hz monitor and other high end video devices the differences between 120fps, 60fps, and 30fps are quite vast. If you watch any film which was not interpolated or had any other kind of frame blending trickery applied, you should be able to notice that the motions of objects moving across the screen do not look perfectly smooth at 30fps. However, on smaller devices such as phones 30fps is good because the distances covered aren't as far.
I will agree that the difference between 120fps and 60hz isn't blatantly obvious if you haven't used it for long, like other parts of audio/video you grow acclimatized (Not sure if this is the right word to describe it?) When I first started using Sennheiser HD600's, coming from the Sennheiser HD457's, there wasn't that much of a difference to me other than deeper bass. Now after listening to the HD600's for over a year, if I listen to the HD457's I think "I don't remember these producing mostly mid bass and the highs seeming like there's a piece of cotton in the phones.....". Viewing games and other mediums at 120hz is kind of like that. At first only extremely fast moving objects were noticeably different, but now if a game forces 60fps it looks odd and it no longer looks completely smooth. On the contrary, before I upgraded to a 120hz display I was on the bandwagon of thinking 60hz was completely smooth and that people who thought otherwise were experiencing the placebo effect.
For our martin, I suggest that if you do end up buying that monitor you check if it uses interpolation to display the frames or if it can actually take 120hz input and display those same frames. Interpolation causes the monitor to create extra frames which takes time, causing large amounts of input lag. It also creates a blurring effect which you may or may not like. I was unable to find the specifications for the monitor, so I do not know if it interpolates or not. Also, I currently have the Asus Xonar STX and I have noticed that in games which have a large amount of overlapping sounds it will output crackling and popping sounds. With or without GX mode activated. My previous Creative sound card did not do this, but I use the Xonar for its superior sound quality as I usually disable the in-game sounds.