- Joined
- Mar 20, 2010
- Posts
- 19,137
- Likes
- 5,320
I'm gonna go off topic here, because I'm interested in gsync myself.
Has the way gsync truly works been published yet? I'm no techie, and even then, I believe I have it figured out. I literally thought it out in my head for like... an hour, for no reason.
120hz display is necessary, and backlight scanning tricks don't work at the same time (based off Linustechtips video of the Asus monitor). My deduction/hypothesis.:
1. As Nvidia stated, Gsync will only go up to the displays max refresh rate.
2. In order for a smooth playback when fps dips below 60, the display is actually speeding up and doubling the image. I.e.: if fps dips to 45fps, the display is refreshing at 90hz to capture the 45fps image twice. The reason why I believe this is 100% true is because IF the display dips to native 45hz to capture 45fps without stutter, you would get some pretty unbearable flickering (try watching a movie in 48hz mode on a plasma... the light flicker is unbearable). Going over the 60hz line ensures that flicker is reduced. I took my Plasma's 96hz mode for 24hz content as a sort of example. What my TV does for 24hz Blu-ray content is to repeat the same frame 3 times in a second to get that film cadence without the crazy flicker or 3:2 pulldown judder, that happens when 24fps content plays at 60hz.
3. The reason backlight scanning (which effectively makes a display behave like its refreshing at double it's 120hz display refresh, reducing motion blur) won't work in conjunction with gsync, is because backlight scanning (for right now), is limited to working in between the displays innate refresh rate, so gsync's sporadic/altering refresh could at times clash with the backlight scan and mess up the image (this is my assumption). I have heard that gsync monitors have BOTH gsync and backlight scanning, but you can only use one or the other, leading to believe my theory is accurate.
4. The reason gsync only smoothens out lower than 60fps up to a certain point is because if the framerate gets to low, the screen would display the a frame before and after AT THE SAME TIME. That's why there is a report of ghosting once fps goes below a certain point (I believe it was low 40s or less).
5. (THIS IS PROBABLY WRONG, BUT I THINK ITS POSSIBLE). So why does slightly less than 60fps to high 40s still look like 60? This is the tricky part, but I believe they are using motion interpolation to fill in the gap between the current fps and the native signal. A sort of interpolation like the one currently used in basically all modern tvs that make content appear more live-like. Like movies looking like the behind the scenes, games that run at 30fps looking more like 60fps. Motion interpolation is when a frame gets created between 2 actual frames, which is why 30fps can look like 60fps at times. It isn't perfect, however. Anyways, I believe this may be part of the gsync equation and why they can make 45fps (for example) appear to look like 60fps.
6. The one main thing I don't understand is how framerates ABOVE 60fps are handled. Is the display refreshing at the higher framerates, up to 120-144hz (whatever the display natively runs at, despite most displays only accepting up to a 60hz-72hz signal). AFAIK, displays accept to 72hz signals only, regardless of how much the gpu is rendering. Whether this caps what gsync does at framerates above that, I'm not sure.
7. One thing I don't understand is how they managed to do all this and REDUCE input lag. I do understand that inputs are displayed as soon as it happens, unlike before, where Vsync would only display an input command after the full frame was complete (once ever 60 seconds, and if the input is done in between a frame, you would have to wait a frame and then some). Im definitely in the black on this.
Guys, this is just assumptions from someone who only knows about TVs, and less about computer displays. I assume that domestically, 72hz is the HIGHEST regresh rate domestically sold. All the 120hz-144hz displays are just doubling those signals for reduced blurring, and not actually displaying native 120+ signals.
Has the way gsync truly works been published yet? I'm no techie, and even then, I believe I have it figured out. I literally thought it out in my head for like... an hour, for no reason.
120hz display is necessary, and backlight scanning tricks don't work at the same time (based off Linustechtips video of the Asus monitor). My deduction/hypothesis.:
1. As Nvidia stated, Gsync will only go up to the displays max refresh rate.
2. In order for a smooth playback when fps dips below 60, the display is actually speeding up and doubling the image. I.e.: if fps dips to 45fps, the display is refreshing at 90hz to capture the 45fps image twice. The reason why I believe this is 100% true is because IF the display dips to native 45hz to capture 45fps without stutter, you would get some pretty unbearable flickering (try watching a movie in 48hz mode on a plasma... the light flicker is unbearable). Going over the 60hz line ensures that flicker is reduced. I took my Plasma's 96hz mode for 24hz content as a sort of example. What my TV does for 24hz Blu-ray content is to repeat the same frame 3 times in a second to get that film cadence without the crazy flicker or 3:2 pulldown judder, that happens when 24fps content plays at 60hz.
3. The reason backlight scanning (which effectively makes a display behave like its refreshing at double it's 120hz display refresh, reducing motion blur) won't work in conjunction with gsync, is because backlight scanning (for right now), is limited to working in between the displays innate refresh rate, so gsync's sporadic/altering refresh could at times clash with the backlight scan and mess up the image (this is my assumption). I have heard that gsync monitors have BOTH gsync and backlight scanning, but you can only use one or the other, leading to believe my theory is accurate.
4. The reason gsync only smoothens out lower than 60fps up to a certain point is because if the framerate gets to low, the screen would display the a frame before and after AT THE SAME TIME. That's why there is a report of ghosting once fps goes below a certain point (I believe it was low 40s or less).
5. (THIS IS PROBABLY WRONG, BUT I THINK ITS POSSIBLE). So why does slightly less than 60fps to high 40s still look like 60? This is the tricky part, but I believe they are using motion interpolation to fill in the gap between the current fps and the native signal. A sort of interpolation like the one currently used in basically all modern tvs that make content appear more live-like. Like movies looking like the behind the scenes, games that run at 30fps looking more like 60fps. Motion interpolation is when a frame gets created between 2 actual frames, which is why 30fps can look like 60fps at times. It isn't perfect, however. Anyways, I believe this may be part of the gsync equation and why they can make 45fps (for example) appear to look like 60fps.
6. The one main thing I don't understand is how framerates ABOVE 60fps are handled. Is the display refreshing at the higher framerates, up to 120-144hz (whatever the display natively runs at, despite most displays only accepting up to a 60hz-72hz signal). AFAIK, displays accept to 72hz signals only, regardless of how much the gpu is rendering. Whether this caps what gsync does at framerates above that, I'm not sure.
7. One thing I don't understand is how they managed to do all this and REDUCE input lag. I do understand that inputs are displayed as soon as it happens, unlike before, where Vsync would only display an input command after the full frame was complete (once ever 60 seconds, and if the input is done in between a frame, you would have to wait a frame and then some). Im definitely in the black on this.
Guys, this is just assumptions from someone who only knows about TVs, and less about computer displays. I assume that domestically, 72hz is the HIGHEST regresh rate domestically sold. All the 120hz-144hz displays are just doubling those signals for reduced blurring, and not actually displaying native 120+ signals.