Head-Fi.org › Forums › Misc.-Category Forums › Video Games Discussion › Mad Lust Envy's Headphone Gaming Guide: (Update: 10/15/2014: Beyerdynamic T51i Added)
New Posts  All Forums:Forum Nav:

Mad Lust Envy's Headphone Gaming Guide: (Update: 10/15/2014: Beyerdynamic T51i Added) - Page 1713

post #25681 of 28011
Quote:
Originally Posted by IBIubbleTea View Post

144hz, Do you even? 
Wish my korean ips overclocked reliably. My new incoming G sync Asus ROG swift should fit the bill tho lol. That battlefield friends vid is pretty funny.
post #25682 of 28011
Quote:
Originally Posted by xero404 View Post


Wish my korean ips overclocked reliably. My new incoming G sync Asus ROG swift should fit the bill tho lol. That battlefield friends vid is pretty funny.

Are there more than one brand available for G sync yet?  I want it, but I'm not interested in paying more simply because Asus is the only one making it...  Oooh, maybe they'll have some to play with at the Nvidia booth at PAX!

post #25683 of 28011

Quote:

Originally Posted by xero404 View Post


Wish my korean ips overclocked reliably. My new incoming G sync Asus ROG swift should fit the bill tho lol. That battlefield friends vid is pretty funny.

Lucky! Only if I had a job I would blow my money like that...

 

Quote:

 

Originally Posted by Stillhart View Post
 

Are there more than one brand available for G sync yet?  I want it, but I'm not interested in paying more simply because Asus is the only one making it...  Oooh, maybe they'll have some to play with at the Nvidia booth at PAX!

Of course ASUS isn't the only company that is partnered with NVIDIA's G sync technology, http://www.geforce.com/hardware/technology/g-sync/faq

 

"Q: What display companies are planning on introducing G-SYNC monitors?

A: Many of the industry’s leading monitor manufacturers have already included G-SYNC in their product roadmaps for 2014. Among the first planning to roll out the technology are Acer, AOC, ASUS, BenQ and Phillips."

 

I'm really looking forward to FreeSync as it's not suppose to cost as much as G Sync and might be for everyone later on.

 

 

 

post #25684 of 28011

http://www.pcworld.com/article/2466180/lg-to-unveil-curved-ultrawide-monitor-at-ifa.html I really want the curved one. I just hope it's refresh rate and input lag are decent. 

post #25685 of 28011
Thread Starter 
I'm gonna go off topic here, because I'm interested in gsync myself.

Has the way gsync truly works been published yet? I'm no techie, and even then, I believe I have it figured out. I literally thought it out in my head for like... an hour, for no reason.

120hz display is necessary, and backlight scanning tricks don't work at the same time (based off Linustechtips video of the Asus monitor). My deduction/hypothesis.:

1. As Nvidia stated, Gsync will only go up to the displays max refresh rate.

2. In order for a smooth playback when fps dips below 60, the display is actually speeding up and doubling the image. I.e.: if fps dips to 45fps, the display is refreshing at 90hz to capture the 45fps image twice. The reason why I believe this is 100% true is because IF the display dips to native 45hz to capture 45fps without stutter, you would get some pretty unbearable flickering (try watching a movie in 48hz mode on a plasma... the light flicker is unbearable). Going over the 60hz line ensures that flicker is reduced. I took my Plasma's 96hz mode for 24hz content as a sort of example. What my TV does for 24hz Blu-ray content is to repeat the same frame 3 times in a second to get that film cadence without the crazy flicker or 3:2 pulldown judder, that happens when 24fps content plays at 60hz.

3. The reason backlight scanning (which effectively makes a display behave like its refreshing at double it's 120hz display refresh, reducing motion blur) won't work in conjunction with gsync, is because backlight scanning (for right now), is limited to working in between the displays innate refresh rate, so gsync's sporadic/altering refresh could at times clash with the backlight scan and mess up the image (this is my assumption). I have heard that gsync monitors have BOTH gsync and backlight scanning, but you can only use one or the other, leading to believe my theory is accurate.

4. The reason gsync only smoothens out lower than 60fps up to a certain point is because if the framerate gets to low, the screen would display the a frame before and after AT THE SAME TIME. That's why there is a report of ghosting once fps goes below a certain point (I believe it was low 40s or less).

5. (THIS IS PROBABLY WRONG, BUT I THINK ITS POSSIBLE). So why does slightly less than 60fps to high 40s still look like 60? This is the tricky part, but I believe they are using motion interpolation to fill in the gap between the current fps and the native signal. A sort of interpolation like the one currently used in basically all modern tvs that make content appear more live-like. Like movies looking like the behind the scenes, games that run at 30fps looking more like 60fps. Motion interpolation is when a frame gets created between 2 actual frames, which is why 30fps can look like 60fps at times. It isn't perfect, however. Anyways, I believe this may be part of the gsync equation and why they can make 45fps (for example) appear to look like 60fps.

6. The one main thing I don't understand is how framerates ABOVE 60fps are handled. Is the display refreshing at the higher framerates, up to 120-144hz (whatever the display natively runs at, despite most displays only accepting up to a 60hz-72hz signal). AFAIK, displays accept to 72hz signals only, regardless of how much the gpu is rendering. Whether this caps what gsync does at framerates above that, I'm not sure.

7. One thing I don't understand is how they managed to do all this and REDUCE input lag. I do understand that inputs are displayed as soon as it happens, unlike before, where Vsync would only display an input command after the full frame was complete (once ever 60 seconds, and if the input is done in between a frame, you would have to wait a frame and then some). Im definitely in the black on this.

Guys, this is just assumptions from someone who only knows about TVs, and less about computer displays. I assume that domestically, 72hz is the HIGHEST regresh rate domestically sold. All the 120hz-144hz displays are just doubling those signals for reduced blurring, and not actually displaying native 120+ signals.
Edited by Mad Lust Envy - 8/27/14 at 1:50am
post #25686 of 28011

It's much simpler than that, they are just matching monitor's refresh rate to the game framerate, basically the opposite of what v-sync does

post #25687 of 28011
Thread Starter 
I know that. However, I dont see it being that simple. You're not understanding what I'm saying. People have been saying that even framerates as low as mid 40s are as smooth as if they played at 60fps.

And again, going off what I know about 48hz... if the display matched the mid 40fps... you would get REALLY bad flickering (seen on projector screens and plasmas when displaying low fps content natively at 48hz). Trust me on that. I have not heard one report on screen flicker with gsync monitors, meaning that its probably matching the signal's frequency, and if its below 60fps, it is probably doubling it, to avoid flickering.

a few examples of what i think gsync is doing...

30fps is shown twice (this is normal behavior on all displays) since forever.

43fps is being displayed twice at 86hz.

50fps is displayed twice at 100hz.

Only once the fps has hit above 60fps/72fps, does it start displaying an image once. So for example: 85fps is shown at 85fps.This is my theory. Once it hits the monitor's limit of 120/144hz, it won't show any more frames, regardless of the gpu rendering 648479fps or whatever.

As diplay enthusiasts know, 120hz displays have been for the past many years since its inception, only accepting up to 60hz, and just doubling that image. It has NOT accepted native 120hz signals.

Whether this has changed recently, to actually ALLOW the full 120fps to be fully displayed... I don't know.

That brings up a question as how the Hobbit at 48fps hasn't been known to flicker. I assumed that in order to avoid flickering, the image is shown twice per second at 96hz. I thought there is no conceivable way 48fps could be shown natively without people complaining about flicker. That is, unless 48fps native, without a 24fps signal being doubled doesn't flicker. Then all my assumptions are wrong. frown.gif I KNOW Hobbit was shot at 48fps. That doesnt mean that the projector displaying the movie itself isnt just doubling that to reduce flicker. I don't even know any more. It's new territory, no matter how you slice it.

I really wanna know what is the cause of gsync ghosting once fps hits too low. Again, I assume it's because the display is showing the before and after frame at the same time. I would totally test gsync with dxtory and limit the framerate to specific values to see where the problem of ghosting starts occuring.
Edited by Mad Lust Envy - 8/27/14 at 3:17am
post #25688 of 28011
Thread Starter 
What I failed to realize is that 48hz modes are ONLY in projectors and plasma. Also, something like the Hobbit at 48fps was shot with DIGITAL cameras, meaning that it's not reliant on the same capture method as film which was light based. That's probanly the reason why there is no flickering...

I'm an idiot.

Even so, it doesn't explain how a low framerate with gsync on an LCD/LED manages to look as smooth as 60fps.

The technology intriques me.
Edited by Mad Lust Envy - 8/27/14 at 3:25am
post #25689 of 28011
Quote:
Originally Posted by Mad Lust Envy View Post
As diplay enthusiasts know, 120hz displays have been for the past many years since its inception, only accepting up to 60hz, and just doubling that image. It has NOT accepted native 120hz signals.

 

Maybe in years-behind-the-times HDTV land, that's the case due to HDMI bandwidth limitations, but PC monitors have been true 120 Hz and beyond since the days of CRTs. It's part of the reason I'm so fond of my GDM-FW900.

 

As Yethal pointed out, you're really overthinking this. G-SYNC basically lets the GPU control the monitor refresh cycles instead of having to time its frames to the monitor. It only repeats frames in the fashion you're thinking if the framerate drops below 30 FPS. Refresh rate mishmashes are no longer a problem because with G-SYNC, all refresh rates in the range of 30 Hz to the monitor's maximum are effectively native, no interpolation needed. That's the entire point.

 

While you're concerned about flicker, I'm pretty sure G-SYNC is designed with persistent display types like LCD in mind. The results on a CRT, if you couldn't maintain 80-90 FPS, would be downright headache-inducing, never mind that G-SYNC compatible CRTs will never exist.

 

Also keep in mind you're looking at this from an HDTV standpoint, where G-SYNC may very well never exist in the market if they can't already get 120 Hz video inputs right for PC use. This is something exclusively for PC gamers who are likely buying monitors to match their super-expensive graphics cards' capabilities. They already had real 120 Hz, 144 Hz, even 160 Hz, but now they don't have to worry about screen tearing (when two frames are partially displayed during a monitor refresh cycle) or the latency of V-syncing to the monitor's refresh rate.

post #25690 of 28011
Thread Starter 
Yeah, I corrected myself. My main brainfart was that I was considering flicker, forgetting that's not an LCD issue. That's what I get for being so pro-Plasma all the time, lol.

WOW, REAL 120HZ. So much want. I don't see how it can look better than 60fps in a motion standpoint, but I do see the benefits in terms of blur reduction (as I know the benefit is clearly evident, especially with backlight scanning displays that effectively multiply the blur reduction properties of faster refreshes. Gone are the days when LCD/LED were a blurry mess, lagging behind Plasma. 120hz tvs with effective backlight scanning can display the full motion res.... though so far, doing this adds SEVERE input lag.

its too bad, since LCD in its best modes for input lag are blurry as heck at 300 lines, while the worst plasma still does at least 700.

Of course, im referring to televisions. I don't know much of anything about monitors, and their specific properties, especially in terms of high refreshes.
Edited by Mad Lust Envy - 8/27/14 at 4:00am
post #25691 of 28011

Hey, guys.

 

I just ordered the headphone and an amp. At the end I went for the K612 Pro + Fiio e12. Since you can use this amp with the iPhone and such, which is something I can't stand (default headphone and no amp), it is a great extra. I won't have the amp until the next week, so I can take my time to try the headphone with no amp and see the difference.

 

Thank you all for the help, especially to OP for this great guide.

post #25692 of 28011

I think Gsync shows the image at what the frame rate is being processed at,if the pc drops fps as the action hots up with loads of explosions happening at once the Gsync drops it's refresh rate to match the pcs fps!

post #25693 of 28011
Quote:
Originally Posted by Mad Lust Envy View Post

Yeah, I corrected myself. My main brainfart was that I was considering flicker, forgetting that's not an LCD issue. That's what I get for being so pro-Plasma all the time, lol.

WOW, REAL 120HZ. So much want. I don't see how it can look better than 60fps in a motion standpoint, but I do see the benefits in terms of blur reduction (as I know the benefit is clearly evident, especially with backlight scanning displays that effectively multiply the blur reduction properties of faster refreshes. Gone are the days when LCD/LED were a blurry mess, lagging behind Plasma. 120hz tvs with effective backlight scanning can display the full motion res.... though so far, doing this adds SEVERE input lag.

its too bad, since LCD in its best modes for input lag are blurry as heck at 300 lines, while the worst plasma still does at least 700.

Of course, im referring to televisions. I don't know much of anything about monitors, and their specific properties, especially in terms of high refreshes.

Most of your analysis sounds right to me. Here is a good article on "how it works:"
http://www.anandtech.com/show/7582/nvidia-gsync-review

I recommend looking at the graphical representations of what is going on, but here's a quick quote that summarizes

"G-Sync works by manipulating the display’s VBLANK (vertical blanking interval). VBLANK is the period of time between the display rasterizing the last line of the current frame and drawing the first line of the next frame. It’s called an interval because during this period of time no screen updates happen, the display remains static displaying the current frame before drawing the next one."

I believe this is pretty consistent with what MLE has written above, just with different words and graphs. Still doesn't explain what happens when FPS > refresh rate.

A good impression by someone at tweak town who I assume is running 2x780ti (or similar):
http://www.tweaktown.com/articles/6608/hitting-the-g-spot-with-nvidia-s-g-sync-on-the-asus-rog-swift-pg278q/index3.html

A LOT of this sounds somewhat subjective. It is difficult to quantify "smoothness," but the review above claims that, due to previously only being able to maintain 80-100 fps on a 120hz monitor, the g-sync at 144hz on battlefield 4 with whatever system he's got is making a meaningful difference in his gameplay experience.

Here's another article that appears to be claiming g-sync helps even if you are already rendering high FPS (FPS>refresh):
http://techreport.com/review/26870/asus-rog-swift-pg278q-g-sync-monitor-reviewed/3

Here you can see that guild wars 2 at 240fps running at 144hz looks "smoother" with g-sync on a frame by frame basis
post #25694 of 28011
Quote:
Originally Posted by IBIubbleTea View Post
Originally Posted by Stillhart View Post
 

Are there more than one brand available for G sync yet?  I want it, but I'm not interested in paying more simply because Asus is the only one making it...  Oooh, maybe they'll have some to play with at the Nvidia booth at PAX!

 

Of course ASUS isn't the only company that is partnered with NVIDIA's G sync technology, http://www.geforce.com/hardware/technology/g-sync/faq

 

"Q: What display companies are planning on introducing G-SYNC monitors?

A: Many of the industry’s leading monitor manufacturers have already included G-SYNC in their product roadmaps for 2014. Among the first planning to roll out the technology are Acer, AOC, ASUS, BenQ and Phillips."

 

I'm really looking forward to FreeSync as it's not suppose to cost as much as G Sync and might be for everyone later on.

 

That's not what I was asking.  I know there are more people partnered, but last time I checked there were only like 2 models available for purchase and they were both from Asus and both extremely overpriced.  A little Amazon research shows that BenQ and Philips both have models out now too and Asus prices are a lot more reasonable these days.  That's pretty good news...

 

And it looks like Asus has an IPS panel with G Sync in the works too.  Drool...

 

Now if they would start making 19x12 monitors again instead of 19x10, I won't have to DOWNGRADE just to get a newer monitor.  

post #25695 of 28011
Thread Starter 
What I want to know, is what happens when say... a game runs at 32fps. I mean, I'm still not convinced that the display is matching such a low fps, since even on an LCD, that can't possibly be a comfortable refresh for our eyes. This is why is still believe it is doubling such a low framerate and running the refresh at 64hz instead.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Games Discussion
Head-Fi.org › Forums › Misc.-Category Forums › Video Games Discussion › Mad Lust Envy's Headphone Gaming Guide: (Update: 10/15/2014: Beyerdynamic T51i Added)