Mad Lust Envy's Headphone Gaming Guide: (8/18/2022: iFi GO Blu Review Added)
Mar 17, 2016 at 9:05 AM Post #34,606 of 48,562
I would've agreed with you years ago on motion resolution, but now that LCd and LED sets have strobing black lights modes and/or black frame insertion, they effectively do motion just as good as CRTs.

Seriously, I tested an M series Vizio and a Sony TV with the tech, and OMG, not even my older Panasonic ST30 plasma which had better motion res than the VT60 looked as sharp in motion. basically LCDs do an image in motion as sharp as a stationary image. No blurring whatsoever with LESS input lag. I used to own an older Sasmung LCD that did 240hz without the interpolation, but it added a ridiculous amount of input lag. Still, it was equal or better motion res than plasma for non-gaming purposes.

The best Tv sets do this without that horrible motion interpolation, that dejudders an image.

Monitors do it too now, under ULMB (Ultra Low Motion Blur), or whatever they call it under the company naming.

I'd really like a gsync monitor, but I'm hoping for when both ULMB and Gsync can be enabled at the same time (at the moment, you can't).
 
Mar 17, 2016 at 9:18 AM Post #34,607 of 48,562
well, I had 3 Gsync monitors here in the last month and had to send them back. Asus und Acer QA is horrible. I know a few people with 5-12 replacements. 
 
Other than the horrible BLB and or yellow tone the motion was simply awesome and even better than the already great 144hz experience was VRR with Gsync. No more Vsync Judder with double/tripple buffering, no more tearing.
 
Gonna try it again in autumn. Maybe there's less panel lottery involved in buying such a monitor. Going back to 60Hz and more blur was a pain in the first few hours. 
 
Mar 17, 2016 at 9:27 AM Post #34,608 of 48,562
Yeah, been reading up a lot on Gsync monitors lately, and I totally understand the tech. I just don't like that using Gsync turns off ULMB, meaning you're deaaling with native refreshes at 144hz or less. Not that 144hz is bad (that's almost as good as an average plasma in motion), but when ULMB can do better than plasma motion, you want that kind of clarity at all times. Gsync going down to 80 or less as an example is essentially going back to a standard display's motion res.

I have no doubt that they'll find a way to incorporate ULMB with Gsync at the same time. Probably need the display to be 240hz, so that a 120fps Gsync instance is followed by a black frame insertion, effectively giving 120fps the clarity of 240hz. I figure maybe in a few years 240hz native displays will be a thing. 240hz is basically blur free, but will help gsync for gaming when fps fluctuates below 120.

My VT60 has a very awesome motion smoother without interpolation that gives it full motion res, but the input lag spikes up to something along the lines of 120ms. Absolutely woethless to play games with that much lag. That Vizio and Sony were in the 20ms range, which is basically almost hard to perceive. 10ms is basically instant.

Before being an audio guy, I was a TV guy. :D
 
Mar 17, 2016 at 9:35 AM Post #34,609 of 48,562
240hz would be awesome, is even speculated for 2017 but the thing with all this new tech is. WHO BUILDS A WORTHWILE MONITOR WITH GOOD QA FINALLY.Eizo monitors are my favorite, always, however they will not implement Gsync in the near and later future.
 
Sorting out coil whine on high end cards is another problem. My graphics card is whine free up to 300fps. Thank God but that's a lottery too. Some even start coil whine at 100fps. 
 
Mar 17, 2016 at 9:35 AM Post #34,610 of 48,562
well, I had 3 Gsync monitors here in the last month and had to send them back. Asus und Acer QA is horrible. I know a few people with 5-12 replacements. 

Other than the horrible BLB and or yellow tone the motion was simply awesome and even better than the already great 144hz experience was VRR with Gsync. No more Vsync Judder with double/tripple buffering, no more tearing.

Gonna try it again in autumn. Maybe there's less panel lottery involved in buying such a monitor. Going back to 60Hz and more blur was a pain in the first few hours. 


Kinda hoping for BenQ to have a successor to theirs, since they're essentially the only guys at the time that had both a Displayport AND HDMI inputs. That way I can hook up my PS4 to it, and take advantage of their strobing backlight mode which is separate from the Gsync module.

As for GFX cards, I'm holding out for Pascal to hit with HBM2 memory. The GTX1080 is rumored to use GDDR5X which is a letdown. HBM2 is the future, but I don't trust AMD with their driver support. Still salty when I bought my M17x R4 with 7970s THREE TIMES THAT WOULDN'T WORK RIGHT. I swapped for a 680m, and voila, perfect. Nvidia gets my vote.
 
Mar 17, 2016 at 9:40 AM Post #34,611 of 48,562
yeah, HBM2 is set for 2017 anyway as current yields would lead to 2000$ Titan/TI cards. Even fewer would buy that. Finfet 16 is DOUBLE THE PRICE of 28nm.
 
Probably have to skip Pascal anyway as I use a Titan X with a custom cooler. 
 
Mar 17, 2016 at 9:44 AM Post #34,612 of 48,562
So tell me how my 3 year old TOTL Panasonic Plasma VT60's HDMI board burned out. (No HDMI inputs work).This is a $200 part that is basically impossible to find now. Plasmas haven't been sold in years.

I was in a panic... then I remembered that HDMI to component converters are sold, and my component input still works. So... yeah $60, and I lose digital quality for analog, but hey, my Tv isn't completely dead. Hopefully the picture quality loss isn't major. I don't remember component looking bad last time I used for my PS3 for testing.

FYI: don't try overclocking your plasmas. They don't take kindly to that unlike lcd displays...

My 60hz laptop display is overclocked to 80hz, and it makes a pretty big difference. Especially when half-framerate is 40fps, which is considerably smoother than 30, without the requirements of 60.

the exact same thing happened to me, mine was caused by a power surge, if you figure out a diy fix let me know, im hoping the board wont be fried and perhaps a touch up on the solder might fix it 
 
Mar 17, 2016 at 9:58 AM Post #34,613 of 48,562
Hmm, you may be onto something. I overclosed my plasma, and it wasn't damaged. But my Tv messed up an hour or so AFTER my Creative X7 basically went nuts, and almost blew up my ear drums and HE400, where 2% volume and 2% software volume (basically being the weakest of the weakest nomral volume possible) sounded like the equivalent of high gain + volume at 3x the max. yeah, it was that bad. I thought my ears were a goner.

So perhaps it wasn't the X7 or TV tweaking but a sudden surge at home.

My A board is so damn hard to find, and so expensive. ;__; But $60 for an alternative fix is ok.

Feggy, I'm super jealous. I've been eyeing those custom 980tis for a PC build once my computer dies, but was hoping my laptop lasts at least one more year. Since next year is basically the big leap in gfx card tech.

Last time I checked on Titan Xs, they only sold reference cards. I gotta see how custom ones fare against the best 980TI cards. I know the X has double the memory, but I thought the 980TI outperformed them under normal use.
 
Mar 17, 2016 at 10:15 AM Post #34,614 of 48,562
with reference clock the Titan X is faster. Since TIs can come with custom stuff (and thus are always offering a default overcock) they have an easier chance to go higher with air. Many Titan X guys use watercooling and in a few games that extra VRAM comes in handy at higher resolutions.
 
I bought my Titan X for a TI price so I didn't hesitate. Else I would've bought a Super Jetstream or EVGA TI. Putting on a custom cooler was a bit intensive though. I put a Accelero Xtreme IV on it. Took me more than an hour. 
 
Mar 17, 2016 at 10:17 AM Post #34,615 of 48,562
I'm too scared to DIY stuff like that. AIO cooler is as far as I go, and if it's not an easy bolt on, I won't even do it. Just bad with my hands for that sort of thing.

Should've seen me butcher my Audeze vegan pads to fit on my HE400 directly. I was so frustrated to the point where I almost ripped them off in anger.
 
Mar 17, 2016 at 10:25 AM Post #34,616 of 48,562
I was kinda scared too so I was extra careful. However the only thing you can do wrong is cutting and aligning the protection film wrongly. Most critical part is to draw the lines for the cutout holes that need to let the thermal pads through (which you have to position properly) and cut it carefully. You only have one film. Cut it bad and you can forget the backplate and reorder a film unless you want to fry your card via electr. shortening
 
As long as that film is cut out safely and is neither overlapping or underlapping everything is ok and everything else is a cakewalk. Just takes a bit time to clean everything properly until you do the main work. 
 
Mar 17, 2016 at 10:39 AM Post #34,618 of 48,562
I'm too scared to DIY stuff like that. AIO cooler is as far as I go, and if it's not an easy bolt on, I won't even do it. Just bad with my hands for that sort of thing.

Should've seen me butcher my Audeze vegan pads to fit on my HE400 directly. I was so frustrated to the point where I almost ripped them off in anger.

EVGA sells a Hybrid cooler (basically the reference cooler, which is actually a pretty good cooler, with an AIO water cooler attached to it) Titan X. They're the only non-waterblock custom card. The reference cooler is for the Vram while the AIO cools the actual chip.
 
I use to have a Titan X, then a 980ti (No point in owning a Titan X when I can just overclock a 980ti,) and then a Fury X (ultrawide freesync is a lot easier to get a hold of.) Oddly the Fury X is performing the best in a lot of cases, but that's only because Nvidia's drivers have been going ass-backwards lately.
 
Mar 17, 2016 at 10:44 AM Post #34,619 of 48,562
It's also because Fury X has HBM which is superior to GDDR5 (at least I hear it makes a difference). Which I hear works best for the upper tier of resolutions, IIRC.

I just have had bad experience with AMD, and I feel GSYNC is superior to Freesync, particularly when things get rough. Freesync doesn't double rates at low fps while Gsync does, which I feel AMD needs to address. Of course,since you got some pretty beefy gear, that won't be an issue. But then again, there are games like Rise of the Tomb Raider that has been known to humble almost everyone's gear. Lol.
 
Mar 17, 2016 at 10:57 AM Post #34,620 of 48,562
It's also because Fury X has HBM which is superior to GDDR5 (at least I hear it makes a difference). Which I hear works best for the upper tier of resolutions, IIRC.

I just have had bad experience with AMD, and I feel GSYNC is superior to Freesync, particularly when things get rough. Freesync doesn't double rates at low fps while Gsync does, which I feel AMD needs to address. Of course,since you got some pretty beefy gear, that won't be an issue. But then again, there are games like Rise of the Tomb Raider that has been known to humble almost everyone's gear. Lol.

 
I completely understand. I had a horrid experience with my 295x2 when I tried that, but it was mostly due to the company selling the card, not AMD itself. Their drivers have improved tremendously. A ROTR driver update made it come entirely back up to snuff.
 
Freesync's advantage is that it is a standard part of VESA firmware, so any manufacturer can implement it for free(ish.) It's still new though, and has the same problems G-sync's already solved. G-sync is the superior service, but then you have to wonder if the extra $200+ premium is worth the cost.
 
Oddly, going with the LG 21:9 that was only ~$300 has given me better QC than when I tried the panel lottery with the more expensive G-sync (and Freesync) panels.
 

Users who are viewing this thread

Back
Top