Mad Lust Envy's Headphone Gaming Guide: (8/18/2022: iFi GO Blu Review Added)
Jul 21, 2016 at 6:32 PM Post #36,016 of 48,579
Sure, but don't think they call them DACs. All I know is the Sony PS3 was considered a superior Blu-Ray player than most competition for video quality, until Oppo took the throne. And there's always exotic super $$$ items that jockey for "the best" quality position.

The signal decoder built into a TV is also quite important... Maybe that's the RAMDAC that Nameless is referring to? And 400hz is kind of like audio's 24-bit/196kHz sampling rate?
Speaking of which, would you say getting a Oppo Blu Ray player be a noticible upgrade over my ps3 for the price?
 
Jul 21, 2016 at 8:10 PM Post #36,017 of 48,579
Speaking of which, would you say getting a Oppo Blu Ray player be a noticible upgrade over my ps3 for the price?

Ahhh, pbbbbt, that's one of those subjective value things. Yes, the Oppo blu-ray player is better in a technical and measurable way, but it's up to you to decide if that will increase your enjoyment. I'm not a videophile, loved going straight from a DVD collection to HD movie downloads (iTunes and Amazon).
 
Jul 21, 2016 at 10:04 PM Post #36,018 of 48,579
I agree, being the first to discover a good value (and make a review) is really fun. Maybe get the headphone instead of making a 9% gpu power increase for $700? That GTX980 should last you for YEARS.

Where the hell would you find a GTX980 for less than $200 besides 2nd hand from your brother, or something? Lol I'll buy yours for $190 if you don't want it!

I have a 980 Ti, not a regular 980 ( which I did have before and sold it for a good price to get the Ti) But 200 dollar GTX 980 roam ebay, used of course.
 
Q8200 clocked at 2.94ghz (can't do more), GTX 660 Ti aaaand.... Windows Vista.
 
Luckily I have some money put aside, so I'll soon have a very respectable rig.

Man, at least thats better than what I had about 4 years ago, a P4 at 3.0ghz of which was in a pre built computer, a Sony Vaio lol. Had 2 GB of DDR, a 250gb HDD, an Antec 380w PSU and the first video card for it was a PNY 7600GS with 512MB of GDDR2 I think. Then I upgraded to a BFG (Yes, BFG! they were great) 7950GT with 256GB GDDR3 I think which did pretty well to run COD MW when it released. They upgraded to a PNY Geforce GTS 250 (aka gtx 9800 or something like that). I was still a noob in terms of PC components and what was the latest.
 I was still running Windows XP at the time as well.
 
 
   
Well, when you have FD Trinitrons in the 21" to 24" range that were top-of-the-line professional graphics monitors in their day acquired for rather low prices, it tends to spoil you rotten. No input lag, perfect viewing angles, black levels that are BLACK, high refresh rates, no native resolutions for when you still like to fire up the classics... these things wipe the floor with LCDs and I'm just waiting for OLED monitors to become more widespread.
 
My only issue is that the flyback transformer board or some other high-voltage component in my prized GDM-FW900 blew out over a year ago, and I need to get it fixed. Having to go back to 21" 4:3 in this day and age is rather painful with a lot of games.
 
But then I realize that the cost of fixing up the FW900 might be prohibitive when to keep using it with future graphics cards to its maximum potential, I'd have to start shopping for a DisplayPort-to-VGA adapter that doesn't completely suck compared to the RAMDACs built into graphics cards up until NVIDIA's Maxwell generation, which most likely isn't going to happen until the HDFury 5 releases - the most prestigious in a line of video DACs that are NOT cheap.
 
Most DisplayPort-to-VGA adapters have a RAMDAC clocked at only about 230-260 MHz, you see. 400 MHz RAMDACs have been standard features in graphics cards for years. The FW900 could push resolutions and refresh rates that take advantage of such fast clocks. 'Nuff said.
 
Meanwhile, you can get refurb Acer XB270HU monitors for $400 now. The quality's going to suck by comparison, but 2560x1440, 144 Hz and G-SYNC in particular make it a potential holdover 'til we get widespread OLED.
 
Oh, and sub-$200 GTX 980s? Maybe I should pick up a second one at that kinda pricing just for the hell of it, though I know full well that a lot of recent releases don't support multi-GPU at all due to incompatible rendering methods. I'd probably be better off saving the money for Volta.

Dont know much about monitors but all I know is that CRTs were the schiit back then. The sub $200 GTX 980s are used though. Plenty of them on Ebay, might even find a few on craigslist....especially since the GTX 1060 has been released and its slightly faster on some games than the GTX 980. I would still go with the 980 with the extra cuda cores if you do some video editing. Plus, who cares about saving energy when you are playing on your PC? We love powa!
 
Jul 21, 2016 at 11:48 PM Post #36,019 of 48,579
Sure, but don't think they call them DACs. All I know is the Sony PS3 was considered a superior Blu-Ray player than most competition for video quality, until Oppo took the throne. And there's always exotic super $$$ items that jockey for "the best" quality position.

The signal decoder built into a TV is also quite important... Maybe that's the RAMDAC that Nameless is referring to? And 400hz is kind of like audio's 24-bit/196kHz sampling rate?


No no, not a blu ray. Specifically a video dac that takes a digital video signal such as hdmi and converts it to analog one, such as vga. I balieve Nameless was referring to exactly such a decice built into an adapter plugged into the gpu.
 
Jul 22, 2016 at 1:57 AM Post #36,020 of 48,579
   
Man, at least thats better than what I had about 4 years ago, a P4 at 3.0ghz of which was in a pre built computer, a Sony Vaio lol. Had 2 GB of DDR, a 250gb HDD, an Antec 380w PSU and the first video card for it was a PNY 7600GS with 512MB of GDDR2 I think. Then I upgraded to a BFG (Yes, BFG! they were great) 7950GT with 256GB GDDR3 I think which did pretty well to run COD MW when it released. They upgraded to a PNY Geforce GTS 250 (aka gtx 9800 or something like that). I was still a noob in terms of PC components and what was the latest.
 I was still running Windows XP at the time as well.

When I bought my PC there were quite a few people saying that a quad core CPU was overkill. I'm glad I didn't listen because going quad core is one of the reasons I've been able to keep going this long (that and a couple of graphics card upgrades). It was a pretty decent machine back in 2008.
 
It's only really the last year or two that my PC hasn't been cutting it, and that's mostly because more and more games have been dropping support for Vista. Stuff like Battlefield 3 and Far Cry 3 I could run with a mix of high and medium settings and get between 40-70 fps.
 
My next machine I'm going i7 K series, 16 gb RAM and a GTX 1070 or 1080 (while some people say either of those are overkill for a 1080p monitor a) It will do 144hz, so taking advantage of that when I can will be nice and b) It should allow me to max out games for a decent amount of time before I have to upgrade).
 
Jul 22, 2016 at 3:02 AM Post #36,021 of 48,579
When I bought my PC there were quite a few people saying that a quad core CPU was overkill. I'm glad I didn't listen because going quad core is one of the reasons I've been able to keep going this long (that and a couple of graphics card upgrades). It was a pretty decent machine back in 2008.

It's only really the last year or two that my PC hasn't been cutting it, and that's mostly because more and more games have been dropping support for Vista. Stuff like Battlefield 3 and Far Cry 3 I could run with a mix of high and medium settings and get between 40-70 fps.

My next machine I'm going i7 K series, 16 gb RAM and a GTX 1070 or 1080 (while some people say either of those are overkill for a 1080p monitor a) It will do 144hz, so taking advantage of that when I can will be nice and b) It should allow me to max out games for a decent amount of time before I have to upgrade).

I have pretty much the same specs in my work machine. I7-6700, 16gb of ram and gtx1070. Cryengine 3 games maxed out go as high as 100 fps in 1080p.
 
Jul 22, 2016 at 3:49 AM Post #36,022 of 48,579
I have pretty much the same specs in my work machine. I7-6700, 16gb of ram and gtx1070. Cryengine 3 games maxed out go as high as 100 fps in 1080p.


It will be nice to be able to run new/newish games with authority again. It's sad when you start having to do the balancing act between performance and visuals.
 
You mention the Cryengine. I'm currently playing Crysis 2 on the PS3 (I have a PS4 but I'm currently working through some of the backlog of games I have). I bought it in a sale a year or two back despite having the PC version as sometimes I like to check out multiple ports of games (I'm one of the few people that check out comparison vids and Digital Foundry articles purely because I find it interesting rather than for bragging rights). It's pretty impressive for a PS3 title, though I personally think they should have stripped the graphics back a little bit more for the sake of the framerate. When things get hectic the controls suffer.
 
Jul 22, 2016 at 4:35 AM Post #36,023 of 48,579
Underdeveloped ports are the price we pay for publishers' greed. And underoptimized console releases are the price we pay for developers laziness.
 
Jul 22, 2016 at 7:30 AM Post #36,024 of 48,579
Underdeveloped ports are the price we pay for publishers' greed. And underoptimized console releases are the price we pay for developers laziness.


Multi-platform development will likely always have its problems. I wouldn't call Crysis 2 a particularly bad port considering the lack of horsepower the PS3 has. It just seems like one of those cases where they prioritised graphics over performance which is detrimental to a FPS.
 
I started out on Atari but I was too young to be critical back then. When I got an Amiga is when I started noticing differences between platforms and ports. Amiga was very much a mixed bag when it came to ports. After Amiga I moved to the Sega Mega Drive (Genesis) and stuck mostly with consoles for quite a long time. My brother bought a PC in 1998 but I never really got into it. I seriously regret that now because the late 90s and early 00s was an awesome time for PC games.
 
From the mid 90s to early 00s console and PC had very different games to suit very different audiences. In a way that was great because it allowed each to play to their strengths. Nowadays the vast majority of games are or easily could be multi-platform which kind of gives less reason to own one over another. The clear answer nowadays SHOULD be, if you want the best experience, get yourself a PC, but there are still cases of botched ports (Batman Arkham Knight being particularly notable) and cases of content disparity (Mortal Kombat not getting all the content packs), so if you want to be sure of having the best experience, really you need to own more than one system. There's also the matter of exclusives which is a whole other topic.
 
Jul 22, 2016 at 8:05 AM Post #36,025 of 48,579
Personally I'm against simultaneous multiplatform development. It's a waste of resources to have the development team split into different engineering teams for each platform, or for people to write code for several of them at once. For me the best approach is what Warframe and Rocket League did. First PC, then one console, then the other. Not all at once.
 
Jul 22, 2016 at 8:58 AM Post #36,026 of 48,579
Never listen to people that say X card is overkill for X resolution. What you're doing with 'overkill' is ensuring the card you get stays competitive for longer. It might be doing 140-200fps now, but in 2-3 years, it may be 40fps extreme case, I know, but still).

What is overkill now, will be 'fine' the next year.

Not to mention, 240hz monitors are coming quite soon.

I just want a that new Titan X coming in less than 2 weeks. But that price is about as much as I wanna spend on my next PC sans video card. >.<

I've been EXTREMELY patient in waiting for new monitors to come out. It has to be at LEAST 1440p/HDR/144hz+/Gsync.

But really, I've been waiting to hear about OLED monitors, which by the looks of it look like 2017 territory, sadly.

As for me, still on 3630QM Mobile i7, and 680m still, 12gb of DDR3/12800. :frowning2:
 
Jul 22, 2016 at 9:39 AM Post #36,027 of 48,579
Acer has a 2560x1080 200hz monitor already.
 
Jul 22, 2016 at 11:29 AM Post #36,029 of 48,579
   
Yea but 1080p is so ancient now.  I wouldn't buy any 1080 monitor even if it has 1000+hz.


Depends on the size. Perceived pixel density increases the farther your are from the screen.
 
Jul 22, 2016 at 12:05 PM Post #36,030 of 48,579
   
Yea but 1080p is so ancient now.  I wouldn't buy any 1080 monitor even if it has 1000+hz.


Personally I'm in no hurry to abandon 1080p. I'd rather be able to run high settings, a good framerate and lower resolution (native) than lower settings, a good framerate and a high resolution (native). Plus, half the time I sit/lay on my bed to play games, so that's half the time when the difference between a native 720p and 1080p screen would probably be negligible, never mind higher resolutions. Mind you, the biggest monitor I have room for is 24" (unless I were to downgrade my speakers), so I might feel differently if I had a bigger screen.
 

Users who are viewing this thread

Back
Top