PC Enthusiast-Fi (PC Gaming/Hardware/Software/Overclocking)
Nov 29, 2015 at 5:48 PM Post #8,508 of 9,120
Now I know I'm happy with my setup...

Been ripping Blu-Rays to put onto my Z5 Premium, and - was using VLC to convert the MKVs to MP4s, and it was giving my CPU (i7 5930k modestly overclocked to 4.1ghz on air) a bit of a hard time - Spotted a Back Friday sale on iSkySoft iMedia converter deluxe (no, not affiliated - bear with me!), and - it converts, using the GPU (my 980Ti) as well - super fast (Terminator Genisys in 12 minutes), and no CPU stress - happy happy days :)
 
Dec 7, 2015 at 12:03 AM Post #8,509 of 9,120
ehm, so here's some fun news
 
I built a PC with some help from Bowei an Hybrid Core back in the day, maybe early 2013, an roughly 2/3 ish years later it's still running strong ^^ well I've replaced my GPU about once a year every year lol aside from that it's still working hard! So thanks guys :D 
 
Dec 7, 2015 at 5:31 PM Post #8,511 of 9,120
  ehm, so here's some fun news
 
...... I've replaced my GPU about once a year every year ........ 

eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
eek.gif
 
 
Dec 7, 2015 at 8:59 PM Post #8,512 of 9,120
Speaking of new GPUs, I'm dying for an upgrade to my GTX 760, but can't justify $650 for a 980ti when it's 22nm and doesn't fully support DX12. I don't want to buy something I'll want to replace every year either, and want to be able to play The Witcher 3 and other modern games with good graphics. Also a lot of my money is going into my car and home theatre speaker setup, so I'm torn lol.
 
Dec 7, 2015 at 9:49 PM Post #8,513 of 9,120
IMO it's not worth it to get 28nm GPU anymore now as the next fabrication process is going to half that size into 14nm. Massive performance difference to be seen there, though at the moment I want working out a deal with someone to trade my GTX 780 SLI with his R9 290 Crossfire.
 
Dec 8, 2015 at 12:32 AM Post #8,514 of 9,120
IMO it's not worth it to get 28nm GPU anymore now as the next fabrication process is going to half that size into 14nm. Massive performance difference to be seen there, though at the moment I want working out a deal with someone to trade my GTX 780 SLI with his R9 290 Crossfire.
Yeah, it's 28nm now, my bad. Especially with HBM gen2 coming out, it seems like the best of both worlds, more capacity to keep up with today's higher rez textures (the most important thing to me in graphics honestly, fake textures ruin it) and more bandwidth. I'm curious on how latency is though, haven't looked into it too much. I'm not completely obsessed with latency but some friends have got me thinking about it more. I still regret selling that CRT. :frowning2:
 
Dec 8, 2015 at 2:18 PM Post #8,516 of 9,120
  The latency is less than you would notice, just like it is now. I don't think the human brain and eye can detect the differences between a few nanoseconds of latency.

Most displays have a latency of 60hz. Those who can do 240hz, are 240hz gray to gray, not black to white, so it might not make a difference in real life experience, even less if you consider that it could introduce ghosting. 
 
Theoretically human eye can understand a FPS of about 1000hz, as it resulted in some studies. But this was to trained eye. I cannot wait for technology to advance enough to make more than 120HZ (fps) displays real! (there still aren't many laptops with these types of displays on market, and the existent displays with such good properties are very expensive)
 
 
  IMO it's not worth it to get 28nm GPU anymore now as the next fabrication process is going to half that size into 14nm. Massive performance difference to be seen there, though at the moment I want working out a deal with someone to trade my GTX 780 SLI with his R9 290 Crossfire.

You should see a small difference, I mean both GTX780Sli and R9 290 Crossfire are extremely good, and Radeon is known to make less of a difference in certain applications, though it's raw power output is very high. 
 
I am curious why do not you wait for the new Pascal architecture, as from a technical point of view, it should really change some things.
 
Dec 8, 2015 at 3:25 PM Post #8,517 of 9,120
Ummmmmmm, latency is not based on "60Hz"...? Hz is a measure of frequency? Are you sure you know what you are talking about there? Latency is measured very differently for monitors...
 
Dec 8, 2015 at 3:48 PM Post #8,518 of 9,120
  Ummmmmmm, latency is not based on "60Hz"...? Hz is a measure of frequency? Are you sure you know what you are talking about there? Latency is measured very differently for monitors...

Yes. Latency is the time that the signal takes from the time of being ready to being shown on the screen. 
 
But as the refresh rate itself is 60hz (60fps) for most monitors this means that a latency is wanted to be lower than 16 ms (1000ms/60fps). 
 
Also, this was the reason G-Sync was created, to solve the problem of differences between latency and refresh rate. Gpu never outputs imaged at an exact rate, so G-sync is a controller inside the monitor to adapt the refresh rate to the refresh rate of the Gpu. This solves micro-stuttering. 
 
The technology primitive to G-sync was Vertical sync, which would sync the output frame rate to exact 60fps. Worked great for some games. 
 
for the time being 60hz=60fps, because 60hz means that there are 60 different images on the screen at a time, so this is the max fps that can be shown on a 60hz monitor. Anything higher or lower (numbers that are not divisible) will stutter. 
 
I was conducting a study on this for my project, and found it most intriguing.
 
Dec 8, 2015 at 4:04 PM Post #8,519 of 9,120

Well My orignal 6950 was Over Clocked far to high an I really ran that thing dead, my 7870 OC edition was a "refurb" it it arrived DOA [it always crashed after an hour] my R280X OC edition runs nicely, I'm hoping this NEW card will last me three years. Happy to say my processes held strong! Through years of OCing an bad shut downs
 
Dec 8, 2015 at 4:30 PM Post #8,520 of 9,120
  Yes. Latency is the time that the signal takes from the time of being ready to being shown on the screen. 
 
But as the refresh rate itself is 60hz (60fps) for most monitors this means that a latency is wanted to be lower than 16 ms (1000ms/60fps). 
 
Also, this was the reason G-Sync was created, to solve the problem of differences between latency and refresh rate. Gpu never outputs imaged at an exact rate, so G-sync is a controller inside the monitor to adapt the refresh rate to the refresh rate of the Gpu. This solves micro-stuttering. 
 
The technology primitive to G-sync was Vertical sync, which would sync the output frame rate to exact 60fps. Worked great for some games. 
 
for the time being 60hz=60fps, because 60hz means that there are 60 different images on the screen at a time, so this is the max fps that can be shown on a 60hz monitor. Anything higher or lower (numbers that are not divisible) will stutter. 
 
I was conducting a study on this for my project, and found it most intriguing.

The Latency of a monitor is not measured directly by the refresh rate of the monitor. The actual latency of a monitor is pretty much based on electronic response time, refresh rate and GTG time. For a good monitor, the GTG time has to be below the period for each frame, ghosting is caused by a GTG that is either too near the period (or frametime). A 240Hz latency GTG would be about 4 ms, not good enough for anything, and no where near the handful of nanoseconds for RAM latency and electronic latency.
 
Stuttering is a seperate issue, stuttering is caused by a framerate/frametime being lower/higher than the monitor's and a difference in timing between the displayed images and the eye's frames. G-Sync does not really fix microstutter, but can help to a minor degree with it. If the eye and video is still not in sync, or it is not filling the spaces in between, there will be stuttering to your eyes. G-Sync's main feature is the elimination of screen tearing and supposed input lag. Microstutter is a harder issue to combat and syncing refresh rate of monitor to output will not fix it.
 

Users who are viewing this thread

Back
Top