「Official」Asian Anime, Manga, and Music Lounge
Dec 13, 2016 at 10:14 PM Post #170,536 of 177,745
Speaking of computer stuff though, how do games even work? XD

[video]https://www.youtube.com/watch?v=JgQUigcY4O8[/video]


My VRAM usage is only like 60% of what my graphics card has, it has a 5th(?) gen Intel Core i7m, and 8 GB RAM. All of that seems like it should be plenty but I can barely run the game at a solid 40 FPS and my fans are roaring. U wat m8?
 
Dec 13, 2016 at 10:18 PM Post #170,537 of 177,745
Speaking of computer stuff though, how do games even work? XD


My VRAM usage is only like 60% of what my graphics card has, it has a 5th(?) gen Intel Core i7m, and 8 GB RAM. All of that seems like it should be plenty but I can barely run the game at a solid 40 FPS and my fans are roaring. U wat m8?

GPU clock speed? (don't quote me on that I'm just guessing)
 
Dec 13, 2016 at 11:15 PM Post #170,540 of 177,745
  edit: nvm I'm dum, it seems like the 6900k can all core boost to 3.5GHz. 
redface.gif
 This is getting real confusing ahahahahah.
 
All hardware accelerated encoding lowers quality compared to CPU only encoding, so it is a compromise of performance vs quality. You don't see anyone serious using a Skylake chip's Quick Sync for video encoding, they usually rely on the much larger GPU encoding engines to deliver higher performance without sacrificing too much quality. It does seem that in all of the damn Quick Sync info slides and stuff, it's meant for "casuals" converting videos for their phones and such, Quick Sync is basically useless for professional applications. That's probably why it seems like no reviewer really talks about Quick Sync past a Handbrake benchmark or two. Skylake only added some extra bit depth and h265/HEVC encoding/decoding support.
 

 
Yeah it does seem like some sort of auto-OC, good for the peasants I guess. 
tongue.gif
 

I was going to say, why would they only reserve turbo for lighter tasks when you want those highest speeds with the heaviest tasks.
 
Hm... Isn't it also possible to get no quality loss through heterogeneous? Because, from a quick look at the Wiki page for hardware acceleration, all hardware acceleration specifications use some amount of compression to increase speed gains. For heterogeneous compute, GPU shaders just need to run through more instructions for more complex instructions built into CPU ISA that aren't in the GPU ISA but the end result is that you have more available resources to do encoding with.
 
  http://www.anandtech.com/show/10907/amd-gives-more-zen-details-ryzen-34-ghz-nvme-neural-net-prediction-25-mhz-boost-steps

Seems like their boosting tech can do increments of 25mhz as long as thermal and power allows it.
 
Kinda like modern gpu boost?

More or less.
 
  Excited for Ryzen even though I am not gonna buy it
 
I hope AMD does good

We should all hope, Intel as well (kind of?). It gives consumers more competition to choose from and pushes the tech forward faster.
 
For Intel it means they can now start letting loose slightly so they can improve raw performance without having to worry that they're going to run AMD into the dirt and get slapped by some antitrust/monopoly-destroying act. It's kind of worrying for them as well because it's not like they were sitting around with no threat; ARM has been destroying Intel in the IoT and mobile space so they've spent a lot of time trying to make x86 power efficient enough to be considered an alternative to ARM (we saw how that went). So I guess now they can start appeasing the performance crowd a bit more but also have the extra pressure of mobile and IoT eating at them.
 
Dec 14, 2016 at 12:09 AM Post #170,541 of 177,745
  For Intel it means they can now start letting loose slightly so they can improve raw performance without having to worry that they're going to run AMD into the dirt and get slapped by some antitrust/monopoly-destroying act. It's kind of worrying for them as well because it's not like they were sitting around with no threat; ARM has been destroying Intel in the IoT and mobile space so they've spent a lot of time trying to make x86 power efficient enough to be considered an alternative to ARM (we saw how that went). So I guess now they can start appeasing the performance crowd a bit more but also have the extra pressure of mobile and IoT eating at them.

 
Bahahahah I'm just showing my age, I always thought turbo boost was kinda like rerouting power to just one core and "overclocking" it, but when all cores are utilized it has to bring things down to base clock due to TDP and all that jazz. Looks like turbo boost has evolved a great deal since then because apparently on the 6950X at full tilt, two cores operate at 3.5GHz, a further two at 3.4GHz and so on until 3.0GHz. Intel marks which cores performed best at the factory, and they decide which cores will turbo the highest.
 
Right from the horses mouth, it looks like a GPU has no effect on the actual h264 encoding process, but a GPU does help rendering effects/colour correction and such taking the load off the CPU so it can focus purely on encoding. Since a normal video production export workflow involves rendering and then encoding, it all makes sense now.
 
Since the majority of mainstream computing now lies in mobile computing, I am looking forward to Zen+whatever iGPU they can shove in there. Intel Iris graphics are pretty decent nowadays but I'm hopeful that AMD can destroy them in that department.
 
Dec 14, 2016 at 3:41 AM Post #170,542 of 177,745
   
Bahahahah I'm just showing my age, I always thought turbo boost was kinda like rerouting power to just one core and "overclocking" it, but when all cores are utilized it has to bring things down to base clock due to TDP and all that jazz. Looks like turbo boost has evolved a great deal since then because apparently on the 6950X at full tilt, two cores operate at 3.5GHz, a further two at 3.4GHz and so on until 3.0GHz. Intel marks which cores performed best at the factory, and they decide which cores will turbo the highest.
 
Right from the horses mouth, it looks like a GPU has no effect on the actual h264 encoding process, but a GPU does help rendering effects/colour correction and such taking the load off the CPU so it can focus purely on encoding. Since a normal video production export workflow involves rendering and then encoding, it all makes sense now.
 
Since the majority of mainstream computing now lies in mobile computing, I am looking forward to Zen+whatever iGPU they can shove in there. Intel Iris graphics are pretty decent nowadays but I'm hopeful that AMD can destroy them in that department.


huh, I guess it's just not really worth the trouble to leverage Nvidia and AMD's propriety h.264 hardware.
 
Dec 14, 2016 at 3:53 AM Post #170,543 of 177,745
  Hm... Isn't it also possible to get no quality loss through heterogeneous? Because, from a quick look at the Wiki page for hardware acceleration, all hardware acceleration specifications use some amount of compression to increase speed gains. For heterogeneous compute, GPU shaders just need to run through more instructions for more complex instructions built into CPU ISA that aren't in the GPU ISA but the end result is that you have more available resources to do encoding with.

technically yes? but someone needs to be in charge of instruction translation and GPU offloading (which the OS doesn't do because GPUs aren't registered as a processor).
then there's the PCIe overhead, the CPU's internal bus is pretty darn fast compared to PCIe and the CPUs have an internal instruction cache.
 
heterogeneous compute is actually closer to distributed cloud compute IMO, but I could be completely wrong...
 
Dec 14, 2016 at 11:20 AM Post #170,545 of 177,745
So I haven't seen Flip Flappers yet but there seems to be a lot of good talk about it.

The summary and genre didn't seem overly interesting to me so I'm wondering what it could be doing to get such acclaim.

Can anyone do some explanation to it?

Just watch it.
Summary and genre won't tell you anything about it.
I would say it is sort of like Space Dandy. It is pretty episodic (but also not) and it feels like every episode has a different director. The art is fantastic and so diverse and the animations are pure sakuga.
Lots of interesting things going on that one could overanalyze for hours and hours (the episode discussions on /r/anime are crazy...).
My favorite show of this season for sure.
 
Dec 14, 2016 at 12:16 PM Post #170,547 of 177,745
  http://gatebox.ai/

Not sure if someone has already posted, but that trailer was hilarious.

seen this pretty long ago somewhere else.

#forever(not)alone ?
 
I want one actually, preferably an imouto version... that handles all my chores.
 
Dec 14, 2016 at 6:18 PM Post #170,550 of 177,745
  Just watch it.
Summary and genre won't tell you anything about it.
I would say it is sort of like Space Dandy. It is pretty episodic (but also not) and it feels like every episode has a different director. The art is fantastic and so diverse and the animations are pure sakuga.
Lots of interesting things going on that one could overanalyze for hours and hours (the episode discussions on /r/anime are crazy...).
My favorite show of this season for sure.

 
FripFrap flipflap flifla. Good stuff, got some yuri but not enough!
 

Users who are viewing this thread

Back
Top