「Official」Asian Anime, Manga, and Music Lounge
Aug 22, 2018 at 1:13 AM Post #176,177 of 177,742
I, for one, welcome our Nvidia overlords.

Same.

Was deciding whether to go with a CPU (thus an entire PC) update, or a GPU upgrade based on what Nvidia is offering. CPU is the main thing slowing me down at the moment where I have to wait a few minutes once in a while before I can continue on working, as most of what I do is cpu based. However, I am willing to switch to a GPU based renderer and ditch out the money for just a GPU if Nvidia's ray tracing tech is indeed that promising.

It is, and more. The demo at SIGGRAPH absolutely blew my mind. As Jensen said, for what that 10k usd Quadro RTX 8000 does, "It's a steal."

Nvidia always dominated that market, offering many things that and can't match. That's obvious. But AMD products still had a place. The GPU I chose for my current workstation was an R9 390, which was very capable, even matching a 1080ti under specific workloads, and the 390x have outrun even the Titan X in very specific tasks, tasks that I am interested in personally. The CPU rendering program I'm using already costs about two 2080ti graphics cards to begin with. It's fast and powerful for what it does, but the GPU renderer I'm looking at is having a change in its pricing model, with a free tier that allows the use of two GPUs with no significantly limitations, including commercial use. Turing is offering about 8x in ray tracing rendering performance compare to Pascal on the highest end for this specific program. Investing in a GPU for now makes a lot more sense for me.

We shall see...
 
Last edited:
Aug 22, 2018 at 2:40 AM Post #176,178 of 177,742
do you use a special rice to dry it up?

Like many problems in life, this one can be solved with copious amounts of IPA. (unfortunately not India Pale Ale, but rather Isopropyl alcohol)

It takes a few days to dry out all the water because it pools underneath the capacitors, it's a bit of a pain having to squirt some IPA at the caps every now and then. i don't have an air compressor so I used a leaf blower as my 'air knife', let's just say things were a bit dicey, the boards almost flew away a few times.

Looking at the pricing for the next gen cards, it looks like we have an increase in performance without much of a performance/price ratio improvement. That's what happens when there's no competition...

An old Toshiba SSD from 2010 is seriously bottlenecking one of my thinkpads. Only ~12MB/s random reads means any random application update brings the whole system to a crawl. Now should I spend $40 on a new SSD for double the random read performance hmmmmmm I don't really use it often enough to justify the expense.

Friend got a free thinkcenter m92p (I think), it's an Ivy bridge era SFF PC and we shoved a 1050ti in there to use as a cheapo gaming system. Problem is a damn 1050ti cost like $230 here which is nutso because that's 570/1060 3GB money when prices were more reasonable. Ehh what can ya do, we're limited by the 75W PCIE slow power delivery, it can 'run' monster hunter at 1080p, all low, about 45fps.
This is for someone who played all of Fallout 4 on Haswell integrated graphics... I don't think they're gonna notice that it's not at 60fps. XD

I have a Thinkpad X240 that used to crash when I picked it up on the left side, had to take it apart and reseat the RAM/eDP cable, and gave the screws a bit of a tighten. Seems okay for now.

Overlord is okay, cells at work is nice.
 
Aug 22, 2018 at 8:10 AM Post #176,179 of 177,742
If you're going GPGPU then you'd at least want to make sure whatever software you're using has Microsoft DXR support. I'm not exactly sure how splurging for the raytracing-specific hardware is going to benefit if the software can't even utilize that API.

CUDA basically has a stranglehold on GPGPU in the creative space. OpenCL support is in a few handful of creative apps with the most notable being Apple in their stuff like Final Cut Pro because they hate NVIDIA with a burning passion. Unless Intel's new GPU efforts turn out to be trash (Raja has an enormous lump of money fueling him though) and AMD falls unimaginably far behind, Apple will probably never add CUDA support to their software. Wondering when Apple will start bringing GPU architecture design in house for their x86 machines as well.

Back to NVIDIA, I'm expecting Turing to not get a refresh for another 2 years outside of the occasional Ti release and releasing identically named SKUs with lower performance. Unless Navi is that good, high probability NVIDIA will make people wait 2+ years for another card to milk the market dry. Using a 106 chip in a ##70 product is pretty sad enough. Within the next few generations I'd almost expect the ##80 products to also get 106 chips instead of 104 chips at the rate we're going.

Anyways, given the current glimpses of RTX2080 Ti raytracing performance, it's not looking good.

https://www.techpowerup.com/247007/...cing-sotr-barely-manages-30-60-fps-at-full-hd

The Star Wars demo frame times show the RTX card in that machine running at what...22 fps? I wouldn't expect it to get much better, and by the time it does they'll probably have the next set of cards out with even stronger raytracing performance to make you feel like your hardware is outdated.

It's not particularly surprising given that this is still a 12nm chip like Pascal trying to trade some shader space for raytracing modules.

Edit:



Tons of fanboys (from both camps) hate on this guy. Not sure why; his predictions are almost always spot on for the past few years and his stuff is well researched.
 
Last edited:
Aug 23, 2018 at 12:41 AM Post #176,180 of 177,742
Aug 23, 2018 at 8:16 AM Post #176,181 of 177,742
Yeah sure as a demo to show progress but it's nowhere near ready for consumer market production. It's somewhat impressive but it is not even close to very impressive. This kind of stuff looks more like something you show at a developers conference rather than a consumer product launch. If you want to write that level of "performance" off as something worth spending the cash on a 2070, 2080, or 2080 Ti you must be smoking something. I won't go as far as saying putting this kind of poor performance in a mainstream card because these cards are clearly too expensive to be purchasable by many (despite being mid-range silicon due to being a cut down TU104 variant) and, as far as we know, the 2060 and below will be GTX cards and not RTX cards (very smart and dirty marketing on NVIDIA's part). That, however, only makes it worse because they're effectively selling you a card that will get outdated extremely fast while making the GTX cards feel even more oudated and that's pretty unacceptable for the people who are going to shell out the cash for this. You can expect Turing's successor to have somewhere near more acceptable performance at probably a much higher price (probably slap on another $100-$300 depending on SKU) but we probably won't see that for what...2+ years again?

The genius here mostly comes from the fact that they were able to hype up these relatively poor performing cards. With RTX, NVIDIA has effectively created their own relatively proprietary way of doing things (it's still built to use strictly Microsoft DXR functions for raytracing calls but will have stuff like NVIDIA's proprietary AI RTX denoising tech) and then go ahead and compare it to previous cards which simply don't have the capability to do any of these things. They've been able to get people to buy into the idea that the Turing cards have been able to greatly improve on performance on some workloads that previous cards weren't ever intentionally designed to do. Of course you're going to see this enormous inter-generation improvement in ray tracing because Pascal was never made to ray trace, so even a small amount of dedicated raytracing hardware will give you enormous performance gains relative to Pascal. RTX, NVIDIA's comments, and the demos made people feel like their GTX cards were now all outdated because they don't have the dedicated hardware to perform ray tracing computations quickly while also making people desire what's effectively a pig wearing makeup.

Was I expecting them to fully drop rastering for raytracing to get maximum raytracing performance with the limited die space they have? Absolutely not. That makes 0 sense from any perspective. At the same time though people calling that kind of performance acceptable for products that cost that much is something that needs to change. You also need to raise your expectations since this kind of progress is still very slow because NVIDIA can force the market to move at their fingertips. AMD has always been one to push newer technologies (DX12, async compute, GPGPU, HBM, primitive shaders, etc.) but NVIDIA knew how to weave the words to sell sand as if it were gold (quite literally). The previously consistent 40-50% gains year to year (i.e. generation to generation) for GPUs has now been annihilated to a mere 20-30% thanks to NVIDIA destroying their competition through marketing and reducing the pace to fill their coffers. Pascal was one of the most blatant abuses of that, having not released a card for more than 2 years until this week.
 
Last edited:
Aug 23, 2018 at 11:25 PM Post #176,183 of 177,742
The previously consistent 40-50% gains year to year (i.e. generation to generation) for GPUs has now been annihilated to a mere 20-30% thanks to NVIDIA destroying their competition through marketing and reducing the pace to fill their coffers. Pascal was one of the most blatant abuses of that, having not released a card for more than 2 years until this week.

Yeah and to add insult to injury, it basically took 2 years for the higher end cards to be readily available at MSRP.

It's just a sad state of affairs in the GPU market. Process node improvements have slowed down, memory prices are sky high, and there's basically no competition from AMD. AFAIK PC sales have gone down the gutter too with longer and longer upgrade cycles, so I guess it's inevitable that the market moves towards low volume high margin products.

On the flip side at least the CPU market has been quite exciting the past few years....except that for gaming it feels like a 6 year old processor still isn't much of a bottleneck especially at higher resolutions.
 
Aug 24, 2018 at 3:53 AM Post #176,184 of 177,742
I think you guys are hard on Nvidia. you can "gain" up to 3 times the number of fans on one card. :imp:

will their ray tracing thingy be significant for 3D graphists at least? because for gaming I'm the sort of guy who would 10billion percent(Dr Stone style) say something like: "wow it's really super pretty. how do I turn it off?"
 
Aug 24, 2018 at 8:24 AM Post #176,185 of 177,742
Yeah and to add insult to injury, it basically took 2 years for the higher end cards to be readily available at MSRP.

It's just a sad state of affairs in the GPU market. Process node improvements have slowed down, memory prices are sky high, and there's basically no competition from AMD. AFAIK PC sales have gone down the gutter too with longer and longer upgrade cycles, so I guess it's inevitable that the market moves towards low volume high margin products.

On the flip side at least the CPU market has been quite exciting the past few years....except that for gaming it feels like a 6 year old processor still isn't much of a bottleneck especially at higher resolutions.

Mmm well the process node improvements are mostly on the fab guys and I personally think they're doing fine unless you're Intel...in which case you're probably screwed. Their 10nm++ process is still being tested (for reference their 10nm++ is about the same size as the other fabs' 7nm processes. It's like 4nm wider in one direction but Intel can supposedly still get higher transistor density because of some tech like quad patterning, hyperscaling, and a few other things they have).

The 7nm node from TSMC, Samsung (LPE though but geared more towards SoC's and storage anyways), and GloFo is basically ready for mass production and supposedly Vega 20 (Vega successor, supposedly not just a die shrink but has some performance improvements) and Navi will be on it. Turing is on 12nm although that is actually beyond stupid because they probably won't have another card for another 2 years so we're stuck with 12nm right on the cusp of 7nm availability.

Supposedly the memory prices should be dropping to very low levels after that China memory price fixing investigation and also something about forcing production in China? I don't remember, take what I said with a grain of salt. I'd need to look for a source but I do remember hearing they will drop. That or NAND flash.

I personally have very high hopes for Navi. I saw a lot of people think "wonderful, AMD is going to put out a crap architecture because it's geared for Sony and the PS5" but I'm actually expecting pretty great performance if Sony is involved in this. I doubt the architecture will be scalable to larger dies just like Polaris isn't particularly good at that (after all this is the Polaris successor) but at the smaller die sizes most midrange cards are at, I'd expect it to perform quite admirably.

I expect there to be a lot of forward facing stuff we've seen in Vega put into Navi for efficiency purposes, like their primitive geometry shader (which NVIDIA is using a variant of in Turing to get their claimed "50%" performance boost, which will probably only work on newer titles since developers have to force float and int32 parallelism and the driver stack probably won't).

There's actually quite a few games which benefit much more from multicore now. There's games choking on quad cores (I think check one of the AdoredTV coffee lake videos? I believe there's a few other sites that covered it).

Edit:



I think you guys are hard on Nvidia. you can "gain" up to 3 times the number of fans on one card. :imp:

will their ray tracing thingy be significant for 3D graphists at least? because for gaming I'm the sort of guy who would 10billion percent(Dr Stone style) say something like: "wow it's really super pretty. how do I turn it off?"

It could? It depends on if the software has Microsoft DXR support which is the raytracing API NVIDIA's RTX is built on top of.

This kind of stuff would matter more for CGI in film where they have giant render farms which use ray tracing a lot but they would use Quadro RTX cards which have much stronger performance due to being full die or near full die cards rather than the severely cut down ones we're seeing for the 2070, 2080, and 2080 Ti.

Again, so long as the software uses Microsoft DXR and or NVIDIA RTX then you will see big performance gains using Turing from previous generation cards because you went from 0 raytracing dedicated logic to some which should provide a big performance gain over past generations, but it's still weak raytracing performance when you look at it. The sub-30fps at 1080p we're seeing on things like the Tomb Raider demo isn't even pure raytracing, it's hybrid raytracing in that it's still mostly rasterizing with some raytracing for some effects.
 
Last edited:
Aug 24, 2018 at 9:10 AM Post #176,186 of 177,742
I, for one, welcome our Nvidia overlords.

3zssCOb.jpg
 
Aug 24, 2018 at 10:14 AM Post #176,187 of 177,742
will their ray tracing thingy be significant for 3D graphists at least? because for gaming I'm the sort of guy who would 10billion percent(Dr Stone style) say something like: "wow it's really super pretty. how do I turn it off?"

xei5i0cxpyh11.jpg


There's actually quite a few games which benefit much more from multicore now. There's games choking on quad cores (I think check one of the AdoredTV coffee lake videos? I believe there's a few other sites that covered it).

Once we get to the point where 1440p/60fps/high isn't possible any more, I'll finally retire the ol' girl. My days of chasing ultra are long gone and to be honest its not like i can reaaaally tell the difference in the heat of the moment. Maybe Zen 3/4 will be out by then. XD

I hope something does come out of that memory price fixing lawsuit in China but I think it might be a slap on the wrist.

Oh there were also rumours that Intel are getting into the discrete GPU game so maybe things will get spicy in ~5 years time.

image0.jpg

I put a diode bridge in backwards, press F to pay respects. There goes $5 in parts (2 nice caps and a 7815 regulator), and that's why you buy three times as many parts as you normally need cuz the show must go on.
 
Aug 24, 2018 at 11:33 AM Post #176,188 of 177,742
Once we get to the point where 1440p/60fps/high isn't possible any more, I'll finally retire the ol' girl. My days of chasing ultra are long gone and to be honest its not like i can reaaaally tell the difference in the heat of the moment. Maybe Zen 3/4 will be out by then. XD

I hope something does come out of that memory price fixing lawsuit in China but I think it might be a slap on the wrist.

Oh there were also rumours that Intel are getting into the discrete GPU game so maybe things will get spicy in ~5 years time.

Honestly a lot of the visual effects on higher graphical settings like bloom or ambient occlusion actually make the game less playable from a visual standpoint since a lot of the effects can be distracting or covering visual cues that would otherwise be more apparent and more useful in making decisions.

China is quite keen to move production in house which is outlined in their "Made in China 2025" plan so I believe there were some things forcing companies to allocate a certain amount of production capacity in China and or IP sharing. Given their economic prowess, a lot of companies have to give in a bit.

It's not a rumor anymore. IIRC Intel has a teaser video for graphics in 2020, so their reasons for hiring Raja Koduri went from speculation to basically confirmed.

My best guess for why is 1. Intel wants a piece of the graphics pie that NVIDIA is hogging and 2. Intel wants to mirror AMD's semicustom strategy because it's extremely smart and keeping AMD afloat. Kaby Lake G (Intel mobile CPU + Vega GPU on a single PCB via EMIB, still connected via PCIe) was the first attempt to crush out NVIDIA because the likelihood was that the total package of a Kaby Lake G chip was still less expensive than an Intel mobile CPU + NVIDIA mobile GPU which would entice manufacturers to use Kaby Lake G instead. This works as a win-win for Intel and AMD because Intel gets more control and cash, AMD is willing to give a GPU for a pittance because it gives them more marketshare where there was originally none, and NVIDIA just loses. Intel wants full control now though because RTG is no longer producing competitive enough GPUs for this to work, at least not right now, and AMD's APUs are now an enormous threat due to Zen being a thing. It also gives them GPU market share and lets them pocket more money without having to rely on others.
 
Last edited:
Aug 24, 2018 at 5:20 PM Post #176,189 of 177,742
I'm pumped about the next episode of Hataraku Saibou. The last ep ended on a cliffhanger!
Is that show actually good? it seems like it's super meme-able which usually isn't a good sign for a show actually being good.

Probably the last thing you want but sure. Have fun with price gouging.
Eh, a lot of other industries price gouge all the time, so this really shouldn't be anything new to anyone. It sucks but what can you do about it other than avoid buying a new GPU?

Yeah and to add insult to injury, it basically took 2 years for the higher end cards to be readily available at MSRP.

It's just a sad state of affairs in the GPU market. Process node improvements have slowed down, memory prices are sky high, and there's basically no competition from AMD. AFAIK PC sales have gone down the gutter too with longer and longer upgrade cycles, so I guess it's inevitable that the market moves towards low volume high margin products.

On the flip side at least the CPU market has been quite exciting the past few years....except that for gaming it feels like a 6 year old processor still isn't much of a bottleneck especially at higher resolutions.
Hasn't PC gaming been dying in general along with all non-mobile gaming though? So it makes sense they'd have to raise profit margins somehow.


Wew I just found this song and really like it. Anyone have some similar sounding songs they've found?
 
Aug 24, 2018 at 7:53 PM Post #176,190 of 177,742
Is that show actually good? it seems like it's super meme-able which usually isn't a good sign for a show actually being good.


Eh, a lot of other industries price gouge all the time, so this really shouldn't be anything new to anyone. It sucks but what can you do about it other than avoid buying a new GPU?


Hasn't PC gaming been dying in general along with all non-mobile gaming though? So it makes sense they'd have to raise profit margins somehow.

The show is fine. If you watched Hakumei and Mikochi (Tiny Life in the Woods) last season then it's like that show where it's fun to watch for the first handful of episodes but then it gets a little old by maybe the 5th episode (although personally I felt Hakumei and Mikochi had a lot more potential for things to explore but the show slumped pretty hard in the middle).

This isn't any ordinary price gouging though. NVIDIA is charging what use to be a $600-$700 full blown GPU die (__100 chips) for $3k and up now while charging that same $700 for a severely cut down chip midrange chip (__104 chips). That's one thing but now they deliberately sell the __104 chips as high end when they clearly aren't. Even Intel, which is a scummy company, doesn't do crap like this. The problem is NVIDIA's marketing department is genius and AMD's is awful and too honest. The hot and loud meme still follows RTG and NVIDIA can deliberately choose to tell convenient truths to manipulate public opinion (they aren't lying but the numbers they do show and the way they do it is beyond deceptive). I don't think there's another company in the tech space that can abuse its position so badly.

Just buy an old AMD GCN card? I don't know why you need the latest and greatest. Probably 90% of the cards sold are just cards that can do 1080p 60fps fine. I generally advocate for GCN cards because the architecture was designed in a way that ages quite well. NVIDIA on the other hand, due to basically forcing everybody to use Gameworks being the dominant GPU maker, is free to make their older cards become as unusable as possible in a quick manner. For example, with Turing they're trying to force RTX and selling it by comparing Pascal, which had 0 ray tracing hardware, to Turing, which has some raytracing hardware but has very mediocre performance with it. 2 years down the road when we get a successor, it'll have much stronger raytracing hardware and they'll have RTX supporting games run even heavier raytracing loads that the already meager Turing can't handle.

Er...no? Not even close. PC gaming is still growing but other platforms are growing much faster, particularly mobile. PC gaming never made much of a profit anyways; what PC gamers generate is a pittance. Even console gaming is only decent profit margins. The GPU market is primarily pushed by AI, datacenter, and the creative fields who benefit massively from GPGPU. The only games that drive any sort of massive revenue is mobile games. There isn't a really good reason to care about desktop gaming; it just happens to be along for the ride due to consoles and datacenter.
 
Last edited:

Users who are viewing this thread

Back
Top