"Happy birthday" just sounds like "congratulations for being one year older today than you were one year ago!" to me. It's pointlessly tautological.
Quote:
it could be. i do not know since i never played the game and as i said i been away from pc hardware for a good amount of time so my tech lingo is very off. trying to jumpstart my brain since i actually been wrapped up with audio for the past couple years instead so i can be wrong on my part and do not mind being corrected.
from my understanding the marketing of physx is new but it's way it's used is old tech and reason why cpus can't render physx and cpus shouldn't have such difficulty at all rendering physics at such a low level. if still does then it's known as bad optimization. lot of games are badly optimized so it's up to drivers and patches to optimize it better cause of whole gpu war thing, but you right bout the consoles using ati gpus able to render physx but that's cause consoles are specially optimized like crazy over pc's. here an old tom's hardware article
http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html
i don't see the reason to use physx anymore and don't understand nvidia still pushing it especially since we have open cl and direct compute. nvida did make some mistakes in the past thoug like metro 2033 where physx was able to be rendered on both ati and nvidia due to it being software based instead of hardware based, but i believe it's fix now. i believe consoles use software physx unlike hardware accelerated physx like pc so it can be a possibility why it renders better on consoles and ports as well maybe.
Ergh. No, stop being wrong.
PhysX computations are performed on the
GPU
specifically so that the
CPU isn't involved, so that the CPU can instead spend cycles computing other things.
PhysX does this by taking advantage of
CUDA cores on a GPU (which is why there's a minimum of 32 CUDA cores if you want your GPU to be able to perform PhysX calculations).
CUDA - Combined Unified Device Architecture - is entirely different from x86 architecture. This is why an x86 CPU would be bad at PhysX (aside from the fact that the PhysX drivers aren't written for CPUs at all, so the CPU can't even perform PhysX calculations at all in the first place). (It's the same reason why the Playstation 2, despite being over a decade old, still doesn't have a "perfect" software emulator - very different hardware and architecture. Meanwhile, the PS1 is comparatively easy to emulate purely in software.)
PhysX is its own instruction set, yes, based on an older instruction set yes. But if the devs of a piece of software are doing PhysX on the GPU (i.e., hardware PhysX), then it's not even relevant - to my knowledge, GPUs aren't constructed and programmed in the same way as CPUs, and don't do SSE/2/3/4 instructions (since those are CPU instruction sets).
If you're talking about software PhysX, then yeah, it's entirely possible that the devs did it wrong or that nVIDIA has poor support for it. I wouldn't know; the most significant third-party middleware I've ever worked with were OGRE and OpenGL, whose support consist mainly of having to Google instructions on performing ancient demon magic and blood-sacrifice rituals.
PhysX is still used in the industry.
Metro - Last Light is going to be using it.
Borderlands 2 is going to be using it.
ArmA 3 is going to be using it.
-- Griffinhart