Quote:
Originally Posted by iriverdude /img/forum/go_quote.gif
I heard latest Photoshop uses 3D cards? And what about newer video encoding/decoding, may be GPU accelerated although probably require latest software.
|
CS4 can use CUDA/Brook+ acceleration (forgive me if Brook+ isn't its current name, its AMD's streamcomputing API, they change the name ~quarterly), but it requires a driver package that supports it, for example Quadro CX, and it also requires using software features which are GPGPU enabled
GPGPU is a very limited scope thing, because it requires explicitly parallel algorithms in order to leverage any advantage, for example HD video encoding/decoding, various filters in CS4 (usually image enhancement/restoration, stuff that would classify as "digital image processing", like facial recognition/enhancement, color/contrast optimization, etc), and similar tasks, the thing you have to remember is, it requires the developer to spend the money to re-write the software to be compatable with this, and a VERY small fraction of users have this hardware ability
for Adobe it makes sense, they have the time, money, and expertise, and Adobe is used professionally, so having a h/w offload feature for the top end software package is reasonable (as top end users can likely buy systems through their employer, or otherwise have the money for a $4000+ workstation devoted just to their work)
a few years ago what was more commonplace was to have single purpose DSPs to handle tasks like this, this is still seen on DAWs (look at Avid/Digidesign, SSL, RME, etc), where many cheap little DSPs are paired together for h/w offload to improve the capabilities of the workstation (at massive $ investment)
now consider the general advantage of this h/w offload is "more precision, faster", basically Quadro/Tesla or FireGL/FireStream can process high precision FP values much faster than your CPU can, this is huge for massive projects (Pixar is a hugescale example (and a terrible one, becuase Pixar develops all of its rendering solutions in-house, and its all run on their own software API, but most people can relate to Pixar-type projects)), but for the average home user with Photoshop Elements or Corel software doing red-eye reduction on their digital photos, doesn't really matter (because who cares if it does it in .03 seconds or .3 seconds? user doesn't see the difference)
@ the video encoding/decoding (which I sort of touched on), all three major GPU vendors offer this, as h/w offload, and the performance gains between processors are trivial, you either have the feature, or you don't (for example Intel's GMA X4500 offers this feature, my HD 4870 X2 won't do the calculation 10-15x faster, it may only gain 5-10% (AVIVO does offer some benefits for color processing/etc, but AVIVO is available on any modern AMD GPU, and even the "cheap seats" will handle it just fine))
so, yes it is a valid point, but he's already going to be seeing the maximum hardware acceleration/advantage with the chipset he's got currently (the X1300), so he won't gain in going to a newer GPU (as the entire package is handled in software, and even a "cheap" GPU will outright smoke any GPCPU on the planet at this kind of math)
hence why I'm saying: if gaming or professional rendering (read: you render or you don't eat) is the motivation, then this is a valid upgrade, otherwise, its wasted cash
Quote:
Originally Posted by Zorander /img/forum/go_quote.gif
I believe the x1300 is cooled passively (i.e. no sound). The OP's situation differs markedly to yours. Where you are going from a noisy card to quieter ones, the OP is essentially thinking of upgrading a silent card to another that will possibly be fan-cooled (and add some noise).
|
good point.
Quote:
Originally Posted by mark2410
oh i know, its a pain in the arse, ms really really needs to dump all the legacy code and a switch to a less crap file system might be nice, like why they hell dont they natively support reiser, ext3 ext4 zfs etc etc. grrrrr dont get me started
but untill everything plays nice with debian then its still easyer to just stick with windows, and windows is okay really if you just keep throwing ever more powerful cpu's at it
|
I don't support any file system written by someone who slaughters their family
zfs for life, but we'll never see a Sun product take center stage
and LOL @ the "if you just keep throwing ever more powerful CPUs at it" comment (thats more or less true of any OS, in regards to modern software)