Gamewise, Creative's got every 3D codec locked up under their belt, including A3D somewhere if I'm not mistaken (by acquision of Aureal's assets or what not). That mean, in most gaming rigs, you'll see Creative's Audigy 2 as the default sound card. And you get all the plus (fairly low CPU overheads, its 3D/surround codec, etc) and minuses (crummy interpolation, somewhat questionable drivers, etc) with it.
You may want to take a look at VIA Envy chipset. While it works like most AC97 codecs, it's got 24bit/96kHz audio with certain Envy chipset/soundcard implementations. Since it's CPU host based, it'll sap more CPU cycles. But with modern CPUs that shouldn't be much of an issue. And its sound quality has generally been agreeable to most "professional" reviewers ears.
Speaking of Soundstorm, the Soundstorm isn't the onboard chip itself, but refers to particular output specification of Nvidia's NForce2 based Mainboard. You can find more at nvidia.com (do a search for Soundstorm), but basically, it refers to the number of outputs that a mainboard must provide, that is basically certain number of digital out (TOSLINK or COAX) with certain output levels. From what I know, there is certain uncertainity whether or not the digital out from the NF1/NF2 (MCP-D/MCP-T respectively) uses the Realtek/Chintech/AC97 codecs or not, though I believe that they do not. Finally, your SoundStorm is the only PC card capable of encoding any and all digital streams in Dolby Digital. Driver quality however varies wildly between version to version, introducing bugs and noises here and there between versions. Personally, I've found no problems with its digital output or drivers in general.
Finally, the new Intel chipset/mainboards coming later this year will have HD Digital codec on the mainboard itself. It's supposed to revolutionize PC Audio scene as we know it, but since it's not released yet, no one knows. Right now though, it doesn't have the hype of SoundStorm yet, which may be a good thing after all.