X RAM support in Games

We all hear quad core CPUs bashed for being "not needed" but at the same time we're seeing a rash of add in boards to do things usually done on a CPU. Sound cards are far from a new invention, but the X-Fi chip is far more of a "processor" than we've ever had on a sound board. We've also seen the Ageia phys-X cards and the bigfoot NIC. There was also talk a couple months back about an AI processor in development.

So at least some people think that putting a bunch of extra cards with their own specialized processors and RAM into your gaming box is the way to give the PC more power. I'm not so sure though. Surely one core of a quad core CPU can do anything an X-Fi can, perhaps at much lower precision but good enough to be transparent. The same goes for physics, AI etc.

Of course designing such an engine isn't easy, but would the resources you spend be more than what would sure be a nightmare of dealing with multiple add in board types, companies, cards & driver combinations? I doubt it. And it would just work since it would be pure software.

Hmm, good points. It would be nice to be able to get the same gaming quality out of my headphone amp/DAC as I do when listening to music.
 
Yeah, the problem with addon cards is they have to deal with PC interconnects. PCI sucks for things, apparently. :) All DSP audio cards have bus transfer issues, whether it is caused by a bad BIOS, poor PCI hardware implementation, bad drivers, or evil video card latency doings. Even NVIDIA's old southbridge APU had occasional hickups like that. The Physx card slows things down in some cases.

3D cards pretty much get their own customized connection. AGP isn't used for anything else and PCIe x16 apparently only works with video cards (at the chipset level). Other addon cards don't get such luxuries. I've read that PCIe has some of the same latency issues of PCI.

So, I think our fav platform is at fault at the hardware level in these cases of more dedicated hardware.
 
No. Essentially all user interface stacks are now contained in user mode; this includes audio, video, network and all interface devices (keyboards, mice, joysticks, yokes, pedals, fingerprint scanners, security badge scanners, whatever.) This is what allows the Vista kernel to survive just about any driver crash you throw at it.

The only things truly running at kernel level are the things that are core to the kernel's operation, such as memory management, non-removable drive management, plug-n-play core functions, BIOS / boot functions, HAL and priveleged system access passthru from signed user-mode driver hooks.

Sorry for the OT...

Yeah you are right, sorry I should have clarified. Video drivers can still "talk" to the kernel through a dedicated API/Direct3D (so if the driver fails, it doesn't bring the whole system with it), but companies can still provide hardware acceleration through HAL (Hardware abstraction layer). However there is no more HAL for DirectSound and DirectSound3D, which is Creative's problem.
 
We all hear quad core CPUs bashed for being "not needed" but at the same time we're seeing a rash of add in boards to do things usually done on a CPU. Sound cards are far from a new invention, but the X-Fi chip is far more of a "processor" than we've ever had on a sound board. We've also seen the Ageia phys-X cards and the bigfoot NIC. There was also talk a couple months back about an AI processor in development.

So at least some people think that putting a bunch of extra cards with their own specialized processors and RAM into your gaming box is the way to give the PC more power. I'm not so sure though. Surely one core of a quad core CPU can do anything an X-Fi can, perhaps at much lower precision but good enough to be transparent. The same goes for physics, AI etc.

Of course designing such an engine isn't easy, but would the resources you spend be more than what would sure be a nightmare of dealing with multiple add in board types, companies, cards & driver combinations? I doubt it. And it would just work since it would be pure software.

Are you advocating lower precision? :???:

As for X-RAM games here are the ones I can remember so far:
DOOM3 ver. 1.3 & DOOM3:ROE
Quake 4
PREY
UT2004
BF2, BF2:SF & BF2142

As for why X-RAM Creative says this:
http://www.soundblaster.com/products/X-Fi/technology/x-ram/gamingXram.asp?page=3
  • Slow memory access due to paging
  • Limited memory access due to OS restrictions
  • Processor cycles lost to decompression
  • Sound quality lost to size restrictions
  • Performance lost to streaming
 
I'm not necessarily advocating anything. The key word in that quote is "transparent". In other words only if it good enough that you can't tell the difference between the two in a blind test.

I'd much prefer it if game developers used high end Creative like audio engines in software utilizing extra CPU cores instead of just deciding to shine it altogether because the majority of people aren't using dedicated audio hardware. Which seems to be the way the industry is heading.
 
Thanks for all the replies and discussion.

I've decided to go for a standard X-Fi without the X-RAM.

Regarding OpenAL, can any of the the other chipsets (ALi, RealTek,etc.) use openAL? In games such as BF/ Q4/ D3 do they fall back to a DirectSound path or do they just use software audio?

Also if the other chipsets can use openAL is it likely ALi or RealTek will also write a Vista wrapper (like Creative) to forward DirectSound calls to OpenAL?

Cheers :)

Kristin
 
I'd much prefer it if game developers used high end Creative like audio engines in software utilizing extra CPU cores instead of just deciding to shine it altogether because the majority of people aren't using dedicated audio hardware. Which seems to be the way the industry is heading.

Look at games that utilize dual-cores CPU's.
Going from dual to quad-core represents even more troubles (look at "Alan Wake" that keeps getting pushed back) and quad-core wil be very high end only for a long period of time.

Looking at Valv's Steam sruvey we can see that some funny facts:
http://www.steampowered.com/status/survey.html

Top DX9.0C GPU: Geforce 6600
Top DX9.0B GPU: Radeon 9600
Top audio processor: Realtek AC97
SLI/CrossFire = <2% of the market
Dual-core or more CPU's: ~10%

I you make the mistake and think that most gamers have a Dual-core SLI/Crossfire rig with a 16:9 LCD with an dedicated soundcard you are way of.

/rant -on
Infact the number of users running 16:9 is less that the people that own a Creative card but they sure make a lot of nosie when a game dosn't support their nieche 16:9 resolutions...I'm tired of the 16:9 people and their bickering...get over it.
/rant -off

With dualcore CPU's today only amountig to a staggering ~10% no sane developer would make a game that couldn't run (decent) on a single core CPU.

And I think that Crysis is going to "surprise" a lot of people...expecting high IQ on even on a dual-core with a 7900/X1900 GPU...
(Infact the recommend specs are dual-core (~3Ghz) and DX10...think what optimal specs are then...:oops:)

Quad-core as standard are going to take years and years...and the games that utilize this technology are far apart.

But let's look at Doom3.
When it got EAX sound, I srtated gaming all over again.
The sound got so much better compared to the old sound-engine.
One thing that also plauged HL2...the stu-stu-stu-stu-stuttering sounds...software sound engine...go away from that please.

I have a X-Fi Pro Elite myself
Why?
Because it's the card that frees up most CPU cycles when running 128 voices in games and have the best DAC's on any X-Fi card (including the Fatal1ty) and the sound and 3D is perfect..especially when played on 7.1 speakers.

Anything I can get that frees up CPU cucles is a good thing in my book.
Why?
Because it's the one thing(free CPU cycles) you never can have to much of.

And so far all the CPU sound engines I have seen have left a lot lacking...or lag-lag-lag-lagging ;)
 
Thanks for all the replies and discussion.

I've decided to go for a standard X-Fi without the X-RAM.

AFAIK all X-Fi's have X-RAM, the lower ones only 2MB though:
http://www.xtremesystems.org/forums/showthread.php?t=93548

As for OpenAL I havn't heard anyone else than creative talking about a wrapper.
Infact I havn't even seen normal OpenAL drivers(that actually works) from anyone besides Creative.

Not that a onbard soudchip is any good in Directsound anyways...an wrapper would only make it worse.
And unless you(regarding Vista) use OpenAL all sounds are based on the Vista Software Sound Mixer...
 
AFAIK all X-Fi's have X-RAM, the lower ones only 2MB though:
http://www.xtremesystems.org/forums/showthread.php?t=93548

As for OpenAL I havn't heard anyone else than creative talking about a wrapper.
Infact I havn't even seen normal OpenAL drivers(that actually works) from anyone besides Creative.

Not that a onbard soudchip is any good in Directsound anyways...an wrapper would only make it worse.
And unless you(regarding Vista) use OpenAL all sounds are based on the Vista Software Sound Mixer...

Nvidia has OpenAL drivers. Everyone else uses an OpenAL -> DirectSound3D wrapper. No sound degration, but more CPU cycles.

DOOM3 ver. 1.3 & DOOM3:ROE
Quake 4
PREY
UT2004
BF2, BF2:SF & BF2142

The games I bolded have OpenAL support, but do not have X-RAM support.


Back to the OP, you made the right choice. No one should get anything higher than the XtremeMusic unless if they have higher quality speakers (ie: not computer speakers); then they should invest into the Elite Pro.
 
Some Quake 3 engine games have OpenAL too. Jedi Knight 2 and Jedi Academy, for example.

X-RAM is never going to go anywhere. They have a hard enough time getting decent EAX support, let alone X-RAM. This is especially true when there aren't really any tangible benefits to X-RAM.

DAC quality doesn't impress most people either. To sell sound cards, Creative will need to really develop some amazingly noticeable audio. Unfortunately I don't really think it's possible to do so. I mean, just how many people out there are perfectly happy with a ultra cheap CD player for their music needs, for example. How many gamers have $30 Logitech speakers? X-Fi Elite Pro isn't going to impress them, especially with its price.

Sound cards have become rather intangible improvements since the end of ISA. And moreso with the end of MIDI. MIDI quality used to be such an easily improveable area for companies to develop.

The difference just isn't there anymore. People just don't care. I see people retiring SBLive! cards in favor of their onboard mobo audio. Some of them even think mobo audio sounds better. That tells you how subjective this stuff is too.

I personally have never relied on mobo audio. It is noticeably awful to me. But I'm spoiled. I got addicted to sound cards back in the MIDI days. Hearing Doom and TIE Fighter with wavetable was awesome compared to FM. SBLive was a lot better for Unreal than those old ISA cards. My Audigy 2 that I got for cheap is a lot better than Live! But, right now, I don't see a worthy reason to move to X-Fi.
 
Last edited by a moderator:
Ateo's post
I think your post is a bit misdirected Ateo. I own a fatal1ty myself, and would love it if every developer supported it to the fullest extent but I don't see that ever happening. Unless an X-Fi found its way into the majority of gaming rigs. As great as the chip is I wouldn't be surprised if it is the last of its breed. If EAX was still an open standard and there was competition on the audio board market things might be different.

With CPUs I see dual & eventually quad core becoming more or less standard for gaming rigs, thus developers will be able to utilize them. You don't need a sound board but you need a CPU. And if that CPU at the sweet spot in your price range just happens to be a quad core people will buy. If you consider how recently dual cores were introduced 10% is very impressive when it comes to PC hardware.

Yes an MT game engine is hard to develop but if it can give us a high quality sound in most games that "just works" even if you're using that onboard Realtek it'll be better for the average PC gamer. Developers will eventually be using a lot of that extra power for something and I'd much prefer they do not continue to shove audio into its darker by the year corner.

Valve's sound stutter issue is caused by a stupid overly conservative caching setup that should have been rewritten 2 years ago. It's not the fault of the sound engine persay. Its just another one of those issues that exists because not everybody has a kick ass PC with loads of RAM, so we all suffer. Regarding Doom 3, I do try to write my posts carefully but for some reason people always seem to see things to argue about that simply aren't there. I said the engine would have to be equivalent to high end EAX for me to be satisfied. Doom 3's original sound engine was not. Nor is Valve's engine or any other software engine I have experience with.
 
Last edited by a moderator:
I'm not sure Doom3 even used Directsound initially. It had a VERY basic audio engine. Then Creative came around and twisted their arm with a threat of lawsuit (over shadowing) and id allowed them to implement EAX4.
 
I hope though we start seeing better audio engines in future games, no matter what method developers have to use (EAX, X-RAM, or just DS3D/OpenAL/another API). For awhile there (like almost a decade ago) it looked like audio engines would explode like graphics have, but it died off pretty quickly. I am not sure if Creative is to blame or the market itself, but here's hoping for a turnaround.
 
Back
Top