SwiftShader 2.0: A DX9 Software Rasterizer that runs Crysis

Zero-area textures? Are you sure?
It creates a one-dimensional texture, as far as I recall, but that is perfectly legal within D3D. You can have 1D, 2D and 3D textures, if the hardware supports it. In your case you should be able to support everything.

Did you read the docs?
http://msdn2.microsoft.com/en-us/library/bb174363.aspx

"Width
[in] Width of the top-level of the texture, in pixels. The pixel dimensions of subsequent levels will be the truncated value of half of the previous level's pixel dimension (independently). Each dimension clamps at a size of 1 pixel. Thus, if the division by 2 results in 0, 1 will be taken instead.
Height
[in] Height of the top-level of the texture, in pixels. The pixel dimensions of subsequent levels will be the truncated value of half of the previous level's pixel dimension (independently). Each dimension clamps at a size of 1 pixel. Thus, if the division by 2 results in 0, 1 will be taken instead."

Zero-area textures should not exist.
 
Last edited by a moderator:
Could you also explain how you fixed the problem in Trials 2 Second Edition. Was there a bug in our texture management code, or did my code use some corner case functionality that was not supported by Swiftshader?
 
Nice work, and so quickly!
You're welcome.
Is the evaluation version at website updated now, or could you give us a estimated time for the next release? We are currently evaluating Swiftshader for the game. Many of our customers have pretty fast business laptops (good CPU but bad GPU), and the game needs SM2.0 support with MRT rendering support (basically a fully featured DX9 hardware).
There's no updated version online yet, but we're working hard on it. When it will be released depends mainly on which bugs we're still going to address and how long the testing takes. It's totally not my descision but I guess an update could be online by the end of the week or so.

Anyway, if you want a development build for testing as soon as possible, feel free to contact swiftshader@transgaming.com.
 
Well I've always wondered about that. I think it's pretty safe to assume that you'll at least need a Core2 Duo system to get any kind of reasonable SM2.0-performance out of SwiftShader, no matter how simple the graphics (even that DolphinVS demo doesn't run all that fast, and it doesn't get much simpler than that). The worst you could have in such a system is an Intel IGP, I suppose.
I don't want to talk bad about any particular hardware, but there are still lots of GeForce 4, GeForce FX 5200, Radeon 8500, Radeon 9000, Radeon 9200, Intel Extreme Graphics and Intel Extreme Graphics 2 graphics cards out there (plus some S3, SiS and Matrox cards and probably some others I'm forgetting). These don't support Shader Model 2.x and some even don't support Shader Model 1.x.

I don't think you need a Core 2 Duo either. You need one (or equivalent) for something like Half-Life 2, but casual games are typically a lot less demanding. In fact most are happy with a very modest framerate. I can also assure you that when we optimize for our clients applications the performance can be significantly higher than what is reached by SwiftShader 2.0. This release marks the beginning of a new generation, not the end of it.
I happen to have one of those myself, in my laptop with a 1.5 GHz Core2 Duo processor, an X3100. Not only is the X3100 significantly faster in most applications than SwiftShader is, it also seems to have much better DX8/DX9 driver support. Pretty much all DX8/DX9 demos that I threw at SwifthShader bugged at one point or another, from rendering wrongly to just crashing altogether. Everything works flawlessly on the Intel. Now surely, the Intel drivers are far from perfect themselves, but my first impression is that they're still leaps and bounds ahead of SwiftShader.
Granted, it isn't flawless, but look at how quickly I was able to address some issues. The turnaround time is extremely small compared to hardware driver bugs. For many casual game development studios it's just not an option to work closely together with the IHVs. And they're never really sure that with some combination of new or old driver with some new or old hardware things are going to fail.

In that respect using SwiftShader is kindof like console development. The only thing that can vary is performance, but if it runs adequately on the minimum specification system targeted it's as easy as dropping in the DLL, testing once, and you can sleep on both ears.
Note also that the X3100 is technically an SM4.0-part. There are no DX10-drivers yet, but they were scheduled for Q1'08, so they should arrive any day now. Which will explode the installed base of DX10-capable computers tremendously, as Intel is one the largest players in the GPU-world. It may also have a positive effect on performance and compatibility, as DX10 has a much cleaner and simpler driver model, which leaves less room for bugs and suboptimal implementations. I can't wait to install those drivers and try Crysis... and my own DX10 stuff ofcourse.
In my opinion it's still going to take years before the average game is going to have a Direct3D 10 path. The whole Vista-only thing actually ensures that anyone who bought an XP system not too long ago is going to demand Direct3D 9 support. And judging purely by the graphical effects possible I also don't think that Direct3D 10 is that much of a leap forward. So Direct3D 9 is going to stick around for quite some time. Especially for casual games this pretty much guarantees a solid future for SwiftShader.

And you're right, Intel has a huge installed base. But don't let that mislead you to think that SwiftShader needs a market of the same scale to be worth its salt. We keep aiming high and the casual games market alone is in fact huge but there's no problem with starting small. As you know, the project has come a long way, and I'm absolutely sure that with every step the potential grows.
So am I correct that with my Intel-powered laptop and 'fast' dualcore I should be in your target market? In which case, why do I not feel like SwiftShader offers me anything over my poor cheap X3100? Well, it does generate more heat, and I get to switch the batteries more often, so at wintertime it's nice, and the forced pauses reduce RSI I suppose.
No, your system is not part of the target market. That X3100 might look cheap to you but for someone who only sees computers as a means to communicate with friends and play a game of poker or whatnot, an upgrade or a new system is only an option if something actually breaks apart or it becomes too slow to finish a simple task within the time of a coffee break. Don't mistake your average computer use with that of the average computer user.

And there's not only (casual) gaming. For some professional fields SwiftShader offers possibilities that would otherwise require expensive custom hardware solutions or at least specialized drivers. There are also many desktop applications that are making the move to 3D. For example Ashampoo Burning Studio 7 uses SwiftShader to create interfaces for DVD movies. It doesn't necessarily have to be real-time but it offers more than adequate performance for these applications.
 
Impressive work Nick, I am really impressed.
Is it still a solo effort or were you joined by other programmers?
 
Zero-area textures? Are you sure?
It creates a one-dimensional texture, as far as I recall, but that is perfectly legal within D3D. You can have 1D, 2D and 3D textures, if the hardware supports it. In your case you should be able to support everything.
Yes I'm sure. To be exact it calls CreateTexture(UINT Width = 0, UINT Height = 0, UINT Levels = 1, DWORD Usage = D3DUSAGE_RENDERTARGET, D3DFORMAT Format = D3DFMT_X8R8G8B8, D3DPOOL Pool = D3DPOOL_DEFAULT, IDirect3DTexture9** ppTexture = 0x004B4F60, HANDLE* pSharedHandle = 0).
Did you read the docs?
http://msdn2.microsoft.com/en-us/library/bb174363.aspx

"Width
[in] Width of the top-level of the texture, in pixels. The pixel dimensions of subsequent levels will be the truncated value of half of the previous level's pixel dimension (independently). Each dimension clamps at a size of 1 pixel. Thus, if the division by 2 results in 0, 1 will be taken instead.
Height
[in] Height of the top-level of the texture, in pixels. The pixel dimensions of subsequent levels will be the truncated value of half of the previous level's pixel dimension (independently). Each dimension clamps at a size of 1 pixel. Thus, if the division by 2 results in 0, 1 will be taken instead."
Yes of course I've read the documentation. I also checked REF behavior and it returns D3DERR_INVALIDCALL. But that doesn't keep the application from crashing. The clamp by 1 rule is only for creating mipmap levels.

Anyway, I might have found the actual cause. The documentation sais the following about CreateDevice:
If BackBufferCount, BackBufferWidth, and BackBufferHeight are 0 before the method is called, they will be changed when the method returns.
It doesn't explicitely say that they will be changed to the actual window resolution but that seems logical and I assume your application blindly relies on it. Should be a quick fix...

Given how many advanced games actually run flawlessly with SwiftShader I'm surprised how many corner cases your demos bump into. I'm really thankful for that as it may fix harder to track bugs later. ;)
 
I don't want to talk bad about any particular hardware, but there are still lots of GeForce 4, GeForce FX 5200, Radeon 8500, Radeon 9000, Radeon 9200, Intel Extreme Graphics and Intel Extreme Graphics 2 graphics cards out there (plus some S3, SiS and Matrox cards and probably some others I'm forgetting). These don't support Shader Model 2.x and some even don't support Shader Model 1.x.

Yes, but these can only be found in really old PCs, with a CPU that by default won't be fast enough for decent SwiftShader performance. People who have such poor videocards are not the kind of people who will upgrade a motherboard or CPU. They just buy a new prefab computer, which, aside from a CPU fast enough for SwiftShader, will also have an IGP with more features and better performance than SwiftShader.

I don't think you need a Core 2 Duo either. You need one (or equivalent) for something like Half-Life 2, but casual games are typically a lot less demanding.

Contrary to what you might think, Half-Life2 is not actually a heavy game to run. It is designed to run nearly everything in a single pass, which is also why it runs at hundreds of FPS on a modern GPU, while games from that era like Far Cry and Doom3 get only a fraction of that framerate.
A casual game probably renders everything in just one or two passes aswell, and therefore its performance should be dictated mostly by the resolution and texture quality, just like HalfLife2 is.

In my opinion it's still going to take years before the average game is going to have a Direct3D 10 path. The whole Vista-only thing actually ensures that anyone who bought an XP system not too long ago is going to demand Direct3D 9 support. And judging purely by the graphical effects possible I also don't think that Direct3D 10 is that much of a leap forward. So Direct3D 9 is going to stick around for quite some time. Especially for casual games this pretty much guarantees a solid future for SwiftShader.

Regardless I think you should support DX10 sooner rather than later. It may be more efficient and less errorprone to implement, and it might lead to more efficient rendering (more things in a single pass, focusing more on arithmetic than texturing). At least be ready when the time comes.
I think games with a DX10 path are pretty common already, a lot of games currently in development have DX10 already. They might not be DX10-exclusive yet, and that may take years, but if you can take advantage of DX10, why not?

No, your system is not part of the target market. That X3100 might look cheap to you but for someone who only sees computers as a means to communicate with friends and play a game of poker or whatnot, an upgrade or a new system is only an option if something actually breaks apart or it becomes too slow to finish a simple task within the time of a coffee break. Don't mistake your average computer use with that of the average computer user.

Well, it was the cheapest graphics option I could get, so anyone who buys a computer now should get the same IGP or better, I suppose. And this chip has been on the market for well over a year, if I'm not mistaken.
 
It doesn't explicitely say that they will be changed to the actual window resolution but that seems logical and I assume your application blindly relies on it. Should be a quick fix...

Ah yes, rendertarget, not just a texture. Indeed, I always rely on that in windowed mode. Most games don't run in windowed mode anyway, so perhaps you've never encountered it.

Given how many advanced games actually run flawlessly with SwiftShader I'm surprised how many corner cases your demos bump into. I'm really thankful for that as it may fix harder to track bugs later. ;)

Perhaps those games aren't that advanced after all...
 
It would be interesting to know how this application behaves on a Phenom processor.

Alright, here you go:

cubemap.exe @ 1280x960

default: 13
1 thread: 11.5
2 threads: 16
3 threads: 13
4 threads: 15
IGP: 130

spheremap.exe @ 1280x960

default: 23
1 thread: 16
2 threads: 23.5
3 threads: 22
4 threads: 22.5
IGP: 235

Phenom X4 9600 (2.3 GHz) with HD3200 IGP (780 chipset)
4 GB RAM
Vista 64

Not sure if this cpu has the TLB bug. Any simple way to tell?
 
Could you also explain how you fixed the problem in Trials 2 Second Edition. Was there a bug in our texture management code, or did my code use some corner case functionality that was not supported by Swiftshader?
It was a bug in SwiftShader's UpdateSurface implementation. Indeed a corner case that apparently was never hit by any other application, with a hazy description in the SDK documentation. But no worries, the bug has been addressed entirely and Trials 2 Second Edition works very nicely.

Fun game too! It's a bit hard to get used to the controls though. Is this an evolution of a Flash game using the same concept?
 
Not sure if this cpu has the TLB bug. Any simple way to tell?

Phenom 9600 -> bug
Phenom 9650 -> fixed stepping

It's probably performing so poorly because the TLB fix is applied. This greatly reduces memory access speed.
 
Last edited by a moderator:
Funny how people assume that Larrabee is anything like a Core2 or such, as soon as the term 'x86' is dropped.
Larrabee has a *subset* of the x86 ISA, with special extensions for more efficient geometry processing and shading, does not incorporate out-of-order execution, and is coupled with special fixed-function hardware for (GP)GPU tasks.
Funny how journalists seem to think "This is like a Core2 <ubermanycoreshere>" instead of what it actually is: nothing like what we currently know as x86.

The reason why Intel used a subset of x86? Probably because they already have all that technology on the shelf, and there's no point in reinventing the wheel. A trimmed-down x86 instructionset will work just about as well as whatever other instructionset they could have come up with.

I don't see a point in running SwiftShader on something like that, at least not in its current form. The fixed-function hardware can rasterize and divide the workload more efficiently, and the added instructions would mean that SSE-like code would be inadequate.
Yes, the basic idea of compiling shaders to x86-like code on the fly is the same... but then again, all GPUs (well, their drivers) compile shaders on the fly, they just don't use an x86-like instructionset... as if that matters any.
 
Impressive work Nick, I am really impressed.
Is it still a solo effort or were you joined by other programmers?
Thanks nAo! I still do the great bulk of research, design and implementation. But I regularly exchange valuable ideas with TransGaming's CTO, Gavriel State. And there's obviously overlap in some of the technology used by the Cider and Cedega teams. It's an agile company, so if necessary we can probably have more people working on SwiftShader in a very short time.
 
Yes, but these can only be found in really old PCs, with a CPU that by default won't be fast enough for decent SwiftShader performance.
Not necessarily. One of my friends has a Pentium 4 2.6 GHz with a GeForce 4200 Ti, my girlfriend has a laptop with a Pentium M 1.7 GHz and an Intel 865G, a nephew has an Athlon XP 3000+ and a Radeon 380 IGP, etc. Their systems are not ready for the junk yard yet, but they already run into situations where applications simply don't run. Those CPUs are quite capable of running casual games and 3D desktop applications though.

Ok, my personal surrounding is not much of a reference, but it's one of the things that keep me motivated. Most of us on this forum upgrade our hardware yearly but I'm convinced that others plan on using the same hardware for five years or so.

Again, don't misjudge the scale of the market. It doesn't have to be for everybody and his dog to make the product viable. But at the same time the people who play games only casually are the most likely to have somewhat outdated hardware.
Contrary to what you might think, Half-Life2 is not actually a heavy game to run. It is designed to run nearly everything in a single pass, which is also why it runs at hundreds of FPS on a modern GPU, while games from that era like Far Cry and Doom3 get only a fraction of that framerate.
Your point being? Half-Life 2 is one of my all-time favorite games. I don't really care if it's heavy or not. It's a fun game and it's playable with software rendering on a recent CPU. It tells me that things are right on track to extend the market from just casual games to something a little more serious in the not too distant future. Sure, by that time there will be new "heavy" games, but there will also be more fun games that are not cutting edge that can run on the CPU. Max Payne used to be a "heavy" game...

You might find some interesting statistics here: Valve Survey - November 2007. 11.42% defaults to the DirectX 8 path, and still even 4.14% use the DirectX 7 path. This pretty much means that every game that requires DirectX 9 features is missing 15% of the market. Luckily Source supports it, but game engines with DX8/7 paths are a dying breed. For casual game developers supporting all that hardware is a serious issue. SwiftShader allows to write just one DirectX 9 path (saving time and money), use more effects (beating the competition at visuals), increase the target audience by 15%, reduce QA costs, and reduce support calls. I'm not responsible for marketing, but the advantages are readily clear to me.

I'm not blind for the limitations, trust me, but it doesn't help anybody to complain about them. I'm actively improving things and I see the opportunities growing and that's what really matters.
 
Funny how people assume that Larrabee is anything like a Core2 or such, as soon as the term 'x86' is dropped.
Larrabee has a *subset* of the x86 ISA, with special extensions for more efficient geometry processing and shading, does not incorporate out-of-order execution, and is coupled with special fixed-function hardware for (GP)GPU tasks.
Funny how journalists seem to think "This is like a Core2 <ubermanycoreshere>" instead of what it actually is: nothing like what we currently know as x86.

The reason why Intel used a subset of x86? Probably because they already have all that technology on the shelf, and there's no point in reinventing the wheel. A trimmed-down x86 instructionset will work just about as well as whatever other instructionset they could have come up with.

I don't see a point in running SwiftShader on something like that, at least not in its current form. The fixed-function hardware can rasterize and divide the workload more efficiently, and the added instructions would mean that SSE-like code would be inadequate.
Yes, the basic idea of compiling shaders to x86-like code on the fly is the same... but then again, all GPUs (well, their drivers) compile shaders on the fly, they just don't use an x86-like instructionset... as if that matters any.

I know that people would hate it, and after looking a bit more about writing asm for it I can see one source of irritation (manually checking for dependencies/stall conditions between three instructions in a bundle and between two bundles), but I would like to hear more words about why a one banger (one 128 bits instruction bundle instead of two at a time) IA-64 based solution + maybe some vector extensions for Larrabee's cores would have been so bad (x86 HW compatibility fully software based as it is now standard practice in IA-64's world)...
 
Not necessarily. One of my friends has a Pentium 4 2.6 GHz with a GeForce 4200 Ti, my girlfriend has a laptop with a Pentium M 1.7 GHz and an Intel 865G, a nephew has an Athlon XP 3000+ and a Radeon 380 IGP, etc. Their systems are not ready for the junk yard yet, but they already run into situations where applications simply don't run. Those CPUs are quite capable of running casual games and 3D desktop applications though.

No offense, but these ARE very old computers, and MUCH less powerful than a Core2 Duo.
They can run games and 3d stuff, but ONLY when they have a 3d accelerator.

Ok, my personal surrounding is not much of a reference, but it's one of the things that keep me motivated. Most of us on this forum upgrade our hardware yearly but I'm convinced that others plan on using the same hardware for five years or so.

Well, yes, I have an old GPU around myself: a Radeon 9600XT. It will be 5 years old this year (although it was preceded by the 9500-series with pretty much the same performance level).
It's in an Athlon XP1800+. You don't want to know how the Radeon compares to SwiftShader on that box. In fact, even with the fastest CPU available today, SwiftShader can't get anywhere near the performance of that old beast (ironically it was supposed to be the direct competitor to the FX5600/5700). It only cost me about 120e back in the day, so it wasn't exactly an expensive high-end card either. This type of card could also be found in many OEM machines, which were more or less in the mid-end.

I also have a laptop from that era, with a Celeron 1.6 GHz CPU and a Radeon IGP 340M. That one is even slower with software rendering, and although the IGP isn't all that fancy, it helps to relieve the CPU and enable some modest gaming that way. It cannot run HalfLife2 very well though, and I doubt SwiftShader would be an improvement.
The system barely has enough processing power to play a DVD without acceleration, leaving the rest of the system helpless and unresponsive. Luckily the IGP has DVD acceleration though.
Oh, and it actually renders the reflecting spheres and shadowvolumes demo properly, at a reasonable framerate of about 30 fps. Care to fix SwiftShader so we can compare it?

Bottom line is, even WITH a GPU these systems aren't all that game-worthy, even with simple, old games. With the CPU doing software rendering... well you'll be lucky if you can just clean the framebuffer and zbuffer at 60 fps in 1024x768, if you know what I mean. And that means there already is NO time left for any game logic.

What you're really looking at is having today's high-end CPUs running SwiftShader in 5 years, compared to the IGPs we have then. Because any older CPU is just inadequate.
Unless you can name me some nice 'casual games' that would be great to try on these systems with SwiftShader?

Your point being? Half-Life 2 is one of my all-time favorite games. I don't really care if it's heavy or not.

I think you should, if you want to use it as an indication of how well SwiftShader can handle games. Far Cry will be considerably heavier on SwiftShader, even though it's actually an older game.

You might find some interesting statistics here: Valve Survey - November 2007. 11.42% defaults to the DirectX 8 path, and still even 4.14% use the DirectX 7 path. This pretty much means that every game that requires DirectX 9 features is missing 15% of the market. Luckily Source supports it, but game engines with DX8/7 paths are a dying breed. For casual game developers supporting all that hardware is a serious issue. SwiftShader allows to write just one DirectX 9 path (saving time and money), use more effects (beating the competition at visuals), increase the target audience by 15%, reduce QA costs, and reduce support calls. I'm not responsible for marketing, but the advantages are readily clear to me.

You don't have to convince me, you'll have to convince all those game developers. Any luck there yet? Do you have more than 15% of the market yet?
 
No offense, but these ARE very old computers, and MUCH less powerful than a Core2 Duo.
That's very relative. My girlfriend's Dell Inspiron 510m is merely three years old and its Pentium M is the direct ancestor of the Core Solo. She has no plans to replace it any time soon and I'm sure lots of other people don't consider a three year old system very old yet.
They can run games and 3d stuff, but ONLY when they have a 3d accelerator.
We had 3D games long before 3D accelerators, using CPUs far less powerful. And for desktop applications like Burning Studio 7 and such a CPU several generations older than a Pentium M would suffice.
It only cost me about 120e back in the day, so it wasn't exactly an expensive high-end card either.
That might be cheap to you but you have to realize that for the main market SwiftShader addresses the people are only willing to spend the minimum on gaming hardware. If I look around in local computer stores I see piles of graphics cards of 50 € and less. There's clearly a market for these cards, and they're sold to people who simply ask for the cheapest thing that can run Direct3D 9. For casual games and 3D desktop applications they could even save that cost.
I think you should, if you want to use it as an indication of how well SwiftShader can handle games. Far Cry will be considerably heavier on SwiftShader, even though it's actually an older game.
They are not casual games. Running Half-Life 2 and the like is merely to illustrate full Shader Model 2.0 support, and indicate the potential for the future.

So I really think you're looking at the wrong games. A good candidate in my opinion would be a game like Chess Titans, which comes with Vista Premium but not Vista Basic. It requires DirectX 9 and looks visually appealing while not demanding a high framerate.
You don't have to convince me, you'll have to convince all those game developers. Any luck there yet? Do you have more than 15% of the market yet?
It's going absolutely great actually. And you don't even have to look very far. Trials 2 Second Edition doesn't run on my girlfriend's laptop without SwiftShader. With it, it runs flawlessly. I have to admit it's a bit sluggish but with some tuning it can likely be made to run smoothly. On my Mac mini (Core Duo 1.66 GHz, GMA 950) it hangs during loading, but with SwiftShader it runs perfectly.
 
That's very relative. My girlfriend's Dell Inspiron 510m is merely three years old and its Pentium M is the direct ancestor of the Core Solo. She has no plans to replace it any time soon and I'm sure lots of other people don't consider a three year old system very old yet.

There's a difference between 'old' and 'outdated' though. My brothers Pentium D is only a few months older than my Core2 Duo, but it surely is outdated. Even though his is clocked at 4.2 GHz and mine is clocked at 3 GHz, and he also has 2 cores and 4 MB L2 cache in total, mine is WAY WAY faster, it's just a whole new dimension of performance.
I meant 'outdated' more than 'old' in terms of when it was bought and how long it has been used.

We had 3D games long before 3D accelerators, using CPUs far less powerful. And for desktop applications like Burning Studio 7 and such a CPU several generations older than a Pentium M would suffice.

I think I know more about that than you. Back in the day I was actually spinning them donuts and cubes on Amiga, 486 and Pentium, remember?
I lived through that whole revolution of accelerated graphics, and I am well aware of the fact that 3d games changed forever, and there is no way back. Michael Abrash also provided ample evidence with PixoMatic.

That might be cheap to you but you have to realize that for the main market SwiftShader addresses the people are only willing to spend the minimum on gaming hardware. If I look around in local computer stores I see piles of graphics cards of 50 € and less. There's clearly a market for these cards, and they're sold to people who simply ask for the cheapest thing that can run Direct3D 9. For casual games and 3D desktop applications they could even save that cost.

You missed my point that these cards were standard in many Dells, Compaq's, HP's etc. So these people don't have to pay extra. Only the ones that had the REALLY cheap boxes (there's often a difference with business machines and consumer machines there, by the way).
Even so, an investment of a few euros would greatly improve their gaming capabilities and overall experience.

So I really think you're looking at the wrong games. A good candidate in my opinion would be a game like Chess Titans, which comes with Vista Premium but not Vista Basic. It requires DirectX 9 and looks visually appealing while not demanding a high framerate.

Since Vista Aero already requires DX9, I think it's not very surprising that their chess game does aswell. Any other examples?
Because technically I can't imagine why you'd need SM2.0 for a chessgame... Why wouldn't DX7 graphics be just as good? In Chess Titans I see little more than some Gouraud shading and a plane reflection. That doesn't need shaders.

It's going absolutely great actually. And you don't even have to look very far. Trials 2 Second Edition doesn't run on my girlfriend's laptop without SwiftShader. With it, it runs flawlessly. I have to admit it's a bit sluggish but with some tuning it can likely be made to run smoothly. On my Mac mini (Core Duo 1.66 GHz, GMA 950) it hangs during loading, but with SwiftShader it runs perfectly.

So you managed to sell a copy to your girlfriend?
 
Nick and Scali your discussion is really interst, sorry to interrupt.

I've a little question speaking of casual gaming. What kind of cpu would be needed to match Wii capabilities?

EDIT
I know that the holywood is more of a directX7 gpu, but my question more like what it would take to achieve results in the same ballpark ;)
 
Last edited by a moderator:
Back
Top