Not necessarily. One of my friends has a Pentium 4 2.6 GHz with a GeForce 4200 Ti, my girlfriend has a laptop with a Pentium M 1.7 GHz and an Intel 865G, a nephew has an Athlon XP 3000+ and a Radeon 380 IGP, etc. Their systems are not ready for the junk yard yet, but they already run into situations where applications simply don't run. Those CPUs are quite capable of running casual games and 3D desktop applications though.
No offense, but these ARE very old computers, and MUCH less powerful than a Core2 Duo.
They can run games and 3d stuff, but ONLY when they have a 3d accelerator.
Ok, my personal surrounding is not much of a reference, but it's one of the things that keep me motivated. Most of us on this forum upgrade our hardware yearly but I'm convinced that others plan on using the same hardware for five years or so.
Well, yes, I have an old GPU around myself: a Radeon 9600XT. It will be 5 years old this year (although it was preceded by the 9500-series with pretty much the same performance level).
It's in an Athlon XP1800+. You don't want to know how the Radeon compares to SwiftShader on that box. In fact, even with the fastest CPU available today, SwiftShader can't get anywhere near the performance of that old beast (ironically it was supposed to be the direct competitor to the FX5600/5700). It only cost me about 120e back in the day, so it wasn't exactly an expensive high-end card either. This type of card could also be found in many OEM machines, which were more or less in the mid-end.
I also have a laptop from that era, with a Celeron 1.6 GHz CPU and a Radeon IGP 340M. That one is even slower with software rendering, and although the IGP isn't all that fancy, it helps to relieve the CPU and enable some modest gaming that way. It cannot run HalfLife2 very well though, and I doubt SwiftShader would be an improvement.
The system barely has enough processing power to play a DVD without acceleration, leaving the rest of the system helpless and unresponsive. Luckily the IGP has DVD acceleration though.
Oh, and it actually renders the reflecting spheres and shadowvolumes demo properly, at a reasonable framerate of about 30 fps. Care to fix SwiftShader so we can compare it?
Bottom line is, even WITH a GPU these systems aren't all that game-worthy, even with simple, old games. With the CPU doing software rendering... well you'll be lucky if you can just clean the framebuffer and zbuffer at 60 fps in 1024x768, if you know what I mean. And that means there already is NO time left for any game logic.
What you're really looking at is having today's high-end CPUs running SwiftShader in 5 years, compared to the IGPs we have then. Because any older CPU is just inadequate.
Unless you can name me some nice 'casual games' that would be great to try on these systems with SwiftShader?
Your point being? Half-Life 2 is one of my all-time favorite games. I don't really care if it's heavy or not.
I think you should, if you want to use it as an indication of how well SwiftShader can handle games. Far Cry will be considerably heavier on SwiftShader, even though it's actually an older game.
You might find some interesting statistics here:
Valve Survey - November 2007. 11.42% defaults to the DirectX 8 path, and still even 4.14% use the DirectX 7 path. This pretty much means that every game that requires DirectX 9 features is missing 15% of the market. Luckily Source supports it, but game engines with DX8/7 paths are a dying breed. For casual game developers supporting all that hardware is a serious issue. SwiftShader allows to write just one DirectX 9 path (saving time and money), use more effects (beating the competition at visuals), increase the target audience by 15%, reduce QA costs, and reduce support calls. I'm not responsible for marketing, but the advantages are readily clear to me.
You don't have to convince me, you'll have to convince all those game developers. Any luck there yet? Do you have more than 15% of the market yet?