Stagnation in GPU and CPU market (Come AMD, we need you!)

I remember the GF->GF2 transition being more impressive than the 9700->9800 transition.
IIRC, That's mostly because of the strange quirk where the GF1 had "free trilinear" but couldn't do single cycle "dual texturing". The GF2 changed this, which I always thought of as an indication there was a flaw in the original GF1 rather than as a technological advancement in the GF2.
 
You forgot the Voodoos? ;) What was that, 2-3 years each?

GF2 was not more of an update than G92 is, R350 even less so.

I did specify this century ;)

And the second part makes no sense, at least to me. GF2 was 40% faster then GF1 DDR at high settings. 9800 was ~30% faster than 9700. Both came 7 months after the part they beat. 12 months since G80 was launched, 8800GT is an across-the-board downgrade for GTX owners.
 
I did specify this century ;)

And the second part makes no sense, at least to me. GF2 was 40% faster then GF1 DDR at high settings. 9800 was ~30% faster than 9700. Both came 7 months after the part they beat. 12 months since G80 was launched, 8800GT is an across-the-board downgrade for GTX owners.

8800 GT is not the replacement for 8800 GTX though. That part won't be out until January. I'm not excusing the lack of a high-end replacement from NV, just re-aligning expectations.
 
9800 was ~30% faster than 9700.

Actually initially there was a tweak in the drivers that only assisted R350 when doing AA. I'm positive that R300 was included in this later on. So, I don't think that performance improvement is quite correct.
 
Last edited by a moderator:
Actually initially there was a tweak in the drivers that only assisted R350 when doing AA. I'm positive that R300 was included in this later on.

IIRC 9800 had some AA perf. enhancements in silicon. At least, that's how it was marketed @ launch.
 
Voodoo Rush was, more or less, 3dfx's first (failed) attempt at winning the OEM market.

Too complex and too expensive for the target audience. If Banshee had been released in that timeframe instead I expect things might have turned out differently.
 
Actually, as an owner of a Voodoo Rush, it sucked in both 3D and 2D. 2D image quality was HORRIBLE. And while it could sometimes match a single Voodoo1 in 3D speed, it wasn't uncommon to see it perform at half that speed in many 3D titles.

It was "supposed" to be as fast as a single Voodoo1, but that wasn't generally the case.

Actually initially there was a tweak in the drivers that only assisted R350 when doing AA. I'm positive that R300 was included in this later on. So, I don't think that performance improvement is quite correct.

Either way 9800 pro (and XT) performance increase wasn't enough to make me want to upgrade from a 9700 pro. Same went for the GF2. I went Voodoo1 -> Riva TNT1 (1 week then into a drawer to gather dust) -> Voodoo2 -> Geforce 256 (1 month then into a drawer to gather dust) -> Geforce 3 (1 month then into a drawer) - V5 5500 -> Geforce 4 (used simultaneously with V5)...

All of those being relatively minor performance upgrades compared to what we've seen the past few cycles with NV and ATI. The only big performance upgrades during that time was going from a single Voodoo1/2 to SLI Voodoo 1/2.

Regards,
SB
 
Last edited by a moderator:
your list confuses me. so you basically used voodoo2s all the way up till the voodoo5 came out, except for 2 months where you used a geforce and a geforce3?
 
I am sure that I am not the only one who looked at today's CPU and GPU launches and is someone perturbed by the fact that barring an unexpected RV670 breakthrough GTX might remain the fastest card on the market for a year and a half. Even when R300's unchallenged superiority allowed ATI to dominate the market, they had enough decency to update the high-end from time to time. (No, 1 beelion dollar Ultra does not count). Value is fine and all, but as 3D Enthusiast, I can't help but be disappointed by the current state of affairs.

We have a similar situation in the CPU market. Just like G92, Peryn is solid incremental update of an excellent architecture. However, when was the last time Intel actually raised their frequency? They are perfectly content selling QX9650 clocked at ~3GHz for $1000+, while the chip is perfectly capable of hitting 3.6GHz and more.

I think the current state of affairs clearly demonstrates how vital the competitive AMD is to the enthusiast landscape, provided we want our CPUs and GPUs to actually get faster year-to-year. Hopefully RV670 and Phenom will arrive in timely fashion and be the step in the right direction.

This is why merging was bad, now if one company is in the dumps we lose CPU and GPU competition in one fell swoop.
 
This is why merging was bad, now if one company is in the dumps we lose CPU and GPU competition in one fell swoop.

True. Which is why everyone that is aware of AMD products should buy them to keep the company afloat :p

Seriously though, I wish more of the "informed" crowd would buy AMD products instead of Intel/NV parts, given the state of the market ATM. Seriously, what's an extra 5-20% performance now if it means no competition very shortly?
 
7800 GTX, June 22, 2005
8800 GTX, November 8, 2006

Funny how some forget the recent past.

You seem to have forgotten the 7900 series, including the 7900 GTX and 7950 GX2, both of which were faster than the 7800 GTX 512, and released later (March '06 for the 7900 GTX and June '06 for the 7950 GX2).

And now your post is gone :p
 
your list confuses me. so you basically used voodoo2s all the way up till the voodoo5 came out, except for 2 months where you used a geforce and a geforce3?

Pretty much. The speed of a geforce wasn't even close to a match for SLI Voodoo2's. Especially if you weren't using a Direct3D title. Not to mention the 2D quality was significantly worse than the matrox card I was using for that purpose.

The Geforce 3 while faster than the Voodoo 5 5500 was absolutely horrible in image quality compared to the V5. IE - 16 bit was faster than V5's dithered 16-bit (22-bit effective) mode but didn't look even remotely as good. But 32 bit color was a huge hit to performance making it roughly the same speed or slower for still worse Image quality (AA modes) while color quality was only marginally better for a huge hit in performance. Especially when you enabled AA on both cards. Add to that, AGAIN absolutely horrible 2D quality and yeah it didn't last very long.

The Geforce 4 however, was brilliant. Finally decent 2D quality (still wasn't up to Matrox standards at the time, but close enough) and was significantly faster than the V5. Enough that the worse image quality wasn't enough to prevent me from using it in games that required speed (twitch FPS gaming - Competition Unreal Tournament for example). While I still used the V5 5500 for games that didn't absolutely require ultra fast response time (System Shock 2 for example).

Remember these are my opinions. :) I realize that many other people aren't as sensitive to aliasing as I may be. Many of my friends didn't notice the quality differences in games unless I pointed it out to them. And even then it didn't bother them nearly as much as it bothered me. However, the 2D quality of the cards in high resolution (1600x1200 and even 1280x1024) was universally noticable by EVERYONE I knew.

Regards,
SB
 
True. Which is why everyone that is aware of AMD products should buy them to keep the company afloat :p

Bah. I don't see a need to go on some quest to keep the same two around. For all we know they have been colluding for most of this decade. :cool: It would be interesting to see what would happen to the market if AMD died off. I highly doubt Intel would just reign supreme for the rest of time. Out of the ashes comes VIA-XXL!
 
Back
Top