New Steam survey results

I honestly think you'd struggle to find any GPU architecture that has faired so poorly with age, especially considering they were very competitive at launch.
Actually GeForce FX is the same way. Go look up some early 2003 GFFX 5800 Ultra reviews that test DX8 and OpenGL games. You'll see old NV30 matching and beating R350 sometimes. We know how that went once SM2.0 became popular, however. ;)

Besides there are many games that don't push hardware as hard as the latest shooters do and they do run fine on old high-end G7x cards. It is also wise to consider how a 7800/7900 compares to today's gimpy IGPs and low-end "value" cards. An old 7900GTX will dominate stuff like a AMD 785G IGP, Radeon 5450 or GeForce 9400 (whatever they are calling them now). I know you will bring in the power efficiency point here, but frankly very few people care at all about that, especially if they are saving ~$60 or more by getting a free hand me down graphics card.

I gave away a X850 to my boss's kids years ago and they still use it for games. I also know a friend's sister who plays mainly Zoo Tycoon games and Civ4 and she has an Athlon XP 2500+ and a 6800GT yet. The range of PC gaming hardware out there is MUCH wider than the tech sites tell you.

edit:
It also occurred to me that I found on my notebook with the 7800 Go GTX that in XP the 3D performance is a lot better. I was bored, as usual, one evening and tried 3DMark2001SE in both 7 and XP and found a 40% (!!) difference. I also gave Killing Floor a go and the difference was apparent. I have no idea what the deal is there but it could be shoddy drivers. The driver revision was the same in both OSs though, 179.48.

I do see little reason to run these GPUs in Vista/7 though.
 
Last edited by a moderator:
OT: I thought G7X GPUs were best used with pre 95 (90 in general) series drivers, after that performance was just really horrid (notable after COD4 released)

*coughs*
 
OT: I thought G7X GPUs were best used with pre 95 (90 in general) series drivers, after that performance was just really horrid (notable after COD4 released)

*coughs*

NV's marketing department thinks you need to, Remember the uproar they made about ATI not supporting DX9 cards in the monthly Catalysts?
 
They used to have really aggressive trilinear & aniso "optimizations" in their drivers. G7x isn't very fast at high quality filtering. I wouldn't be surprised if the reason the newer drivers are slower is that they removed some of those cheats. In some games I used to have to set the control panel to high quality filtering to get rid of some really ugly mip map boundaries or bad texture aliasing.

I just know that in Vista/7 my notebook's 7800 Go GTX performs considerably worse than in XP. I wouldn't be surprised if this was caused by some sort of hardware issue with that notebook that comes up in NT6. Maybe messed up power management, an ACPI issue or something else.
 
Last edited by a moderator:
Exactly! It's really odd by some degree, as both G71 and R580 have similar MADD pixel shader computational throughput, while G71 enjoys 50% more texture bilerp rate on top of this.
It all comes down to architectural wisdom, sort of. :p

I believe in the NV40/G70 architectures texture address calculations took place in the pixel shader while in all ATI architectures they used fixed function hardware.
 
Interesting, if you add up the numbers the GT 200 series (295, 285, 280, 275, 260) comes in at 7.84% which is still behind the 4800 series at 8.70%. Think it's safe to say that Rv770 has done better than GT200 + GT200b, although not by as much as I'd expected.

GeForce FX 5200 still with a fair chunk. :D I laugh everytime I see that on there.

Nice to see XP continues to accelerate down the charts. With almost 60% on Vista/Win7 hopefully more devs will start dropping Dx9 entirely (like Just Cause 2).

Regards,
SB
The 8800 series had a nice ride, top dog is now the 4800.

king is dead, long live the king
 
I think you'll end up disappointed. 78xx/79xx cards have aged horribly, X19xx cards now outperform them 2:1 in modern games. They're wholly inadequate for modern gaming:
One thing to note is that on the lower end with similar sized chips, e.g. 7600GT vs. X1600XT, the 7600GT started with a huge performance advantage, so it's not that far back now. Recall that aside from bandwidth, G73 was half a G71 whereas RV530 was a quarter of R580.

I think one of the reasons for this aging is dynamic branching. NV4x and G7x basically just had PS3.0 support tacked onto a lesser architecture, and it didn't hurt them at the time. The reason that it seemed like a step up from R3xx/R4xx is the FP blending in the ROPs allowing FP16 based HDR. Now, even simple branching like that used in ubershaders can mess up G7x a lot.
 
I remember F.E.A.R. was one of the first titles to use extensively dynamic branches for its parallax mapping effects. At some point NV did manage to put the G70 (not sure for NV40) performance, with a driver update, almost up to par with X1000 series. The one thing G70 improved architecturally over NV40 in this regard was the fixed (and smaller) batch-size of 1024 fragments -- still far too large, though.
AFAIK, there was even a discussion here, in B3D, about this.
 
Would be fun to see a G71/R580 faceoff now though.

Tom's Perhaps?

In general, a 1950XTX is 25% faster than a 7900 GTX a straight Knock-Out, this goes for the normal XT and 1900XT as well, they're all a good 20% faster.

fielding the 1800 XT versus the venerable 7800 GTX 512 nets a more even result, though R520 wins on points (be it in single digit percents.)

What should be more disturbing to nVidia owners is the number of blanks in some high-res benchmark results. basically everything over 1680x1050 causes issues on G7x. Taking those benches, the most common scenario seems to be EndWar being a no-go for it at high res.

At non widescreen resolutions (1280x1024) the 1950XTX is about 10% faster than the 7900GX2!

Just select the cards and check compare for an FPS comparison.
 
I giggle when I see Radeon 4670 beat up Radeon 2900 XT in tests. The chips are similar and run at similar clock speeds, but 2900XT has ~4x the memory bandwidth. 4670 also just happens to be one of the least power-consuming cards out there, which you can't exactly say about R600. ;)
 
tbh I expected more than 3% penetration for DX11 systems so far, even if this is just steam users.
 
tbh I expected more than 3% penetration for DX11 systems so far, even if this is just steam users.
Why do you think that? I think there's a lack of incentive for most people to upgrade. There aren't that many reasons to upgrade right now IMO.
 
Last edited by a moderator:
Why do you think that? I think there's a lack of incentive for most people to upgrade. There aren't that many reasons to upgrade right now IMO.

Well I figured a lot of users would be like myself and waited a few generations to see what happens and move to DX11 instead of the latest & greatest DX10.1 card. I guess a lot would be waiting to see what Fermi is like as well.
 
I think that when Fermi comes out the DX11 awareness is going to go up a few notches. It will also push those who have been waiting for Fermi to make a decision.
 
What i find intriguing is that according to steam data Cypress is outselling Juniper! Who would have though that a part that sells for double the price sells more?

In DX11 GPU's the breakdown is as follows

5800 series 49.9%
5700 series 43.85%
5900 series 2.56%
5600 series 0.99%

Heck the 5900 series has sold more than the 100$ 5670. I thought the general consensus and the many studies from Jon peddie etc showed that majority of sales are below the 200$ pricepoint
 
I thought the general consensus and the many studies from Jon peddie etc showed that majority of sales are below the 200$ pricepoint

I hope you might find that the Steam hardware survey isn't monitoring a majority of the computers out there. These are also "old" numbers already, you'll typically find the numbers lagging a month and this is data from february ... and DiRT2 only coming with Cypress and Juniper helps the numbers.
 
Back
Top