New Steam survey results

Interesting, if you add up the numbers the GT 200 series (295, 285, 280, 275, 260) comes in at 7.84% which is still behind the 4800 series at 8.70%. Think it's safe to say that Rv770 has done better than GT200 + GT200b, although not by as much as I'd expected.

GeForce FX 5200 still with a fair chunk. :D I laugh everytime I see that on there.

Nice to see XP continues to accelerate down the charts. With almost 60% on Vista/Win7 hopefully more devs will start dropping Dx9 entirely (like Just Cause 2).

Regards,
SB

Honestly, when you look back at just how terrible ATI's preceding cards performed, I think it would have been silly to have ever expected performance better than that. The launch of the 4800 series sent them from damn near irrelevant to market leaders, the volume of high end cards they've shipped since the 4800 series launched must have increased several times over compared to what they were pumping out before.

Looks like both the 5800 series and 5700 series are off to a really good start. They're already the 2nd and 3rd most popular ATI GPUs among Steam users, that's insane considering they've had serious issues with supply and retailer price gouging since launch.
 
Interesting, if you add up the numbers the GT 200 series (295, 285, 280, 275, 260) comes in at 7.84% which is still behind the 4800 series at 8.70%. Think it's safe to say that Rv770 has done better than GT200 + GT200b, although not by as much as I'd expected.

NVIDIA stayed price competitive with ATI through most of it with the GTX260, GTX260 216, and GTX275. Though at great cost it would seem, especially to AIB partners.

As for devs ditching DX9, it's about time. DX9 hardware is generally too slow to run modern games anyway, and the number of gamers with powerful DX10 hardware still running in XP are rapidly diminishing.
 
Last edited by a moderator:
As for devs ditching DX9, it's about time. DX9 hardware is generally too slow to run modern games anyway
Yup, let's hope that PC game devs cater to an ever narrower audience. Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.

It's time for big title PC gaming to die, and hopefully let something more healthy rise from the ashes.
 
Yup, let's hope that PC game devs cater to an ever narrower audience. Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.

It's time for big title PC gaming to die, and hopefully let something more healthy rise from the ashes.

DX 10 cards have been around for 3 years already. In fact I don't think you can buy a pc without a dx 10 capable igp or gpu inside of it.

I think the problem is supporting cards from 2002 and limiting whats done in the game.

You can get a card like a 5450 that will play modern games just fine at 720p which console gamers seem to love.
 
Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.
Oh come on... even DX11 cards span the entire range of hardware now and DX10 cards have for *years* now. DX9 cards are just old and less power efficient than newer hardware. Modern doesn't imply "high-end"...
 
Yup, let's hope that PC game devs cater to an ever narrower audience. Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.

It's time for big title PC gaming to die, and hopefully let something more healthy rise from the ashes.

you mean like a passively cooled 5770 without additional power connectors?
 
Yup, let's hope that PC game devs cater to an ever narrower audience. Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.

It's time for big title PC gaming to die, and hopefully let something more healthy rise from the ashes.

Cutting DX9 systems cuts max about 20% of the theoretical max market (sure, at Steam it's 40% for DX9 systems, but the fact is Steam is filled with machines that wouldn't run any new games from the past 2 years anyway)
 
Cutting DX9 systems cuts max about 20% of the theoretical max market (sure, at Steam it's 40% for DX9 systems, but the fact is Steam is filled with machines that wouldn't run any new games from the past 2 years anyway)

Take a look at some of the results here:

http://techreport.com/articles.x/18521/6

79xx/78xx class GPUs have aged horribly and they are all simply incapable of playing any of the games tested at 1680x1050. Sure ATI's cards may have faired a little better but only the very fastest variants are capable of playing modern games at anything approaching respectable quality settings.

If you're running hardware from that era then chances are you're not playing modern games and, if you are, you'd see your framerates increase 2 or 3 times over by investing in a $90 5670 or something a card which will reduce your power consumption and heat output by a tremendous amount, as well as offer extensive hardware video acceleration and much better A/V options. As such, sticking with that sort of hardware just makes no sense at all.
 
Yup, let's hope that PC game devs cater to an ever narrower audience. Along with modern cards drawing enough power to disqualify themselves from the vast majority of stationary computers, and just about all laptops it helps accelerate critical trends.

It's time for big title PC gaming to die, and hopefully let something more healthy rise from the ashes.

Gee, I wonder if DX 7 class card users were crying onto their computer when all games switched to DX 8. Or when virtually all games switched to DX 9 within 3 years.

I suppose we should also still be releasing games that work in DOS and Win9x?

Wait hold on. CGA users want to be able to run games also. :) Damn, and Black and White users just called and they want to do gaming still also. :p

Regards,
SB
 
Gee, I wonder if DX 7 class card users were crying onto their computer when all games switched to DX 8. Or when virtually all games switched to DX 9 within 3 years.

I suppose we should also still be releasing games that work in DOS and Win9x?

Wait hold on. CGA users want to be able to run games also. :) Damn, and Black and White users just called and they want to do gaming still also. :p

Regards,
SB

0.00001% of people cannot see colour at all. I want you to spare a second for those people forced to buy features and pay performance penalties for colour they cannot see!
 

For instance.
Actually, I've seen the amount of money reported that was spent on small scale gaming on computers in Europe alone, and it was in billions of Euros. I found the figure remarkable, but it really opened my eyes to the size of the market that doesn't cater to "gamers". A lot of people play games, but it isn't the main reason for their choice of computing systems.

And that ties into for instance eastmens remark.
Core iX products have been selling for a year and a half, and they still don't represent more than 1% of current CPU sales, and are a small fraction of a percentage of the installed base as a whole. The moral of the story is that the overwhelming majority of PC users are using systems that are pathetic from the perspective of the PC tech enthusiast. But those users are still interested in playing games now and then.

So when I read the gazillionth thread lamenting the slow adoption of the latest hardware, and/or the latest graphics feature I can't help feeling that
- Yes, lets make those 48 people (all of them in the business) who have core i7 systems with 2560x1600 displays and crossfire or SLI setups the sole target of game development. You know, that tiny splinter of the population who actually own the rigs and settings that the tech sites use to justify the latest and greatest. Because they can't sustain a gaming industry and the move would kill off the gaming justified tech envy charade once and for all. It hasn't really served gaming much for a long time, rather gaming has been used to try to foist new hardware on people. And while this may have benefitted hardware vendors (and the tech sites), the impression that PC gaming needs the latest and greatest has been destructive for the attractiveness of the PC as a platform for consumers.

Be careful what you wish for. The emperor is scantily clad.
PC gaming doesn't really need the tech industry.
 
Last edited by a moderator:
Take a look at some of the results here:

http://techreport.com/articles.x/18521/6

79xx/78xx class GPUs have aged horribly and they are all simply incapable of playing any of the games tested at 1680x1050.
I see 4X AA and 16X AF enabled. Not realistic settings for those cards at this point, and you can turn the other settings down too. 1680x1050 is a rather high resolution as well.

Actually I think most gamers are on consoles now anyway. Those machines are the benchmark that games get developed for at this point. You don't have to look hard to see that's how it is. And the PS3 has what amounts to a sort of 7900 GT / GTX hybrid.

And what Entropy said above me.
 
Last edited by a moderator:
you mean like a passively cooled 5770 without additional power connectors?
Actually I have a passively cooled 5750, with additional power connector.
Powercolor makes it. They will ship a version without the power connector as well, presumably at a slightly lower voltage, is that the card you refer to? It draws as much power as everything else in that box put together, disks, drives, memory, CPU, soundcard with breakout box, et cetera. Justifiable for a dedicated gaming system. To the extent that a dedicated gaming PC is justified at all.
 
The pre-G80 hardware is hopelessly out of driver support from NV for too much time now, while ATi is still maintaining some quasi driver updates for it's X1000 installed base. PCGH had some recent gaming benchmarking, including both 7900GTX and X1950 SKUs, and the latter just wiped the floor with its arch-rival, though sill a good measure behind all modern mid-range offerings.
 
For instance.
Actually, I've seen the amount of money reported that was spent on small scale gaming on computers in Europe alone, and it was in billions of Euros. I found the figure remarkable, but it really opened my eyes to the size of the market that doesn't cater to "gamers". A lot of people play games, but it isn't the main reason for their choice of computing systems.

And that ties into for instance eastmens remark.
Core iX products have been selling for a year and a half, and they still don't represent more than 1% of current CPU sales, and are a small fraction of a percentage of the installed base as a whole. The moral of the story is that the overwhelming majority of PC users are using systems that are pathetic from the perspective of the PC tech enthusiast. But those users are still interested in playing games now and then.

There are still 150m ps2s out in the world. I don't see next gen devs programing for ps2 and up ressing for next gen systems.

Its the same here. Core ix products are the minority but at this point dual core cpus have been on the market for 4 or 5 years now ? Its time for dual core to be standard for gaming with quad core and higher supported better than now. Just like dx 10 should be the standard now. Its been around for 3 years and the hardware for it is dirt cheap. You can get a great performing dx 10 card for $50 bucks. Heck you can get a great dx 11 card.

d



So when I read the gazillionth thread lamenting the slow adoption of the latest hardware, and/or the latest graphics feature I can't help feeling that
- Yes, lets make those 48 people (all of them in the business) who have core i7 systems with 2560x1600 displays and crossfire or SLI setups the sole target of game development. You know, that tiny splinter of the population who actually own the rigs and settings that the tech sites use to justify the latest and greatest. Because they can't sustain a gaming industry and the move would kill off the gaming justified tech envy charade once and for all. It hasn't really served gaming much for a long time, rather gaming has been used to try to foist new hardware on people. And while this may have benefitted hardware vendors (and the tech sites), the impression that PC gaming needs the latest and greatest has been destructive for the attractiveness of the PC as a platform for consumers.

I think devs need to target a 3 year window , mabye 4 years. IF your system is more than that why bother continuing to support it gaming wise. Consoles last 5-6 years so a pc can have a slightly faster turn around rate , esp considering that video cards are cheap and easy to upgrade.

You don't need a 5870 to play the latest games. Not all gamers want 8x fsaa and 16x anistropic filtering and 1920x1200. Many gamers are fine with the 1680x1050 res or 1080p res that their monitors are running

Be careful what you wish for. The emperor is scantily clad.
PC gaming doesn't really need the tech industry.

Developers an allways make a profit when targeting the tech indstry.
 
... including both 7900GTX and X1950 SKUs, and the latter just wiped the floor with its arch-rival, though sill a good measure behind all modern mid-range offerings.
The X1950 also might be better equipped for today's games than the 7900. R580 has quite the beefy shader resources compared to G71.

I have a notebook with a 7800 Go GTX. Those are essentially identical to the desktop cards, but with a small downclock. It is actually stable at desktop clocks. I still use that notebook for gaming. I've played Trine, Defense Grid, Killing Floor, C&C3, World in Conflict, SupCom and some STALKER on it. Out of those SupCom is too slow once the single core Pentium M can no longer deal with the game, and STALKER is too much overall really. STALKER has a lot of ways to drop the graphics load down though.

This talk makes me want to try more recent stuff on it.

If the Mobility 5870s are ever in good volume, I'll probably buy a notebook with one. Been running this Inspiron for 5 years now.
 
Last edited by a moderator:
Cool! By the way I recently got hands on an old Asus 7800GTX as a spare video card (in case I plan to sell my 5870 for a GF100 board), and I'm still to find some time to put the old champ back to duty and see how it fares in a modern environment. ;)
 
Cool! By the way I recently got hands on an old Asus 7800GTX as a spare video card (in case I plan to sell my 5870 for a GF100 board), and I'm still to find some time to put the old champ back to duty and see how it fares in a modern environment. ;)

I think you'll end up disappointed. 78xx/79xx cards have aged horribly, X19xx cards now outperform them 2:1 in modern games. They're wholly inadequate for modern gaming:

nmjceu.jpg


A lowly 9600GT will net you more than 3x the framerate in modern games, I honestly think you'd struggle to find any GPU architecture that has faired so poorly with age, especially considering they were very competitive at launch. Really this was the point some of us were trying to make earlier. If you're on "high end" DX9 hardware you're using some of the least efficient GPUs out there, it just makes no sense to use these parts nowadays when a $80 upgrade can net you something like 3-4x the framerate, cut your power consumption in half and net you a bunch of nice features. Asking developers to waste time optimising for inefficient hardware that is so ill equipped to deal with modern game engines just doesn't seem like a terribly good idea. They'd be much better served getting performance and quality up on awesome little cards like the 4770, 5670, GT 240 and 4670, efficient, modern GPUs which consumers should (and for the most part, are) moving on to. No one is asking for developers to target a GTX 280 as the minimum spec.
 
Last edited by a moderator:
I think you'll end up disappointed. 78xx/79xx cards have aged horribly, X19xx cards now outperform them 2:1 in modern games. They're wholly inadequate for modern gaming:
Exactly! It's really odd by some degree, as both G71 and R580 have similar MADD pixel shader computational throughput, while G71 enjoys 50% more texture bilerp rate on top of this.
It all comes down to architectural wisdom, sort of. :p
 
Back
Top