Rumor: R350 out before GFFX and is 10% faster......

I just got finished playing the C&C generals beta (awesome BTW), now here is a game that pushes your hardware:

Min Req: 2Ghz CPU, 512mb RAM, GF4 or better performance. Of course, it's a debug build, but the game has a very detailed physics engine and the best in-game explosions I've ever seen.

It runs locked at 800x600. If I turn on 6XFSAA and 16x AF, the game slows down noticably. I have to scale back to 2x FSAA before its acceptable. (e.g. non-jerky. I don't get "silk smooth" until I turn off AA)

And if you think that 4-6X sampling is "perfect" anti-aliasing, I think you need to look again. If I look closely, I can still see some edge crawling and shimmering, especially at 800x600 resolution. When you look at pre-rendered graphics, you are looking at 16-64x sampling. There is a noticable difference when you go up that high.


Man, calling in A10 airstrikes, special forces raids, and Daisy Cutter bombs is so freakin cool!
 
Doomtrooper said:
I said:
They are also being bought to be able to play a 2004/2005 game at 1024x768 with medium details and no AA/AF. Not everyone who buys a high end card plans on buying a new one a year later.
Hogwash...that high end card is now a budget card in 2004.

Hogwash yourself. The high end card 3 years ago was the GeForce DDR. It can run most modern games at a moderate framerate at 1024x768 with medium details and no AA/AF. The high end card 2.5 years ago was the GeForce2 GTS. It can easily run just about any modern game at a pretty good framerate at 1024x768 with medium details and no AA/AF.

The add-in card with the largest installed base on reasonably current computers is the GF2 MX (or some variant thereof). I happen to have one (GF2 MX 400) in the box I'm posting from. Because it's so widespread, almost every modern game is targeted to run at a moderate framerate at 1024x768 with medium details and no AA/AF on that card. And they do.

Indeed, just downloaded the UT2k3 demo to prove this. Remember, all the benchmarks you see on the web are with everything set to max quality, and they are, indeed, pretty low at 1024x768. Put in reasonably medium settings, though (I used--texture and character: "lower"; world and physics: "normal"; bilinear, no shadows, no dynamic lighting) and it's playable at 1024. Well, Anatalus is rough but playable, averaging ~22 fps in a standard match against bots, and Asbestos is almost approaching smooth at ~37fps. I'd play certainly play at 800x600 instead on this machine (if forced to play UT2k3), but many people barely mind 22fps, as awful as it sounds to us. It's certainly enough to get plenty of enjoyment out of the game.

Thing is, the GF2 MX and GF2 MX 400 offer performance quite a bit behind that 3 year old GF DDR, and maybe half that of the 2.5 yead old GF2 GTS. So we have a 3 year old card that will play the prettiest game of the moment adequately once you dial down the settings, and a 2.5 year old card that will play it pretty well.

Meanwhile, the new mainstream card du jour (and no, in the general taxonomy of video controllers these do *not* deserve the name "budget"), the GF4 MX 420, is *still* much slower than a GTS, and still has barely half the memory bandwidth of a GF DDR (although with bandwidth-saving features). You are kidding yourself if you think any game released today can't be run in reasonably playable fashion on a GF4 MX 420.

To give yet another example: by the time Doom3 comes out, the GF3 will be over two years old. But it will very likely get decent framerates at 1024x768 with medium settings. Carmack has specifically said that GF3 will have playable performance at high quality settings (although probably only in 640x480). And this is on the most graphically ahead-of-its-time game around. It's a no-brainer that the 9700 Pro and GFFX 5800 will still turn in decent framerates on an average game released 2-2.5 years after they are.

Now, you might be wondering who would spend $300 on a video card but not bother to buy a new one when, 2.5 years later, it performs no better than the mainstream ("budget") cards of the day. Doesn't he know he could get better overall performance and save money by buying a $100 card and replacing it with another $100 card 12-18 months later?

The answer, I think, is a lot of people. First off, remember that even the highest end cards are options on major build-to-order OEM PCs, so there are a decent number of people who purchase a high end card but who may not be comfortable (or just don't want to bother with) installing a video card. Second, remember that $300 is not a lot of money to many people, especially if they're already shelling out $2000+ for a reasonably high-end box from Dell.

Many people are willing to pay extra to have a system which is acceptable for 3+ years. The fact that their system will be "kickass" for the first 6 months is incidental to them. It's not that these people don't understand that it's a better deal in the computer industry to buy 2 cheap things every 18 months instead of one expensive one every 3 years. It's that the savings aren't worth the bother. Whether it's upgrading a current system or migrating to a new one, the "buy cheap, buy often" system takes time, attention, and increases the risk of having something go wrong.

It may save money. But for many people, their time is worth money. This may be a slightly alien concept to us. We spend our time arguing on computer forums: our time is self-evidently not worth money...
 
Your point holds water for those who are buying low end boards, but those who are buying the high end I don't think it does. The upgrade frequency for those buying the high end is vastly different than from those buying the mid to lower end.
 
I think both dave's have an a valid point. Im currently using a shuttle ss51g which has integrated graphics (id estimated it at gf2mx-gf4mx performance) but im totaly happy with it. im not a big gamer so i have no need to waste money on a new card. The only reason i would even upgrade to a dx9.0 card would be to see all the cool demos humus puts out. :) (btw where the hell is he.)

later,
 
The upgrade pattern for high end board purchases may be different, but the number of people who indulge in that is pretty limited. I don't think that there are enough 3D enthusiasts, who like to have the cutting edge tech in their machines, to make any great difference to what is percieved as being the average or mainstream card.

We dedicated followers of fashion are a mere drop in the ocean. ;)
 
mboeller said:
working @ ATi ??

If he is then he is doing a dam good job of promoting their hardware. If not they should. :)

@board: although we are a drop in the ocean, do you think we have more influence on the regular joe/jane out there?

later,
 
Your point holds water for those who are buying low end boards, but those who are buying the high end I don't think it does. The upgrade frequency for those buying the high end is vastly different than from those buying the mid to lower end.

Again, I think that a number of those buying the high end (particularly those who buy the high end as an option on a built-to-order OEM PC) are not really hardware enthusiasts. Instead they're people for whom the extra couple hundred bucks is worth it for the convenience of having a PC which is adequate for as long as possible.

Would such a person really be willing to hang on to their once-top-end card even when it was down to ~25fps in the latest games? Yeah, I think they might. But then I'm pretty much talking out my ass here so who knows? 8)

(If you happen to have stats on upgrade frequency at different ends of the spectrum that'd be mighty interesting...)
 
I think anyone who plays PC games wants to have the fastest computer possible to play them on. I also think most of them are willing to trade off a little of that performance in order to save a little of money. Some of them are even willing to trade a lot of that performance for a lot of money.

I would agree with you that you don't have to buy the most expensive, fastest things available to be an enthusiast. I view most enthusiasts as people who would buy an Athlon instead of the latest P4, a Radeon 9500 Pro instead of a 9700 Pro, and 7200 RPM IDE drives instead of 10k RPM SCSI drives. I also don't think they upgrade more than every 12-18 months. They don't waste their money on over priced things, and they likewise won't waste their money on crap like GeForce MX cards and 5400 RPM hard drives. People who do the former are rich, and people who do the latter are the "mainstream market".

I also think the people who buy the most expensive things as soon as they're available aren't doing it because they want it to last longer, they're doing it because they want the fastest thing they can get their hands on. I think they'll still upgrade at the same rate as the other enthusiasts, they'll just always have slightly faster computers, and will always be paying top dollar to get it.

In my experience, the only people who are thinking of spending more money for a computer in hopes that it will last them a little longer, are the people who weren't thinking of spending much money on one in the first place.
 
epicstruggle said:
@board: although we are a drop in the ocean, do you think we have more influence on the regular joe/jane out there?

Well I believe we have some affect at a certain level. In my job I can influence the hardware purchase decisions on a medium to large scale. People then get to play with stuff they normally wouldn't consider and when (if) they make their own purchases they may be primed to a certain brand or level of ability.

As for the average Joe\Jane on the street; word of mouth can have a deep impact on peoples choices, and since we may be considered the elite\informed and people ask us for advice, then I think we can have a small scale impact on what people view as the best choice for them. But at the end of the day it's the economics that ultimately make the choice for most people. And that's not something we can change.

I consider myself to be very lucky, I can upgrade as and when I like and to whatever level I choose. But that's not really an option for the average consumer.
 
It really all depends. I enjoy having a cutting edge graphics card, but it's also not something I need. I used to put a lot of importance in it, when I was into online gaming. Now that I've left online gaming behind, it no longer really matters since framerate isn't life. :p

Honestly, if you actively look for jaggies, aliasing, etc, you can see them at any resolution (even 16*12 w/max AA and AF). But, on the other side of the coin, if you choose just to ignore them and play the game, you can get by fine at 1024*768 with nothing else. It's all about mind over matter: if you don't mind, it doesn't matter. ;)

I like a better looking image, definitely, but I still refuse to upgrade more than every 18 months and the only if I can get a good deal in the $200-$300 price range. The point being, there are many different ranges of buyers. The ones that upgrade yearly like clockwork are probably pretty rare. Those who upgrade every time a new card comes out are rarer still.
 
too many variables to generalise on upgrading.

many people are perfectly happy with many old bits of their system and just upgrade CPU's. graphics cards from tiem to time - keeping sound cards, modems ram and HDD's.

I upgrade a little bit at a time - my 9700Pro was so I could have decent 4xAA at 1024x768 whereas my 8500 wasnt cutting it. Most enthusiasts think my CPU is to slow and it is my next upgrade - but an Athlon XP @ 1533 is still fast enough for me and I'll only upgrade when the XP2600's are around £100.
 
Five people here at work that just got new home computers with R9700PRO / R9700TX, replacing their old computers all with TNT/GF2MX range cards, says that Dave H has a point. They are non-gamers with kids that might play some games or ..uhm... "softcore" gamers. They just wanted a computer that would last as long as possible without further care.
 
indeed - tomorrow I am installing a Radeon9000Pro on my friends machine - a PIII500/V3 combo bought 3 years ago. The only reason he is upgrading is because he wants to play MOH:AA and NWN which obviously he cant do with a V3 now. He has played 4 games since he bought the machine - Resident Evil, Severance, Max Payne and Rollercoaster Tycoon.

A classic 'softcore' gamer who doesn't give tuppence about 60+ fps, AA/AF - he just wants it to work with the minimum of fuss and certainly hasnt got a CPU upgrade on hs agenda - even though IMO he should have.
 
Speculation: The NV35 will be a very impressive card. It most certainly won't have the NV30 main problem ( memory bandwidth )


Uttar
 
Uttar , I 'll make you a deal , I wont mention the r400 till the r350 is released if you don't mention the nv35 till the nv30 is launched. We have no clue what problems the fx has. We def have no clue what problems the rehash will fix. Who knows nvidia may have found that this card is a pos and is going to jump to the next chip design as soon as possible
 
jvd said:
Uttar , I 'll make you a deal , I wont mention the r400 till the r350 is released if you don't mention the nv35 till the nv30 is launched. We have no clue what problems the fx has. We def have no clue what problems the rehash will fix. Who knows nvidia may have found that this card is a pos and is going to jump to the next chip design as soon as possible

Actually, I've got a better deal.
I'm not going to mention the NV35 in the next 3 months unless I'm asked by someone to leak information about it. I just don't feel like speculating with so little information anymore. And we probably won't get any interesting NV35 info for a few months.
And feel free to mention the R400 as much as you want :)

And what I meant by "problem" is "bottleneck". I indeed have no clue which problems the FX got, but we all do know its main bottleneck at 4X FSAA or higher is Memory Bandwidth.


Uttar
 
Dave H said:
Now, you might be wondering who would spend $300 on a video card but not bother to buy a new one when, 2.5 years later, it performs no better than the mainstream ("budget") cards of the day. Doesn't he know he could get better overall performance and save money by buying a $100 card and replacing it with another $100 card 12-18 months later?

This ugrade has been happening for years, why all of sudden is it not 'ok', when buying a video card today since the feature set is way ahead of developers one should look to what the card will deliver them in its lifespan. i.e 9700 can deliver very high frame rates with current popular titles with FSAA and AF.
The selling point for high end cards has always been the 'hardware improvements'..Bus speed,memory timings and that is what consumers benefit from immeditely...the feature set is just a added bonus.

FYI I have Doom 3 Alpha and I get 55 fps @ 1024 x 768 with 2X FSAA and 4 X AF..this on a very highly clocked 9500 NP..there is dips down to the 30's but my system is not overally fast with a Athlon XP 1800..this is a Alpha build with no optimizations.
FSAA is a reality with Doom 3 with a 9700/9500..and I'm sure a Ti 4600 too.

My old card was a Radeon 8500 which I bought for $400 Canadian...its now $100 1 year later..that is what I'm talking about...high end is now low end.

IMO graphic cards have exceeded motherboard technology and its time to start seeing some bottlenecks removed from the system.
 
Doomtrooper said:
Much as I don't like these games I think the following is more accurate:

Game of the Year: The Sims (not limited by anything)
Most popular online game: Counter Strike (not limited by anything)

IIRC the most popular online game of all time was StarCraft, not sure if it still ranks ahead of Counter-Strike. As for the most popular offline game, that would be either Solitaire or Minesweeper.

And you wonder why R300s sell so little compared to integrated Radeon7000s and GeForce2MX's. You don't need 3d acceleration at all to play 3 out of 5 of the most popular computer games. :p
 
Back
Top