Where are the GT200 mainstream GPUs?

Was it the plan all along for NVidia to exclude GT2xx features/improvements for all GPUs below GT200, instead making them entirely G9x cost-reduction refreshes?

This would seem to imply the strategy is that GT3xx chips will all be feature-aligned.

Jawed
 
Was it the plan all along for NVidia to exclude GT2xx features/improvements for all GPUs below GT200, instead making them entirely G9x cost-reduction refreshes?
There might be some GT2XX features/improvements, but i would not expect too much. Goal is very much secondary to getting the 40nm products out and replacing all their lower end chips with cheaper versions.

The GT2XX lineup would roughly be:
GT210 - 40nm G98
GT220 - 40nm G96
GT225 - 40nm G92/192bit
GTS240 - 55nm G92
GTS250 - 55nm G92
GTX275 - 55nm G200
GTx285 - 55nm G200

If they add too many features to the lower end SKUs the middle GTS240 and GTS250 would be kind of awkward. May be able to get the 40nm G92 up enough to cover this gap.

(Thought there might originally be a GDDR5 SKU, but am thinking now will be held off till the GT3XX chips arrive)

This would seem to imply the strategy is that GT3xx chips will all be feature-aligned.

Hopefully. Have not seen anything about GT3XX chips at all, will be interesting with the windows release coming up if they try to pursue a similar strategy to G80 and Vista which seemed to work well for them. ie high end part first to capture all the early adopters and slowly phase in other lower parts over the following 6months to capture the other groups as the migrate/upgrade.


(Obligatory warning that all of above is based on rumors)
 
rjc, they already dropped the idea of renaming 8800GT/9800GT to GTS240 thanks to pressure from board partners, so that card will continue to sell under 3 different 9800GT forms ("green edition" underclocked, normal and overclocked)
 
Damn, that's like backwards progress. I know all card companies used to do it, but it was nice when ATI started doing feature aligned launches at least with the 4xxx series. Was hoping both vendors would adopt this going forward...

Well here's to hoping GT3xx are all feature aligned. Always hated explaining to more casual friends how even though such and such chip was the same generation and naming scheme as this other such and such chip that they couldn't do the same things.

At least with the 4xxx series, it's easy to say, yes they are all basically the same just some are slower and some are faster.

Regards,
SB
 
rjc, they already dropped the idea of renaming 8800GT/9800GT to GTS240 thanks to pressure from board partners, so that card will continue to sell under 3 different 9800GT forms ("green edition" underclocked, normal and overclocked)

Yeah i saw that story, but was not so sure it was completely true. I thought nvidia might be holding back the GTS240 till they knew where the RV740 was going to launch, so they could quickly put out a performance matching product.

Even if it never appeared still would be a little bit of a problem adding GT2XX features to the lower end cards as the GTS250 is definitely there. Partners/consumers might be a little upset their more expensive GTS250 didnt have a feature that the very cheap GT210 had.
 
At least with the 4xxx series, it's easy to say, yes they are all basically the same just some are slower and some are faster.
I'm suspicious that RV7xx chips lower than RV770 have no LDS capability which would hurt the performance of any Stream apps that are written specifically to take advantage of LDS (or make them simply not run). Of course it could be a very long time before such apps appear since CUDA based stuff is generally far more advanced in market penetration. I suppose there's a chance of some non-consumer apps working this way and I suppose they're unlikely to use anything other than RV770 or better...

Jawed
 
I'm suspicious that RV7xx chips lower than RV770 have no LDS capability which would hurt the performance of any Stream apps that are written specifically to take advantage of LDS (or make them simply not run). Of course it could be a very long time before such apps appear since CUDA based stuff is generally far more advanced in market penetration. I suppose there's a chance of some non-consumer apps working this way and I suppose they're unlikely to use anything other than RV770 or better...

Jawed

FWIW, some of my codes which use LDS are running *very very* slowly on 4670 but I am still investigating whether its the LDS at fault or not.
 
I am slowly getting into PC DIY and if my research is correct, the GT2xx also have really good power management. Idle draws are low. This is a factor that has held me back from acquiring the HD48xx family. I know they are VFM but the power draw is not very good.

I wish AMD new HD49xx can improve on power usage or else i have to continue praying for the GT200 mainstream GPU. I mean who games 24/7?

That's a backward view you have on power consumption and quite obviously not relevant anymore. Power draw is equal on HD4 cards versus the GTX's and temperatures are cooler.
 
Tempature is mostly the result of power wattage.. ((and leakage)). If their power draw/usage is so similar then the amount of heat they output is very similar. The only difference would come down to the coolers in use. Some Dissipate heat better than others. As long as both companies keep their cards within their thermal limits. I fail to see how its even relevant. Power usage and "How Hot/Cool" the chips run is a totally different thing.

Low Power = Good for everyone when the heat starts getting dumped around in your PC's area. Like my 9800GTX 2 way/3way setup may use less power while gaming than Quad GTX 295 Setup. But whenever I am doing desktop work. The GTX 295 Quad setup dumps alot less heat into my room and PSU heats up alot less thus decrasing heat all around.
 
Last time I checked, Furmark was running significantly faster on the 4870 compared to the 280... is it any real surprise that the 4870 uses more power there? Obviously, Furmark isn't the "power virus" for the 280. Of course, most sites only publish the power consumption numbers, not the actual performance.

Here are some results for Furmark (1280x1024 no AA):
4870 - 96 fps
280 - 71 fps

This was with stock clocks for both boards.

I can't speak for Crysis Warhead performance I have not looked at it.

-FUDie
I had both running just this morning at about 46ish to 50ish Fps in extreme burning mode at 12x10 - and this is where we're doing our power measurements. A stock GTX285 draws about 208 watts while a stock HD 4870/1G draws about 192 watts. Crysis Warhead is about 58 watts lower on GTX285 and almost 70 watts lower on HD 4870/1G.

GTX260/216 (55nm) and HD 4850/512 are significantly below, though.


WRT "the power virus": I have yet to see an application which causes either RV770- oder GT200-boards to draw higher amount of current. Feel free to point me some. :)
 
I had both running just this morning at about 46ish to 50ish Fps in extreme burning mode at 12x10 - and this is where we're doing our power measurements. A stock GTX285 draws about 208 watts while a stock HD 4870/1G draws about 192 watts. Crysis Warhead is about 58 watts lower on GTX285 and almost 70 watts lower on HD 4870/1G.

GTX260/216 (55nm) and HD 4850/512 are significantly below, though.


WRT "the power virus": I have yet to see an application which causes either RV770- oder GT200-boards to draw higher amount of current. Feel free to point me some. :)

Have you renamed Fur.exe to something else to avoid artificial performance penalty on HD4870?


Regarding idle power consumption of HD4870 - my solution is to create CCC profile and a hot key for it with downclocked memory to minimum (eg. 465MHz). Every time I want to play a demanding game I'm switching Performance profile (825/955) and for everything else I'm running Power Save profile (625/465).
The difference at the socket is roughly 40W less in Power Save mode. Mind you GPU clock does affect idle consumption very little! Only 1-2W of difference between 825MHz GPU and 625MHz GPU!
All that by just pressing a combination of keys. (I know card should do this automatically, but I think there is a good reason why it's not doing it. Every time you change memory speed screen flickers)
 
Back
Top