ATI RV740 review/preview

I'm with Degustator on this one. If someone thinks that, then its their own fault for not researching the product they intend to spend money on, it can't be blamed on NV.

For well over a decade now, both NV and ATI have released new low and medium range GPU's with higher overall numbers than the high end GPU's of their previous generations. Examples:

GFTi 4600 > GFFX 5200u
5800 Ultra > 6200GT
7900GTX > 8600GT
8800GTX > 9500GT
9800XT > X600Pro
X1900XT > HD2600
X850XPE > X1600XT
HD3780 > HD4650
Radeon 8500 > Radeon 9200

In all of the above cases the lower numbered GPU from the older family was faster so I don't see why people would only now, start believing that every GPU from a new family generation is automatically faster than every GPU from the previous generation.

That's a bit like comparing apples to oranges isn't it?

Both companies always released new generations on a higher number and it was usually pretty obvious that an x8xx card from the previous generation would be faster than an x3xx or x6xx from the new generation.

However... in all these cases they used new GPUs*. In the current case with the GTS250 and GTS240, they have been using the same old G92 GPU for nearly 3 "generations", all the way from the 8800 to the 9800 to the GTS200 series.

Anandtech put it nicely here:

NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.
 
Last edited by a moderator:
That's a bit like comparing apples to oranges isn't it?

Both companies always released new generations on a higher number and it was usually pretty obvious that an x8xx card from the previous generation would be faster than an x3xx or x6xx from the new generation.

However... in all these cases they used new GPUs. In the current case with the GTS250 and GTS240, they have been using the same old G92 GPU for nearly 3 "generations", all the way from the 8800 to the 9800 to the GTS200 series.

But I see no issue with using the old architecture if both the feature set and performance are near enough identical to what they would have been on a mid range version of the new architecture (gamers don't give a damn what version of CUDA the GPU supports).

In comparison to a true GT2xx based mid range part, the price is right, the performance is right, and the feature set is right. So what is the consumer losing?

Yes it was obvious in past generations that x8xx card from the previous generation would be faster than an x3xx or x6xx from the new generation to anyone with a basic level of GPU knowledge, but exactly the same is true here. I don't see anyone who knows how GPU naming schemes and generation performance jumps work automatically assuming the GTS 250 will be faster than the 9800GTX+.

Anandtech put it nicely here:

Anandtech complains that NV is treating its customers like idiots and yet in this very thread we have talked about how this is prevailent amoungst average joe gamers.

The fact is that from a consumer perspective, there is virtually no difference between G92b and a mid range GT2xx so if consumers are shying away from G92b because it has an old name, then I don't see an issue in renaming it to something appropriate to its performance and feature set.

Note some of the other thngs they are doing, i.e the fast/slow versions and specifying which games to use in reviews are actions I strongly disagree with.
 
However... in all these cases they used new GPUs*.

Yet that's entirely irrelevant. Performance, features and price are the only things that matter. We all share the frustration vented in the Anand article but their logic is flawed IMO. If Nvidia had taped out a "new GPU" with the exact same performance as G92, called it the GTS 250 and some dolt upgraded from his 9800GTX is that somehow more acceptable because Nvidia wasted money on taping out a new chip?

The whole "new chip" argument is completely ridiculous in the context of G92 and GT200. There isn't a big enough architectural or feature difference to make the venture worthwhile. Especially for the consumer.
 
Yet that's entirely irrelevant. Performance, features and price are the only things that matter. We all share the frustration vented in the Anand article but their logic is flawed IMO. If Nvidia had taped out a "new GPU" with the exact same performance as G92, called it the GTS 250 and some dolt upgraded from his 9800GTX is that somehow more acceptable because Nvidia wasted money on taping out a new chip?

The whole "new chip" argument is completely ridiculous in the context of G92 and GT200. There isn't a big enough architectural or feature difference to make the venture worthwhile. Especially for the consumer.

At least in this case. Someone who upgraded their 9800GTX to a GTS 250. Would have the option to SLI them together. I know I know. Alot of people here don't like SLI. But these cards can SLI together and its not immediately obvious.
 
All and the same, there's no differenvce between RV770 and RV770 mobile as far as I know, heck, even the benchmarks on game.amd clearly say that the benchmark for the 4870 mobile was done with a 1GB 4870 card.

Is this chip really a RV770? I'm not 100% sure cause the RV770 has a size of 260mm² according to all the old articles. The chip on the jpg has a size of 264,5mm². Well that could be a simple missmatch but it could also mean that the chip is not a RV770 but a RV790 which according to the rumours is a slightly improved RV770. So maybe AMD needed a new layout for the RV790 to improve clock-speed etc.. and therefore the chip grew a little bit in size.

I found it also a little bit strange that AMD gave the exact size of both (new?) chips on the jpg. Why should they do it? The size of the RV770 is well known.

Also, why haven't we seen a M4850 since Q3/Q4 2008 already? The RV770 is available since Q2/08 so I would have expected to see notebooks with the M4850 already much earlier than now. So maybe AMD needed a new revision of the RV770 to improve the power consumption of the RV770 and in the end the RV770 grew into the RV790 with all the improvements necessary for the notebook market.

I know very well that this all is only hypothetical but well it's my hypothese. :)
 
At least in this case. Someone who upgraded their 9800GTX to a GTS 250. Would have the option to SLI them together. I know I know. Alot of people here don't like SLI. But these cards can SLI together and its not immediately obvious.

Thanks for pointing that out Chris, A lot of people were wondering if you need a bios flash or something for that. Thanks for clearing that up!
 
Is this chip really a RV770? I'm not 100% sure cause the RV770 has a size of 260mm² according to all the old articles. The chip on the jpg has a size of 264,5mm². Well that could be a simple missmatch but it could also mean that the chip is not a RV770 but a RV790 which according to the rumours is a slightly improved RV770. So maybe AMD needed a new layout for the RV790 to improve clock-speed etc.. and therefore the chip grew a little bit in size.

I found it also a little bit strange that AMD gave the exact size of both (new?) chips on the jpg. Why should they do it? The size of the RV770 is well known.

Also, why haven't we seen a M4850 since Q3/Q4 2008 already? The RV770 is available since Q2/08 so I would have expected to see notebooks with the M4850 already much earlier than now. So maybe AMD needed a new revision of the RV770 to improve the power consumption of the RV770 and in the end the RV770 grew into the RV790 with all the improvements necessary for the notebook market.

I know very well that this all is only hypothetical but well it's my hypothese. :)

Well. it's a photoshopped marketing slide, but according to amd that's a Mobility HD4850.
There is more than one laptop with the 4850 inside. Besides the MSI one there's also This one

Why there are no reviews? dunno.
 
Let me get this straight: some people are bitter about a rebranding, even though the new "rebranded" card is 25% cheaper than the previous card of same/similar performance, has much lower power consumption, is physically much smaller, and has the option for more on-board memory (which comes in handy at higher resolutions and/or with SLI-based systems)???

Ummm, ok

So what, everything would be fine and dandy and no complaints if NVIDIA simply created a totally hacked down version GT200 that had 1/2 the performance of a GTX 280 and named it GTS 250? Like what has been done with midrange video cards in some prior generations?

I don't get that.

Rebranding was probably the quickest and least expensive route for NVIDIA to go right now in the short run to combat AMD's new strategy. Clearly NVIDIA could have come out with a GT200-based midrange card, but at what extra performance, at what extra cost, and at what extra delay in time to market?

Frankly, there is nothing wrong with minimizing extra R&D expenditure, and nothing wrong with bringing as good or better performance and lower power consumption to lower price points. What matters is the end result, not the means to get there. It's not like anyone with even half a brain could confuse GTS 250 as a performance upgrade to a 9800 GTX+ based on the name alone.

NVIDIA ultimately must have deemed that they would be better off pouring more resources into GT300 development (and beyond).
 
We've seen examples already of people wanting this "New GTX160M" because it's obviously much better than that old 8800GTX.
Code:
GeForce 8800M GTX
Stream Processors 	96
Core Clock (MHz) 	500
Shader Clock (MHz) 	1250
Memory Clock (MHz) 	800
Maximum Memory 	512MB
Memory Interface 	256-bit
Code:
GeForce GTX 260M
Processor Cores	112
Gigaflops	462
Processor Clock (MHz)	1375 MHz
Texture Fill Rate (billion/sec)	31 
Memory Specs:
Memory Clock (MHz)	Up to 950 MHz
Standard Memory Config	1  GB GDDR3
Memory Interface Width	256-bit 
Memory Bandwidth (GB/sec)	61
So the second one is worse than the first one, yes?

We know most sites used Cat 8.12 as drivers for the 4800's
We know the review cards were cherry picked and the reviewers fail to mention that the cards you see there are not the $149 they talk about.
We know there were (strict?) guidelines given on what to test, how to test and what to mention.
I know that we don't have any problems with doing whatever in our review of GTS 250. That's enough to know for me. You may believe Charlie (who sought NV announcing x86 CPU today somehow).

Hell, if it wasn't for unplayable framerates at 1920x1200 8xAA and 16xAF in Crysis (9.6 for the 4850-512 vs. 13.6 for the 1GB *OC* 250) Toms Hardware wouldn't have anything to be happy about, yet they have orgasmic screams
I've said this numerous times: it's POINTLESS to use 8x MSAA on G8x+ GPUs. You have a nice CSAA modes which are generally quite a bit faster. So the only unbiased comparision of G8x+ and RV7x0 in my opinion is in 4x MSAA modes.

http://media.bestofmicro.com/9/P/181789/original/image011.png

It's Hilarious, at 1280x1024 you're talking about 0.2fps and at 1680x1050 about 1.8fps Yet it's THOSE kinds of remarks that the tools base their conclusion on. It gets demolished by the 4870 at the same pricepoint and it is only faster at unplayable framerates in other games compared to a 4850.
The new 48x0 pricing is quite bad for NVIDIA, that's a fact.
But i've seen reviews of GTS 250, yeah. Have you? If you don't use MSAA 8x (i've said why it's biased comparision) then GTS 250 is even with 4850 in ATI-favoring banchmarks and faster than 4850 in NV-favoring benchmarks. And that's faster overal from where i'm standing.
And it's quite clear that GTS 250 can't compete with 4870 on the same price level at all.

1- HD 3870 is faster then HD 2900.
You need to refresh your memory. 3870 isn't and simply can't be faster than 2900.

2- HD 3xxx introduced DX_10.1, tesselation, reduction 50% in power consuption, half the price.
It introduced DX10.1 support only which is useless right now even on RV770x2 (the only game which uses it somewhat intensively -- Clear Sky 1.5.07 -- is unplayable with DX10.1 features enabled in any sane resolution: http://www.ixbt.com/video/itogi-video/test/0902_scsbench.shtml)
As for half the price and reduction in power consumption -- that's funny because GTS 250 is doing exactly the same compared to 9800GTX+. But what NV's doing is bad and what AMD's done is good, right?

3- 8800GT -> 9800GT introduced what? Nothing besides few Mhz core/mem and a new name.
They've kept the performance level which is already ten times more important to the end user then what AMD's done with 2900->3870 transition.

Are you sure? We have lots of rumors that NV 40nm are in trouble (they are latter then ATI for sure) and GT212 (the GT200b replacer and most important card) is in the toilet.
Lots of rumours? From where? From Charlie? -)
What you've heard right now originated from AMD, not NV. It's RV740 which is late and needs another spin before it may come to market. Yesterdays announcement from AMD was a paper launch intended to spoil NVs new mobile offerings hard launch -- strange that noone mentioned that.
GT212 isn't the RV740 competitor. It may well be cancelled but certainly not because of the problems with 40G. All the other GT21x chips (one of which should be the direct RV740 competitor) are still coming.
 
Last edited by a moderator:
So the second one is worse than the first one, yes?

Dude, I wrote a 1, not a 2.

The new 48x0 pricing is quite bad for NVIDIA, that's a fact.
But i've seen reviews of GTS 250, yeah. Have you? If you don't use MSAA 8x (i've said why it's biased comparision) then GTS 250 is even with 4850 in ATI-favoring banchmarks and faster than 4850 in NV-favoring benchmarks. And that's faster overal from where i'm standing.
And it's quite clear that GTS 250 can't compete with 4870 on the same price level at all.

No, they're reviewing a $179 GTS-250 OC 1GB model against a $129 stock HD4850

You need to refresh your memory. 3870 isn't and simply can't be faster than 2900.

I've had a HD2900XT and would've loved to trade it in against a HD3870. There are only a couple of games at which the 2900XT is actually faster. Heck, the 2900XT is barely faster than a 1900XTX at some times. Your view is very askew.

As for half the price and reduction in power consumption -- that's funny because GTS 250 is doing exactly the same compared to 9800GTX+. But what NV's doing is bad and what AMD's done is good, right?
Assasins Creed, Far Cry.... No, GTS250 does not reduce power draw by half, no it doesn't reduce price by half. Heck, it doesn't even go down $20. and it certainly does not introduce new features.

What you've heard right now originated from AMD, not NV. It's RV740 which is late and needs another spin before it may come to market. Yesterdays announcement from AMD was a paper launch intended to spoil NVs new mobile offerings hard launch -- strange that noone mentioned that.
Maybe because the GTS250 isn't a hard launch either?
 
Yesterdays announcement from AMD was a paper launch intended to spoil NVs new mobile offerings hard launch -- strange that noone mentioned that.
How can NVidia hard launch something that's not new?

Cherry picking G92b for a SKU that at best could have legitimately been called GTS250M, not GTX280M, is not a hard launch.

Where can I buy a laptop with GTX280M?

http://www.techreport.com/discussions.x/16507

Nvidia was a lot less secretive than AMD about its launch schedule, telling us plainly that GeForce GTX 200M-based laptops will come out next month.

GTS250 isn't even a hard launch - there's literally no excuse there - it's not as if the opening day of CeBIT was announced last week and GTS250 was put together in a hurry.

Jawed
 
Dude, I wrote a 1, not a 2.
So you're saying that naming your mobile GPUs the same as descreet while they are slower is bad? Then you should blame ATI for that -- it's their doing, they were doing this before there was any mobile NVIDIA GPU. NV is simply doing the same what ATI/AMD's doing here.

No, they're reviewing a $179 GTS-250 OC 1GB model against a $129 stock HD4850
From what reviews i saw everyone was reviewing either GTS 250 or GTS 250 + GTS 250 OC Edition.

I've had a HD2900XT and would've loved to trade it in against a HD3870. There are only a couple of games at which the 2900XT is actually faster. Heck, the 2900XT is barely faster than a 1900XTX at some times. Your view is very askew.
I have 2900 right now as well as 3870 X2. 2900 is way faster than one of 3870 X2 GPUs nearly everywhere -- especially with any kind of AA. My view is what i see on my screen, sorry.

Assasins Creed, Far Cry.... No, GTS250 does not reduce power draw by half, no it doesn't reduce price by half. Heck, it doesn't even go down $20. and it certainly does not introduce new features.
So "half" is good, -20% is bad?

Maybe because the GTS250 isn't a hard launch either?
GTS 250 is as hard as it gets because it's mostly the same 9800GTX+ -- you may go and get one right now.
And i was talking about new NV's mobile offerings, not GTS 250. AMD hasn't announced any new desktop cards yesterday, they've announced new mobile GPUs. And while the new G-line from NV is available right now you'll have to wait for 2Q to get AMD's offering.
 
You need to refresh your memory. 3870 isn't and simply can't be faster than 2900.

Yes it can. Some cases is faster, some is slower. Depend on game...



DegustatoR said:
It introduced DX10.1 support only which is useless right now even on RV770x2 (the only game which uses it somewhat intensively -- Clear Sky 1.5.07 -- is unplayable with DX10.1 features enabled in any sane resolution: http://www.ixbt.com/video/itogi-video/test/0902_scsbench.shtml)
As for half the price and reduction in power consumption -- that's funny because GTS 250 is doing exactly the same compared to 9800GTX+. But what NV's doing is bad and what AMD's done is good, right?
GTS250 is reducing 50% power consuption at hidle and load? :LOL::LOL:
Dude you must be dreaming.

As far as DX_10.1 I remember ATI had 4 partnerships for new games with DX_10.1 suport. You have deal with Blizard also and Win 7 takes advantage of DX_10.1 on Aero Peak.
Also DX_11 will incorporate DX_10.1 and Tesselation, so by DX_11 time this ATI GPU can enable features of DX_11.



DegustatoR said:
Lots of rumours? From where? From Charlie? -)
What you've heard right now originated from AMD, not NV. It's RV740 which is late and needs another spin before it may come to market.

RV740 is up and running on desktop and mobile. Is it late? Depend on the defenition of late. If you compare it to Nvidia it is very sooner.

DegustatoR said:
Yesterdays announcement from AMD was a paper launch intended to spoil NVs new mobile offerings hard launch -- strange that noone mentioned that.
Nvidia + New = :LOL:
This days be careful to what you buy from Nvidia. In case you don´t know GTX280 and GTX260 mobile are G92 based and not GT200 based.
This is equal that you are eating the same crap again.

As far as AMD it show notebooks with HD 4850, HD 4870 and HD 4870X2 (this is really new and it not a fake name being a RV6xx arquitecture). It dind´t spoil anything with paperlaunch. Only showed plans for the first 40nm mobile GPU.

DegustatoR said:
GT212 isn't the RV740 competitor. It may well be cancelled but certainly not because of the problems with 40G. All the other GT21x chips (one of which should be the direct RV740 competitor) are still coming.
The only chip @40nm taped out for now was GT218 witch is the very low low end.
The other chips are in one unknown state.

RV740 was taped out in final 2008 and yields are so good that the launch was pushed from May to April and they come to desktop and mobile at the same time. It looks to me a very sucessful and early entry.
 
Last edited by a moderator:
Let me get this straight: some people are bitter about a rebranding, even though the new "rebranded" card is 25% cheaper than the previous card of same/similar performance, has much lower power consumption, is physically much smaller, and has the option for more on-board memory (which comes in handy at higher resolutions and/or with SLI-based systems)???
Well, the older one hat the option for more memory too (though it wasn't the reference design).
Apart from that, personally I don't really mind a new name, it's just that the name suggests it's something which it isn't. Something like 9810 GTX or so would have been more appropriate (or if nvidia wanted that GTS moniker, GTS 1xx something maybe even). Viewed from a different perspective, a rename is actually good since it allows you to buy the newer card easily, which is usually what you want (certainly if the newer one isn't more expensive), in this case if only because of the lower power consumption (granted some of the later 9800 GTX+ cards might be close, but it still seems be a bit better). (WD for instance does the opposite, they sell completely different harddisks with the exact same model number, and it's a royal pain to get the better one for sure.)

So what, everything would be fine and dandy and no complaints if NVIDIA simply created a totally hacked down version GT200 that had 1/2 the performance of a GTX 280 and named it GTS 250? Like what has been done with midrange video cards in some prior generations?
You're right that there isn't exactly a huge difference in the capabilities of GT200 and G92 in the areas that matter but yes I think everybody would prefer it if names suggesting the same generation would actually be from the same generation.

Rebranding was probably the quickest and least expensive route for NVIDIA to go right now in the short run to combat AMD's new strategy. Clearly NVIDIA could have come out with a GT200-based midrange card, but at what extra performance, at what extra cost, and at what extra delay in time to market?
I dunno if they could have come out with something competitive based on GT200 actually (you can't ignore time to market and cost).

Frankly, there is nothing wrong with minimizing extra R&D expenditure, and nothing wrong with bringing as good or better performance and lower power consumption to lower price points. What matters is the end result, not the means to get there. It's not like anyone with even half a brain could confuse GTS 250 as a performance upgrade to a 9800 GTX+ based on the name alone.
I dunno but I think that yes some less informed people will think that.

NVIDIA ultimately must have deemed that they would be better off pouring more resources into GT300 development (and beyond).
I really wonder how those GT2xx derivatives look like though...
 
It introduced DX10.1 support only which is useless right now even on RV770x2 ...
It is as useful as PhysX at the moment.

3- 8800GT -> 9800GT introduced what? Nothing besides few Mhz core/mem and a new name.
They've kept the performance level which is already ten times more important to the end user then what AMD's done with 2900->3870 transition.
OK, step from 8800GT to 9800GT was made with just simple sticker change with no price change.(9800GT was even priced a bit highier than 8800GT here in Poland). For you its "keeping the performance level" (please keep in mind that a lot of 9800GT boards had 65nm G92 on them). On the other hand AMD's RV670 decreased die size, price, power consumption and added DX 10.1, UVD. But for you what nV did is ten times more important for the end user, than what AMD did. Can you explain how did you come to that conclusion?

What most people are trying to say is that review sites should inform users what is good/bad/new/old/worth buying. And what nV is doing right now is just plain bad for the customer. They are trying to use misinformation to boost their sales. They are not offering anything new or better than their previous parts but they just try to present their old parts like something new. When reviewers see this they should inform about it and not just present information handed to them by the manufacturer. I am aware that both sides have done something like this in the past but it doesnt mean that it is now ok to do it once more.
Clients base their purchase decisions on reviews they find in the internet. They have the right to know everything about the product and its reviewer's duty to present all the information necessary to make a good purchase decision. And please remember that it is very hard to change that first impression in GPU industry. Really good marketing can do miracles for manufacturer (nvidia's FX lineup) but can be also very bad for the end user.
 
Let me get this straight: some people are bitter about a rebranding, even though the new "rebranded" card is 25% cheaper than the previous card of same/similar performance, has much lower power consumption, is physically much smaller, and has the option for more on-board memory (which comes in handy at higher resolutions and/or with SLI-based systems)???

Ummm, ok

So what, everything would be fine and dandy and no complaints if NVIDIA simply created a totally hacked down version GT200 that had 1/2 the performance of a GTX 280 and named it GTS 250? Like what has been done with midrange video cards in some prior generations?

I don't get that.

Rebranding was probably the quickest and least expensive route for NVIDIA to go right now in the short run to combat AMD's new strategy. Clearly NVIDIA could have come out with a GT200-based midrange card, but at what extra performance, at what extra cost, and at what extra delay in time to market?

Frankly, there is nothing wrong with minimizing extra R&D expenditure, and nothing wrong with bringing as good or better performance and lower power consumption to lower price points. What matters is the end result, not the means to get there. It's not like anyone with even half a brain could confuse GTS 250 as a performance upgrade to a 9800 GTX+ based on the name alone.

NVIDIA ultimately must have deemed that they would be better off pouring more resources into GT300 development (and beyond).

They should have maintained the name 9800GTX and lower the price to compete. Renaming it just have one effect: make consumers confused to buy the "new video card".
 
I dunno but I think that yes some less informed people will think that.

Those same people would think that regardless of whether it was G92b or some brand new GT2xx variant under the hood. That's the point many are missing. The chip under the heatsink is irrelevant to everybody but us.
 
Another:

According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.
http://www.dailytech.com/article.aspx?newsid=14480

I rest my case about who is early or late to 40nm when ATI is kicking RV740 in 1 month in desktop + mobile.
 
Back
Top