ATI RV740 review/preview

Yes so where did the price war leave AMD? Not much change, still not making a shift, but in the end just hurt nV and put both of them in the same boat, isn't that just bad for the market over all? Thats exactly what that article was talk about too. 1-2% change is not what AMD expected, that kind of change doesn't warrent the price cuts, because they cut prices by 10% or more. How is that good business? Cut margins by 10% and then gain 1 to 2 percent marketshare, talk about really good at balancing a scale. Now add in the recession where both companies lose sales due to lack of purchases (not because of price wars) because of general lack of consumer confidence, it was just a stupid move.

AMD has been doing price cuts for their CPU's too, has that helped them to gain marketshare or get back to profitability. No it hasn't. Similiar situation.

Yeah, I really wish AMD would raise their prices. I just don't feel like I'm sufficiently stimulating the economy with my graphics card purchases anymore.
 
It left AMD's Graphics division (ATI) with only an 8% revenue loss.

It left Nvidia's Graphics division with a 47% revenue loss.

I think that's pretty significant. And even with those massive price/margin cuts, they still ended up losing 1-2% marketshare year on year to AMD/ATI.

Regards,
SB


That isn't significant when you look at over all performance to debt man, what does it matter if AMD is going into a situation where it can't get money in without outside investors? Its a pretty bad right now for AMD, if they didn't go with a price war and priced accordingly, they would have the same loss of sales but higher margins, and possible they wouldn't have had that loss, so far every single quarter for the past 3 years could be even 4, ATi has not been able to meet quartly expectations when it comes to revenue.

The last 2 quarters has been bad for nV you state, especially the last quarter, but all of this is due to the price cuts AMD has been doing (compounded by the recession), which didn't do sh*t for them when you look at marketshare, they have higher margins then nV, but this isn't about margins, its about makeing money and gaining marketshare, thats the only way price cuts work, AMD failed at it period because they miss read the market and didn't expect nV to cut into thier margins, which is not just an oversight.

Yeah, I really wish AMD would raise their prices. I just don't feel like I'm sufficiently stimulating the economy with my graphics card purchases anymore.

sarcasm is noted and dismissed and nothing ;).
 
Interesting I mentioned 40% with the 10% margin loss from the write off, or did you missunderstand me?

1- I didn´t missunderstand you. You missunderstand Nvidia data:
Burkett ascribed 10 percentage points of the gross margin decline to inventory write-offs in the fourth quarter, while a shift in demand to lower-margin products accounted for the rest of the drop.

Write-off + lower-margin products = 10% drop

While the writeoff was only 50M, the GPU division lost almost 250M. Do the Math... those 40% where never never reachable with or without writeoff. Is a combination of 2 factors and you have to acount both. Nvidia margins for Q4 2008 where 29% and expect 35% for Q1 2009 witch they probably fail if the market continue this way.

2- We are in a world with credit problems.
ATI solution: Drop prices and introduce two more new chips (RV740 and RV790). One of them in 40nm.
Nvidia solution: Massive renaming.

Ok you don´t mind to eat G92 crap all over again. I see that for you GTS250 witch is equal to 9800GTX+ can eat HD 4870. This was hilarious.
Yo still say that HD 4870 is less competitive to GTX260 core 216. You don't accept the fact that HD 4870 is equal to GTX260 core 216@55nm.

Review from today:
The NVIDIA GeForce GTX 260 and the AMD Radeon HD 4870 1GB have both matured. They are amazing video cards and even more unbelievably they both seem to be within a hairs width of each other performance wise. While one may take a superficial lead in one game or the other when looking at the gameplay experience they are both dead even.
http://enthusiast.hardocp.com/article.html?art=MTYyNiwxLCxoZW50aHVzaWFzdA==

You are misleading to many points. Misleading ATI performance GPU´s and put G92 in a level that it isn´t for a long long time ago.

ATI is having very good feedback on foruns from RV740. Nvidia is having very very bad feedback from this massive renaming.
ATI is attacking crysis with credibility and new products in 40nm less expensive and fast. Nvidia is attacking with renamings. You will see tons of bad comments on Nvidia everywhere. Nvidia is destroying it´s reputation and everyone see that and this lost of reputation is much worse then have bad performance cards on market.
No matter what you say no one agree with you with those renamings.

And you keep pushing the button on Q4 2008 number market share. We don´t know discrete numbers for Q4 2008.
We all know that nvidia recovered a bit in global share, yet it lost year on year. Yet we all know that AMD will never surpass Nvidia on global share for the simple fact that Nvidia do IGP for Intel. We all know also that since RV770 launch Nvidia never made profit again and mixed with failed mobile parts + crysis = the sum of 2008 result in loss.
Q1 and Q2 2009 numbers should be so bad that it will destroy all 2009 finances.
Nvidia is trailing AMD:
- Starting to make losses each time bigger.
- Reputation down by failed mobile GPU + massive renamings.
- Lack off competitiveness in price/performance.
 
Last edited by a moderator:
1- I didn´t missunderstand you. You missunderstand Nvidia data:


Write-off + lower-margin products = 10% drop

While the writeoff was only 50M, the GPU division lost almost 250M. Do the Math...

2- We are in a world with credit problems.
ATI solution: Drop prices and introduce two more new chips (RV740 and RV790). One of them in 40nm.
Nvidia solution: Massive renaming.

Ok you don´t mind to eat G92 crap all over again. I see that for you GTS250 witch is equal to 9800GTX+ can eat HD 4870. This was hilarious.
Yo still say that HD 4870 is less competitive to GTX260 core 216. You don't accept the fact that HD 4870 is equal to GTX260 core 216@55nm.

Review from today:

http://enthusiast.hardocp.com/article.html?art=MTYyNiwxLCxoZW50aHVzaWFzdA==

You are misleading to many points. Misleading ATI performance GPU´s and put G92 in a level that it isn´t for a long long time ago.

ATI is having very good feedback on foruns from RV740. Nvidia is having very very bad feedback from this massive renaming.
ATI is attacking crysis with credibility and new products in 40nm less expensive and fast. Nvidia is attacking wuth renamings.
No matter what you say no one agree with you with those renamings.

LOL you miss read what Burket stated can someone step in and explain the sentence to this guy? 10% + more, is how that reads. Why not take a look at their margins last quarter, then you will understand why your understanding of that statement is really off :) He was comparing marketshare from quarter ago.


You are going by forums to basis your idea on sales? Thats a good one...... I don't care what Kyle states, thats his feelings, last time I saw he is only one guy, and what he stated he might not like the renaming, but tell me the x600 renames from ATI, dude its a stupid arguement to get publicity Kyle isn't always right, although I like HardOCP's articles and review, his I'm always correct idea has to drop a bit, and I told that to him before whne he states his reviews are perfect and only way to do it.

You don't know the markeshare numbers keep that in mind. I already pointed out ways to figure a general idea of that out, no more on that topic please.
 
LOL you miss read what Burket stated can someone step in and explain the sentence to this guy? 10% + more, is how that reads. Why not take a look at their margins last quarter, then you will understand why your understanding of that statement is really off :) He was comparing marketshare from quarter ago.

He is NOT misreading it.

Burkett ascribed 10 percentage points of the gross margin decline to inventory write-offs in the fourth quarter, while a shift in demand to lower-margin products accounted for the rest of the drop.

10 perentage points of the gross margin decline was ascribed to the inventory write-offs. And a shift in demand to lower margin products (for Nvidia) accounted for the rest.

In other words 10% of the total gross margin decline is actually due to the write off.

So if gross margine declined by 10% then 1% of that is actually ascribed to the inventory write off. But nowhere is it mentioned how much gross margins declined. So it could have been 10% of a 5% drop or any other number.

Meanwhile ATI still maintained a relatively healthy margin in comparison to Nvidia. In other words Nvidia could drop to 0% margin and ATI could still price their competing products lower and maintain a positive margin.

This is made even more relevant when you considering G92 will be competing with a much smaller Rv740. The renaming scheme will help to reduce inventory by trying to convince unknowing consumers that this is the latest and greatest from Nvidia in the midrange, however it's not going to help them achieve their forecast of returning to 35% margins.

Regards,
SB
 
gross margins from previous quarter was 41.9%, now if thier margins are now at 29%, what is the difference? The decline of gross margins has to compared to something, its either previous quarter or 4q of last year. That is more thne a 10% decline, thats 12.9%. If they were comparing to previous q4(07) it would be 17.5%. Either way you look at its more then 10%

http://www.anandtech.com/weblog/showpost.aspx?i=518

One interesting item from NVIDIA’s statement was that their gross margin is up, nearly 3%, from 39.1% to 41.9%. As NVIDIA has continued to take a soaking on the GTX 200 series, a more rational outcome would have been for those GPUs to drag the gross margin down; instead and in spite of that it’s up. Clearly NVIDIA is still finding a way to make money on what’s an expensive chip to make, and barring further price cuts things should further improve as they finish transitioning their GPUs to 55nm. Their “performance segment “ (which we take to mean the sub-$200 GF9 parts) is now entirely at 55nm, for example.
 
Last edited by a moderator:
gross margins from previous quarter was 46.5%, now if thier margins are now at 29%, what is the difference?

The difference? You just accused someone of misreading when they were in fact not.

You try to blame everything but the fact that Nvidia had to reduce margins due to pressure from competition from a lower cost to manufacture chip from ATI to why margins were so low.

Or do you already forget about trying to attribute almost the entire margin loss to inventory write offs?

All evidence so far points to the price war being an over all positive thing for ATI/AMD. Especially when you factor in how much it has cost Nvidia.

Everyone is suffering through the recession. Had AMD not dropped prices (assuming there was no Nvidia to compete with) then they likely would have sold less units and could quite possibly have suffered a far larger drop in revenue than just 8%.

Faced with the same situation Nvidia's graphics unit suffered a 47% drop in revenue.

This isn't to say that AMD was some master of planning and knew ahead of time they would need a small efficient chip to manufacture to reduce losses for an upcoming recession. They just got lucky.

But to claim that AMD's graphics unit (ATI) did not benefit from this and didn't come out of the Q4 in better shape than Nvidia is just ludicrous.

And then to compound that by implying that somehow Nvidia is going to return to >35% margins when they have to further reduce the cost of G92 based parts is, well, mind boggling.

Even if sales of those parts won't be reflected in future financials, at the very least it will cannibalize sales of parts that will be sold.

Likewise Rv790 will be coming out soon with the prospect that Nvidia will yet again have to cut prices and margins.

None of this is to say that Nvidia is in trouble yet. Nothing that has happened is unrecoverable. And over the rest of 2009 we'll see how they adjust to the situation and how they respond to renewed pressure from ATI. Likewise over the rest of 2009 we'll have to see if ATI can continue to execute well.

However, Nvidia's Q4 other than roughly maintaining marketshare year over year has been a disaster.

Regards,
SB
 
he did missread it as you did, edited my post with more info, it is 10+ % not 10% in whole.

If you measure benefit by not helping yourself and hurting others, thats not really a benefit at all! Because the down side of it is AMD is still in the same position, thats exactly what that article I linked stated it doesn't help a competitor to go into a price war which is trying to gain marketshare in a recession (the big R is very important here and it seems like people aren't factoring that in) but can't compete in any other way.

Recessions aren't bound by price, they are bound by consumer confidence it doesn't matter what the price of an object is if consumer isn't willing to buy because of thier pocket book or outlook on thier job, they will not be perusaded into buying a product of entertainment value. The only thing that would change that would be intrensic value of the object, something other then price that would give reason for the purchase.

If there was no recession I would agree the price war would have been great because AMD would be in a profit and nV would have been at a loss or flat, that isn't what happened was it? Factor in those 8 million sales losses and it would have been rosey.
 
Last edited by a moderator:
There must be more to it than Fudzilla's letting on. That would surely be bait and switch.
 
That's RV770 isn't it?

Jawed

All and the same, there's no differenvce between RV770 and RV770 mobile as far as I know, heck, even the benchmarks on game.amd clearly say that the benchmark for the 4870 mobile was done with a 1GB 4870 card.

Besides, it's my birthday today, I'm entitled to a little bit of win.
So with design win in hand, and chips ready at manufacturing I'm pretty sure that in 33 days we'll see desktop 40nm parts that just won't be different from mobile 40nm parts. They're taking the G92b "one size fits all" approach on both markets.
 
I'm sorry but this I just find hilarious.

A Ferrari you buy for €200.000 is still a Ferrari when the prices are dropped and people can buy it at €100.000. Not even a name change to Lamborghini will change the fact that IS and will always BE a Ferrari.
That's good. GTS 250 isn't NVIDIA now? Last I checked it was.
Ferrari isn't the name of the car. It's the name of the company.
There are many cars which are essentially the same model with different name, different pricing and different time to market. I don't see nothing wrong or hilarious in that, sorry.

The thing is that it's deceiving people. It's already happening with the current renaming of 8800GT to 9800GT. I am seeing unsuspecting people (who do not know much about videocards like the the average B3D visitor) who bought a 8800GT 2 years ago buying a new 9800GT simply because the name is "1 generation" higher so they think it's faster. You should see their faces when I tell them they just bought the same card. When stuff like that happens and when the instore clerics fail to tell the customers they're about to buy the same cards... That IS bad imo.
It's bad that people have so much money that they don't use their brain when they spend them. Here's a russian point of view for you -)
Every commercial company out there will use whatever method neccessary and legal to gain income. What's legal is right until proven otherwise.
AMD is naming their Phenom IIs right now as Intel does with Core i7 while PhII are way slower than their Intel "counterparts".
No one is saint on the market, everyone's using some kind of shady tactic to sell their goods when their goods isn't good at all.
When AMD named their RV670 parts 38x0 while they were slower then 2900 parts i didn't see you being this irritated by this move while it was even worse than what NV's done with 8800GT->9800GT (they at least kept the performance level) or is doing now with GTS 250.

You might know it's the same card, but a lot of (average joe) people do not. It's just simply misleading to put a higher name tag on an older card. No matter who does it, AMD or nV. But I guess we'll never agree on that one. So let's just agree to disagree. ;)
Oh believe me i understand that. But i tend to blame those people not NVIDIA or AMD or anyone else -- it's their fault that they don't know what they're buying. We're doing independant tests and benchmarks -- for whom? For them so that they may know how what compares to what. If they're so lazy that they can't read our reviews -- that's not NV's/AMD's/Intel's/etc's problem -- that's their problem.
Any commercial company will always try to be more profitable than it is. There is no "good" or "bad" commercial companies, they're all the same.

There's nothing wrong with trying to clean up the naming mess you've got in your lineup. But doing it like this just seems to me like trying to get rid of that nasty nVentory in a very devious way.
I still don't get it. How's "GTS 250" monker "bigger" or "better" than 9800GTX+? I mean, it's not even GTX anymore and it's what? 40 times less than it was?
You should be more thorough in you claims -)

So what happens when you even the odds and do some overclocking on the HD4870?
I'm not saying that G92b is competitive with RV770, it clearly isn't and can't be.

That just depends on which reviews you take, which games they use and which settings they use. And it also depends on which GTS250 you take.. the one with 1GB or the 512MB version. I've seen plenty of benchmarks where the HD4850 beats the 9800GTX+ (in 2 days to be called GTS250).
Sorry but no, it doesn't. Whatever review you take GTX+/GTS250 will be faster than 4850. It's just the way it is.

Not really. At that time the NV drivers were already optimized for multi-core CPUs in DX10, while the Radeons got their multicore boost in the 8.12 Hotfix. So it wasn't really apples to apples and some applications get a relatively very big boost on Radeon cards after the hotfix was applied.
http://www.xbitlabs.com/articles/video/display/geforce-driver-182-06.html
Don't think that only AMD can improve performance with their drivers. My own expirience tells me that they're approximately even in this regard.

Let me get this straight:

Basically some people here think the RV740 does not benefit ATI's competitive position compared to nVidia.

I though it was obvious this chip was good news for ATI and bad news for nVidia. Am I missing something?
The only thing that everyone here seems to be missing is that it's pretty pointless to judge NVIDIA for trying to stay in the game with doing some cost cutting and line-up rearranging.
Another point that everybody should be thinking about is that comparing new AMD GPUs with one and a half year old NV GPUs is fun and all but pretty pointless.
Remember -- it's not the first to market who wins, it's the one with the best product. NV will answer to RV740/790 in the coming months and only then anyone will be able to tell who will have the better line-up for the next 6-9 months.
What NV's doing now is just some very low cost damage control, nothing more.
 
Last edited by a moderator:
Degustator, after today I'd thought you'd soften up. We've seen examples already of people wanting this "New GTX160M" because it's obviously much better than that old 8800GTX.
We know most sites used Cat 8.12 as drivers for the 4800's
We know the review cards were cherry picked and the reviewers fail to mention that the cards you see there are not the $149 they talk about.
We know there were (strict?) guidelines given on what to test, how to test and what to mention.

Hell, if it wasn't for unplayable framerates at 1920x1200 8xAA and 16xAF in Crysis (9.6 for the 4850-512 vs. 13.6 for the 1GB *OC* 250) Toms Hardware wouldn't have anything to be happy about, yet they have orgasmic screams

TomsHardware said:
Finally, the extra 512 MB on BFG’s GeForce GTS 250 demonstrates its value. At all three tested resolutions—but especially at 1920x1200—1 GB of onboard GDDR3 helps the GTS 250 almost double the frame rate of the GTX 9800+.
http://media.bestofmicro.com/9/P/181789/original/image011.png

It's Hilarious, at 1280x1024 you're talking about 0.2fps and at 1680x1050 about 1.8fps Yet it's THOSE kinds of remarks that the tools base their conclusion on. It gets demolished by the 4870 at the same pricepoint and it is only faster at unplayable framerates in other games compared to a 4850.

I'm happy they're sorting it out in the conclusion and that's best quoted
But it seems fairly certain, given our benchmark results, that the small overclock and extra 512 MB of GDDR3 memory don't really affect the card's standing against its predecessor until the resolution/detail settings are taxing beyond the point of playable frame rates anyway. For the most part, it isn't worth paying more money for the extra 512 MB–in which case, the 512 MB GeForce GTS 250 might be a better buy at $129.
 
When AMD named their RV670 parts 38x0 while they were slower then 2900 parts i didn't see you being this irritated by this move while it was even worse than what NV's done with 8800GT->9800GT (they at least kept the performance level) or is doing now with GTS 250.

1- HD 3870 is faster then HD 2900.
2- HD 3xxx introduced DX_10.1, tesselation, reduction 50% in power consuption, half the price.
3- 8800GT -> 9800GT introduced what? Nothing besides few Mhz core/mem and a new name.

Do you want to discuss deeper?
8800GTS.........G92 core, 128sp, 256bit
9800GTX.........G92 core, 128sp, 256bit
9800GTX+.......G92 core, 128sp, 256bit
GTS250..........G92 core, 128sp, 256bit


DegustatoR said:
Sorry but no, it doesn't. Whatever review you take GTX+/GTS250 will be faster than 4850. It's just the way it is.
Wrong. HD 4850 is perfectly a match for GTS 250 proving that is Anandtech review:
http://www.anandtech.com/video/showdoc.aspx?i=3523

Don't make me put here all the images. They are on the link and GTS250 and HD 4850 are perfectly matched like 9800GTX+ was.


DegustatoR said:
NV will answer to RV740/790 in the coming months and only then anyone will be able to tell who will have the better line-up for the next 6-9 months.
What NV's doing now is just some very low cost damage control, nothing more.
Are you sure? We have lots of rumors that NV 40nm are in trouble (they are latter then ATI for sure) and GT212 (the GT200b replacer and most important card) is in the toilet.
 
Last edited by a moderator:
I'm not gonna reply to your complete message, because basically I don't think we will ever agree. Ever.

But to this I just had to respond:

Sorry but no, it doesn't. Whatever review you take GTX+/GTS250 will be faster than 4850. It's just the way it is.

So under which rock have you been living today? Or did you miss all the GTS250 reviews of which most of them showed the HD4850 and GTS250 to be on par?
 
So under which rock have you been living today? Or did you miss all the GTS250 reviews of which most of them showed the HD4850 and GTS250 to be on par?

Don't try to reason. he's been nBlinded. for him if nVidia is as fast as Ati, means that nVidia is faster. if it's slower than Ati then the reviewers are biased or the drivers are bad.

I repeat: reasoning is futile.
 
2- HD 3xxx introduced DX_10.1, tesselation, reduction 50% in power consuption, half the price.
Tesselation was already across the line of HD 2000 Series GPU's; the feature differentiators were DX10.1 (as you say), Compliant PCI Express Gen 2, UVD across the range and DisplayPort.
 
Degustator, after today I'd thought you'd soften up. We've seen examples already of people wanting this "New GTX160M" because it's obviously much better than that old 8800GTX.

I'm with Degustator on this one. If someone thinks that, then its their own fault for not researching the product they intend to spend money on, it can't be blamed on NV.

For well over a decade now, both NV and ATI have released new low and medium range GPU's with higher overall numbers than the high end GPU's of their previous generations. Examples:

GFTi 4600 > GFFX 5200u
5800 Ultra > 6200GT
7900GTX > 8600GT
8800GTX > 9500GT
9800XT > X600Pro
X1900XT > HD2600
X850XPE > X1600XT
HD3780 > HD4650
Radeon 8500 > Radeon 9200

In all of the above cases the lower numbered GPU from the older family was faster so I don't see why people would only now, start believing that every GPU from a new family generation is automatically faster than every GPU from the previous generation.

If anything the GTS250 is a little better than most previous mid range GPU's in that its at least as fast, or faster than the very fastest GPU of the previous generation. And that doesn't happen too often.
 
Back
Top