AMD: R7xx Speculation

Status
Not open for further replies.
http://enthusiast.hardocp.com/article.html?art=MTQ4MCwxLCxoZW50aHVzaWFzdA==

They are in this article. I guess it's unfair to say trounced, more like slightly better at DX9. I thought remembered a recent H article showing 3870X2 doing better than in this article, but I cant find it offhand if it exists.

There are also recent articles at H showing 3870X2 performing on par or a bit above 9800GTX on Ass Creed (though I guess you could probably attribute that to DX10.1 if you want to downplay 3870, it's always something) and Rainbow Six Vegas 2.

Personally, I like real gameplay benches, but I hate the way H does it. It just muddles things up so much it always tends to look like little difference between various cards.

They should do their "real world gameplay", except with apples to apples benchmarks with clearly readable graphs, and ditch the best playable nonsense imo. It really kind of flies in the face of true benchmarking to be using different settings on different cards imo.
 
From the benches I've seen, I would conclude that the HD 3870 X2 is generally a faster card than single-chip nVidia offerings (at least the G92 based ones). There are games where CF doesn't scale very well, then it's either a driver problem and it will or won't be fixed, or it's a game engine (optimization?) problem and it affects all multi-GPU setups.

And by the way, you can't objectively benchmark Oblivion while fighting enemies. Remeber the point is to compare the cards to each other, not to determine which one provides a playable framerate with the given settings. You can get comparable number by running around peacefully, not by fighting enemies.


AH, but if you do the former and not the latter, then you are selling the viewer a bad set of goods as they will looks and go "WOW, 80FPS with eveything cranked" only to find that when they ACTUALLY play the game, they are really getting 50-60FPS. Now not to under score that either is bad fps wise, just that the first is very misleading compared to actual game play.
 
http://enthusiast.hardocp.com/article.html?art=MTQ4MCwxLCxoZW50aHVzaWFzdA==

They are in this article. I guess it's unfair to say trounced, more like slightly better at DX9. I thought remembered a recent H article showing 3870X2 doing better than in this article, but I cant find it offhand if it exists.

There are also recent articles at H showing 3870X2 performing on par or a bit above 9800GTX on Ass Creed (though I guess you could probably attribute that to DX10.1 if you want to downplay 3870, it's always something) and Rainbow Six Vegas 2.

While I will give it up to teh DX10.1 thing, if it works and gives them a boost, then it should be seen. We as gamers want the best possible exsperience playing our games, period. But at the same time, it should not take 2 cards or 1 card with 2 GPUs to equal one.

For the longest time now, since the R300, Nvidia has been the slower single card solution and in a good number of cases, multicard(GPU) aswell. I have cards fom both companies in my house, though none of my ATI are in use as they are AGP and I have the fastest AGP card I own sitting in that slot now, I have stayed away from SLI/XFire from its first appearence until last year when I finally made the dive. I want ATI to come out with a competitive Singe GPU solution, not a hacked X2 solution as that really doesn't help to push NVidia at all.

I will be honest with you, I thought ATI was gonna hav something with the R600, just like everyone else because of specs. Then the G80 launched, shocked everyone(including ATI it would have seemed), and ATI just hasn't rebounded like everyone has hoped they would. But they are making Nvidia have to fight for the area the brings the most cash and that is the 150-350 dollar segment(yes I know this is actually mulitple performance areas).

Personally, I like real gameplay benches, but I hate the way H does it. It just muddles things up so much it always tends to look like little difference between various cards.

They should do their "real world gameplay", except with apples to apples benchmarks with clearly readable graphs, and ditch the best playable nonsense imo. It really kind of flies in the face of true benchmarking to be using different settings on different cards imo.


I'm adressing this sperately. I can understand you point of view. Make all settings the same and give me the numbers type of thing. But I kinda like the way they do it as it gives you an idea as to what is playable without you having to muddle around with settings trying to find that playable zone.

http://www.hardocp.com/article.html?art=MTQ3MiwzLCxoZW50aHVzaWFzdA==

The above link shows teh 9600GT SLI vs a single 8800GTX. pretty interesting, but for whatever reason(maybe I'm not on the same map/load area they used yet) but I have 2x XFX XXX 9600GTs in SLI, C2Q6600 OC'd to 3.0, 4GB DDR800, VistaU64, Multi GPU update for Vista, 1.1 patch for crysis, all settings at high, no AA and I'm getting solid 30-35fps all the time at 1680x1050.
 
Last edited by a moderator:
Returning back to original question how well upcoming single (RV770) Radeon HD4870 will perform against GF9800GX2.


For competition sake, I hope it performs better than the GX2, but if the rumor of it equalling the X2/GTX/Ultra are true, then ATI needs to be shot for giving up.
 
AH, but if you do the former and not the latter, then you are selling the viewer a bad set of goods as they will looks and go "WOW, 80FPS with eveything cranked" only to find that when they ACTUALLY play the game, they are really getting 50-60FPS. Now not to under score that either is bad fps wise, just that the first is very misleading compared to actual game play.
Usually, you don't choose a card that would play your favorite game at the highest settings regardless of the cost. Usually, you have a sum of money you can spend on a graphics card and you want to get the best bang for your buck. The main focus is to compare the cards, that's why the settings are usually set so high that it's unplayable on some cards. It's then up to the reviewer to remind you that in actual gameplay, FPS will get lower at times, but it's impossible to objectively compare that. So you have to look at other games than Oblivion.
Bloody shame....I had sooooo much hope for r700 following r600's failed attempt at retaking the crown.
You want the R700 to fail? :???:
 
FUDzilla reckons that RV770XT will sell for just $229, falling to $200: http://www.fudzilla.com/index.php?option=com_content&task=view&id=7363&Itemid=1

ATI's Radeon RV770XT based cards powered with GDDR5 memory will end up with a $229 suggested etail price and it’s likely to drop quickly to under $200 mark. This is the price for 512MB card that should be the top of the mainstream / performance line of ATI RV770 generation.

Naturally, the RV770PRO card with 512MB of GDDR3 memory will end up cheaper and it will launch at under $200 suggested etail price.

RV770XT will replace Radeon 3870 512MB cards on the market while RV770PRO should replace 3850 cards of live on top of them.

We believe that ATI won’t kill its affordable Radeon 3870 and 3850 cards and will make them co-exist with RV770 based products, at least for the time being.

Fudo also reckons that RV770 will launch 2 days before GT200: http://www.fudzilla.com/index.php?option=com_content&task=view&id=7362&Itemid=1

RV770XT with GDDR5 memory support and RV770PRO with GDDR3 support based cards will launch on June 16th, just days before Nvidia's Geforce GTX 280 product codenamed GT200.

ATI wants to spoils Nvidia's party but in the end, it will all come down to the fact which will be faster. From what we know it looks like GT200 based products should end up faster than RV770XT, but we believe that R700 with its two RV770 chips should be the real GT200 competitor.

As we said in a separate post, R700 dual chip card will launch at some point in August which gives Nvidia quite a lead.

(See also http://www.fudzilla.com/index.php?option=com_content&task=view&id=7364&Itemid=1 for claimed GT200 launch date of June 18th).
 
:LOL:

So ATI has really given up then....or should I say AMD ? :cry:
Err, what?

It´s RV770, not R700. Also, if e.x. the XT really is as fast as some sources claim, it would be faster than even I expected it to be. You also have to take into account that the XT is dual-slot which means that there´s quite some o/c-headroom for AIBs (customers) left. At the current rumoured prices this is gonna be one hell of a SKU (4870). Hopefully GDDR5 supply isn´t gonna limit them too much.
 
RV770Pro ~1.25x 88GT and RV770XT ~1.25x 98GTX seems more likely or? ;)

If this is what CJ is saying "yes" too, than that is pretty good imo and about what we've kind of expected. Then if it's really priced at 229 and less as the other new posts suggest..that's pretty nice. 1.25X 9800GTX for $100 less?

As for 9800GX2..well that's going to be an up and down card like any SLI card so..comparisons to it's top end dont mean much to me..when "on" it's twice as fast as G92..which obviously isn't most of the time.
 
Usually, you don't choose a card that would play your favorite game at the highest settings regardless of the cost. Usually, you have a sum of money you can spend on a graphics card and you want to get the best bang for your buck. The main focus is to compare the cards, that's why the settings are usually set so high that it's unplayable on some cards. It's then up to the reviewer to remind you that in actual gameplay, FPS will get lower at times, but it's impossible to objectively compare that. So you have to look at other games than Oblivion.

Cost has everything to do with it. Lets say you have 200 to spend on a card, you are limited to 1280x1024 because of the monitor you. You read 3 sites, all of which did walkthrus, flybys and cutscenes for their benching purposes. And going with those sites they all say the card performs great at your resolution with max settings and was faster than the competition at the same settings. So you buy your card based on that info, take it home, install it into your machine and go to play several games you have and find out it isn't performing as you thought it would, even some which were used in the reviews. Now you start to wonder why, well you go back and check the sites, your cpu is comparable to theirs, little lesss ram so what is the issue. During your search for the cause you discover another review that shows results more inline with what you have experienced because they actually play the game.

So again, which review does the end user more credit, the ones that use cutscenes, flybys, walkthrus and ingame benches that in no way reflect actual game play and can thus mislead the possible consumer as what to expect or the review that is done by actually playing the game?
 
If this is what CJ is saying "yes" too, than that is pretty good imo and about what we've kind of expected. Then if it's really priced at 229 and less as the other new posts suggest..that's pretty nice. 1.25X 9800GTX for $100 less?

As for 9800GX2..well that's going to be an up and down card like any SLI card so..comparisons to it's top end dont mean much to me..when "on" it's twice as fast as G92..which obviously isn't most of the time.


I'll believe it when I see it. Given there is not alot of performance difference between the 8800gt and 9800gtx, that must mean that either A) GDDR5 isn't doing a damn thing for it other than helping AMD lose money or B) The chip itself is yet still with issues that were in the R600. Why even bother with GDDR5 for all that bandwidth if it isn't even going to be used? As that is what it looks like here, unless I have missed something, the only 2 things different between the Pro and the XT is clocks speeds and type of memory used.
 
Status
Not open for further replies.
Back
Top