What do you expect for R650

What do you expect for HD Radeon X2950XTX

  • Faster then G80Ultra about 25-35% percent overall

    Votes: 23 16.4%
  • Faster then G80Ultra about 15-20% percent overall

    Votes: 18 12.9%
  • Faster then G80Ultra about 5-10% percent overall

    Votes: 18 12.9%
  • About same as G80Ultra

    Votes: 16 11.4%
  • Slower thenll G80Ultra about 5-10% percent overall

    Votes: 10 7.1%
  • Slower then G80Ultra about 15-25% percent overall

    Votes: 9 6.4%
  • I cannot guess right now

    Votes: 46 32.9%

  • Total voters
    140
I remember the pipeline count (shaders vertex and pixel) being similiar, just TMU count higher for the 8500...
 
There were deficiences in the 8500, too: it was AFAIK a 2x3 (pipe/TMU per pipe) setting while GF3 was a 4x1, and in pure fill rate GF3 was faster, while in multitexturing things were different.
You're thinking of Radeon. 8500 was 4x2 (with 2 VS engines as well).
 
G80 is the Clock-friendly arichtecture .
How to increase 50% of performance of R600 .it is hard for R6XX to do so.
However for G9X .With 500MHz clock and 32 TCP in increase would be regarded as Die Size of the R580 under 65nm.
 
You're right!
My mistake (getting old, I think) :???:


Specification
Core Clock
Memory Clock (DDR)
Pipelines
TMUs

RADEON 8500 275 MHz
275 MHz (550 MHz)
4
2

GeForce3 Ti 500
240 MHz
250 MHz (500 MHz)
4
2

GeForce3
200 MHz
230 MHz (460 MHz)
4
2

GeForce3 Ti 200
175 MHz
200 MHz (400 DDR)
4
2

RADEON 7500
290 MHz
230 MHz (460 MHz)
2
3

GeForce2 ULTRA
250 MHz
230 MHz (460 MHz)
4
2

GeForce2 Ti
250 MHz
200 MHz (400 DDR)
4
2

GeForce2 Pro
200 MHz
200 MHz (400 DDR)
4
2
 
I'm just shocked how ridiculously poorly the 2900xt performs with AA/AF. Check nvnews front page, there is a 8800GTX review link. The 2900xt apparently loses like 45% of its performance in some games just by going up to 2x/4xaf. That's rediculous. The gtx loses like 8% in the same situation.

Here is the link : http://www.thetechlounge.com/article/418-5/PNY+8800GTX+768MB+XLR8+Edition/

In fear, 1600, loses 37% going to 2x/4x. It seems like in most games, 2x/4x brings around a 30% drop in frames. This is much more than either the GTS or the GTX. I'm just dissapointed that the company that released cards like the r300 and the r580 (both which scaled really well with AA and AF) would release something with such "broken" aa performance. I've seen xbitlab's recent benchmarks with cat 7.5's and the seem to sing a different song, but I would like to see more sources confirm.
 
Don`t say that out loud. The lads over at Rage are convinced that nVidia is doing Transparency MSAA whilst ATi is doing Quality ADAA(meaning SSAA), and that actually it`s the driver fairy that brought improvements in terms of performance, not the implementation of EATM:D
 
Interestingly, I totalled up this Xbit labs review with the newer 7.5 cats. Most game tests were at 3 resolutions, so it was easy to declare a winner as whoever won 2 or more. For 3Dmark 05/06, I only included the overall score, and did not break out wins and losses by the individual game tests. In other words, 3Dmark05 and 06 was counted as 1 test each (overall score).

Compared to stock 8800 GTX, HD2900XT won 5, lost 17, and tied 2.

Compared to 8800GTS, HD2900XT Won 14, lost 9, tied 1.

So at least HD2900XT proved clearly superior to GTS. With caveats of decreased IQ and increased power consumption. And also the fact on some games HD2900 drivers are clearly completely borked and it barely outperforms X1950XTX, whereas GTS provides much more consistent performance.

On the HD2900's side, you get the feeling due to it's superior "computing power" that it is more future proof.

HD2900XT actually performs similar or worse than X1950XTX in no less than 5 of 24 games.
 
Last edited by a moderator:
The 7.5's don't offer EATM yet officially -- when I investigated the performance and quality modes of the Adaptive -- both modes were indeed Super-Sampled with the 2900 XT.

Also, you can only go by what the reviewer offers; the X-bit reviewer claimed that they were 7.5 and adaptive was enabled.

However, I find it hard to believe that the 7.5's can offer identical scores as the 7.4's..........in every test and resolution.

Thanks goes to Blacklash for the heads-up:

2900 with 7.4 driver set

http://www.xbitlabs.com/articles/video/display/radeon-hd-2900-games.html

2900 which claims 7.5 driver set -- identical benchmarks as 7.4 across the board.

http://www.xbitlabs.com/articles/video/display/xfx-gf8800ultra-extreme.html
 
Last edited by a moderator:
What is so odd with a rev of the driver producing identical/similar benchmark results? They might have concentrated on bug fixes or improved compatiblity and stability. From the reviews shown, it seems to be wise move for ATI to fix all the graphical glitches they exhibit.

There's a sight out there (driver heaven?), that tracks driver revisions and compares them to previous drivers. The Radeon X19xx series has been relatively flat across driver updates for a long while.
 
What is so odd with a rev of the driver producing identical/similar benchmark results? They might have concentrated on bug fixes or improved compatiblity and stability. From the reviews shown, it seems to be wise move for ATI to fix all the graphical glitches they exhibit.

Not similar.

I would understand this point more if it was one test but how many? Around 30 benchmark tests? -- with three resolutions for each one -- for around 90 total?

Identical benchmarks across the board in every single title at every resolution including 3dMark, too?

If I bench using my same drivers -- sometimes I may get different results with the same test -- that is why some may average their results over-all with multiple runs.

Usually, there are always some slight variations in performance loss or gains somewhere but identical in all these tests with two different drivers?
 
Folks that own the HD 2900XT are reporting performance increases with the 7.5 cats in S.T.A.L.K.E.R. X-bits numbers do not reflect this. The numbers they report in the Ultra review are all identical to the first review they did of the HD 2900XT in every bench down to fractions.

I might add if you look at EB's recent review of the HD 2900XT with the 7.5 cats it supports forum user assertions that S.T.A.L.K.E.R. performance is up.

It's right behind the 8800 640MB GTS.

They tested with dynamic lighting, then with static lighting plus active filtering:

http://www.elitebastards.com/cms/in...sk=view&id=420&Itemid=27&limit=1&limitstart=8
 
Last edited by a moderator:
Still looks like a dodgy driver, 2048x1536 4xAA/8xAF is faster on HD2900XT than at 1600x1200 4xaa/8xAF.

Jawed
 
Maybe someone should look at that AA pattern. Didn't we have a case awhile back where Cats were automagically downgrading AA level past a certain res?
 
Maybe someone should look at that AA pattern. Didn't we have a case awhile back where Cats were automagically downgrading AA level past a certain res?

yes, to 2x, but that doesnt completely explain why the 4x AA setting would be damaging performance so much that reducing it to 2x AA AND adding a hefty amount of additional pixels provides noticeably faster performance. If anything, like most games in the past, i'd expect such a resolution increase to be very taxing.
 
mikeshardware.co.uk said:
ATI Radeon X2900 65nm GPU is expected to be released around Mid August. The successor of the Radeon X2900 XT, the 65nm refresh version is expected to feature a lower TDP alongside improved texturing and AA performance and the inclusion of UVD http://www.mikeshardware.co.uk/RoadmapQ307.htm#ATI Radeon X2900 65nm.


Is he talking about 65nm-R600 or 65nm-R650.

mikeshardware.co.uk said:
AMD Radeon R680 GPU is expected to be released in December, with availability in January 2008, and is expected to be based on a 55nm process. The R680 is the successor to the Radeon HD 2900 (R600) GPU and is expected to be based on an updated version of that architecture. R680 is expected to feature support for PCI Express 2 and will contain a hardware UVD engine. http://www.mikeshardware.co.uk/RoadmapQ407.htm#AMD R680.


Is he talking about R680 successor to 65nm-R600 or 80nm-R600.
 
Don`t say that out loud. The lads over at Rage are convinced that nVidia is doing Transparency MSAA whilst ATi is doing Quality ADAA(meaning SSAA), and that actually it`s the driver fairy that brought improvements in terms of performance, not the implementation of EATM:D

I will make sure, but I'm fairly sure that EATM has not replaced SSAA.
 
Back
Top