Anand talk R580

Will it not be the case that Nvidia will also have to spend transistors on non-angle indepedent AF and improved SM3.0 branching in order to keep up with ATI's new features? Those are transistors that Nvidia then cannot spend on simply increasing pipes.
 
SugarCoat said:
I will kindly point out that they did not disable the FEAR mis-optimization that ATI has running for their cards in that driver, for that particuliar game.
why should reviewers disable AI ? ATi is suggesting to use it. It is enabled by default. 99.9% of users won't touch it. If it is a driver bug - it is a driver bug.
Cat 5.11 is WHQL. SHould we all start checking all kind of switches to see what happens?
Should we do same thing for NV drivers too ?
One either uses "stock" settings on both cards, or not.
Tweaking one card, but not tweaking the other is not how a non-biased reviews should be made
 
SugarCoat said:
And to be honost, while i can be critisized for it, Nvidia charging 700 dollars for a card that performs 1 frame faster in SC with AA enabled (only tests i look at, you dont buy cards like that with no AA), is a loss.

only 1 frame?

x1800xt removing the bug performs @ 1920 as like 7800gtx-u 512 performs in 1600 or I'm wrong??
IMG0014782.gif
 
I found this today from mydrivers.com but as usual the translation makes me dizzy as im already bad enough on english.

"
Before we already reported ATi was grinding sends the R580 graph chip the news. Now acts according to the news which divulges indicates, the R580 graph chip uses 13 metals in the craft interlocks, ILD, uses the improvement the Low-k craft and the gap compound excessively level, by further promotes the operating frequency. Some hearsays indicate, because between TSMC and IBM has the technology exchanges the agreement, but TSMC again is the R580 generation of labor merchant, therefore IBM in G70 when usefully arrives partial technology, also will appear in R580.

The hearsay points out, R580 and RV530 same, uses the picture element exaggerates the pipeline and the texture unit separation formula design. Under the traditional overhead construction design pattern, the single scroll picture element exaggerates the pipeline contains covers all exaggerates the process, the R580 design and this traditional design is different. ATi starts from the R300 graph chip, each picture element exaggerates the pipeline all matches 2 ALU, 1 entire labor, 1 part time worker. ncVidia G70 each picture element exaggerates the pipeline matches 2 entire labor ALU, therefore with G70 similar, the R580 graph chip will have 48 picture elements exaggerates the pipeline, will amount to 96 ALU, 16 textures units. Moreover, the hearsay indicates, in ATi plan RV560 will have 24 picture elements exaggerates the pipeline, 8 textures units. At present RV530 has 12 picture elements exaggerates the pipeline, 4 textures units.

The hearsay indicates, R580 will not greatly possibly promote in the near future, but will be can issue around the next year spring Cebit2006 congress. Certainly R580 final specification and issue date or must official tell us by ATi."

http://translate.google.com/transla...&hl=en&ie=UTF-8&oe=UTF-8&prev=/language_tools
 
seems that the memory controller makes an huge difference with AA and/or hdr
that makes fear better on a x1800xt than a 7800u 512, make the radeon bettern than gtx with an engine (Quake4) that in the past give nvidia an advantage and makes playable aa+hdr


now the drivers are early or beta, I bet thatn, when the drivers will start to use mem, controller tweak, the x1800xt will boost other titles after quake4 and fear

this are the actual results
IMG0014782.gif



IMG0014788.gif


IMG0014793.gif

IMG0014818.gif








is the 200$+ of price difference between 7800gtx 512 and x1800xt 512 worth this extra fps?
and what about the lack of AT filtering quality and msaa+hdr in g70?

how many of us thinks that it's worth?
 
SynapticSignal said:
is the 200$+ of price difference between 7800gtx 512 and x1800xt 512 worth this extra fps?
and what about the lack of AT filtering quality and msaa+hdr in g70?

how many of us thinks that it's worth?

If you're the kind of guy looking for that, than it's worth as much as you're ready to pay. To me, it's not worth it. Having 20 fps less and $200 more in my pocket is ok with me :)

AT filtering and MSAA+HDR: who cares when you're playing @19x12 and even more? :LOL:
 
Spring 06 is too late for the R580 unless it has 24+ "pipes" in old terms (please don't start the 16-1-3-2 kind of discussion here) at least. NV must be much further in development of the next-gen or two by now, they'll rip them a new one _again_ as I already suggested for this round quite a few times over the last few months. If R580 doesn't come until christmass, paper launch at least, ATI will be in serious trouble business-wise. Already is IMHO.

If they should manage to release R580 until then, they would still have a chance to keep some market share on the high-end side of things, but it doesn't look like that will be the case. Not being able to deliver R520 in quantities definitely made everyone but the most hardcore fans turn their heads to nV this round.

I'm talking strictly business side of things here and all as always IMHO.
 
_xxx_ said:
If you're the kind of guy looking for that, than it's worth as much as you're ready to pay. To me, it's not worth it. Having 20 fps less and $200 more in my pocket is ok with me :)

AT filtering and MSAA+HDR: who cares when you're playing @19x12 and even more? :LOL:

I care a lot about quality.. you don't?
the high resolution don't make the AT filter better
af-quality.jpg
 
SynapticSignal said:
seems that the memory controller makes an huge difference with AA and/or hdr
that makes fear better on a x1800xt than a 7800u 512, make the radeon bettern than gtx with an engine (Quake4) that in the past give nvidia an advantage and makes playable aa+hdr


now the drivers are early or beta, I bet thatn, when the drivers will start to use mem, controller tweak, the x1800xt will boost other titles after quake4 and fear

this are the actual results

<graph>

is the 200$+ of price difference between 7800gtx 512 and x1800xt 512 worth this extra fps?
and what about the lack of AT filtering quality and msaa+hdr in g70?

how many of us thinks that it's worth?

You found the graph were X1Ks performs very well. And what about some other results : ;)

IMG0014785.gif


IMG0014789.gif

(FP16 for both)

IMG0014799.gif

(FP16 for Nvidia, FX10 for ATI)
 
SynapticSignal said:
I care a lot about quality.. you don't?
the high resolution don't make the AT filter better
af-quality.jpg

Frankly, when I saw those screenshots the first time I didn't even notice any difference until someone here pointed it out.

While ATI _does_ have a bit better AF, it's not _that_ better. In most situations you'll be hard pressed to see any difference at all. Try D3/Q4 in Ultra mode (then it enables 8xAF) and do a comparison.

But that's not the point. You have 2 products with _nearly_ the same IQ, no real diffrerence noticable to a normal user. One of these cards spanks the other card's arse in most games by a significant margin. At this point, any IQ difference discussion discovered only with magnifying glass is just dead. People who look for performance will buy the faster HW, period.
 
_xxx_ said:
Frankly, when I saw those screenshots the first time I didn't even notice any difference until someone here pointed it out.

While ATI _does_ have a bit better AF, it's not _that_ better. In most situations you'll be hard pressed to see any difference at all. Try D3/Q4 in Ultra mode (then it enables 8xAF) and do a comparison.

But that's not the point. You have 2 products with _nearly_ the same IQ, no real diffrerence noticable to a normal user. One of these cards spanks the other card's arse in most games by a significant margin. At this point, any IQ difference discussion discovered only with magnifying glass is just dead. People who look for performance will buy the faster HW, period.

Is there something that is not a "cube" in Doom and Quake ? Is there very vast surfaces ? No so how could there be a difference without the angle dependant optimizaton ??

It's very wrong to talk about ATI's HQ aniso as something discovered with a magnifying glass. It improves the quality on some surfaces. Actually it doesn't improve anything and just do the filtering right on every surface. What it does is not decreasing the quality as other AF does. IMHO angle-dependant AF is a shitty trick. And the funny thing is that ATI invented it and now corrects it :D
 
Tridam said:
It's very wrong to talk about ATI's HQ aniso as something discovered with a magnifying glass. It improves the quality on some surfaces.

To each his own, I guess. I see no difference in 90+% of the situations, and very minor ones in the rest 10%, mostly only visible in still shots. How much speed does X1800 lose with max-IQ AF? On the other side, nV's TrSAA looks better than ATI's AAA for example. Or not? ;)

But as I said, that's not the point. The point is, the difference in IQ is negligable. As long as speed difference >> IQ difference, it's pretty clear what most people will buy.

How about the (un-)availability of the cards, or of Crossfire systems? Is Avivo working with current drivers? I remember I read somewhere that it's not fully in the driver yet, but I'm not sure.
 
Tridam said:
IMHO angle-dependant AF is a shitty trick. And the funny thing is that ATI invented it and now corrects it :D


I hope they wont change their mind in the future products. I really like this feature and I am collecting money for a graphic card upgrade just for this. I so wish Nvidia follows ATI on this one with the next generation of their products.
 
_xxx_ said:
To each his own, I guess. I see no difference in 90+% of the situations, and very minor ones in the rest 10%, mostly only visible in still shots. How much speed does X1800 lose with max-IQ AF? On the other side, nV's TrSAA looks better than ATI's AAA for example. Or not? ;)

But as I said, that's not the point. The point is, the difference in IQ is negligable. As long as speed difference >> IQ difference, it's pretty clear what most people will buy.

Filtering quality matters for some people buying high end boards and it makes sense. Of course some prefer speed and it makes sense too.

IMG0014810.gif

The performance drop with HQ AF is small (Pacific Fighters is an exception).

T/A AA are the same stuff, AFAIK there isn't one looking better than the other.
 
I didn't want to discuss quality but the impact on the average end-user's choice. Especially if you look at the availability of the XT boards (I can find none in german largest e-tailers' sites and even XL are quite hard to find).

We'll just take a look at the market share in a few months, ok? :)
 
Tridam said:
Filtering quality matters for some people buying high end boards and it makes sense. Of course some prefer speed and it makes sense too.

IMG0014810.gif

The performance drop with HQ AF is small (Pacific Fighters is an exception).

Regarding Pacific Fighters - angle dependant AF is really a pain in flight simulators so even a 20+% performance hit might be worth it.
 
_xxx_ said:
I didn't want to discuss quality but the impact on the average end-user's choice. Especially if you look at the availability of the XT boards (I can find none in german largest e-tailers' sites and even XL are quite hard to find).

We'll just take a look at the market share in a few months, ok? :)
That proves that the 7800 had a 5 month lead time over the X1800, not your point. ;) (The point is, the difference in IQ is negligable. As long as speed difference >> IQ difference, it's pretty clear what most people will buy.)
 
_xxx_ said:
I didn't want to discuss quality but the impact on the average end-user's choice. Especially if you look at the availability of the XT boards (I can find none in german largest e-tailers' sites and even XL are quite hard to find).

So your point is that ATI shouldn't improve AF quality because most users just buy the faster board anyway ?
 
Back
Top