Anand talk R580

serenity said:
That proves that the 7800 had a 5 month lead time over the X1800, not your point. ;) (The point is, the difference in IQ is negligable. As long as speed difference >> IQ difference, it's pretty clear what most people will buy.)

:rolleyes:

Post #149:
I'm talking strictly business side of things here and all as always IMHO.
 
Tridam said:
So your point is that ATI shouldn't improve AF quality because most users just buy the faster board anyway ?

Of course not. My point is that ATI executed badly and produced much more hype than what they actually were/are able to deliver. Hopefully they'll do better next time.
 
serenity said:
But you tried to use "most prefer speed over IQ arugment" by saying "we'll see which is the most popular card" analogy. :LOL:

And that still stands as well, provided the difference in IQ is as small as it is between GTX and X1800. Has nothing to do with the above comments on business side of things, though.
 
_xxx_ said:
Frankly, when I saw those screenshots the first time I didn't even notice any difference until someone here pointed it out.

Here is a bigger problem...while I know that some people will not notice them until they are pointed out...I saw it right away (less than 1 sec). And once you know what it looks like then you even start to notice it in an faster pace action game. So it very hard to said its "small" difference as some like me will find this to be a big difference.... And neither one of us is right or wrong in this matter...people see things differently...
 
_xxx_ said:
Of course not. My point is that ATI executed badly and produced much more hype than what they actually were/are able to deliver. Hopefully they'll do better next time.
Try this AF videos: X1800 / 7800 (16x AF with max. quality). I think that the difference is visible enough.
 
_xxx_ said:
Of course not. My point is that ATI executed badly and produced much more hype than what they actually were/are able to deliver. Hopefully they'll do better next time.

Are you suggesting that they have actually delivered something???
 
Ever try a GeForce using a home projector?
The shimmering is nuts aweful in BF2

You've identified there are two types of users.

The simple user that makes a decision based on bar graphs, and the other who has a little more understanding of what is going on - making their decision that they want the better overall experience.

You don't think ATI could up their performance bars with bilinear and filtering hacks too?
 
no-X said:
Try this AF videos: X1800 / 7800 (16x AF with max. quality). I think that the difference is visible enough.
Wow the difference there is pretty big. Is the AA the same on both? The AA looks a lot worse on the 7800.

Why is the draw distance for the plant life on the 1800 lower than 7800?

Jawed
 
ATI needs to get R580 by Jan-Feb. have a clocked bumped slight refresh, the R590 in summer, and then boost R600 specifications significantly to ensure that nothing Nvidia brings out (G80, G81, G82) can touch it.
 
Megadrive1988 said:
ATI needs to get R580 by Jan-Feb. have a clocked bumped slight refresh, the R590 in summer, and then boost R600 specifications significantly to ensure that nothing Nvidia brings out (G80, G81, G82) can touch it.

Have there been any reports of an R590? I can't remember.
 
Megadrive1988 said:
ATI needs to get R580 by Jan-Feb. have a clocked bumped slight refresh, the R590 in summer . . .

When I read Wavey talking about "first" R580 boards not having GDDR4, I tend to think this is indeed the way things are headed.

, and then boost R600 specifications significantly to ensure that nothing Nvidia brings out (G80, G81, G82) can touch it.

When I read statements like that about either IHV, I am always reminded of Snidely Whiplash, the arch villian who could be counted on to try mightily, and with some skill and acumen, to NOT QUITE win at the end of this weeks episode. Every week. Ad inifinitum.

If the last 3.5 years have taught us anything at all (dear god, let them have taught us something!), it is there are no Snidely Whiplashs in this business. It is unsafe and unrealistic to think you can ever accomplish "nothing can touch it" except as a backwards looking historical metric.
 
Last edited by a moderator:
Jawed said:
Wow the difference there is pretty big. Is the AA the same on both? The AA looks a lot worse on the 7800.

Why is the draw distance for the plant life on the 1800 lower than 7800?

Jawed
I didn't prepare these videos. I can only say, that their autor used 8.173 drivers for Radeon (+fix), 78.01 official drivers for GeForce and this hardware configuration. In-game settings were probably the same.
 
Graphics_Krazy said:
Ever try a GeForce using a home projector?
The shimmering is nuts aweful in BF2

Any user with a bit sense for filtering quality that plays BF2 with filtering optimisations enabled on any GPU deserves to be shot.

You don't think ATI could up their performance bars with bilinear and filtering hacks too?

What exactly have they been doing for years now? Or are there ANY Radeons that don't use filtering optimisations by default since the dawn of R100?

That said I'd like to applaud Tridam's post above, which deserves another quote:

Is there something that is not a "cube" in Doom and Quake ? Is there very vast surfaces ? No so how could there be a difference without the angle dependant optimizaton ??

It's very wrong to talk about ATI's HQ aniso as something discovered with a magnifying glass. It improves the quality on some surfaces. Actually it doesn't improve anything and just do the filtering right on every surface. What it does is not decreasing the quality as other AF does. IMHO angle-dependant AF is a shitty trick. And the funny thing is that ATI invented it and now corrects it.

And yes I agree that less angle dependency is something that is noticable in more occassions then many imagine; it could be that too many got used to angle-dependency that they can't notice a difference anymore. Still an oxymoron; if I drink decaf for months and you sneak in a double pumped espresso into my cup, my eyeballs will fall out.
 
Its funny how people are saying ati's angle independent AF is something you need to be looking for to see it.
Play a game besides an FPS for once in your insignificant lifetime!

oh and nvidia's filtering looks aweful lin that video.
Are all optimizations off?
shimmer city!
Reminds me off my 8500!
 
Last edited by a moderator:
While nothing concrete is known about the 90 nm refresh of the G70, we can speculate that the product replacing the 7800 GTX will at least reach 550 MHz and that the high end could possibly hit 700 MHz at maximum. NVIDIA will not stop at just clock speed. We can expect that it will have a full 8 quads of pixel shaders, 10 vertex shaders, and 16 “Superâ€￾ ROPS that will be able to handle the output of those shaders. I have heard rumors that the AA unit will be getting a makeover and it will be able to handle HDR anti-aliasing. I have also heard rumors that texture filtering will also be getting a boost and we can expect texture quality to match that of the older FX series. This product could easily hit 380 million transistors, and with the addition of 90 nm Low-K (remember, the regular G70 is 110 nm FSG- it does not get a transistor performance increase by using Low-K) this product will hit some impressive clockspeeds. One thing that does not look to change will be the memory controller. NVIDIA does not feel the need for a programmable memory controller like ATI has as of yet, and will instead rely on faster GDDR-3 memory to make up the difference.

The introduction of the 512 MB 7800 GTX is another success for NVIDIA in a year that has proven them dominant in the market. Do not expect NVIDIA to have such an easy time next year though. While ATI may have floundered with the R520 and RV530, the R580 and its midrange/budget derivatives are well on their way to the market. ATI is apparently very happy with how the R580 is coming along, and it looks as if all production on R520 parts will be halted by December. This means that the final R520 chips off the line will be delivered in a late February timeframe for fulfilling the final orders for the X1800 series. This means that ATI is gearing up for a new product to replace the R520 around that time. It is likely that the 512 MB 7800 GTX will be overshadowed by an ATI product at that time, but for the next four months NVIDIA will keep a hefty premium on this part. The big question that comes to mind is if NVIDIA will be able to supply its 90 nm part in quantity at that time and offer instant availability for the high end to continue to compete with ATI? While NVIDIA has promised instant availability of products upon release, they have their work cut out for them to deliver by late Winter/early Spring of next year.


http://www.penstarsys.com/previews/graphics/nvidia/512_7800gtx/512gtx_3.htm

no idea how reliable that stuff is, will be interesting if true.
 
radeonic2 said:
Its funny how people are saying ati's angle independent AF is something you need to be looking for to see it.

What's amazing is all the people who were claiming the opposite back in the days when NV had angle independence and ATI did not. ("not very perceptable because most games have 90 degree angles")
 
Back
Top