R520 = Dissapointment

Kombatants "you cannot go wrong with either product" is pretty spot on.

But what could have been. If Ati had released in May the figures would have seemed very good indeed. Then when G70 came out with, in general, speed and IQ not quite there then it would have seemed that nvidia had failed to catch up and would have got a mauling, however G70 coming out earlier than r520 has sort of taken some of the punch out of the ATi chip.
 
I'm disappointed with most reviews out there. One comment at Anand's was spot on: why complain about product paper launches and then release a rushed-bordering-on-invalid review just to hit a deadline too? I'm very happy Dave decided to break it into parts and publish those as self-sufficient modules instead of doing what others have done: speed through timedemos missing some cards/resolution configurations/investigating new feature's performance hits/etc.

I know I probably shouldn't be saying this but I feel so strongly about this I just had to.
 
radeonic2 said:
He did try it and said performance was shitty.

I didn't see that, but I'm playing the game at 1920x1200 with 4x AA on a X800 XT (max texture resolution, complex shaders enabled, etc.).
 
radeonic2 said:
From a pure technological standpoint I'm not sure how anyone could be dissapointed who isn't crazy.
HDR+aa, better fsaa across the board, way better AF, super efficient etc.
X1800XT= new leader.
X1800xl= mixed views... atm its a bit slower but has better quality.

X1800XT= new leader when you can actually buy one. I have a bad feeling this will be the phantom edition all over again with sky high prices on the few avail cards.


Prove me wrong ATI... I DARE you!!!! :)
 
But who is goint to buy an X1800XT knowing that Ati must take out R580 as soon as possible ?. Or, if your prefer, not many people are going to buy this card knowing that soon Nvidia will unveil its G70 90 nm refresh and afterwards Ati will attack again with the R580...

One thing i can´t understand: If Xenos is so good with 250 mill. transistors why didn´t Ati go with R600 directly ? Or, wait a minute... maybe Xenos is a bluff worse than R520 ( See the new gost recon 3 footage in ign ) and Nvidia is the real winner with its G70...
 
Last edited by a moderator:
Nvidia has scheduled its G72 graphics processing unit (GPU), which will be manufactured using 90nm process technology, to be introduced in early 2006.

The G72 will compete with the 90nm R-series from ATI announced today.

The 90nm G72 is expected to have a much smaller size than the 0.11-micron based G70 allowing for multiple GPUs to be utilized on one graphics card through SLI technology.

http://cdrinfo.com/Sections/News/Details.aspx?NewsId=15202

At this point, guess r580 will be around to duke it out with that?
 
radeonic2 said:
From a pure technological standpoint I'm not sure how anyone could be dissapointed who isn't crazy.

Oh realy?

Those who are dissappointed are probably due to:
- Product failed to live up to hype / peoples expectations too high
- Not all products are available upon paper launch / LATE
- Not all feature sets will be used upon release
- Performance is not that great compared to other manufactures - win some, lose some
- Price - debatable

Lets face it - it is late. This would have been a great card about a year ago. But not today. It wins some tests, it loses others. Honestly - put the top tier card in the same PC and place it in front of an average Joe and I bet you a soda the person behind the keyboard would not know ATI from Nvidia.

Feature sets are nice and all - but will they be used? By games today? This was an arguement we all fought back when Nvidia went forth with SM 3.0 and ATI didn't...Now we have HDR AA support from ATI but will you use it? Honestly? I mean, do you even need AA/FSAA at resolutions greation than 1600x1200 let alone any HD format? I know I disable it when I'm playing on my HDTV - zero need for it. Again, debatable.

I had higher expectations so I'm dissappointed. In fact, I will be honest and say that with those clock speeds I expected it to destroy Nvidia's 7 series. Not only that, I expected them to have every product available upon launch. It didn't do either of my main concerns. Therefore, it failed.

What this does tell me is that Nvidia's GPU's run a hell of a lot better than what people are giving them credit for. When a 400 Mhz GPU is keeping up and in some cases beating one clocked at over 600Mhz, well, one has to wonder what the hell ATI did wrong.

Again - I think hype and expectations have a lot to do with peoples reactions. Then the test results factor in the rest I bet...just my opinion.
 
Love_In_Rio said:
But who is goint to buy an X1800XT knowing that Ati must take out R580 as soon as possible ?. Or, if your prefer, not many people are going to buy this card knowing that soon Nvidia will unveil its G70 90 nm refresh and afterwards Ati will attack again with the R580...

One thing i can´t understand: If Xenos is so good with 250 mill. transistors why didn´t Ati go with R600 directly ? Or, wait a minute... maybe Xenos is a bluff worse than R520 ( See the new gost recon 3 footage in ign ) and Nvidia is the real winner with its G70...

no one knows that. its only assumed that it was an early 2006 refresh and that it will stay on track of releases where as the R520 went off track by months. Its an assumption. And frankly you can wait around for hardware forever and ever if you hold off. If you want something, you get it. Or you'll find yourself always holding out and getting no where fast.

Xenos doesnt have to cater to half the things expected of the R520. Its two different scenarios and i really wish people would stop comparing them like their apples to apples. Likewise i doubt the R600 is ready since it has to work for all features Vista enables. And vista is in beta for at least another year. I dont think they have the power or go-ahead from MS to do something like that.

Xenos is somewhere between an R520 and a unified part, the R600 should have its own special featureset.
 
Rur0ni said:
http://cdrinfo.com/Sections/News/Details.aspx?NewsId=15202

At this point, guess r580 will be around to duke it out with that?
Seems reasonable. Back at 7800GTX launch a January refresh of G70 on 90nm would have been considered a bit late. And I'm reasonably sure everyone would have expected R580 to be released competitively, around December. So now R580 gets some breathing room.

So, as I asked earlier, where are the 90nm 7200/7600s? I think they're looking like they'll be late, too.

Jawed
 
Love_In_Rio said:
But who is goint to buy an X1800XT knowing that Ati must take out R580 as soon as possible ?. Or, if your prefer, not many people are going to buy this card knowing that soon Nvidia will unveil its G70 90 nm refresh and afterwards Ati will attack again with the R580...

One thing i can´t understand: If Xenos is so good with 250 mill. transistors why didn´t Ati go with R600 directly ? Or, wait a minute... maybe Xenos is a bluff worse than R520 ( See the new gost recon 3 footage in ign ) and Nvidia is the real winner with its G70...

Bear in mind that Xenos consists of the 232 million transistor chip plus the daughter-die featuring the 10MB EDRAM, ROPs etc. From Dave's article on this site, he states that we don't actually know how many transistors this daughter-die contains but it is likely to be at least 100 million and probably more (80 million for the EDRAM alone). Total transistor budget on the graphics side of XBox360 is therefore probably going to be a lot more than R520.
 
saf1 said:
Oh realy?

Those who are dissappointed are probably due to:
- Product failed to live up to hype / peoples expectations too high
- Not all products are available upon paper launch / LATE
- Not all feature sets will be used upon release
- Performance is not that great compared to other manufactures - win some, lose some
- Price - debatable

Lets face it - it is late. This would have been a great card about a year ago. But not today. It wins some tests, it loses others. Honestly - put the top tier card in the same PC and place it in front of an average Joe and I bet you a soda the person behind the keyboard would not know ATI from Nvidia.

Feature sets are nice and all - but will they be used? By games today? This was an arguement we all fought back when Nvidia went forth with SM 3.0 and ATI didn't...Now we have HDR AA support from ATI but will you use it? Honestly? I mean, do you even need AA/FSAA at resolutions greation than 1600x1200 let alone any HD format? I know I disable it when I'm playing on my HDTV - zero need for it. Again, debatable.

I had higher expectations so I'm dissappointed. In fact, I will be honest and say that with those clock speeds I expected it to destroy Nvidia's 7 series. Not only that, I expected them to have every product available upon launch. It didn't do either of my main concerns. Therefore, it failed.

What this does tell me is that Nvidia's GPU's run a hell of a lot better than what people are giving them credit for. When a 400 Mhz GPU is keeping up and in some cases beating one clocked at over 600Mhz, well, one has to wonder what the hell ATI did wrong.

Again - I think hype and expectations have a lot to do with peoples reactions. Then the test results factor in the rest I bet...just my opinion.


As you pointed out yourself a few of your bulleted complaints could be something yanked right out of last year. The third and fourth in particuliar.

And they didnt launch on zero hour so what? Its not a true paper launch until we've been waiting a month or two. Nvidia's GTX launch did a wonderful job at wiping peoples memories clean i can tell you that.

And your clock speeds are all screwy, a stock GTX is 430, with dynamic clocks some parts (important ones anyway FPS wise) run at +40more MHz on a stock GTX. So really how far apart are they? And you're giving them zero acknowledgement on drivers. Considering that ATI is running in my opinion true AF now as well as sporting much more fancy AA which can be used in all games and the obvious need for driver revisions; i'd say you rushed your verdict
 
saf1 said:
Oh realy?

Those who are dissappointed are probably due to:
- Product failed to live up to hype / peoples expectations too high
- Not all products are available upon paper launch / LATE
- Not all feature sets will be used upon release
- Performance is not that great compared to other manufactures - win some, lose some
- Price - debatable

Lets face it - it is late. This would have been a great card about a year ago. But not today. It wins some tests, it loses others. Honestly - put the top tier card in the same PC and place it in front of an average Joe and I bet you a soda the person behind the keyboard would not know ATI from Nvidia.

Feature sets are nice and all - but will they be used? By games today? This was an arguement we all fought back when Nvidia went forth with SM 3.0 and ATI didn't...Now we have HDR AA support from ATI but will you use it? Honestly? I mean, do you even need AA/FSAA at resolutions greation than 1600x1200 let alone any HD format? I know I disable it when I'm playing on my HDTV - zero need for it. Again, debatable.

I had higher expectations so I'm dissappointed. In fact, I will be honest and say that with those clock speeds I expected it to destroy Nvidia's 7 series. Not only that, I expected them to have every product available upon launch. It didn't do either of my main concerns. Therefore, it failed.

What this does tell me is that Nvidia's GPU's run a hell of a lot better than what people are giving them credit for. When a 400 Mhz GPU is keeping up and in some cases beating one clocked at over 600Mhz, well, one has to wonder what the hell ATI did wrong.

Again - I think hype and expectations have a lot to do with peoples reactions. Then the test results factor in the rest I bet...just my opinion.
It's peoples own damn fault for thinking it would stomp the GTX in every possible way.
Get real.
I'm amazed it can even keep up with the gtx let alone stomp it in a few benchmarks.
I however am disturbed by the doom 3 and riddick performance.
If you dont think you need fsaa you either need glasses or you're lying to your self.
Ati didn nothing wrong.
They have a highly effcient 16FP part competing with a 24FP part and doing so easily in most cases while delivering superior quality.
I'm willing to bet if rolls were reversed and nvidia released a card very simlar to this instead of the 7800gtx people would be amazed a 16FP part could be so fast and when the ati came out with a 24FP part that didnt stomp a measly 16FP part they would be like wtf.

It's not ati's fault people hype everything up.
People here *dave* have been saying not expect huge performance leads over the 7800gtx and what do people do? They expect huge performance leads and are dissapointed when it's not the case.
About feature sets not being used upon release- did you say that about the 9700, then the 6XXX series?
Price has yet to seen since street prices can vary alot from the msrp,
 
radeonic2 said:
Ati didn nothing wrong.
Well, except once again fail to support OpenGL fully. Given that I use Linux frequently, OpenGL support is extremely important to me. So it looks like I'm going to wait until nVidia offers similar AA/AF features to upgrade (though I was going to wait until late next year anyway due to financial reasons....so I doubt it'll be an issue).
radionic2 said:
They have a highly effcient 16FP part competing with a 24FP part and doing so easily in most cases while delivering superior quality.
I don't think that first statement is fair. The clockspeeds of their part put the 16 pipes of the R520 on theoretical parity with the 24 pipes of the G70. In other words, they have the same shader throughput per clock.

And at the same time, given that the X1800XT has, at stock speeds, significantly higher power draw than the G70 at stock speeds, I expect the G70 has much more overclocking headroom. As such, it seems even more likely now that nVidia will be capable of releasing an Ultra version of the G70 that will pretty soundly beat the X1800XT in most programs.

The real advantages of the X1x00 are the high quality anisotropic filtering, the support for AA with FP16 rendertargets, and the dramatic efficiency improvements with certain shader loads (see FarCry, dynamic branching).

But the 7800 GTX still has its own significant advantages. Chief among them is power consumption, which directly leads to the potential for better overclocking. It also has nVidia's excellent OpenGL and cross-platform support. Then there's support for multisampling transparency AA for those titles that take a big hit on supersampling transparency AA (though this is a very minor benefit, as Humus already has a wrapper that you can use to enable a similar mode for any hardware). There's also availability now, along lower prices. And SLI is a significant benefit as well, considering that Crossfire support with the R5x0 is still an unknown (both in time of availability and in performance/resolution support).
 
Chalnoth said:
Well, except once again fail to support OpenGL fully. Given that I use Linux frequently, OpenGL support is extremely important to me. So it looks like I'm going to wait until nVidia offers similar AA/AF features to upgrade (though I was going to wait until late next year anyway due to financial reasons....so I doubt it'll be an issue).
True but I actually kinda expected that and the performance as I haven't read about any ati OGL driver breaks thoughs..
I just thought ati might surprise me.
 
Chalnoth said:
Late compared to what, Jawed?
Well we've had "rumours" of September 90nm low-/mid-range GPUs from NVidia, with thoughts (expectations) that this would lead into a December 90nm G70-refresh (with a roughly simultaneous R580).

It might only be a month or two later than this for these parts, of course.

I'm just wondering if it is going to be only a month or two. That's all. Always interesting to wonder, out loud, if NVidia is easily making the transition to 90nm.

We know that some NVidia chipsets have been 90nm for 6 months+, but GPUs are a different kettle of fish as far as I can tell.

Jawed
 
saf1 said:
What this does tell me is that Nvidia's GPU's run a hell of a lot better than what people are giving them credit for. When a 400 Mhz GPU is keeping up and in some cases beating one clocked at over 600Mhz, well, one has to wonder what the hell ATI did wrong.
You know this means nothing right?

IMO* ATI made a pretty smart decision. They attempted to make a core that would run at a higher clock and therefore they could reduce the parallel nature, and make a smaller chip, getting higher yields (hopefully). In other words it may end up being a really good idea, and making these boards get really high margins. If this is the case though the price war will commence. If it does not then we know things did not pan out.

*which is worthless
 
Jawed said:
Well we've had "rumours" of September 90nm low-/mid-range GPUs from NVidia, with thoughts (expectations) that this would lead into a December 90nm G70-refresh (with a roughly simultaneous R580).
Oh, yeah, that's right. Somebody from nVidia did make a statement about releasing lots of 90nm products this year. It will be interesting to see if nVidia is just going to aim to replace its current product line without significantly changing price/performance ratios, or if they'll attempt to put the squeeze on ATI by undercutting ATI's prices.

Of course, I'd prefer the latter as it'll make for cheaper GPU's for all :)
 
That´s what i was saying: 232 mill. + 20 mill = 252 million transistors. Take the EDRAM out and put it in a pc card with its 256 bus to memory. The unified architecture doesn´t need directx 10 and if the simulations prove that the performance is better than the one from R520...( at least ATI claimed that R500 performance was like a 32 pipe card ) once you are late in the market why not to smash nvidia now with a much better and efficient design ?
 
Uhh.. Yeah.

Don't these final release benchmarks of "trusted sources" just confirm what the blasphemous and biased Hardware Analysis preview of "leaked" benchmarks say?

Sure, we can't compare them definatively, because those were run on "special" timed benchmarks that aren't being used across the board.

But pretty much, it seems to me that his completely biased, totally off-based, wildly unprofessional journalism, told us all a couple of weeks ago what a certain group of people on one side of the fence didn't want to believe: That the R520 isn't significantly faster than the 7800, that the 7800 wins in some and the R520 wins in others.

It also seems that ATI's staunch refutation of his benchmarks, claiming them to be absurdly ridiculously low, was also a bunch of crap.

The R520 is comparable, not superior, to the 7800 in benchmark performance. IQ, "extra" feature set, etc.. is up to the eye and mind of the beholder. And the reality is that even if the R520 were slower across the board, a certain set of people would still say it was superior due to those extra features/quality, anyway.
 
Back
Top