R520 = Dissapointment

radeonic2 said:
It's peoples own damn fault for thinking it would stomp the GTX in every possible way.
Get real.
I'm amazed it can even keep up with the gtx let alone stomp it in a few benchmarks.
I however am disturbed by the doom 3 and riddick performance.
If you dont think you need fsaa you either need glasses or you're lying to your self.
Ati didn nothing wrong.
They have a highly effcient 16FP part competing with a 24FP part and doing so easily in most cases while delivering superior quality.
I'm willing to bet if rolls were reversed and nvidia released a card very simlar to this instead of the 7800gtx people would be amazed a 16FP part could be so fast and when the ati came out with a 24FP part that didnt stomp a measly 16FP part they would be like wtf.

It's not ati's fault people hype everything up.
People here *dave* have been saying not expect huge performance leads over the 7800gtx and what do people do? They expect huge performance leads and are dissapointed when it's not the case.
About feature sets not being used upon release- did you say that about the 9700, then the 6XXX series?
Price has yet to seen since street prices can vary alot from the msrp,

Get real? Last time I checked I didn't set the expectations. I'm just stating the facts that people are a bit disappointed. Some people have stated that here in this thread alone.

Based on clock speeds alone I expected more and do not see it. Now factor that with the fact not all of the cards are available upon launch, well, that is another loss to me. Feature sets are up to the buyer - I think someone in the thread said it best - what is the life expectancy of this card. That will factor in I believe more than anything else.

Image quality is difficult to compare - why? Because it is in the eye of the beholder. It will vary from people to people. You have those on these forums who can tell and you have the average consumer who can not. They do not over clock, they do not update their drivers every month or qtr, nor can they tell the difference between Nvidia or ATI products if the box didn't have a sticker on the side!

This does not mean the 5xx series of cards is a bad card. Just means that to some people it will not fit the bill and they will wait it out.

As far as my comment about AA/FSAA - subjective, wouldn't you agree? To say I'm lieing is not fair. I have a 60" wide HDTV which I run both a 9800 on and a 6800 GT. I play Doom 3, EQ2, and WoW on it with zero settings and find the image quality just fine. And my eye sight is 20/20 just for the record. You might find jaggies, but I do not.

Clock for clock, I expected more out of the ATI card and I'm not seeing it. It is winning, which is what I expected. But it wins some, and loses some. I think in time this may change if honest reviewers get more time with them. It may be that you can not compare apples to apples. Regardless, you can compare and contrast and note the differences.
 
digitalwanderer said:
But John! Their "average gamer" would never know of such a thing or be capable of doing it! :oops:




(Sorry, just bitterness...I'll try and stop)

The average gamer doesn't even know what an .ini file is. I would say that less than 20% of even the consumer who builds their own pc would know how to create that AA setting in a game on their own.

edited to fix my awful grammar
 
Last edited by a moderator:
Pardon me for just reading the thread title, but I don't know how often I estimated that I don't expect a "G70-killer" in R520. I call it good instinct and nothing else.

It's a highly competitive product with both it's advantages and disadvantages just as the G70. At least FINALLY we're over those extremely silly 24/32p rumours and "za R520 shall wipe da floor with the G70" crap.
 
ondaedg said:
I would say that less than 20% of even the consumer who builds their own pc would now how to create that AA setting in a game on their own.
Wow, I guess I can't argue with such strong and convincing evidence that you pulled out of your ass like that. :rolleyes:
 
Unless someone can show me a screenshot comparison otherwise. ATI has the higher IQ of the two. That Aniso of ATI is just DIALED.
 
What amazes me here is the sudden change in preference when choosing a video card. Back when the NV40 was competing with the R420, the majority of this board declared that SM3.0 was useless and the overall feature set of the NV40 was great, but the raw speed of R420 was all that was important to them. Now that the reviews have shown that the X1800XT is not the speed demon everyone thought it was going to be, suddenly the preference has become on the featureset of the X1800XT over the 7800 GTX. I love how opinions change to accomodate the times...

It is also amazing to me that no one is talking about Sander's amazingly accurate review now that we have hindsight. What about ATI telling the media that Sander's benchmarks are flat out wrong? Didn't they state in rebuttal that the benchmarks will show that the X1800XT will completely outclass the competition and establish it as the leader?

Furthermore, there are no X1800XTs and if they do indeed ship (at least a month from now) at 549.00, are they truly competitive with the GTX which is below 450.00 (and as low as 430.00)?

And why aren't we discussing the midrange cards when it was pointed out to me that the majority of this forum will not purchase the high end but will likely purchase a midrange part (as shown by a poll conducted here)? Nvidia's midrange parts which are all of a year old are beating up on ATI's new part. Yet there is no mention of this.

I call it like I see it. Only if the X1800XT ships at the same price as a similarly performing GTX, then I would call it a draw or a slight win for ATI (due to the image quality improvements which I like). All these reviews has shown me is how good Nvidia has executed over the last 12 months and overall how good the NV40 architecture is.
Does anyone see a mobile part that is even competitive with Nvidia's in the next 6-12 months?

BTW, wasn't it typical that BZB threw in the obligatory "Nvidia is probably using cheating drivers" statement? Thanks BZB, that was classic! It wouldn't be a Beyond3d thread without it ;)
 
digitalwanderer said:
Wow, I guess I can't argue with such strong and convincing evidence that you pulled out of your ass like that. :rolleyes:

And you implied that the majority of gamers are more than capable of creating that setting with no evidence of your own. The only difference is I will not resort to vulgarities.
 
ondaedg said:
It is also amazing to me that no one is talking about Sander's amazingly accurate review now that we have hindsight. What about ATI telling the media that Sander's benchmarks are flat out wrong?

Yes Sanders review of the X1800 pro was spot on.
:rolleyes:
 
AlphaWolf said:
Yes Sanders review of the X1800 pro was spot on.
:rolleyes:

I wasn't aware that it was an X1800 Pro. If it was, then my bad. I will have to look that over.

edited: It looks like he named the Pro which was really the XL.

Keep in mind, I am not saying his tactics were correct in any way. I am merely stating that the numbers that he posted are pretty in line with what the reviews are showing except for the Splinter Cell numbers.
 
Last edited by a moderator:
ondaedg said:
BTW, wasn't it typical that BZB threw in the obligatory "Nvidia is probably using cheating drivers" statement? Thanks BZB, that was classic! It wouldn't be a Beyond3d thread without it ;)

Nvidia did send out beta, unsupported review drivers to all the people reviewing R520 that up benchmark scores - just a couple of days back. There are still regular complaints about Nvidia drivers not rendering things they should or getting higher scores in benchmark mode than in game playing.

What else would you call that standard M.O. of Nvidia?
 
digitalwanderer said:
But John! Their "average gamer" would never know of such a thing or be capable of doing it! :oops:




(Sorry, just bitterness...I'll try and stop)


To be honest you'd be surprised how often it comes up. With EQ 2 the first few months this tweak was required for AA. It was in the manual. Albeit very well hidden.
 
Bouncing Zabaglione Bros. said:
Nvidia did send out beta, unsupported review drivers to all the people reviewing R520 that up benchmark scores - just a couple of days back. There are still regular complaints about Nvidia drivers not rendering things they should or getting higher scores in benchmark mode than in game playing.

What else would you call that standard M.O. of Nvidia?

Actually, to me that looks like way too inept for NV when they go into Lex Luthor mode. :LOL:

Consider, we've heard the Euro guys say they were benching in Ibiza from Sep 30. And we've also heard it said that the Norte Americanos were favored and had been benching for a week or so before. "Couple days back" is way too late to impact that. And I suspect NV knew the schedule for all that long before you and I did.

Then, we have various statements that these drivers NV released are pretty, well, "meh", performance-improvement-wise.

So, I certainly understand you "playing the old tapes". Quick pattern-recognition is both a strength and weakness of the race. But it looks to me like there are some factors here that go against the pattern.
 
Bouncing Zabaglione Bros. said:
Nvidia did send out beta, unsupported review drivers to all the people reviewing R520 that up benchmark scores - just a couple of days back. There are still regular complaints about Nvidia drivers not rendering things they should or getting higher scores in benchmark mode than in game playing.

What else would you call that standard M.O. of Nvidia?

Perhaps you should look at it this way. Why is it wrong for Nvidia to send out beta drivers, yet we are reading reviews for a video card that isn't available for at least another month? If those drivers aren't available at the time of launch (approximately one month from now), then you have a valid point. If those drivers go public before, on, or soon after the release of the R520, then you may want to take your words back.
 
geo said:
Consider, we've heard the Euro guys say they were benching in Ibiza from Sep 30. And we've also heard it said that the Norte Americanos were favored and had been benching for a week or so before. "Couple days back" is way too late to impact that. And I suspect NV knew the schedule for all that long before you and I did.

So, I certainly understand you "playing the old tapes". Quick pattern-recognition is both a strength and weakness of the race. But it looks to me like there are some factors here that go against the pattern.

You call it "old tapes", but I've been reading complaints of several of the last betas - and all unsupported, yet released. It's the same old, same old - still happening today.

ondaedg said:
Perhaps you should look at it this way. Why is it wrong for Nvidia to send out beta drivers, yet we are reading reviews for a video card that isn't available for at least another month? If those drivers aren't available at the time of launch (approximately one month from now), then you have a valid point. If those drivers go public before, on, or soon after the release of the R520, then you may want to take your words back.

We know that ATI releases to the public, supported drivers every month. We know that Nvidia leaks unsupported drivers all the time, and produces special unsupported review drivers as spoilers when other companies launch competing products.

There's a bit of a difference between new drivers for a soon-to-be released product, and spoiler drivers that are buggy and unsupported with no purpose other than to increase benchmark scores in the face of a competitor's launch.
 
Last edited by a moderator:
SugarCoat said:
LN2 does amazing things :). From the pics all i see are 3 120mm fans, 1 on the CPU, in addition to the stock cooling.

I wish they'd do more then the damn 3Dmark stuff though.
I'm pretty sure someone beat 12k on a single 7800GTX with just water cooling.
 
Dave Baumann said:
Notebook versions aren't planned at the moment (I'm wondering if they are looking to R580 for that now), however "Idle" implies Windows 2D, where NVIDIA clocks down but ATI doesn't on the desktop parts yet (Terry mentioned in his presentation they would be soon, IIRC).

I don't quite follow you, Dave. PCI-SIG is showing M52, M54, M56 and M58 (not to mention the GL versions). Surely you're not suggesting that M58 is based on something other than the R520 core?
 
jb said:
But once the ATI cards are out and start to fall in price, would ATI's suggestions make more sense? I mean ATI used MSRPs to compare...
It really depends, doesn't it? Anyway, as I said, it's really good that reviewers typically benchmarked a range of cards around the suggested competition, so that people don't have to look far if the prices fluctuate.
 
Junkstyle said:
Unless someone can show me a screenshot comparison otherwise. ATI has the higher IQ of the two. That Aniso of ATI is just DIALED.
You know what really pisses me off about this? The entire reason that nVidia doesn't still support their nearly angle-independent anisotropic is because of ATI's own angle-dependent anisotropic, that most users seemed to claim had a minor image quality impact at most.

And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.

Granted, I've always been in the camp of angle-independent anisotropic, and was really disappointed by the NV4x, but it just upsets me that the response seems to be much more active now than it was back when ATI was using its highly angle-dependent implementation and nVidia wasn't.
 
Back
Top