R520 = Dissapointment

ondaedg said:
What amazes me here is the sudden change in preference when choosing a video card. Back when the NV40 was competing with the R420, the majority of this board declared that SM3.0 was useless and the overall feature set of the NV40 was great, but the raw speed of R420 was all that was important to them. Now that the reviews have shown that the X1800XT is not the speed demon everyone thought it was going to be, suddenly the preference has become on the featureset of the X1800XT over the 7800 GTX. I love how opinions change to accomodate the times...

Just a few notes on Shader Model 3.0:
  • At nv40 launch time, it was *completely* irrelevant because no titles used it at that time.
  • At this moment, right now, as I write this, sm 3.0 is still mostly irrelevant. A smattering of titles use it. It's usefulness has slowly increased over the recent past.
  • Over the next year, it will become more and more relevant. More and more titles will be using it. It's usefulness will rapidly increase in the forseeable future.


It shouldn't take much effort to figure out why a year ago one could say sm 3.0 wouldn't be a factor in choosing a vidcard but that same person will consider it a factor today.

Here's a comparable situation: how well vidcards support Windows Vista doesn't matter the slightest right now but it will matter in 2007.

I can't fathom why you say the change in opinion is "sudden" since it's taken more then a year to happen. It's neither sudden nor unexpected. I remember saying early this year that sm 3.0 would start to matter in 2006. So, in my case at least, my opinion on sm 3.0 really hasn't changed much, if at all.
 
Hubert said:
Yup. Where Nvidia has the power, Ati has the smarts. Of course there can be scenarios where one of them suits the task much better. While expecting future proofness is stupid in this industry, still, I favor Ati's solution. It's let's say, more elegant. :)

What is more elegant? The smaller batches? Arguably yes. But going for high clocks vs wide parallelism? I don't think high clocks are neccessarily elegant, and that road often leads to a dead end. Remember NetBurst? Intel got their ass handed to them by Athlon/Opteron, and now Intel is back to making chips which do more per clock cycle, and not necessarily trying to maximize clocks.

I think the high clocks are a stop-gap measure. I have a feeling that the R580 will be clocked lower, but utilize more parallelism. Xenos shows that you don't need high clocks (or even high # of ROPs) to be a monster.l
 
geo said:
Someone asked an entirely legitimate question upstream --how much does the extra memory and faster memory speed play a part here in the eye-popping instances where X1800XT kicks butt?
Doesn't X1800XL vs. 7800GT answer this?

ondaedg said:
What amazes me here is the sudden change in preference when choosing a video card. Back when the NV40 was competing with the R420, the majority of this board declared that SM3.0 was useless and the overall feature set of the NV40 was great, but the raw speed of R420 was all that was important to them.
Maybe when the initial benchmarks showed the X800s significantly outperforming the 6800s in the main "DX9" benchmark of the time, Far Cry, but not when later driver releases showed a much more even fight in that one game. I thought the majority of B3D forumgoers rightly credited the GF6 series for overall outclassing the X800 series once the launch uncertainty passed.

What about ATI telling the media that Sander's benchmarks are flat out wrong? Didn't they state in rebuttal that the benchmarks will show that the X1800XT will completely outclass the competition and establish it as the leader?
It's early yet, but if Sander's #s are vindicated, ATI's PR dept has more to own up to.

And why aren't we discussing the midrange cards when it was pointed out to me that the majority of this forum will not purchase the high end but will likely purchase a midrange part (as shown by a poll conducted here)? Nvidia's midrange parts which are all of a year old are beating up on ATI's new part. Yet there is no mention of this.
I'm giving RV530 some extra time to prove itself given that its architecture seems quite a bit different than the current mode, but it's certainly underwhelming with current benchmarks.
 
DemoCoder said:
What is more elegant? The smaller batches? Arguably yes. But going for high clocks vs wide parallelism?

True. Still, I am one stupid gamer who buys Ati's PR shit about developers liking this chip.
Also, the programmable arbiter (memory controller) is very fancy, IMO :D
I am not saying I am not concerned a bit about power consumption and thus noise, but I'll end up probably with a X1800XL, and also I hope Ati can adress the idle consumption issu via their drivers. Simply put, for me, the 7800 GT seems unattractive now.

I don't care about OpenGL or titles taylored for Nvidia products like Doom3 (hey, the engine does not seem that popular, sepctacular, not even a VGA killer now anyway) or Riddick. I am worried about a bit how Witcher might perform, Aurora was always a Nvidia liker, maybe the added shader (fragment) effects will soften out the difference, or maybe they won't.

I'm reading the Xbitlabs test. Interesting thougtht: high consumption on the X1800XT might be partly becasue of the 512 Mb memory running on high clockspeeds.( It is almost double of what the lower clocked 256 Mb X1800 XL consumes.) So comparing it to a 256 Mb 7800 GTX might not be totally fair.
 
Last edited by a moderator:
If this chip came out as planned, it would have been a winner. Now, although it is a very nice piece of HW it just doesn't "shine" as it should in order to make any serious fuss on the market. So it seems like ATI lost this round (and that even not counting the counter-attack from nV - whatever that might be, it'll surely be faster than the GTX and might even have soom room for disabling AF optimizations/angle dependency etc.).

At least it _does_ offer some more value than the GTX (IQ-wise) so it's not "dead meat" which it definitely would be without those features. ATI barely made it this round, they'd better make sure that R580/590 etc. win in _every_ area by a significant margin, otherwise I see ATI landing somewhere between nV and S3/XGI etc. as "midrange", especially knowing how aggressive nV is and will continue to be.

As it stands, if I purchased a new card NOW, it would be green unless the prices for ATI parts reach the same levels. This is not probable at all though, since nV has MUCH more air as far as the pricing goes.
 
_xxx_ said:
If this chip came out as planned, it would have been a winner. Now, although it is a very nice piece of HW it just doesn't "shine" as it should in order to make any serious fuss on the market. So it seems like ATI lost this round (and that even not counting the counter-attack from nV - whatever that might be, it'll surely be faster than the GTX and might even have soom room for disabling AF optimizations/angle dependency etc.).

True, but it still seems to shine and I would say what ATi lost this round, they lost before these cards were launched and reviewed: execution was poor and they have shot themselves in the foot in the business aspects. The product is fine, but how it came about isn't.

At least it _does_ offer some more value than the GTX (IQ-wise) so it's not "dead meat" which it definitely would be without those features. ATI barely made it this round, they'd better make sure that R580/590 etc. win in _every_ area by a significant margin, otherwise I see ATI landing somewhere between nV and S3/XGI etc. as "midrange", especially knowing how aggressive nV is and will continue to be.

This I don't agree with at all. There is no reason to win in every category. What matters most now is SM 3.0 performance and HDR, along with IQ, but that subjective. I say this because SM 3.0 should see a serious boost in adoption now that both major IHvs are playing on the same field. This is a good thing (said Martha). However, this may also mean that some real advancements are made and that branching will become important, for example. The X1800XT has better dynamic branching than 7800 GTX and supports more HDR types with AA. If the Xbitlabs review is to be believed, the R580 will absolutely crush the 7800 GTX in dynamic branching (hint: in their test, using a custom program, the X1600XT beats the 7800 GTX in dynamic branching tests). So, while the ground may look even when they play 'older' titles, newer titles that rely on SM 3.0 dynamic branching to bring added quality, the Nvidia G7x lineup may be facing total defeat.

As it stands, if I purchased a new card NOW, it would be green unless the prices for ATI parts reach the same levels. This is not probable at all though, since nV has MUCH more air as far as the pricing goes.
I was just thinking about this the other day and thought it would make an excellent thread question.

"If you are standing in a store with both the X1800XT and 7800 GTX in front of you on the table and with the money to buy either, which do you leave the store with?"

My answer would be the X1800XT were it not for these damn rumors that R580 will be released sooner rather than later. I want some of that!

Interesting times ahead, methinks.
 
wireframe said:
True, but it still seems to shine and I would say what ATi lost this round, they lost before these cards were launched and reviewed: execution was poor and they have shot themselves in the foot in the business aspects. The product is fine, but how it came about isn't.

They lost just because they're too late, that's what I meant. From the pure marketing standpoint. Better AF and HDR+AA is not enough to justify the pricing and the benchmark numbers (with current games, which of course won't really let ATI's shader power shine) are also not really looking that good. They'd need benches which would better show the efficiency advantage ATI has with dyn. branching etc and there simply are none out there, or none which are popular/well known except FEAR maybe.

This I don't agree with at all. There is no reason to win in every category. What matters most now is SM 3.0 performance and HDR, along with IQ, but that subjective.

After all the delays and x-fire problems etc.: for the not-so-knowing masses, they would have had to beat nV with at least 50% advantage in every single benchmark for the right "impact" in the media. Also, it would have to be not more than 10-15% more expensive than nV in _that_ case. As it is, the pricing would have to be about equal in order for ATI to "reach the masses" and I don't believe for a second that they'll be able to follow nV there, pricing-wise. The 7800 are cheap and plenty in comparison, nV can lower the prices as they wish. But they don't even need to do that since Joe Doe will se nV winning half the benchmarks and being ~10-20% behind in most of the rest and $100 cheaper even now. Go guess what Joe will buy ;)

I was just thinking about this the other day and thought it would make an excellent thread question.

"If you are standing in a store with both the X1800XT and 7800 GTX in front of you on the table and with the money to buy either, which do you leave the store with?"


Make it a poll with the question "I'd choose X1800XT over GTX if:" with options like "...ATI was not more than 10% more expensive", "...same cost" and "...GTX would cost $100 less" or so.

EDIT:
I'll do it for you :)
http://www.beyond3d.com/forum/showthread.php?t=24311
 
Last edited by a moderator:
Hellbinder said:
I am still trying to figure out what benchamrks people are looking at...

Extremetech for one shows ATI in the lead most of the time.

which ones are U reading?
 
The Baron said:
Well yeah, but isn't that just kind of obvious from its function alone? Why would increasing AF on certain angles result in a generic performance hit when plenty of scenes don't have said angles?

This is why God created synthetic benchmarks. Somebody, get to it. Make us an AF tunnel with a user-selectable number of sides.

Well, until the almighty wakes up from his nap:

http://www.xbitlabs.com/articles/video/display/radeon-x1000_19.html
 
The Hexus benchmarks show the X1800 leading in Far Cry, FEAR and Battlefield 2. The Tech Report benchmarks add Splinter Cell to this list. The Driver Heaven benchmarks add Half-Life 2: Lost Coast, CS:S - VST, NFSU 2 and Fable. I don't understand how people can say the X1800 is losing in most games.
 
Subtlesnake said:
The Hexus benchmarks show the X1800 leading in Far Cry, FEAR and Battlefield 2. The Tech Report benchmarks add Splinter Cell to this list. The Driver Heaven benchmarks add Half-Life 2: Lost Coast, CS:S - VST, NFSU 2 and Fable. I don't understand how people can say the X1800 is losing in most games.

Not all games, but at least half of them. What matters though: it does not _crush_ the GTX in _any_ benchmarks.
 
BTW, just checked it out, the nVidia parts are about 100€ cheaper than their ATI counterparts here and now.

7800GT - ~360€
X1800XL - ~450€

7800GTX - ~470€
X1800XT - ~599€

Neither XT nor XL are in stock, though...
 
wireframe said:
My answer would be the X1800XT were it not for these damn rumors that R580 will be released sooner rather than later. I want some of that!

Not that I've seen it, but Im suprised noone has made a poll yet.

Come November 11th? Will you buy a X1800XT or just wait a few more months until R580 is released.. well I dont know when its released exactly.

Beh, make is simple X1800XT or R580, which one will you get :D
 
I think we can pretty much guess now, why exactly ATi hasn´t released R400 on time and instead brought up R420. If you take a closer look at R520, people would´ve reacted even more dissapointed back then, even if there is absolutely no reason for it. It´s just the way a lot of people look at reviews, and to be honest, i really don´t like it.

R520 is fantastic card, it´s late and it´s not excusable that it draws that much power, but it again redefines quality at a level NV has to compete with. That´s a win-win situation for us all. If ATi had again just focussed solely on performance alone, i wouldn´t have found any more excuse buying a high-end >$400 SKU.
 
_xxx_ said:
BTW, just checked it out, the nVidia parts are about 100€ cheaper than their ATI counterparts here and now.

7800GT - ~360€
X1800XL - ~450€

7800GTX - ~470€
X1800XT - ~599€

Neither XT nor XL are in stock, though...

How long do you think it will take to create price stability between competitive products (if ever)? Nvidia would be smart to drop MSRP on the 7800GTX and add a 512 MB card at the same MSRP as the X1800XT. Personally though, I'm waiting until ATI releases the R580. They have the efficiency now, and they only need a wider part.
 
_xxx_ said:
Not all games, but at least half of them. What matters though: it does not _crush_ the GTX in _any_ benchmarks.
It depends what you mean by crush. At 1600*1200 w/4*AA and 16*AF it's at least 28% faster in FEAR, 20% faster in BF2 and 20% faster in Far Cry.
 
CMAN said:
How long do you think it will take to create price stability between competitive products (if ever)? Nvidia would be smart to drop MSRP on the 7800GTX and add a 512 MB card at the same MSRP as the X1800XT. Personally though, I'm waiting until ATI releases the R580. They have the efficiency now, and they only need a wider part.

They already said they're releasing the 512MB GTX soon.

And if it turns out to be 90nm process, we'll have the new king of the hill and the reason for ATI to hurry with the release of the R580. Though I don't think it'll be 90nm(EDIT: the 512MB 7800, I mean).
 
Subtlesnake said:
It depends what you mean by crush. At 1600*1200 w/4*AA and 16*AF it's at least 28% faster in FEAR, 20% faster in BF2 and 20% faster in Far Cry.

I'm beginning to think losing in FEAR by such a large margin may be due to memory buffer reasons. We should find out when Dave releases his numbers.
 
Subtlesnake said:
It depends what you mean by crush. At 1600*1200 w/4*AA and 16*AF it's at least 28% faster in FEAR, 20% faster in BF2 and 20% faster in Far Cry.

And I can give you a bunch of games where it's not the case.

Also, most reviews see the 7800 in the lead with FarCry :???:


Crush = at least 50% faster in everything and anything.
 
Back
Top