Anyone else wondering if 6800 Ultra is really the 6800?

Joe DeFuria said:
I don't see OEM jumping on the nVidia parts based on any difference in feature set. PS 3.0, AFAIC, isn't something the OEMs will feel is a great marketing feature. Both parts will claim "DX9" compliance / support.

Give how successful the 5200 was with its checkbox 2.0 shader support, I'm not so sure that OEMs will overlook the marketing potential of 3.0 shaders.


I'm not too sure why some people are pouring scorn on the idea of ATi reaching 500MHz core and 600MHz memory - Considering they are using 0.13 micron and low-k on R420 (and have already used such a process to reach 500MHz before), and considering we've seen 600MHz GDDR-3 on some of the 6800 Ultra boards, why does it seem impossible to people?
 
Hanners said:
Give how successful the 5200 was with its checkbox 2.0 shader support, I'm not so sure that OEMs will overlook the marketing potential of 3.0 shaders.

You misunderstand.

The "DX9 checkbox" is indeed important for OEMs...no matter how crappy that implementation may be. I don't see "PS 3.0" having anywhere near the same impact...no matter how good the implementation may be.

I'm not too sure why some people are pouring scorn on the idea of ATi reaching 500MHz core and 600MHz memory -

Are people really pouring scorn on that idea?

I am personally doubtful of the 600 Mhz core rumors myself, but 500Mhz doesn't seem out of line to me. And while ATI using bleeding edge (600 Mhz) memory would be a first, it would be in line with past statements of willing to take a margin hit this round to maintain their leadership position.
 
It astounds me that the CEO of NVIDIA had the balls to claim a technology being used by IBM, NASA, TMSC, MOTOROLA and their chief competitor as 'dangerous' and get away with it!
 
surfhurleydude said:
Hanners said:
surfhurleydude said:
We've seen low-k from nVidia as well. That proves nothing :)

We have?

NV36 is low k.

No, its basic FSG process from IBM - not low-k. IBM are having more troubles with their low-k process than TSMC are and they have pretty much bypassed it as a customer option now.

A curious note from a TSMC exec here - all 90nm nodes will use some form of low-k process (not just TSMC).
 
Joe DeFuria said:
The "DX9 checkbox" is indeed important for OEMs...no matter how crappy that implementation may be. I don't see "PS 3.0" having anywhere near the same impact...no matter how good the implementation may be.

I'll agree that it hasn't got the same impact. But we shouldn't underestimate the power of Nvidias marketing machine :)

Edit:

And some Carmack NV40 loving:

“As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The NV40 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance,â€￾ said John Carmack, president and technical director of id Software

Not that surprising of the R420 is SM2.0 though.
 
Joe DeFuria said:
Seiko said:
Just like a badly written soap however....

Don't be so hard on yourself...I think what you've written is a pretty good soap opera. ;)

I do think Nvidia have shown one area to ATI which may bail them out and that's IQ. If ATI can introduce a new FSAA method or fantastic AF then they may win people over. If they don't then I see many an OEM and gamer jumping this round due to the feature set difference.

I see it a bit differently.

I don't see OEM jumping on the nVidia parts based on any difference in feature set. PS 3.0, AFAIC, isn't something the OEMs will feel is a great marketing feature. Both parts will claim "DX9" compliance / support.

Developers may jump on nVidia parts, and to a lesser degree, gamers, based on the features...but as usual, it's going to be performance that largely dictates mind-share and likely drives demand.

I think the OEMs will view R420 and NV40 as largely "feature comparable", so they'll be more apt to go with one or the other based on other factors, performance (which might also be a wash), but more importantly cost, power consumption, ability of ATI/nVidia to fill orders, etc.

Well I hope your right but so far my unimpressed view of OEMs simply going for ticks in the box hasn't been proven wrong.

Here's a question, and a genuine one at that, Do you feel going on current assumptions that the R420 could be considered comparable to NV40 in terms of DX9 features? I genuinly had started to believe that the R420 would offer no new features but instead offer potentially twice the performance of the R350? ATI Devrels pretty much confirmed this although they may have simply been playing down the new chips capabilities?
 
Then nevermind, I'm a fucking idiot, I remember reading it somewhere, but apparently I was wrong. Confused 9600XT with 5700 Ultra, oh well, shit happens.

But damn, those are some nice NV36 OCs without low-k though!!! :oops: :oops: :oops: :oops:

Weird thing though is that NV36s can OC to just a little under 9600XTs max OC potential, yet the 9600XT uses Low K and NV36 doesnt. :idea:
 
surfhurleydude said:
Then nevermind, I'm a fucking idiot, I remember reading it somewhere, but apparently I was wrong. Confused 9600XT with 5700 Ultra, oh well, shit happens.

But damn, those are some nice NV36 OCs without low-k though!!! :oops: :oops: :oops: :oops:
Just wait 'til you see what the R420 can hit with low-k! 8)
 
Seiko said:
Joe DeFuria said:
Seiko said:
Just like a badly written soap however....

Don't be so hard on yourself...I think what you've written is a pretty good soap opera. ;)

I do think Nvidia have shown one area to ATI which may bail them out and that's IQ. If ATI can introduce a new FSAA method or fantastic AF then they may win people over. If they don't then I see many an OEM and gamer jumping this round due to the feature set difference.

I see it a bit differently.

I don't see OEM jumping on the nVidia parts based on any difference in feature set. PS 3.0, AFAIC, isn't something the OEMs will feel is a great marketing feature. Both parts will claim "DX9" compliance / support.

Developers may jump on nVidia parts, and to a lesser degree, gamers, based on the features...but as usual, it's going to be performance that largely dictates mind-share and likely drives demand.

I think the OEMs will view R420 and NV40 as largely "feature comparable", so they'll be more apt to go with one or the other based on other factors, performance (which might also be a wash), but more importantly cost, power consumption, ability of ATI/nVidia to fill orders, etc.

Well I hope your right but so far my unimpressed view of OEMs simply going for ticks in the box hasn't been proven wrong.

Here's a question, and a genuine one at that, Do you feel going on current assumptions that the R420 could be considered comparable to NV40 in terms of DX9 features? I genuinly had started to believe that the R420 would offer no new features but instead offer potentially twice the performance of the R350? ATI Devrels pretty much confirmed this although they may have simply been playing down the new chips capabilities?

Here are my two cents. Either ATI has been playing the Black information/intelligence game and they have a fully compliant DX9.0c part with the R4xx or they are going to be on the wrong end of what they gave NVIDIA last year about DX9.0 compliance on PS2.0. If they are in that situation they really have nothing to say about it because of how they marketed their full DX9 compliance in the last generation of cards.

Why buy a card that is twice as fast as the last generation and currently CPU limited giving it good future proofing if it doesn't have full DX9.0c compliance? Especially since DX(x) is not coming out till sometime in 2006.
 
Seiko said:
Here's a question, and a genuine one at that, Do you feel going on current assumptions that the R420 could be considered comparable to NV40 in terms of DX9 features?

Yes and no.

I'll sum up what I've said before, as my opinion on the matter hasn't changed. I see Shader 3.0 vs. Shader 2.0 very similar to Shader 1.4 vs. Shader 1.1

There's no doubt that Shader 3.0 is "better": more capable, flexible, etc than 2.0. So in that sense (assuming R420 is not PS 3.0), they are not comparable. However, I don't see any "meanigful adoptation" of PS 3.0, to an extent that to end users, it really makes any difference what they have in their machine.

I genuinly had started to believe that the R420 would offer no new features but instead offer potentially twice the performance of the R350? ATI Devrels pretty much confirmed this although they may have simply been playing down the new chips capabilities?

I'm personally not expecting any major differences in pixel shaders. Vertex shaders maybe. For ATI this time around, I think they're really going to try and just be balls-out the fastest...which considering the good foundation of their R3xx core, isn't necessarily a bad thing.
 
surfhurleydude said:
digitalwanderer said:
Just wait 'til you see what the R420 can hit with low-k! 8)

After just quickly glancing thru 9600 XT reviews, I'm not very excited.

Don't look at the first reviews the silicon that hit the street was much better than the silicon given out as references cards. Especially with the updated drivers and overdrive function.
 
surfhurleydude said:
Weird thing though is that NV36s can OC to just a little under 9600XTs max OC potential, yet the 9600XT uses Low K and NV36 doesnt. :idea:

No...considering the significantly less power draw of the RV360. (No power plugs on RV360.) Up the power of the RV360 to be comparable to NV36, then we can start to talk on an "apples to apples" basis...
 
Back
Top