Anyone else wondering if 6800 Ultra is really the 6800?

As a comment on Far Cry: I watched a video of the presentation, and the Far Cry developer said for each shader shown that they were only possible with PS 2.0 and 3.0. I don't think he ever said that any specific shader was only written for 3.0. I expect using PS 3.0 to show a performance increase for some shaders, but I don't expect any visual differences.

That said, the NV4x should pull ahead on future games when developers start to make use of the FP16 framebuffer and texture formats.
 
Chalnoth on the link Im giving you, you will find photographs of the screen presentation by farcy and beneath the first image you will see the words PS2.0 and these are the very same pictures as those from PC perspective.

The difference is that the Bit Tech photos came from the Geneva event not the Lan event which gives the impression they were trying to imply that PS3.0 is this much better than PS2.0.

http://www.bit-tech.net/feature/43.

Dont shoot me down Chalnoth as I fear you, being new here I have not yet recieved my innoculations :LOL:
 
Identical chalnoth except the photos from bit tech actually had PS2.0 written under the left one. It just leads me to think the FUD is already starting and I was hoping we would not get any this round as I expect both new parts to be good.

Why would nVidia compare PS1.0 or other to PS3.0 other than to say hey look how good our FUD oh ahem sorry our our new PS3.0 is, when there competition is ATI and ATI have had higher specs than that for at least 2 years. :rolleyes:
 
DW Fan!!!!! said:
Why would nVidia compare PS1.0 or other to PS3.0 other than to say hey look how good our FUD oh ahem sorry our our new PS3.0 is, when there competition is ATI and ATI have had higher specs than that for at least 2 years. :rolleyes:
Simple, because this time around it will be reversed and nVidia will have the higher spec so they're going to push it as the biggest thing since Jesus lost his sandals. ;)

Seriously, this is gonna get a whole lot weirder/uglier before it gets better. Bank on it. I have a feeling that this isn't the only fanboy argument that is going to flip-flop in this generations battle compared to last. (And when I say "fanboy argument" I don't mean there isn't some validity to the point, it's just I'm considering it as one of the rallying points over-active IHV enthusiasts are going to use to declare their IHV the best even if they have no clue what they're talking about. ;) )

Did I mention this is going to be a fun launch? This is going to be a fun launch! :LOL:
 
The arguments are already well under way at FM :oops: so ive decide to take a sebatacle and i'm not going to return until it dies down, but from what you say, it looks like it wont die down and this could be the worst generational flamefest ever to grace the forum world :devilish: .

I might go out and buy myself an ENYA CD so I can be relaxed while reading the forums because if anything they are at least good to laugh at FM especially :LOL:
 
DW Fan!!!!! said:
Why would nVidia compare PS1.0 or other to PS3.0 other than to say hey look how good our FUD oh ahem sorry our our new PS3.0 is, when there competition is ATI and ATI have had higher specs than that for at least 2 years. :rolleyes:
After watching the video of the NV40 launch, I have to say that the Far Cry presentation really was a low point. It really seemed to me like Dan Vivoli was putting words into the mouth of the Far Cry developer.

The developer stated that the advancements you're seeing are due to a "technology mod" that added shaders to the game. He stated that this mod was inspired by the GeForce 6800. He also said when describing each shader that it was only possible with PS 2.0 and 3.0. I would expect, however, that a GeForce 6800 should gain some efficiency running in PS 3.0. When those screenshots were displayed, the statement was simply, "before" and "after," no mention was made of pixel shader versions.

Dan Vivoli really did get on my nerves. There's a lot of crap that he spewed that he didn't really need to. The GeForce 6800 can really stand on its own here. Then again, it's possible that he really didn't know as much about the architecture as he should have to have given that presentation.
 
digitalwanderer said:
DW Fan!!!!! said:
Why would nVidia compare PS1.0 or other to PS3.0 other than to say hey look how good our FUD oh ahem sorry our our new PS3.0 is, when there competition is ATI and ATI have had higher specs than that for at least 2 years. :rolleyes:
Simple, because this time around it will be reversed and nVidia will have the higher spec so they're going to push it as the biggest thing since Jesus lost his sandals. ;)

Seriously, this is gonna get a whole lot weirder/uglier before it gets better. Bank on it. I have a feeling that this isn't the only fanboy argument that is going to flip-flop in this generations battle compared to last. (And when I say "fanboy argument" I don't mean there isn't some validity to the point, it's just I'm considering it as one of the rallying points over-active IHV enthusiasts are going to use to declare their IHV the best even if they have no clue what they're talking about. ;) )

Did I mention this is going to be a fun launch? This is going to be a fun launch! :LOL:
In Dx9 part, nVidia have the higher spec all the time.
NV3x support PS2.0+/VS2.0+.That is higher spec than R3x0.
 
engall said:
In Dx9 part, nVidia have the higher spec all the time.
NV3x support PS2.0+/VS2.0+.That is higher spec than R3x0.
True, I should have said "useable higher spec"...but at this point that is pretty much a given around these forums so I just sort of spaced it.

My apologies, you are correct. nVidia had the higher specs for dx9, they just couldn't run them in any useable way. :)
 
digitalwanderer said:
engall said:
In Dx9 part, nVidia have the higher spec all the time.
NV3x support PS2.0+/VS2.0+.That is higher spec than R3x0.
True, I should have said "useable higher spec"...but at this point that is pretty much a given around these forums so I just sort of spaced it.

My apologies, you are correct. nVidia had the higher specs for dx9, they just couldn't run them in any useable way. :)
Never mind. I like you ,buddy. :p :p
 
:oops: I actually think you all misconstrued my post :D

Posted by ME
Why would nVidia compare PS1.0 or other to PS3.0 other than to say hey look how good our FUD oh ahem sorry our our new PS3.0 is, when there competition is ATI and ATI have had higher specs than that for at least 2 years.

What I meant by that statement was why would Nvidia compare it too PS1.0 when ATI have had a minimum spec of PS2.0 for the last 2 years as have Nvidia, it sorta gives the impression that they dare not do it.

Probablys my fault you guys misconstrued the question any way as I always get things ass backwards :D :D

I hope you like me too engall or ill set up a user name to follow u too :devilish: :devilish:
 
digitalwanderer said:
My apologies, you are correct. nVidia had the higher specs for dx9, they just couldn't run them in any useable way. :)

NVidia had higher specs in some areas but were also lacking some major features, MRT's f.e. But they got away with that because those features were optional in SM2.0.
 
Chalnoth said:
After watching the video of the NV40 launch, I have to say that the Far Cry presentation really was a low point. It really seemed to me like Dan Vivoli was putting words into the mouth of the Far Cry developer.

Heh...:) Since when is it surprising to see nVidia PR people speaking "on behalf" of developers?...;) A departure for them would be to refrain from doing so.

He also said when describing each shader that it was only possible with PS 2.0 and 3.0. I would expect, however, that a GeForce 6800 should gain some efficiency running in PS 3.0. When those screenshots were displayed, the statement was simply, "before" and "after," no mention was made of pixel shader versions.

It's also reasonable to suspect that running 2.0 instructions under 3.0 might just as well be less efficient. Expect to see 3.0 used as a nVidia marketing bullet--examinations of the efficacy and scope of the nVidia nV40 support of ps3.x are still ahead of us...;)

Dan Vivoli really did get on my nerves. There's a lot of crap that he spewed that he didn't really need to. The GeForce 6800 can really stand on its own here. Then again, it's possible that he really didn't know as much about the architecture as he should have to have given that presentation.

You nailed it--generally the only time to go overboard about something like this is when you know your product is lacking something and you are attempting to compensate by way of manipulative PR (as we saw nVidia do so often with nV3x.) Even if Vivoli is indeed ignorant of specifics relative to the products made by the company which employs him to hype them, it seems to me that the less he knew the less inclined toward embellishment/exaggeration he would have been. Perhaps you sense that Vivoli is not as confident as you are in assuming that nV40 can indeed "stand on its own" in this respect, and this makes you nervous. Possibly you are fearful that instead of knowing less, Vivoli knows a bit more than has been publicly released thus far, and feels a need therefore to overcompensate from a PR perspective.

Then again, maybe Vivoli has become so accustomed to having to hype products which don't "stand on their own" that he was merely talking out of habit and exaggerating unconsciously....;) I feel like it won't be too long before all of these things become a lot plainer, one way or the other.
 
WaltC said:
Then again, maybe Vivoli has become so accustomed to having to hype products which don't "stand on their own" that he was merely talking out of habit and exaggerating unconsciously....;) I feel like it won't be too long before all of these things become a lot plainer, one way or the other.

Whilst the architecture no doubt has weaknesses, I honestly suspect it has a lot to do with this point you made (although I think you were joking, going by the smilie).

Honestly seemed like habit rather than anything that people need to read too much into.
 
WaltC said:
It's also reasonable to suspect that running 2.0 instructions under 3.0 might just as well be less efficient.
No. No it's not. There's nothing fundamentally different about operation in PS 2.0 and PS 3.0, nothing that would automatically cause performance slowdown.
 
PaulS said:
Whilst the architecture no doubt has weaknesses, I honestly suspect it has a lot to do with this point you made (although I think you were joking, going by the smilie).

Honestly seemed like habit rather than anything that people need to read too much into.

I was really only half-joking there, but am nonetheless not too worried about it, really--it's all going to "come out in the wash," eventually. The thought that occurs to me is that if nVidia was indeed keen on ps3.0 for a lot more than a marketing bullet, it seems to me that for the official launch of nV40 they could have come up with something a tad better than a mod for Far Cry which was poorly executed and poorly explained. But that's just me...;)

Chalnoth said:
No. No it's not. There's nothing fundamentally different about operation in PS 2.0 and PS 3.0, nothing that would automatically cause performance slowdown.

I'm speaking specifically of the nV40 implementation of ps3.0, and am only looking at it from the perspective of nV3x differences between ps2.0 and ps1.x. There was a big difference in nV3x between ps2.x and ps1.x hardware support such that nVidia felt compelled to "optimize" its drivers to knock ps2.x down to 1.x for much better performance (something ATi did not have to do because its R3x0 implementation of ps2.x was much better than nVidia's in nV3x.) The nV3x disparities had nothing to do with the differences between the 2.x and 1.x protocols, but had only to do with nVidia's specific implementation of ps2.x support in nV3x. I'm not saying this will be the case here, of course, just that we have to see a thorough test of ps3.x-2.x with nV40 before we can know for sure about the efficacy of its ps3.x implementation. Again, speaking only for myself, of course...;)
 
All good and well I suppose, but the mystery remains: why would any sane person use 3.0 if it was slower than the 2.0 implementation?
 
WaltC said:
I'm speaking specifically of the nV40 implementation of ps3.0, and am only looking at it from the perspective of nV3x differences between ps2.0 and ps1.x.
Yes, and that was an entirely different scenario. Specifically, there was a precision difference between PS 2.0 and PS 1.x. PS 1.x could work at integer precision. PS 2.0 had to operate at FP precision. The NV3x was slow at PS 2.0 because it was slow at FP operations.

There cannot be any inherent drop in performance in moving from PS 2.0 to PS 3.0 as there was for the NV3x in moving from PS 1.x to PS 2.0, because there is no inherent change in how the pixel shader operations are performed.
 
Hey, wouldn't it be something if the game used ps2 rather than ps3? Just poking around a bit... ouch.
 
Back
Top