Anyone else wondering if 6800 Ultra is really the 6800?

kemosabe said:
Ardrid said:
So does anyone know what the first image was? It seems far to bland to be PS2.0 based.

http://www.presence-pc.com/news/n3625.html

My translation of the author's comments on those screenshots (and remember not to shoot the messenger, since I am in no position to corroborate or refute his claims):

"But NVDA goes further, and demonstrates 4 screenshots from the game FarCry, the first set allegedly using pixel shader 2.0 versus 3.0 for the second. Obviously, if taken at face value, the difference is considerable and we would quickly conclude that this technology provides a significant improvement in image quality in a game that has already been released. Except that several remarks can be made when looking at these screenshots.

Indeed, the image of the statue rendered with shader 2.0 seems to have been rendered without using any shader technology at all, and even without Normal Mapping. The output is therefore visibly not up to par with the capabilities of even the original GeForce.

The screenshot of the staircase using shader 2.0 seems to be of higher quality. From this shot, the rendering of the stones on the wall seem to use Paralax Mapping. This technique can be rendered via shaders 1.X (DirectX 8.X), but is ideally obtained with shader 2.0. The Displacement Mapping used on the left screenshot is only possible using shader 3.0, which would make this consistent. It is therefore more than probable that these shots are authentic, although we cannot state this with certainty. Note that in this case, the difference in quality compared to shader 3.0 is also less. Difficult to investigate this issue further without a GeForce 6800 card to run FarCry.

P.S. I have seen FarCry with PS 2.0 and those screenshots aint it they are showing the difference between FarCry on a 6800U with and without PS 3.0 the before shot of the water doesn't even look like it has any ps enabled. These have been discussed ad naseum in other threads
 
I didn't bother reading the whole thread, so please shoot me if this was already brought up :D

NV40 is a monster of a chip. Dig was wondering whether or not you could end up with a part that can run 4/4 quads at 400MHz or 3/4 quads at 500MHz, I think this is definitely possible in a number of ways. But then I also think it's already pretty hard to get 4/4 functionality at all, regardless of clock speed. Yields'n stuff ;)

OTOH I'm pretty confident that R420 will be a somewhat less complex chip (tranny count-wise), due to the supposed smaller feature set, and that it'll be easier to attain higher clock speeds on that "small" chip than it is on NV40.

Overall I expect small performance advantages for R420. I don't expect any bunnies out of the hat from NVIDIA to spoil the R420 launch, I really tend to believe that they've done the best they could on the engineering front and just couldn't afford setting any higher clock speeds.

Kudos to NVIDIA. This is the first thing coming from them since the NV25 that I consider attractive at all, and, pleasantly surprising, much more than that. Me want :oops:

Maybe I'm just dreaming but I think that now, with NVIDIA finally having a compelling architecture on their hands again, we just might see a break from the mudslinging and dirty tactics, back to reasonable and professional behaviour ;)

PS: great review, Dave, many thanks for the interesting read.
 
zeckensack said:
NV40 is a monster of a chip. Dig was wondering whether or not you could end up with a part that can run 4/4 quads at 400MHz or 3/4 quads at 500MHz, I think this is definitely possible in a number of ways. But then I also think it's already pretty hard to get 4/4 functionality at all, regardless of clock speed. Yields'n stuff ;)
We've heard that some samples (Unreal Engine 3 demo, Vegetto-EX's faimly member's) were clocked at 475MHz, so I'm curious if nV throttled back to 400 to increase yields or to lower power draw.
 
Snarfy said:
http://www.azuretwilight.org/gallery/2004_04/FarCry0017

http://www.azuretwilight.org/gallery/2004_04/FarCry0015

boy im getting tired of linking to these, lol

thats ps 2.o, on a 9800 xt =)


I saw those. What i was looking for was a side by side comparison of the same angle showing the differences between PS2.0 and 3.0.

We need someone with an 6800ultra to do this. Either that of if you could take a high res shot of those stairs with the gold lighting on them. I would but i dont have farcry yet. Perhaps ill grab it tomorrow.

Those pics are also really dark.
 
well, i'd do the gold stairway, but its not in the game i have

i've beaten Farcry, and didnt once see the textures they showed in the nv40 launch movie, so its prolly just a special little map they wrote (its quite easy with the sandbox editor) with the new textures, just so no one else could go there and compare ^_^ :devilish: kind of a mean thing to do
 
Snarfy said:
well, i'd do the gold stairway, but its not in the game i have

i've beaten Farcry, and didnt once see the textures they showed in the nv40 launch movie, so its prolly just a special little map they wrote (its quite easy with the sandbox editor) with the new textures, just so no one else could go there and compare ^_^ :devilish: kind of a mean thing to do


Ahh... tricky little bastards arent they. hehe.

Perhaps we can get DaveB to do it. I also put in a request from Ryan at PC Perspective since his comparison photos are terrible and dont show PS2.0 v PS3.0. They look like they were taken with a geforce4 or something.

I still dont understand why any reviewers did a proper side by side comparison... it boggles the mind. :?:
 
I thought PS2.0 was supposed to be able to everything PS3.0 can do? So why would PS2.0 look any worse?

It could be slower, but it shouldn't be _that_ different...
 
Andrew said:
Ahh... tricky little bastards arent they. hehe.
Ayep, but I don't know if this is PCPers' or nV's fault.

Perhaps we can get DaveB to do it.
It appears he has to ship his card off to the next lucky contestant, as the samples supply is limited. Bottlenecked, to use a term more familiar to us. ;)

I still dont understand why any reviewers did a proper side by side comparison... it boggles the mind. :?:
Time limits. I think they had a week or so to get these previews out the door, and I'm sure many were at ATi's editor's day when the NDA lifted.
 
Hmmmmmmmmm.......from the Albatron press release:


The GeForce 6800/6800UV VGA cards also apply AGP 8X that can support the design of most mainboards on the market. The GPU has been clocked up to speeds of 600 MHz and contains 16 pipeline design, therefore the performance of GeForce 6800 is twice as GeForce FX5950.

I bet the reviewers didn't get a true Ultra. This has got to be the reason why not a single reviewer has had any power issues with using a PS under 480W. I bet the 480W recommendation is for the "other" Ultras
 
skoprowski said:
Hmmmmmmmmm.......from the Albatron press release:


The GeForce 6800/6800UV VGA cards also apply AGP 8X that can support the design of most mainboards on the market. The GPU has been clocked up to speeds of 600 MHz and contains 16 pipeline design, therefore the performance of GeForce 6800 is twice as GeForce FX5950.

I bet the reviewers didn't get a true Ultra. This has got to be the reason why not a single reviewer has had any power issues with using a PS under 480W. I bet the 480W recommendation is for the "other" Ultras

Or NVIDIA is playing it safe because if your box is loaded you may need close to the 480W maybe its a safety factor maybe they bought stock in ANTEC
 
Stryyder said:
skoprowski said:
Hmmmmmmmmm.......from the Albatron press release:


The GeForce 6800/6800UV VGA cards also apply AGP 8X that can support the design of most mainboards on the market. The GPU has been clocked up to speeds of 600 MHz and contains 16 pipeline design, therefore the performance of GeForce 6800 is twice as GeForce FX5950.

I bet the reviewers didn't get a true Ultra. This has got to be the reason why not a single reviewer has had any power issues with using a PS under 480W. I bet the 480W recommendation is for the "other" Ultras

Or NVIDIA is playing it safe because if your box is loaded you may need close to the 480W maybe its a safety factor maybe they bought stock in ANTEC
Or its a typo.
 
MSI Speaks about:

Continuing the MSI tradition of incorporating only the best engineering process and design techniques, MSI NX6800 Ultra series take advantage of the most advanced and sophisticated Copper ULTRAâ„¢ Cooling technology. This advanced ventilation cooling mechanism enables higher performance through faster clock rates, while still remains in comparable low GPU temperature-- a perfect metal craft for a perfect graphics from MSI.
 
ChronoReverse said:
From that it seems that PS2.0 can do whatever PS3.0 can do but with more passes. It's also missing the shader-antialiasing (which I don't think could make THAT much difference).
It'll make a huge difference if any shaders use procedural textures. It could also potentially make a significant difference if branching is used.
 
skoprowski said:
Hmmmmmmmmm.......from the Albatron press release:

<...>
The GPU has been clocked up to speeds of 600 MHz <...>
I once (accidentally) clocked my Athlon XP2400+ at 2.66GHz. It didn't even post :D
 
Back
Top