GeForce FX 6800Ultra -previews thread

surfhurleydude said:
Joe DeFuria said:
I would say that the 4x mode in the NV4x is indeed very comparable to the 4X mode of the R3xx.

However, it's a step down from the 6X mode on the R3xx. And if we're lucky R420 will offer 8X MSAA with a sparse sample pattern.

The largest complaint about NV4x AA (stated nicely in the B3D review) isn't about it's sample pattern...it's that it only goes up to 4X. Given the power of the card, and that many games will be CPU limited, higher MSAA modes should be doable from a performance perspective.

Agreed 110%.

Agreed, too, though I would add lack of programmable patterns and gamma correction to the above. But the card has the power to run 4x with high resolutions, and that should certainly satisfy the bulk of the gaming market.
 
John Reynolds said:
surfhurleydude said:
Joe DeFuria said:
I would say that the 4x mode in the NV4x is indeed very comparable to the 4X mode of the R3xx.

However, it's a step down from the 6X mode on the R3xx. And if we're lucky R420 will offer 8X MSAA with a sparse sample pattern.

The largest complaint about NV4x AA (stated nicely in the B3D review) isn't about it's sample pattern...it's that it only goes up to 4X. Given the power of the card, and that many games will be CPU limited, higher MSAA modes should be doable from a performance perspective.

Agreed 110%.

Agreed, too, though I would add lack of programmable patterns and gamma correction to the above. But the card has the power to run 4x with high resolutions, and that should certainly satisfy the bulk of the gaming market.

Me too. Considering the kind of quality AA we've had with the ATI Radeons, I'm disappointed that Nvidia have not been able to improve on things again this generation.

It's really difficult to go backwards once you are used to high quality, and I know I'd rather have 100 fps and higher quality than 200 fps at low quality.
 
When?

Have any of the reviews mentioned the actual availability date? I havn't noticed any in the reviews I've read.

I think nvidia gets the "Most Improved" award this round no matter what.
 
Re: When?

Me said:
Have any of the reviews mentioned the actual availability date? I havn't noticed any in the reviews I've read.

I think nvidia gets the "Most Improved" award this round no matter what.

To answer my own question:
Availability
The first GPUs based on the NVIDIA GeForce 6 Series, the GeForce 6800 Ultra and GeForce 6800 models, are manufactured using IBM’s high-volume 0.13-micron process technology and are currently shipping to leading add-in-card partners, OEMs, system builders, and game developers.

Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days.
 
damn

Slashdot just posted the news, and now all the webservers are melting. Oh well we got all the good info already anyway.
 
Bouncing Zabaglione Bros. said:
John Reynolds said:
surfhurleydude said:
Joe DeFuria said:
I would say that the 4x mode in the NV4x is indeed very comparable to the 4X mode of the R3xx.

However, it's a step down from the 6X mode on the R3xx. And if we're lucky R420 will offer 8X MSAA with a sparse sample pattern.

The largest complaint about NV4x AA (stated nicely in the B3D review) isn't about it's sample pattern...it's that it only goes up to 4X. Given the power of the card, and that many games will be CPU limited, higher MSAA modes should be doable from a performance perspective.

Agreed 110%.

Agreed, too, though I would add lack of programmable patterns and gamma correction to the above. But the card has the power to run 4x with high resolutions, and that should certainly satisfy the bulk of the gaming market.

Me too. Considering the kind of quality AA we've had with the ATI Radeons, I'm disappointed that Nvidia have not been able to improve on things again this generation.

It's really difficult to go backwards once you are used to high quality, and I know I'd rather have 100 fps and higher quality than 200 fps at low quality.

Yes I think ATI will be very flattered by the fact that the NV40 despite it's marvelous features still can't out do their FSAA quality. My only concern is that they have banked on this and they too fail to raise the IQ level this time round. To think we'd have basically R300 IQ for the next two years irrespective of higher framerates doesn't appeal too much to me.

Irrespective of this one small blip I think it's hats off time to Nvidias engineers who have obviously made great strides.
 
Seiko said:
To think we'd have basically R300 IQ for the next two years irrespective of higher framerates doesn't appeal too much to me.

FSAA isn't the only factor when it comes to IQ though.
 
Bjorn said:
Seiko said:
To think we'd have basically R300 IQ for the next two years irrespective of higher framerates doesn't appeal too much to me.

FSAA isn't the only factor when it comes to IQ though.

True, and even FSAA is very subjective. With potentially 1600*1200 and 4xFSAA becomming playable do you need any higher? I.e. Would gamers prefer 1280x1024 with 8xFSAA or 1600*1200 with 4xFSAA?

Unfortunately a limited 4xFSAA combined with angle dependant AF means Nvidias IQ can only go so far with current games and to me that's what the R3xx series already offered 18+ months ago.

By the time games are going to use shaders to wow the gamers I think the NV40 will be fairly old hat?

To me Nvidia are to be congratulated for taking such a large leap in features but I do think they missed a trick by not upping the FSAA and AF enough over the R3xx series.

Thankfully for them I suspect ATI haven't either. :(
 
Seiko said:
By the time games are going to use shaders to wow the gamers I think the NV40 will be fairly old hat?

I'm guessing that Unreal Engine 3 will be the first super heavy shader using engine. But Half Life 2 should be coming out this summer/fall and
it's certainly nothing to sneeze at. And Doom3 will probably play a LOT better on the GF 6800 then 9800 XT and FX5950.
 
Re: damn

Me said:
Slashdot just posted the news, and now all the webservers are melting. Oh well we got all the good info already anyway.
Man am I glad I got up early and loaded up everything already! :oops:

I think a bit of the whole bloody internet is going to melt today! ;)
 
Rys, please elaborate on the cooling unit. You say it is annoying in 3D mode, yet most reviews haven't really said too much about it. I'm running a Radeon 9800 Pro right now, and I'm wondering how the two compare to each other in terms of fan noise. I'm a complete nut when it comes to noise levels.
 
Looks like a good job for nvidia. Compared to the nv30 generation, it's amazing. Compared to the 9800XT, it does pretty damn well for itself :) Personally I'm an ATI guy, but it definitely looks like the nv engineers hunkered down and did this one right :)

Anand/derek's article is up too, though I'm already a little concerned about what he's saying. http://www.anandtech.com/video/showdoc.html?i=2023&p=2

This year the latest in the DirectX API is getting a bit of a face lift. The new feature in DirectX 9.0c is the inclusion of Pixel Shader and Vertex Shader 3.0. Rather than calling this DirectX 9.1, Microsoft opted to go for a more "incremental" looking update. This can end up being a little misleading because whereas the 'a' and 'b' revisions mostly extended and tweaked functionality, the 'c' revision adds abilities that are absent from its predecessors.

I was under the impression that 9.0c wasn't much more than a new SDK and more HLSL compile targets, and that PS3.0 was already in the 9.0 spec from day one. Are they off the wall here, or is it just me?
 
The 6800U is capable of 8xAA. Unfortunately it's using supersampling so it's also SUPER slow.

Hence the benchmarks all at 4x.

Not that I'm complaining, 1600x1200 at 4x beats out a lower resolution at 8x for me.
 
ChronoReverse said:
The 6800U is capable of 8xAA. Unfortunately it's using supersampling so it's also SUPER slow.

Hence the benchmarks all at 4x.

Not that I'm complaining, 1600x1200 at 4x beats out a lower resolution at 8x for me.

Personally I think all this talk of AA is getting ridiculous. Sure Nvidia/Ati could've improved on R3xx quality this rounds but come on....how many non-cybernetically enhanced people can notice the difference between 4x and 8x AA @ 1600x1200 in real time? I don't buy games to stare at screenshots. :p
 
trinibwoy said:
Personally I think all this talk of AA is getting ridiculous. Sure Nvidia/Ati could've improved on R3xx quality this rounds but come on....how many non-cybernetically enhanced people can notice the difference between 4x and 8x AA @ 1600x1200 in real time? I don't buy games to stare at screenshots. :p

I'm sure I would notice it. (Because I notice aliasing at 1600x1200 with 4X AA). Particularly in real time, because the "smoother" the frame rate is, the easier it actually is to notice aliasing effects.

And btw, I don't buy $500 cards to run CPU limited games, having untapped GPU power which could be tapped....if onlyit supported better AA. ;)
 
trinibwoy said:
Personally I think all this talk of AA is getting ridiculous. Sure Nvidia/Ati could've improved on R3xx quality this rounds but come on....how many non-cybernetically enhanced people can notice the difference between 4x and 8x AA @ 1600x1200 in real time? I don't buy games to stare at screenshots. :p
Pixel crawl is most noticeable when you're moving through a scene and concentrating on the opening to a hallway or something. Having to ignore the movement along the edges of the "solid" walls while looking for the movement of an enemy is distracting. Then, there's the joy of watching pixels race across the horizon while you're moving towards it.

I almost never notice aliasing at higher resolutions when I'm sitting still, but it's still annoying when I'm moving.
 
Seiko said:
Yes I think ATI will be very flattered by the fact that the NV40 despite it's marvelous features still can't out do their FSAA quality. My only concern is that they have banked on this and they too fail to raise the IQ level this time round. To think we'd have basically R300 IQ for the next two years irrespective of higher framerates doesn't appeal too much to me.

This is what I kept thinking as I was reading all these previews. For all the performance gains which are significant, IQ on the nv40 is mostly described as "comparable, equal, slightly better, etc...". The 6800u isn't meant to compete with a 9800pro, it handily beats it performance wise but in the IQ department is only as good as 18 month old tech. How can nv40 compete IQ wise with r420 if it improves IQ. Maby IQ won't be significantly better on r420 either, is improving IQ going to become increasingly difficult in the future?
 
Interesting previews we have here...

The worst preview that exists... comes from the one and only Anandtech.

http://www.anandtech.com/video/showdoc.html?i=2023&p=8

Anand tries to show NVidia's improved AA quality... in the pic... where the hell are the jaggies in the NON-AA pic?

According to Anand:

The disabling of trilinear optimizations is currently available in the 56.72

Isn't it SUPPOSED to be 60.72?

http://www.anandtech.com/video/showdoc.html?i=2023&p=17

The 1.1 patch of this game makes note of the fact that PS3.0 is implimented on the NV40 path. We have (as of yet) been unable to determine exactly what function PS3.0 is serving. Maybe it's something useful like branching, or maybe it's marketing speak (technically fp32 is a PS3.0 requirement). We just won't know until we can get ahold of the developers.

Someone obviously isn't doing their research...

HardOCP confirms the "not working PS 3.0 detection"

http://www.hardocp.com/article.html?art=NjA2LDU=

Above you can see what the console reported as operational with each card. The GeForce 6800Ultra was detected as supporting Pixel and Vertex Shader 2.0 and will run FarCry with PS/VS 2.0 mixed with PS/VS 1.1. The GeForceFX 5950Ultra and the 9800XT also follow with the exact same shader versions. Now, we know that patch 1.1 of FarCry adds Pixel Shader 3.0 profile support. However, it does not appear to be exposed yet on the 6800Ultra, and it is unknown if there is any SM3.0 programming present. Suffice it to say FarCry with DX9.0b, Forceware 60.72, and patch version 1.1 will run with PS/VS 2.0.

Dave confirms what Brent had to say..

http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=11

Note: For a list of currently supported DirectX caps on NV40 and the 60.72 drivers see here, and see here for a list of supported OpenGL extensions. The 60.72 drivers currently do not expose DirectX shader model 3.0 so relevant caps for this will not be present.

Also note, I don't recall Anandtech mention WHAT FORCEWARE VERSION was used in his testing (please link me to that, I could've missed it somehow)..

On the whole, the reviews were very informative...

Quite an impressive improvement from NVidia on the NV40...

There's a couple of issues that still exist:
Brilinear on the NV3X series (not fixable as of the 60.72 Dets)
Cheats (specifically 3DMark) on the NV3X series - it will probably stick there unless NVidia finally removes them (to downplay the NV3X and try to focus on the NV4X)... and POSSIBLY try to be clean for the next revision of 3DMark (as of late, the 60.72 ARE "futuremark approved" but for only that hardware)

Those guys who complain about ATI's adaptive aniso really should shut up now though... NVidia's adopted pretty much what ATI has done (improved AA+aniso)...

Though, in the end, I would be REALLY SURPRISED if ATI does something REALLY DIFFERENT... (besides the rumors suggesting 2.b shader specs) like changing their aniso algorithm...

We will have to wait and see...

I'll probably drop by NVNews and see some fanboys at work... I can hear them change their "votes" on ATI's adaptive filtering already (biases and all)

The two molex connectors.... hmm... I wonder about ATI...
 
Back
Top