8800 Series and Crysis - Are they just poor DX10 performers?

IMO to get playable frame rates using very high 16x10 4xaa/16xaf will require quad SLI 8800GT. how easy or hard would it be for nvidia to produce a single GPU card with performance of 4x 8800 GT?

Well, if their next single-card product can do 2x the GT (which isn't impossible...) then an SLI'd pair would net you this kind of performance.

I don't actually think it will require that much though.
 
Well, if their next single-card product can do 2x the GT (which isn't impossible...) then an SLI'd pair would net you this kind of performance.

I don't actually think it will require that much though.

What game sees perfect scaling w/SLI or CF? Even 3dmark doesn't see that kind of scaling...
 
I think Crysis is an example of physics, environmental interactivity and AI running into the same problem that graphics have already started having, a.k.a. the uncanny valley. The world just feels so alive that the parts that reveal that it's only just a computer game are so much easier to point out. The more open-ended you make a game from the start and how the player can interact with the world, the more you have to second guess how different people are going to play it.

Yeah, a more generalised type of uncanny valley is a good way to describe it.

Yes, I'm very happy with my performance as well, but I'm aching for the next generation of Nvidia cards to come out, so I can see what this game would feel like if everything was on very high, AA was 8X or better, and the framerate was always closer to 60 then not.

I can see them sitting on the Geforce 9 for a while though. With the 8800GTX blowing every game away save this one, and the very affordable 8800GT being very close, they have no reason to bring it out until they see ATI's response.

I was really hoping the Geforce 9 would be out this year, I'm quite surprised they're not releasing it and pushing it alongside Crysis.

By the way, have you noticed any weird sort of effects when turning on AF? I tried 16x AF just to see what it'd be like, and things like trees in the distance seemed to look worse, IMO.
 
Would you prefer that they made it look like Quake 3 and have it run at 2000fps?

Devs don't strive to make it slow just for the hell of it. Crysis is the current leader in realtime computer graphics. After having played it on max settings, I cannot point to a single game that looks better. For that level of realism, very powerful hardware is needed.

I don't see what everybody's problem is. Crytek set their goals high. If you want it to run like FarCry does, then set it to medium. Then it looks and runs just as well as FarCry.

Quake 3 ran slow when it first came out. I had a tnt2 ultra which was a good card when it came out and it was very slow if you turned the resolution to a reasonable number and 32 bit color etc...

It is actually an example of a great game b/c it scaled beautifully other engines and games released that had similar performance at the release did not scale nearly as well. The fact that in a few years you got 250fps was a great thing not bad...
 
Yeah, a more generalised type of uncanny valley is a good way to describe it.
I've just noticed a lot of people saying the AI is horrible in this game, and while it still does some typical things like forget you ever existed if it doesn't find you in 20 seconds, I've also seen it do really impressive things. They've flanked me, ambushed me, they swerve the jeep if I shoot trees down in the road in front of them. Turn on cloak right in front of them and they panic and begin shooting in your general direction. But in a game where everyone can experience it differently, there's bound to be issues popping up simply because people are going to find ways to exploit the game in ways the programmers didn't account for, resulting in immediate breaks in the immersion.

I think this is problem is just going to get worse in the future. When games like Oblivion come out, it allows everyone to have a completely different gameplay experience with plenty of options of what to do, but if everyone can have a completely different experience, then bug testing becomes more and more insurmountable.

I was really hoping the Geforce 9 would be out this year, I'm quite surprised they're not releasing it and pushing it alongside Crysis.
Indeed. I know that even with a high price, many people would probably buy a 9800 as an impulse buy if it was released the same day as Crysis, myself included.

By the way, have you noticed any weird sort of effects when turning on AF? I tried 16x AF just to see what it'd be like, and things like trees in the distance seemed to look worse, IMO.
Personally, I haven't tried it. The textures on "very high" were pretty good, so trying to turn on AF never crossed my mind. I have heard that both AF and forcing AA through the control panel are causing issues in Nvidia cards. They apparently are working on fixing that in a later driver, although they also say you'll never be able to turn on transparency AA without issues, something about the way the game engine renders foliage makes it impossible to do.

Speaking of setting tweaks, people might want to check out this thread on the official forums. Someone went through the time to discover exactly what changes with each setting, including a color graph, and shows you how you can make any manual changes.

Personally, I found it helpful. As I continue to balance looks vs. framerate, it's nice to discover how to change my settings details to the very high option individually, instead of taking a performance hit for one chunk of turning a setting to "very high" in the game which includes changes I either don't care about or hardly notice. I don't know about you guys, but other than Texture Detail and Post-Processing (and Object Blur, but that's automatic in DX10 anyways), I can barely tell the difference between the "High" and "Very High" settings unless I really concentrate.
 
Well, if their next single-card product can do 2x the GT (which isn't impossible...) then an SLI'd pair would net you this kind of performance.

I don't actually think it will require that much though.

I hope their next high-end single-card solution will be able to run the game at decent framerate at 1600x1200, 2xMSAA, MS TAA and 8x quality AF, everything "very high" ingame settings.
I think I will try and hold off the game to then. January I have heard.
 
I hope their next high-end single-card solution will be able to run the game at decent framerate at 1600x1200, 2xMSAA, MS TAA and 8x quality AF, everything "very high" ingame settings.
I think I will try and hold off the game to then. January I have heard.

That's a lot to ask... Aren't current high-end cards running the game with those settings well under 30 fps? It would be great if such a card were to be released so soon though, then "the next Crysis" (in terms of graphical fidelity) could be that more closer to reality.
 
Quake 3 ran slow when it first came out. I had a tnt2 ultra which was a good card when it came out and it was very slow if you turned the resolution to a reasonable number and 32 bit color etc...

It is actually an example of a great game b/c it scaled beautifully other engines and games released that had similar performance at the release did not scale nearly as well. The fact that in a few years you got 250fps was a great thing not bad...

I remember QuakeIII and how it ran with about 25-35fps on my TNT2. The funny thing was that Nvidia later released drivers that bosted my fps with same settings to about 45-50fps. Here I am hoping for some new "magic" drivers that will make me play it with more than 25-40fps at highest settings.
 
I remember QuakeIII and how it ran with about 25-35fps on my TNT2. The funny thing was that Nvidia later released drivers that bosted my fps with same settings to about 45-50fps. Here I am hoping for some new "magic" drivers that will make me play it with more than 25-40fps at highest settings.

Or the pre-release sp demo is based on an older build and Crytek have done some major strides in optimizeing the performance the last month(s) for the final release. :)
 
Or the pre-release sp demo is based on an older build and Crytek have done some major strides in optimizeing the performance the last month(s) for the final release. :)

We can only hope. OTOH, they've been working on this for quite some time-they've only found ways to make it playable a couple of months ago?:)
 
What I saw at CES in Jan '07 wasn't running shockingly slower. They were using 24" LCDs at non-native resolution. I don't know what the res was. The PCs had 8800 GTX (perhaps in SLI). It was playing smooth enough. A friend of mine actually played it (I posted a vid back then). They told me they had NV engineers on the way to help optimize it (as they have since announced).

I'd keep the hopes of super framerate boosts in check. I don't really see it happening.
 
Or the pre-release sp demo is based on an older build and Crytek have done some major strides in optimizeing the performance the last month(s) for the final release. :)

I am shure that is the case, just seeing that lots of bugs presented in the mp beta 1 is there even though they where fixed in mp beta 2 and 3.
 
Back
Top