8800 Series and Crysis - Are they just poor DX10 performers?

That's strange...I know we have the same setup and High is very very playable for me @ 1440 x 900. In fact Very High is almost playable at around 20 FPS. I did OC my E6600 to 2.9 GHz, and that seemed to help a bit.

then i guess i need to shop trying to discuss whats playable and not because it seems even the definition of playable is so varied. :LOL: for me, 20 fps is not even close to being playable. different peoples sensitvity to frame rate is different. for me the very second it goes below 40 fps and can "feel" it, there is a sudden slowness. and im talking about fast paced FPS.

but its all good though, now that i somewhat understand how G80/G92 works, in my opinion the next high end from nvidia will be nothing short of a monster and maybe then ill buy crysis. right now TF2 is all i need.
 
Oblivion was the same way when it was released. Now you can load it up on an 8800 or even 8600 and blow it out of the water. Or, you can load up some of the texture mods and see why PC gaming really is the place to me. This is how it has always been. A game comes along that blows away what was seen as the high-end, and so a new high-end is needed.

Exactly. Without games like Crysis, why would we ever need anything more powerful than the 8800? If anything, the fact that the 8800 slaughters pretty much every game out there is probably a big part of the reason why we haven't had anything more powerful for getting on a year now.

Now that Crysis is here, people will be cramming to upgrade and thus NV and ATI will be eager to get something more powerful out the door. Its what drives PC gaming.
 
Ooof, just finished playing the demo. Makes me realize a few things.

1. I'm extremely glad I got a monitor with the best scaling chip on the market right now. The fact that 1360x768 on a 30" monitor looks almost as good as 2560x1600 on a 30" monitor makes playing with Very High quality a very real possibility.

2. Unfortunately at 1360x768 on a 30" monitor AA is an absolute must which negates a lot of that performance gain. :D

3. Looking at what they did, I can VERY easily see how this game can very quickly become CPU limited. The amount of physics calculations used in this game puts Half-Life 2 to shame.

4. In reference to point 3, where's my Phsyics on a GPU? I have a perfectly capable x1800 XT just chomping at the bit to do physics. **sigh**

5. Yeah, you're going to need a top of the line quad core in the fastest speed possible paired up with the fastest GPU you can get your hands on to play this at high res and high settings. I'm betting Intel, AMD, and Nvidia are just LOVING Crytek right now. :)

6. Finally, a graphical showcase that surpasses Oblivion and Vanguard. Although volumetric clouds in Vanguard are still the shiznit. :)

Regards,
SB
 
So where's Ageia? Sure seems like they should be loving Crysis and how much it pushes physics. Perfect opportunity for them to shine.
 
The games auto detect mechanism chose "medium" for my machine(1280x1024,opty 170, X1950XT, no AA) and my frames have stayed over 30fps. As far as the graphics go at this detail level... I'd rather be playing something else. I dunno, I played the shit out of Far Cry, and this is more of the same, with worse framerates, and jagged edges. Pretty much inline with what I was expecting, considering I dont have the newest and best of everything in my box atm:smile:
 
then i guess i need to shop trying to discuss whats playable and not because it seems even the definition of playable is so varied. :LOL: for me, 20 fps is not even close to being playable. different peoples sensitvity to frame rate is different. for me the very second it goes below 40 fps and can "feel" it, there is a sudden slowness. and im talking about fast paced FPS.

but its all good though, now that i somewhat understand how G80/G92 works, in my opinion the next high end from nvidia will be nothing short of a monster and maybe then ill buy crysis. right now TF2 is all i need.

You know right after I made the post I was thinking "I probably have a different definition of playable from pjbliverpool". Personally I've found Crysis to be just fine at 45 FPS or so, but I understand how many people prefer 60 and above. Oh well, guess we'll all be needing new video cards then. :p
 
Somebody sounds very happy with the HQV-chip :) I wished Samsung or Dell would have the sense to put it in one of their 27-30 inchers so that we EUs can get it too.

I was expecting to be disappointed thinking that it was probably just being overhyped.

But while the monitor still fails in some area (backlighting is uneven), the scaling and overall image quality is superb.

Just wish it had a faster response time. Ah well, it's absolutely perfect for what I want it for. A high resolution 30" desktop with the ability to display lower resolution games full screen without looking blurry, blocky, or generally overall crappy like all other LCD's I've tried.

And I have to agree, I wish Samsung or Dell had released a monitor with this chip.

Regards,
SB

PS - 20" 4:3 monitors in portrait mode work great in conjunction with a 30" 16:10 monitor.
 
1. I'm extremely glad I got a monitor with the best scaling chip on the market right now. The fact that 1360x768 on a 30" monitor looks almost as good as 2560x1600 on a 30" monitor makes playing with Very High quality a very real possibility.
Which monitor would that be if I may ask?

I solved it in another way. With my 24" monitor I can only choose from 1920x1200 and 1680x1050 as 16:10 widescreen resolutions. I can create lower resolutions in the nVidia drivers like 1400x900 and 1280x800. However, custom resolutions don't get scaled by the nVidia drivers with the video card itself, but always the monitor, even though I have selected that the card has to do it, since it scales more smooth.

So I dived into how custom resolutions are stored in the registry as a binary blob. And I managed to configure the 2 resolutions in such a way, that the signal to the monitor is 1920x1200, but the resolution rendered on is smaller and gets streched by the video card, which increases the graphical quality in a game when playing at a non-native resolution.

So, I'm happily running Crysis at 1280x800 in Windows XP with the very high settings hack. :)
 
Let's not forget that CryTek plans to franchise the engine, so it should stay cutting edge in the following 12-18 months at the very least.
Also, forget the medium settings. Life begins at high - even if it means upgrading your rig. If you don't want to buy a quadcore and 2 GTXs now, wait for 6 months - that's what I'll do.
 
Cevat Yarli in the number 13 of Egde - spanish edition - . Is one translation from the spanish - the original is in english , sure ;) - , sorry for the errors ;)


" It is not important what you have, our game will overcome it. If you possess a SLI DX10 with 4Gb of RAM, we will be able with your PC; but not because we let's want to do it, because we want to invest in the future. "

He does a pause and laughs: " actually probably if you have a SLI DX10 it turns out difficult that we could with your PC. But there are profiles that would do that computers of high range were working to the half of his power. Then the users would say: ' what a cunt! Have I bought me a new PC for this? '. So two years later you will be able to reinstall Crysis, and I hope that you do it because you are charmed with his playbility , and it will have much better looks . The games go at aging, and it is something that I propose avoid ".
 
Are these cards poor DX10 performers or are there some serious optimisation issues with this game?

I am a frustrated gamer here whom has spend a small fortune getting his system ready for this game. Am I going to have to wait til next gen to get the performance?

I read that DX10 was supposed to perform better, with less horsepower needed for better graphics. Also unified shader architecture was supposed to the way forward.

Yet in Crysis I can neither run AA nor AF without taking a totally ridiculous hit in performance. How long are we looking before a card is capable of running this maxxed?

I have already done some testing and found the game is very CPU dependent and responds well to a sizeable overclock. I have not messed with GPU overclocking for now.

System I run on is

E6600 @ 3ghz
4GB Ram
X-FI Soundcard
8800GTS 640mb
Vista 64bit Ultimate

So am I waiting for G92 and what specific part of the gpu architecture would need improving to make this game run well?

It already runs well,my friend :D , but it could run better.
There are many reporting that they have 35+ fps constantly on a single 8800GTX on Very high settings with a resolution of 1600x1200 and 2x AA.
I have noticed that the shaders are very demanding, setting them to medium will give you a big performance hit.

But lets not forget what is being renderred and calculated, the A.I. is amazing the graphics aswell etc. , that´s what makes it worth for me. I choose graphics before framerate unles the framerate is way to bad which it´s not :cool:.
 
Last edited by a moderator:
Which monitor would that be if I may ask?

It's the Gateway HXD 3000.

http://www.gateway.com/programs/widescreen/30_overview.php

They overhype on the page, but it IS a really nice bit of kit. You can check various reviews around the web which vary from glowing to overall positive. The reviews were what pushed me over the edge to get it. It IS quite expensive however compared to the competition.

If you don't need excellant scaling ability or want to use this for movie watching, I'd suggest the Dell, Samsung, or HP 30" screens. They are MUCH cheaper, but have absolutely horrible scaling. And considering some of the flaws (uneven backlight if you look at an all black screen), some people will find the price unreasonable. However, the positives (scaling, input options, excellant black level and contrast) outweigh the negatives for me (uneven backlight, slow response of all 30" panels).

Obakan said:
I have noticed that the shaders are very demanding, setting them to medium will give you a big performance hit.

Ah, but this is one of the very few games where the shadows are convincing enough that I actually prefer to turn them on. In general, I ALWAYS turn shadows off. But this game and Vanguard have some excellant shadows. Thief 2 also had some amazing shadows for it's time. Everything else, Doom 3 included, had absolutely horrid looking shadows that I turned off if I could.

Regards,
SB
 
Does the 2900xt do it better in dx10 then?

No
crysis.gif
 
Hum, those median low scores make the 8800gts a considerably more atractive card. Though the difference would likely be less when turning the AA down to get a more playable framerate in that particular case, I can imaging other situations where maintaining a decent minimum frame rate could leave the 8800gt at notably lower settings.
 
It's just nuts how much of a bummer R600 turned out to be. Obviously there is something very wrong with the hardware or their driver team has become totally incompetent. They've had, what, over a year now probably to get the card working well? Of course, considering Cat 7.9 made my X800 XL not work at all, maybe that is the case. Heh. I was almost ready to toss that card, figuring it was dieing, until I went back to 7.8 and all was fine again.
 
I found it interesting that [H] went out of their way to disagree with you about improving R600 driver performance in their piece today.

At the GeForce 8800 GTS level though, there is plenty of competition with the AMD/ATI Radeon HD 2900 XT. While this video card started out quite rough for AMD, we are finding now with the latest drivers that it is doing quite well in games we have tested.
 

i wish reviewers would include MINIMUM fps when they do these sort of reviews. i mean, unless there is no difference worth mentioning? I mean if i am someone that uses 16x10 4x/8x AA and 16xAF almost exclusively, why would i upgrade from my 3Ghz clocked e6320 to the qx9650 or any other CPU thats considered "better" ???
 
Back
Top