8800 Series and Crysis - Are they just poor DX10 performers?

Why don't you guys just pretend that there is no high setting and play on medium? IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings. It's not like crysis doesn't justify its performance with the visuals it generates.

Sure there is probably room to optimize but maybe this has more to do with ego than anything else. Nobody likes seeing their mighty machines so thoroughly humiliated :)
 
Good for you, but its not really contributing to the discussion of the problems he's (The OP) experiencing using AA/AF.

There's no problems, he just has unrealisitic expectations for a game that's supposed to push the graphical envelope.
Using AF is no problem since it's pretty much free on the 8000 series.
Why don't you guys just pretend that there is no high setting and play on medium? IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings. It's not like crysis doesn't justify its performance with the visuals it generates.

Sure there is probably room to optimize but maybe this has more to do with ego than anything else. Nobody likes seeing their mighty machines so thoroughly humiliated :)


I fully agree, the game doens't look "bad" at all on medium and I can play it at 1680x1050
 
There's no problems, he just has unrealisitic expectations for a game that's supposed to push the graphical envelope.
Using AF is no problem since it's pretty much free on the 8000 series.



I fully agree, the game doens't look "bad" at all on medium and I can play it at 1680x1050

AF is not free in DX10 mode.
 
While on one hand it seems silly that a dev would put a game that no available GPU can max out at 60 FPS (at reasonable resolutions), at the same time I'm playing the game on High @ 30 FPS on my 8800 GTS and there is no doubt that it looks better than any other game I've played. So I'm not really sure what the big problem is.
 
Why don't you guys just pretend that there is no high setting and play on medium? IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings. It's not like crysis doesn't justify its performance with the visuals it generates.

Sure there is probably room to optimize but maybe this has more to do with ego than anything else. Nobody likes seeing their mighty machines so thoroughly humiliated :)

because perception is subjective. and for me as well as many others, medium settings looks A) like crap and B) miles upon miles from what we've come to expect it look like. not to mention no AA without a further huge hit to already abysmal FPS is alone a deal breaker for me.
 
Those who run medium settings, simply tick the textures up. That will help TONS in how the game looks. I was a bit disappointed my first time playing at all medium, ticked the Textures to High and got the same performance and the game looks TONS better. There is a huge difference between Medium and High textures.
 
Lose the AA, its a killer for Crysis and bugged atm. People are reporting having their framerates halved or worse with it enabled.
 
Ok guys some feedback on some testing.

Game runs better in DX9 mode at same settings.
The game seems fillrate dependent, because at 1280X720 I get an increase of 15fps in DX10 mode. This makes the game completely playable at 25-30fps. As soon as I hit 1680X1050 ( my native res ) my fps goes down to 16fps.

So would I be right it is more a fillrate problem?
 
IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings.
Definitely true. It'd be great if I could bump up the visual quality of Myth II now to keep up with the times, but alas another more than Voodoo3 doesn't help much ;)

Nobody likes seeing their mighty machines so thoroughly humiliated :)
Hehe, I do, as long as it is jusified. Progress ftw!
 
Just done a bit of basic math.

At 16 -20fps I would need a 50% increase in performance from G92 to give me playable settings ( 24fps-30fps roughly )

Is this realistic to expect from DX10 second gen?
 
batch, batch, batch

On the CPU issue:

http://www.shacknews.com/featuredarticle.x?id=639

Bolding/italics are my edits.
A64 3800+, WinXP, X1950GT 512, Medium:

My gpu benchmark result:
!TimeDemo Run 2 Finished.
Play Time: 78.17s, Average FPS: 25.58
Min FPS: 0.00 at frame 372, Max FPS: 35.76 at frame 68
Average Tri/Sec: 18584438, Tri/Frame: 726406
Recorded/Played Tris ratio: 1.26
!TimeDemo Run 3 Finished.

Crysis seems to have a big big batch problem, it hurts my CPU. And now the question: Did Crytek anything to minimize the small batch problem for their D3D10 renderer?

iXBT has played a little bit with NVIDIA PerfKit and 8800 GTX: Look at the Call of Juarez DX10 Benchmark.
batch count (avg) 412 428
batch count (max) 722
primitive count (avg) 1380734
primitive count (max) 2308458
setup triangle count (avg) 851683
setup triangle count (max) 2920384
 
Last edited by a moderator:
Could you explain to me what is batch and why it hurts if high?

When you're sending work to the GPU, you must try to group ('batch') as much work together in a little as possible drawing calls. If you don't, the bottleneck of the system will not be the GPU executing the commands, but the CPU sending the commands from the user mode to kernel mode and then sending it to the GPU.

See here: http://developer.nvidia.com/object/GDC_2003_Presentations.html
"Batch, Batch, Batch:" What Does It Really Mean?

Or just google "gpu batching".
 
so why would crytek purposefully make it run like crap? because thats exactly how it "feels" to me, like the gpu is hardly doing any work.
 
I played it on high on my 7900GT-on-AGP + Prescott 4.2Ghz Hyperthreaded S478 + 2GB of ram on Vista. At 1680x1050 even.

It played fantastically, at around 2fps ;) But geeeeeeeez to look at it! I'm quite happy to play it at medium on my rig, it looks better at these settings than any other game I have at their highest, so I'm not too concerned.
 
because perception is subjective. and for me as well as many others, medium settings looks A) like crap and B) miles upon miles from what we've come to expect it look like. not to mention no AA without a further huge hit to already abysmal FPS is alone a deal breaker for me.

Sorry but I have to agree with what trinibwoy said.

the developer made this game to scale on multi core cpus and from their interviews it seems like they went to the bleeding edge and made it certain that a quad core is needed for their game code.

as for Dx10 and Nvidia and DAAMIT its all a matter of waiting and or allowing those companies to optimize their drivers for Crysis since both companies support Dx10 in hardware and the way things are looking its better that they focus on their Vista Dx10 drivers before releasing a higher performance refresh card or a next gen card.

This may be a bit off topic but I see the problems people are having with Crysis as the begining, eventually Futuremark will release their Windows Vista Dx 10 ONLY 3dMark next that will most likely require a dual core cpu to run respectable and a quad core to run decent.
 
thats great that you agree with triniboy. since nothing in my post was about disagreeing with triniboy, im not even sure why you quoted my post???
 
Back
Top