Crysis 2 being re-designed for GTX580

What does Metro 2033 do in DX10 Very High compared to DX11 Very High? Other than tessellation.

I've been playing it on High and Very High on my retro 8800GTX on my 1360x768 TV. I actually have a hard time noticing what's different about Very High besides the rather large performance hit. DX10 High runs very smooth almost all the time. I only notice significant slow down in some parts where the complexity of the scene seems to skyrocket far beyond what is typical. Most of the time the scene complexity allows Very High to be completely playable for me but in those extra complex instances it turns into a slideshow.

Tesselation, possibly bokeh DOF, more samples, possibly better precision for some effects. The post-process effects being same res as framebuffer in 'very high' mode is what has biggest perfomance impact (godrays). Both fully visible godrays and subtle godrays drains perfomance and even more with MSAA enabled. Fillrate rape in it's finest form.


Anyway said this due to new Nvidia GPU techdemo:

http://forum.beyond3d.com/showpost.php?p=1491199&postcount=414
 
Is it? Shouldn't be new features used to make the games run better, not worse? :???:

The problem is the performance and looks did not seem right. The performance was terrible, but the incremental increase in looks did not support it. Who knows though maybe now it will work better. I was always more annoyed that crysis did not scale well with new hardware.
 
The problem is the performance and looks did not seem right. The performance was terrible, but the incremental increase in looks did not support it. Who knows though maybe now it will work better. I was always more annoyed that crysis did not scale well with new hardware.

Because Crysis is very CPU dependent... It's going to have that problem until the next full generation(Instead of a refresh) after Sandy Bridge and Bulldozer.
 
Because Crysis is very CPU dependent... It's going to have that problem until the next full generation(Instead of a refresh) after Sandy Bridge and Bulldozer.

Not really. Crysis is limited to only using 2 threads thus more threads/cores is unused. Why an E6600 performs the same as a Q6600 with Crysis assuming both got same system components. Now anyway if target is 30-60fps it is GPU depedant but if target is locked 60fps then it is CPU depedant due to inefficiency that cannot be corrected unless next CPUs just achieves more per MHz as only 2 threads/cores will be used no mather what. That said for 30fps or better an E6600 is enough for v.h .

Crysis as a benchmark for CPUs/locked 60fps target mark is ridicolous and worthless. We need to wait for Crysis 2/CE3 to have a benchmark engine that fully utilises hardware from 2006 to this date as CE2 doesn't.
 
Hopefully they have figured out how to really multithread their engine. I'm sure they've put a ton of work into it considering the consoles.
 
So does this mean the 580 will rock at Cry2 but still chug on crysis 1 at hires and max settings?

Now that would be funny.
 
So does this mean the 580 will rock at Cry2 but still chug on crysis 1 at hires and max settings?

Now that would be funny.

IT would be more than funny, it will be hilarious. I actually hope that is the case though. Crysis just never really got where it needed to be, Nebula may think that 30 fps is groovy, but I want 60 :/
 
So does this mean the 580 will rock at Cry2 but still chug on crysis 1 at hires and max settings?

Now that would be funny.

Crysis 2 may be be less demanding than Crysis 1 since it has to run on consoles.
 
IT would be more than funny, it will be hilarious. I actually hope that is the case though. Crysis just never really got where it needed to be, Nebula may think that 30 fps is groovy, but I want 60 :/

But Nebby poo only has a 4890 whilst you are looking at dual 6990 cards if I recall correctly from another thread. You have to understand that his cards are simply not as capable as yours...
 
Back
Top