Crysis 2 PC edition OT

Yep an 8800GT probably could have handled it at a nice solid 30fps and 720p if it was running on a properly optimised CE3 I expect.
Not if they'd create a new Ultra setting profile (aka Very High) like in Crysis 2 that steamrolls the latest cards. Albeit that is primarily due to the object detail setting that drops my 6950 from 30-40fps to about 5 fps if I'm looking at a lumpy brick wall lol.
 
What's interesting to me is that although he says Crysis was horribly optimized, it's still one of the best looking (and sounding) games today, and runs pretty well compared to the other top lookers on todays hardware.
 
I must say that Crysis guy sounds rather aloof, and isn't being terribly charismatic in his customer relations... I only read a few posts because I was starting to get pissed off by his attitude. :p

This is Xzero i think.

bigtabs said:
What's interesting to me is that although he says Crysis was horribly optimized, it's still one of the best looking (and sounding) games today, and runs pretty well compared to the other top lookers on todays hardware.
Because it was and CE 2 didnt even used more than two cores.
 
Quad cores weren't all that commonplace in 2007. The Q6600 came out at the start of the year, but availability wasn't great and prices were high. Development of the game was probably mainly on dual cores, and a huge percentage of gamers didn't have a quad. Many didn't even have a duallie.

Most games of that time weren't quad core optimized. In fact it was quite a sensation when you could find one that was. Quake 4, SupCom... I can't think of any more. In 2007 there were still forum threads asking what games would even make use of their dual cores.

Of course Crysis is badly optimized compared to todays standards. That's good. It means devs have been learning things in the last 4 years. Of its day though, it was just fine. Don't mistake not being able to get great frames at some arbitrary setting as bad coding.

Remember all that Crysis introduced? I don't. There's too damn much of it. How could they be expected to do all these things with future knowledge into the very best methods of doing these things?

Parralax Occlusion Mapping
Screen Space Ambient Occlusion
Chromatic Aberration
Full Scene Motion Blur
Object Motion Blur
Deformable vegetation
Volumetric Clouds
Godrays

Other intensive effects (although done before by a few games probably)
Soft Shadows
Day Night Cycle
Atmospheric Effects
Translucent Objects (leafs)
Subsurface scattering (skin)
Soft particles

Combine all that with unparrelelled view distance, vegetation and destructability and possibly the best sounding game of it's generation (still sounds excellent, close to recent Battlefield games in quality).

Do you honestly expect that all those things would hold up to their counterparts today? Oddly enough to look at or otherwise experience they generally hold up very well. Performance wise, what with the console focus the last few years of course all these things will have been optimized to hell and back and become more performant since then. That's good.

The guy from the linked forum is looking at it from today's perspective. You don't seriously think they developed Crysis thinking to themselves "haha, this is terribly optimized, they won't even be able to use this feature! lol". They did the best they could with the technology and knowledge that was available to them at the time.

They're doing the same now, and in 5 years time they can look back on this console Crysis as they are porting it to mobile phones and say "Christ, that was horribly optimized!"
 
Last edited by a moderator:
Good point, BT. How do you optimize something that's never been used before? Oh sure, they could have spent three more years cleaning it up, but that's true of almost any game.
 
Actually Supcom runs the heaviest part of itself in one thread (simulation) so it really only benefits from a dual core. Spreading the other threads to other cores helps only until sim is maxed out. It also has horrible problems with path finding killing performance.

Quake4 I remember having stuttering problems. I turned off its SMP option. SMP wasnt very useful.

There aren't really any games that I know of that benefit from more than 3 cores. Unreal Engine 3 is the main one that comes to mind for >2 cores. It may not exactly be the pinnacle of game tech but UE3 does seem well threaded.

Also need to remember that SMP is not a new concept and that not all software lends itself to it well. The consoles really need multithreaded games to perform well so obviously there are brilliant people working at it.


What I think about Crysis is that it needed major work but they released it instead of pumping more money into it. Another year of optimization would have streamlined it more and their audience would have had better hardware to run a more polished game on. The same thing basically happened to STALKER. Obviously it turned out that from a business standpoint it was time for release.
 
What I think about Crysis is that it needed major work but they released it instead of pumping more money into it. Another year of optimization would have streamlined it more and their audience would have had better hardware to run a more polished game on.

Honestly judging by what some people seem to think, they would've been better just not bothering with the v high settings. There was no rule saying that anybody had to play the damn thing maxxed out on release day. Stupid devs should've left the higher settings to the config files where people who appreciate them can go and use them.
 
Good point, BT. How do you optimize something that's never been used before? Oh sure, they could have spent three more years cleaning it up, but that's true of almost any game.

Are you serious with that? Carmack has been doing it all the time. Using your logic, Quake could have been released with a Pentium Pro required as the minimum hardware because it has used tech that noone ever done on game engines. Yet they were able to optimize it to run as efficient as possible and render in SVGA 640*480 resolutions in software at playable frame rates.

Just because a tech is new, it doesn't mean you can't write an implementation for it that will use the hardware efficiently. The problem with Cryss wasn't that it wasn't using the most efficient algorithms (like we're seeing now with screen space AO approximations) but that the implementations were inefficient on their own, wasting hardware resources.
 
Holy Jesus, Allah, Zeus, Jupiter, Amun, Assur, Huangdi, and all the other major dieties I might have missed, but...

I finally had the time and inclination to try Crysis 2 with the DX11 + high res texture patch and OMG. Tesselation of world and scenery geometry is even more beautiful and immersive than I expected.

It was exceedingly jarring whenever I ran into a patch of ground or saw a wall that didn't have tesselation.

This has easily displaced Battlefield 3 as the best looking game implementing Dx11 I have seen thus far. However, Battlefield 3 still has much better distant terrain rendering from what I remember playing it at a friend's house (especially those gorgeous mountains in the background).

But for things near and dear and upclose Crysis 2 in Ultra with Dx11 takes the cake. OMFG the tesselation at least in the very first level I'm running through right now is drool inducingly gorgeous. I sat and stared in awe in the first area where APC tracks were dug deep into the ground and dirt was raise up representing the grooves in the tires of vehicle.

Watching the light and shadow play over tesselated geometry just gave me goosebumps. This is the future of graphics. I'm going to rip somehow's heart out and make them swallow it if either of the next gen consoles from MS and Sony don't feature capable tesselation hardware. Yes, Davros I realize those are consoles, but the more hardware with good tesselation capabilities the more likely we'll see games on PC with good use of the feature.

My god, I'm still relatively speechless at how incredibly immersive it is in Crysis 2. I'm not sure I can go back to any other game on PC now without feeling like the whole world is flat and uninteresting.

Wow... Just wow. This is what I was hoping for when tesselation was made an official part of DirectX. Need more...NEED MORE. :p

Oh and thank you AMD for allowing me to cap the tesselation factor through CCC. :p I'm not sure my poor 5870 could handle full tesselation in Crysis 2 with their rather head scratching decision to highly tesselate FLAT surfaces. It already struggles a bit at ultra settings. :p If anything is making me seriously consider a 7970, this is it.

Regards,
SB
 
Just imagine what the game will look like with Panasonics new 20" 4K monitor :oops:

http://www.engadget.com/2012/01/09/panasonic-outs-worlds-smallest-and-thinnest-4k-x-2k-ips-lcd-m/

Finally a monitor with about as high a pixel density as IBM's ancient 22" 4k (3840x2400) monitor (T220/T221) from 2001. :D Well depending on which 4k standard they'll be backing it may or may not have higher pixel density. :D

Looks like it's going with the 3840x2160 standard.

But at least it'll have far faster response time than that ancient monitor (average 45 ms, IIRC).

I'm going to guess it'll be a similar or higher price tag also. If you have to ask you can't afford it. :)

Regards,
SB
 
I'm going to guess it'll be a similar or higher price tag also. If you have to ask you can't afford it. :)

Nah, these are going mainstream soon (my guess). 4K displays are all over CES and now even the tablets are getting this pixel desnity. Finally after 10-15yrs we're going to get an upgrade in desktop image clarity!!!
 
Last edited by a moderator:
Back
Top