8800 Series and Crysis - Are they just poor DX10 performers?

Would you prefer that they made it look like Quake 3 and have it run at 2000fps?

Devs don't strive to make it slow just for the hell of it. Crysis is the current leader in realtime computer graphics. After having played it on max settings, I cannot point to a single game that looks better. For that level of realism, very powerful hardware is needed.

I don't see what everybody's problem is. Crytek set their goals high. If you want it to run like FarCry does, then set it to medium. Then it looks and runs just as well as FarCry.
 
I don't see what everybody's problem is. Crytek set their goals high. If you want it to run like FarCry does, then set it to medium. Then it looks and runs just as well as FarCry.

Remember the time when HL2 and Doom3 was released? They were the state of the art for computer graphics way back then, but one thing that contrasts those games to Crysis is the fact that even with the industry leading visuals the engine was very scalable that even a geforce3 ran on the doom3 engine while not losing much of the eye candy.

Farcry was a 2004 title. I dont think many people want to pay to play a game that resembles the exact same visuals 3 years ago. :LOL:
 
I've been able to play Doom3 and HL2 at 1680*1050 perfectly well on my PC in 2004, with an ATI X800. I even had AA and AF in HL2, and AF in Doom3.

Crysis is certainly pushing the envelope in graphics technology, they have an incredible amount of features, but I jost don't want to bother with getting a PC for it. I'll get an X360 instead (and a PS3 eventually) - they can meet me there...
 
Why don't you guys just pretend that there is no high setting and play on medium? IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings. It's not like crysis doesn't justify its performance with the visuals it generates.

Sure there is probably room to optimize but maybe this has more to do with ego than anything else. Nobody likes seeing their mighty machines so thoroughly humiliated :)

I have to agree. The game looks ridiculously good on very high but my GTS 640 can't touch it. In fact i'm barely getting playable frames at high with 1280x800.

Problem is, Medium doesn't look all that great IMO.
 
I'm getting good 21 fps on the GPU test on Very High setup at 1280*960 under Vista and clocked 2900XT. But considering that ATi still didn't rolled out an optimized driver (a.k.a. hot-fix) there's a room for improvements, I believe. ;)
 
Well, I suppose for all of you that don't like to have a game that tries to push the graphics envelope, Crytek should have just released the game with Medium settings + High Textures as the Ultra High setting and just chopped out the rest.

Then you'd all have been happy campers.

Sheesh. I still remember when Unreal brought EVERYONE's video card to it's knees. There wasn't a single video card at the time that could get anything but a slideshow at 1600x1200 res. Hell, no card at launch could run it at 1280x1024 at a playable framerate. Almost everyone had to drop to 800x600 or 640x480 for playable framerates.

Unreal Tournament was almost as bad. Except a few were able to do 1280x1024, although NOT at maxed settings. And 1600x1200 was still WAY out of reach.

I also don't seem to remember Far Cry being able to do all well at 1600x1200 with maxed everything an AA+AF. Lucky for Crytek that 24" monitors were far less popular or I'd imagine you all would have been blasting them for not being playable at 1920x1200 with everything cranked up. :p

Meh. I'll be happy playing at 1440x900 or 1680x1050 on my 30" monitor. Thank god the Gateway 30" has such incredible scaling abilities. I honestly have an extremely hard time differentiating a 1920x1200 source running full screen from a 2560x1600 source running full screen. Text is still extremely sharp.

Regards,
SB
 
I don't remember all that many monitors with 1280x1024 and higher resolutions being around when Unreal came out. In 1998 the standard was more along the lines of 17'' tubes with a usable desktop resolution of 1024x768. When I got the game in early '99 it ran pretty well on a PIII 550/64MB/VoodooBanshee at 800x600. AA and AF settings didn't figure, simply because they didn't exist/weren't usable. It ran even better (less harddrive action) after upgrading to 128MB and finally 256MB and a GF2MX a few years later.

I can see how disappointing it must be for people with anything sub 88xx. I think the GTS640 I have is borderline, it does alright but I didn't enable AA and I'm not sure I'd be too keen to see the result framerate-wise.

PLUS I think what helps subjectively with Crysis is motion blur.
 
yeah I was gonna say when unreal came out 10x7 was like todays 19x12. 1600x1200 wasnt even considered. even though his premises are eh not accurate, his point is understood. i mean even though everyone is complaining, there is no denying the fact that when MAXIMUM settings are used, Crysis looks unbelievable. nothing comes close as far as realism not only in visuals but also physics.
 
No pain no gain. /shrug

In a world of Ultra SLI, a single GTS looks very midrange-y to me.
 
No pain no gain. /shrug

In a world of Ultra SLI, a single GTS looks very midrange-y to me.


Yes indeed, if you do not have ultra GPU then you get to flick the buttons to a lower setting and then you are OK.

If you then complain that medium looks bad then what you are complaining about is that you expected your rig to look as good as a rig costing 3x as much, is that reasonable? For the last 12 months dual GPU 8800GTX's have not had much to show but now is payback time for them and they are laughing. Which I think is cool. (note I am not one of them...) If it works without bugs of course heh :)

I've got a single overpriced 8800U at 650/2220/1800sp and it runs at the hacked very high settings in DX9 at 20-30fps at 1280x960 0AA / 16AF and although I would like it to be 1920x1200 sometimes you have to bite the bullet. It still looks fantastic at that res with the sunbeams shining down as I stroll around chopping down trees and hurling assorted wildlife for 200 yards here there and everywhere. 25fps for this game is certainly playable.

One problem though the lack of AA is weird. It's like playing at 640x480 at some points with some structures whereas others look perfectly OK. It gets so bad some wooden beams look bent.
 
I'm glad Crysis pushes the envelope. What good does it for gamers to be completely maxed out on 1 year old video cards, especially ones that are not top of the line?


I don't remember all that many monitors with 1280x1024 and higher resolutions being around when Unreal came out. In 1998 the standard was more along the lines of 17'' tubes with a usable desktop resolution of 1024x768. When I got the game in early '99 it ran pretty well on a PIII 550/64MB/VoodooBanshee at 800x600.

Back then one had to spring some money for a nice 21" monitor. I saved for a year just to pick up a Viewsonic 21" to run 1600x1200 desktop. I also saved for an entire year to build an entire system at the end of 1996 with a 17" monitor that could run 1280x1024. I made great use out of those monitors, until 18"/19" LCDs were the latest rage. Well worth the money to spend on the single piece of equipment you stare at all the time when using the computer.

When it came out, Unreal ran great on the Voodoo 1 at 640x480 and decently enough at 800x600. It was a game that I was happy to upgrade to a Voodoo 2. It ran even better once I had V2-SLI at 1024x768.
 
Why don't you guys just pretend that there is no high setting and play on medium? IMO it is a bit strange to want devs not to produce software that is beyond the reach of current hardware as long as that software is scalable to lower settings. It's not like crysis doesn't justify its performance with the visuals it generates.

Sure there is probably room to optimize but maybe this has more to do with ego than anything else. Nobody likes seeing their mighty machines so thoroughly humiliated :)

It is disappointing to have to argue with the group of people who think that games shouldn't demand more as time goes by. I can only guess that they haven't been playing games long enough to see how things have progressed over the years. Usually some console philosophy enters the argument too.

Crysis is really quite reminiscent of Unreal, as said above. Unreal took Voodoo Graphics, everyone's favorite 3D card, and pummeled it into the ground. And so ensued complaints about games becoming too demanding, etc, etc. It's just the great circle of human whining; more for free please, etc. Crysis at least offers scalability. You can run it on a very wide range of cards. You can't expect your X800/6800/fav 3 yr old card to be able to run at max quality.

If a game developer has the gall to absolutely push 3D tech to the edge of what is possible, why is that wrong? Progress is bad? New visual tech is unwanted? Apparently it insults / threatens the self-image of a number of people. One does not have to buy this game or own a PC at all. It reminds me of the old woman at the Time Warner cable office that I got to listen to and wait for a few months back when I was getting a new cable modem. She was demanding they give her cheaper cable because she can't live without it or some such. I wanted to tell her that I didn't have cable TV at all.
 
Last edited by a moderator:
using 800x600 LOW everything. People are talking about playing it with playable framerates and having it look as good as all the hype videos crytek has released. which is impossible.

I have an 8600GT in a second PC. It runs ok 1024x768 HQ in XP. On medium settings, I can do 1680x1050 ok. It is overclocked to 730/950 though. I overclock everything lol.

Gigabyte 965P-DS3 mobo
C2D E4300 @ 2.80 GHz (stock Vcore heh)
2 GB DDR2-800
GeForce 8600 GT passive (I put a fan on its monster heatsink)
Audigy 4
XP SP2
 
Last edited by a moderator:
It is disappointing to have to argue with the group of people who think that games shouldn't demand more as time goes by. I can only guess that they haven't been playing games long enough to see how things have progressed over the years. Usually some console philosophy enters the argument too.

Crysis is really quite reminiscent of Unreal, as said above. Unreal took Voodoo Graphics, everyone's favorite 3D card, and pummeled it into the ground. And so ensued complaints about games becoming too demanding, etc, etc. It's just the great circle of human whining; more for free please, etc. Crysis at least offers scalability. You can run it on a very wide range of cards. You can't expect your X800/6800/fav 3 yr old card to be able to run at max quality.

If a game developer has the gall to absolutely push 3D tech to the edge of what is possible, why is that wrong? Progress is bad? New visual tech is unwanted? Apparently it insults / threatens the self-image of a number of people. One does not have to buy this game or own a PC at all. It reminds me of the old woman at the Time Warner cable office that I got to listen to and wait for a few months back when I was getting a new cable modem. She was demanding they give her cheaper cable because she can't live without it or some such. I wanted to tell her that I didn't have cable TV at all.

For reference my first computer was a BBC Micro Model B with a tapeloader lol. So I certainly have seen the progress of games from the start.

my problem is developers seems to want to push the envelope to silly levels. I mean what is the point in having the latest hardware, only to find you need to wait another year to play the latest game maxxed out?

Gaming is not cheap, what is more frustrating is there is nothing out there I can buy that would give me the satisfaction in terms of performance. I do not like SLI as it is not reliable and produces way too much heat and consumes silly amounts of power.
 
my problem is developers seems to want to push the envelope to silly levels. I mean what is the point in having the latest hardware, only to find you need to wait another year to play the latest game maxxed out?

See that's the problem. The complaint isn't that you are unable to play new games at a certain IQ level. The complaint is that you can't "max out" the in-game settings. Which means you would be happier if the "maxed out" settings were equivalent to Crysis' more playable medium settings. The consumer loses nothing when developers push the envelope except some false notion of pride or entitlement. Now if Crysis was unplayable and looked mediocre then that's a whole other story but I think most people agree that it generates the visuals to justify the framerates we're seeing.
 
And actually at the time Unreal came out, 21" CRTs was fairly common in our gaming group. As were 19" CRTs capable of 1600x1200. Some of the the group even splurged on 21" CRTs capable of 1800x1440 res.

Of course, none of us at the time were expecting any NEW game to be able to run at the max res of our monitors.

Now it almost seems like there's a vocal minority in the enthusiast community that expects every game released to run well at 1920x1200 or even 2560x1600 on top end video cards.

Personally, I just think those people are spoiled. Hell, I remember the days of trying to get Doom to run well at 320x240. :p

Regards,
SB
 
I have to agree. The game looks ridiculously good on very high but my GTS 640 can't touch it. In fact i'm barely getting playable frames at high with 1280x800.

Problem is, Medium doesn't look all that great IMO.

That's strange...I know we have the same setup and High is very very playable for me @ 1440 x 900. In fact Very High is almost playable at around 20 FPS. I did OC my E6600 to 2.9 GHz, and that seemed to help a bit.
 
For reference my first computer was a BBC Micro Model B with a tapeloader lol. So I certainly have seen the progress of games from the start.

You're a youngster then :)

my problem is developers seems to want to push the envelope to silly levels. I mean what is the point in having the latest hardware, only to find you need to wait another year to play the latest game maxxed out?

Gaming is not cheap, what is more frustrating is there is nothing out there I can buy that would give me the satisfaction in terms of performance. I do not like SLI as it is not reliable and produces way too much heat and consumes silly amounts of power.
Well I'm not going to use the cheap-shot "buy a console", but your argument isn't far off needing that sort of response. Maybe "buy a smaller monitor", or "look forward to goodness" is better?! (*)

A typical game engine takes 2-3 years to develop, and that's 2-3 generations in terms of graphics hardware. To recoup it's development investment the engine may well take another 2-3 years from the first time it hits the market. In other words the devs have to look ~4-6 years ahead. That's quite tough when the underlying hardware has generational time-scale of 12-18 months.

PC gaming is alive because and pretty much only because the platform changes underneath it so rapidly. It has its good times and its bad times, and this isn't so great if you're used to the good times. Wait a while and they may well be back.

(*) I'm trying to run this game on a 2560x1600 monitor using a 8800GTS. This is going to fail, but it's fun, right?!
 
Gaming is not cheap, what is more frustrating is there is nothing out there I can buy that would give me the satisfaction in terms of performance. I do not like SLI as it is not reliable and produces way too much heat and consumes silly amounts of power.

Oblivion was the same way when it was released. Now you can load it up on an 8800 or even 8600 and blow it out of the water. Or, you can load up some of the texture mods and see why PC gaming really is the place to me. This is how it has always been. A game comes along that blows away what was seen as the high-end, and so a new high-end is needed.
 
Back
Top