8800 Series and Crysis - Are they just poor DX10 performers?

Maybe its me, but I hate the motion blur and disable it or put it to the lowest setting.

Yeah I do too. I use a autoexec.cfg variable to do that so I can leave the rest of the postprocessing features enabled.

I like the depth of field effect though.
 
Motion blurs actually one of my favorate effects. Especially when the fps is high (around 40 or above).

The most stuning graphical aspect I have seen so far though has gotta be either the water or the 3d rocks.
 
The game was enjoyable on my opty 165 with 2900xt the only slow-down I noticed was on the beach. Everything on medium ofcourse but this was a LOT better than firing up Far Cry for the first time on my 9600xt and 2.66ghz northwood. Far Cry was unplayable in some parts, Crysis seems a far more balanced product.
 
It looks like they benched without AA. Perhaps that's why. In DX10, their benches show 8800 GT outperforming 8800 GTX and HD 2900 outperforming both 8800 GTS cards. DX9 is similar.

http://enthusiast.hardocp.com/article.html?art=MTQxMCwzLCxoZW50aHVzaWFzdA==

AA takes a large perf hit on 2900 and GF8 series, I found it better to drop AA and instead use higher in-game settings. You could very well use "Medium" settings all around and enable 2X AA if you wanted.

The 8800 GT did not outperform the GTX in my testing, and I found the 2900 XT competing quite well with the GTS in Crysis, but the GT was faster than it for sure, closer to the GTX in performance.
 
Swaaye don't you ever read Kyle's/brent's benches? :p
It's not typical apples to apples benchmarks.. Usually their work aims for the highest playable in game settings for every card..
 
This seems like one of those games where the variety of configurations and settings completely changes what's going on. For every benchmark I read saying the framerates are better with Vista or with a x64 OS, I read one saying it's worse.

Meanwhile, I found it funny that several people on this thread and another forum I frequent stating that the demo seems to be CPU hungry, when other forums have people complaining as to why the game is taxing their GPU and barely touching their multi-core processor.

I guess we'll get a better picture once the final version is out.

I also don't think the people saying the lower framerate is much smoother is people living in denial. There seems to be a lot of people online who feel that way. I even have a friend, who is a huge multiplayer fan who has caught the difference between 5 FPS more than once in my presence who said he was pretty certain the demo never dropped under 30 FPS on his system was shocked that it usually wavered around 25. Maybe it has something to do with all the animation taking place in the environment, I have no idea, but I'm much happier with being around 30 then I thought I would be.

As for the whole "future-proof" discussion, I'm on the fence. Yes, it's impressive when you can play a game like Far Cry or Doom 3 a year or two in the future and it still manages to look amazing and it isn't as dated as a console title would be, and people shouldn't assume that spending a lot of money on your setup is a guarantee that you can blow every single game out of the water with everything on high, especially since Crysis is clearly an exception. It's a technical marvel, and with how many polys and triangles they are pushing at once and the fillrate demands caused by all the geometry, of course even the highest of cards is going to struggle with high resolutions and/or AA.

On the other hand, I see the other side of the issue. With PC Gaming not in the greatest of states at the moment, it's hard to argue that games like Crysis look much better than console games, and yet also amount that the $600 card you spent on won't actually let you appreciate it. For me, a game that's optimized to run efficiently today is more impressive than one that I have to fiddle with for an hour or so to make the right compromise between looks and playability. Frankly, I see little benefit for either side of the equation. The buyer feels disheartened for getting a game that doesn't meet the expectations created by the screenshots and videos he/she has seen, and the developer who is trying to stay afloat with the high costs of a three-year cycle is releasing to a slew of comments and reviews suggesting that instead of paying for a game when it's $50, you'll probably get the best experience, the way it was meant to be played, when you pay $10 for it out of a used bargain bin a year from now.

QUOTE=BRiT;1083239]So where's Ageia? Sure seems like they should be loving Crysis and how much it pushes physics. Perfect opportunity for them to shine.[/QUOTE]
Actually, it's probably the opposite. Crysis is pretty much like an advertisement for how you can get realistic physics that's only software based.

I think that's one thing that's not getting the attention it deserves. Even though the starting framerate is lower than expected, I've started several chain explosions in the game, and my framerate barely dips down by 5 no matter what's happening on-screen. It must be taking advantage of my dual-core.
 
Swaaye don't you ever read Kyle's/brent's benches? :p
It's not typical apples to apples benchmarks.. Usually their work aims for the highest playable in game settings for every card..

Yeah they had me very confused. I looked over it again now and noticed that I had missed that they varied in-game detail settings for each card. Before, I only noticed that they were varying resolution.

Looking for the max perceived playable settings with respect to each card and a specific game is an interesting way to bench.
 
Maybe if you have a dual core, but if you're talking single core then you need to upgrade!

Yeah got a dual-core 90nm Manchester core at 2.5Ghz on an Nforce4 board. Looks like it'll hold me till Nehalem and I can skip the whole AM2/Conroe/Penryn era with GPU upgrades along the way :) Interesting that Cevat was hyping quad-cores over duals but neither show any appreciable advantage at reasonable settings.
 
....With PC Gaming not in the greatest of states at the moment, it's hard to argue that games like Crysis look much better than console games, and yet also amount that the $600 card you spent on won't actually let you appreciate it.

It is true that not being able to have everything now is frustrating to humans. Heh. But really, you can play this game on a very wide range of hardware. I have a 8600GT that cost me $100 and, when overclocked, it can run the game 1680x1050 on medium well enough. I also have a 8800GTX that runs it very well at 1920x1200 HQ.

I'm not really sure what to say about people demanding developers artificially limit the level of technical achievement they want to go for just to appease those who don't want to upgrade and don't want to see their hardware become obsolete.

Maybe Crytek just overestimated what they thought hardware would be able to do by now? This game has been in development for years, after all. Oblivion was the same way, really. It pounded the "top-end" cards of 2006 into the ground without mercy. The PC gaming world didn't come to an end as a result, did it?

If the game looked dated but ran well, people would bitch (UT3 demo for ex). When a game demands more than their hardware can handle, people bitch. Developers really can't win. Personally, I want them to go all-out on their technology and build the best damn games imaginable.

Besides, we can always enjoy more "mainstream" tech in numerous other games. HL2, Hellgate, WOW, etc.
 
Well, to be fair, at least part of the reason Crysis saw a near-linear increase in speed on my CPU overclock is because I'm on an ancient socket-478 3ghz Prescott rig. No dual core here, the best I can offer is hyperthreading -- whatever that is / isn't worth.

I tried disabling hyperthreading "just to see", but it actually performed worse. So, not doing that again -- it stays on, and it stays at 4.2Ghz stabily. Obviously someone with a dual core E6600 at stock vs OC would probably be a better measure of CPU usage.

As for Kuddles, while I agree with some of what you said, this part blatently stuck out to me:
With PC Gaming not in the greatest of states at the moment, it's hard to argue that games like Crysis look much better than console games, and yet also amount that the $600 card you spent on won't actually let you appreciate it.
PC gaming doesn't seem (to me) that it's in any sort of sorry state; there are TONS of games coming out for the PC. Sure, not every one is a blockbuster, but nor is every console game. And as far as saying it's hard to argue that Crysis looks better than console games? Which console games are you comparing it to? Because I can't think of a single console game that comes close to Crysis even in Medium details at 720P resolution -- and we still have High and very High settings to talk about later.

Console games are nowhere near where Crysis is on the PC, and in fact they're just now getting to where FarCry was on the PC three years ago with shader usage, shadows, AI, physics and view distance. But even then, current consoles still have to "hack" with texture resolution, shadow methods and framebuffers to get it all squeezed in -- which then loses AA, or AF, or proper trilinear filtering, or god knows what else.

Console games aren't in ANY danger of approaching PC first-person shooters, and I'd be happy to see an example where you think I'm somehow missing it.
 
Actually, it's probably the opposite. Crysis is pretty much like an advertisement for how you can get realistic physics that's only software based.
After seeing what kind of physics effect overkill that Cell is capable of, I'm convinced that CPU power isn't the limiting factor anymore. The software has developed so far that the only thing preventing a fantastically interactive world is RAM. You need to keep the state of all objects (orientation and position at least, and more for non-rigid bodies) for the physics to feel real, or you'll go back to a place and see the scene revert itself.

A 10,000x10,000 grid of movable objects, for example, is pretty sparse for city or game level when comparing to reality, yet that'll take 3GB to keep track of. You'll rarely see more than 100 or 1000 objects moving at once unless it's a temporary burst of particles, so IMO processing power isn't the problem.

That's why AGEIA is a dead end.
 
PC gaming doesn't seem (to me) that it's in any sort of sorry state; there are TONS of games coming out for the PC. Sure, not every one is a blockbuster, but nor is every console game. And as far as saying it's hard to argue that Crysis looks better than console games? Which console games are you comparing it to? Because I can't think of a single console game that comes close to Crysis even in Medium details at 720P resolution -- and we still have High and very High settings to talk about later.
I wouldn't say PC gaming is in a sorry state, but it's also not doing very well right now and hasn't for quite a while. Sales are low, and when a multi-platform game comes out, PC versions usually do the worst. And what I was trying say is that it's hard to defend PC games to another person, when as I said, you can buy the most expensive video card on the market today and you still have to compromise your settings from what the game looked like in a preview.

And Crysis is the exception to the rule. You still need to be on the high end. Most games, like Oblivion, BioShock, etc. only look better on the PC if you play it on the higher settings, turn them down a bit and the 360 versions look just as good if not better.
 
After seeing what kind of physics effect overkill that Cell is capable of, I'm convinced that CPU power isn't the limiting factor anymore. The software has developed so far that the only thing preventing a fantastically interactive world is RAM. You need to keep the state of all objects (orientation and position at least, and more for non-rigid bodies) for the physics to feel real, or you'll go back to a place and see the scene revert itself.

A 10,000x10,000 grid of movable objects, for example, is pretty sparse for city or game level when comparing to reality, yet that'll take 3GB to keep track of. You'll rarely see more than 100 or 1000 objects moving at once unless it's a temporary burst of particles, so IMO processing power isn't the problem.

That's why AGEIA is a dead end.

And another reason we really need to move to a 64 bit OSes as soon as possible. There won't really be an elegant way to deal with this on 32 bit OSes, IMO.

Another reason that while I know it would have been financial suicide, I wish MS would have only produced a 64 bit version of Vista. **sigh**

BTW - I was happy to note that Hellgate: London comes in both a 32-bit and 64-bit version. So at least some progress is being made there in the gaming world.

I'm wondering if any upcoming UE3 games will offer a 64-bit version.

Regards,
SB
 
I wouldn't say PC gaming is in a sorry state, but it's also not doing very well right now and hasn't for quite a while. Sales are low, and when a multi-platform game comes out, PC versions usually do the worst.

I would be interested to see some cross platform sales comparisons. Its something I have looked for in the past but have never been able to find. Still, I think the state of PC gaming is greatly exagerated in the media. Compared to all consoles its certainlt low but I see no reason to compare it against all consoles. Its a seperate gaming platform in its own right just like any console and i'm willing to bet total PC sales are slaughtering total PS3 sales.

And Crysis is the exception to the rule. You still need to be on the high end. Most games, like Oblivion, BioShock, etc. only look better on the PC if you play it on the higher settings, turn them down a bit and the 360 versions look just as good if not better.

That may be true but then again most games don't require all that much horsepower to run on the higher settings and thus look better than the console versions. In fact its the very fact that Crysis requires more horse power that makes it so much better looking than console games. There could quite easily have been higher end setting added to Oblivion, Bioshock etc... that stressed GTX's at low resolutions.

Its curious that some PC gamers would have seen that as a bad thing as opposed to playing the game at 100fps on a GTX and having it barely look better than the console version.
 
After all, there was an x64 Version of UT2k4, so I would expect that UT3 comes with a x64 Binary as well ...
 
After seeing what kind of physics effect overkill that Cell is capable of, I'm convinced that CPU power isn't the limiting factor anymore. The software has developed so far that the only thing preventing a fantastically interactive world is RAM. You need to keep the state of all objects (orientation and position at least, and more for non-rigid bodies) for the physics to feel real, or you'll go back to a place and see the scene revert itself.
Theoretically, you could try using your HDD as a cache for this. HDDs are actually very fast nowadays *if* you can just read & write straight & long chunks of data. Even low-end HDDs could easily achieve 30MB/s, which is 1M objects/second loaded or saved...

For this performance-sensitive task to work, however, you need much more direct access to the HDD than a PC is able to offer. I suspect the PS3's HDD cache might already help here, if used properly. The idea would simply be to have ~15MB chunks you load when an area becomes visible, and you save it when you are sufficiently far from it again.

Alternatively, you could use NAND for the same task, but it tends to be slower (faster random-access, lower bandwidth...)

A 10,000x10,000 grid of movable objects, for example, is pretty sparse for city or game level when comparing to reality, yet that'll take 3GB to keep track of.
I'm sure 30 bytes/object could be optimized further if that's your goal, however. But I agree that PhysX doesn't really solve either problem: objects can't be streamed in/out via the HDD and I very much doubt the on-card representation of the world is very memory-optimized.
 
I think there's some way to go on the software side. I don't think it's a matter of 'being there' on the software front, and just needing the capacity to do more of the same. The physics and interactivity in Crysis are relatively great, in spots as perfect as they need to be (e.g. I just loved the bullets spraying through the leaves of a tree, and their reaction..it seemed really good), but - and I was going to making a point about this in a general crysis thread - with the graphics being so good, it really highlights when things don't react as perfectly. For example, shooting a plastic chair. It shattering into a few big pieces doesn't really seem like a realistic reaction. That's one thing that stuck out at me, among many things.

I think also, as interfaces improve and become more sophisticated, the physics of today may start to look very inadequate. Imagine being able to map the motion of your hand and individual digits on your hand..and being able to interact with and manipulate objects in the game on that level. I think that would require a big step up on the physics front from where we are now. Imagine tracing water over a surface with your finger, rifling through papers on a desk, digging a hole in the sand with your fingers, picking blades of grass...etc. etc. (some of these examples are very arbitrary..but I intend to convey the granularity and sophistication of interactions that would be required, and at a certain level of detail). Currently, in many games (Crysis included at times), you often feel like a big box walking around in terms of your own physical interaction with the world (guns aside)...that ought to get a whole lot more granular and more detailed in time.

I think also the simulation of human movement and behaviour is going to become more detailed...that's another thing that sticks out in Crysis, the behaviour of the humans. Versus other games, it's not bad at all, of course, but versus where one might dream of us being in the future...

Then beyond the physical motion of objects, things like biological interactions and simulations could be explored more fully.

I think software, offline and realtime, still has a way to go on the simulation front, particularly in the nearer term in things like realistic destruction of different materials among other things. Maybe it depends how picky you are, but I want us to go further, absolutely. It's certainly not just a hardware power issue, but a huge software challenge remains IMO.

On the original topic, I got a 8800GT oc'ed to 670Mhz yesterday - the 'optimal' automatic settings on my rig with this card are 1024x768 with all settings set to very high. It's playable with those settings. 1650x1080 with a mix of medium, high and very high settings is playable, but it's hard to go back from everything being on Very High..
 
IMO to get playable frame rates using very high 16x10 4xaa/16xaf will require quad SLI 8800GT. how easy or hard would it be for nvidia to produce a single GPU card with performance of 4x 8800 GT?
 
I think there's some way to go on the software side. I don't think it's a matter of 'being there' on the software front, and just needing the capacity to do more of the same. The physics and interactivity in Crysis are relatively great, in spots as perfect as they need to be (e.g. I just loved the bullets spraying through the leaves of a tree, and their reaction..it seemed really good), but - and I was going to making a point about this in a general crysis thread - with the graphics being so good, it really highlights when things don't react as perfectly. For example, shooting a plastic chair. It shattering into a few big pieces doesn't really seem like a realistic reaction. That's one thing that stuck out at me, among many things.
I think Crysis is an example of physics, environmental interactivity and AI running into the same problem that graphics have already started having, a.k.a. the uncanny valley. The world just feels so alive that the parts that reveal that it's only just a computer game are so much easier to point out. The more open-ended you make a game from the start and how the player can interact with the world, the more you have to second guess how different people are going to play it.

On the original topic, I got a 8800GT oc'ed to 670Mhz yesterday - the 'optimal' automatic settings on my rig with this card are 1024x768 with all settings set to very high. It's playable with those settings. 1650x1080 with a mix of medium, high and very high settings is playable, but it's hard to go back from everything being on Very High..
Yes, I'm very happy with my performance as well, but I'm aching for the next generation of Nvidia cards to come out, so I can see what this game would feel like if everything was on very high, AA was 8X or better, and the framerate was always closer to 60 then not.

I can see them sitting on the Geforce 9 for a while though. With the 8800GTX blowing every game away save this one, and the very affordable 8800GT being very close, they have no reason to bring it out until they see ATI's response.
 
Back
Top