Is Crysis Shader Model 2.0 based?

Because Crysis is extremely heavy on the GPU, even without any SM4 effects and it has a DX10 renderer that doesn't use any of its features.
I'm not talking about textures (even my GeForce 6200 can handle them at their max res). Right now I can play only in Sketch Mode (lower than Low), beside I can set Textures, Physics and Sound to Very High. If it was for me, I would prevent any contact between software house and hardware developer ($$$).

What are you talking about? I don't think we're talking about the same thing. Let me help you a bit more:

Albuquerque said:
I don't see why anyone would think NVIDIA has anything to do with the number of SM2 vs SM3 compiled shaders.

See that part in big bold letters? Yeah. That's the part that I'm having a hard time understanding. What does your response have to do with what I asked? I assume you MEANT to quote me and were intending to answer my question; if not, then please forgive my confusion.
 
What are you talking about? I don't think we're talking about the same thing. Let me help you a bit more:



See that part in big bold letters? Yeah. That's the part that I'm having a hard time understanding. What does your response have to do with what I asked? I assume you MEANT to quote me and were intending to answer my question; if not, then please forgive my confusion.

Don't feed the...erm, you know how that continues.
 
Whoa! I apologize, sir!
I was just pointing the fact that Crysis has not a real DX10 engine and NVIDIA has surely helped Crytek in finances at the cost of a very unoptimized game.
 
Because Crysis is extremely heavy on the GPU, even without any SM4 effects and it has a DX10 renderer that doesn't use any of its features.
I'm not talking about textures (even my GeForce 6200 can handle them at their max res). Right now I can play only in Sketch Mode (lower than Low), beside I can set Textures, Physics and Sound to Very High. If it was for me, I would prevent any contact between software house and hardware developer ($$$).
your 6200 can handle the textures at full res huh? :LOL:
cute
 
your 6200 can handle the textures at full res huh? :LOL:
cute

Yeah! It's not a lie! The problem is shading, postprocessing, lighting and volumetric effects. Oh! And I play at 640x480 windowed, 'cause I suffer of motion sickness. I can't handle an FPS at 1440x900, but that display is wonderfull for desktop usage.
 
Yeah! It's not a lie! The problem is shading, postprocessing, lighting and volumetric effects. Oh! And I play at 640x480 windowed, 'cause I suffer of motion sickness. I can't handle an FPS at 1440x900, but that display is wonderfull for desktop usage.
640x480 eh..
no wonder...
 
Don't feed the...erm, you know how that continues.

I really only prefer to bait them into fully showing their colors. And this one seems to do it well :)

I was just pointing the fact that Crysis has not a real DX10 engine
And you deduced this how? By disassembling the executable? By using a shader analyzer on every shader available for every level? By writing your own SM4 shader, importing it into a level you built yourself, and observing the outcome?

I think not.

and NVIDIA has surely helped Crytek in finances
And again, you deduced this how? I know people love to sling big words at "faceless" companies, but I doubt anyone at NVIDIA wrote a check to Crysis for their 4-second intro movie. Rather, Crysis had the chance to sit down with development staff from NVIDIA to write shader code. I'm sure that much is true... The rest of your statement is simple trolling.

Oh, and how do you know the shaders are unoptimized? OR the engine is unoptimized? What is your specific logical analysis that led you to the basic understanding of how unoptimized Crysis really is?
 
I really only prefer to bait them into fully showing their colors. And this one seems to do it well :)


And you deduced this how? By disassembling the executable? By using a shader analyzer on every shader available for every level? By writing your own SM4 shader, importing it into a level you built yourself, and observing the outcome?

I think not.


And again, you deduced this how? I know people love to sling big words at "faceless" companies, but I doubt anyone at NVIDIA wrote a check to Crysis for their 4-second intro movie. Rather, Crysis had the chance to sit down with development staff from NVIDIA to write shader code. I'm sure that much is true... The rest of your statement is simple trolling.

Oh, and how do you know the shaders are unoptimized? OR the engine is unoptimized? What is your specific logical analysis that led you to the basic understanding of how unoptimized Crysis really is?

In all fairness, nV invested significant ammounts in Crysis-engineers sitting there and working with the Crytek guys+top-end cards/development systems don't come cheap-but so what?More power to them for being active and ensuring the end-user gets the best possible experience AND that their cards are painted in the best possible light(although Crytek seems to have screwed em' with SLi initially). I fail to see why this has gotten so many panties up in a bunch:)
 
Nvidia did recommend it in their older GPU guide. I think it is more relevant to the older generations like GFFX, where they could trade precision for speed with lower shader models. They haven't published a new guide for a long time.

Yeah...I went and found the guide I was thinking of and read through it, and it looks like a lot of the shader optimization tips it mentions were more targeted at the FX. This was the quote I was thinking of, from this guide:

Nvidia GPU Programming Guide said:
-Choose the minimum pixel shader version that works for what you’re doing
-When developing your shader, it’s okay to use a higher version. Make it work first, then look for opportunities to optimize it by reducing the pixel shader version.

I guess the guide seemed much more relevant when I was pulling it out of my brain's dusty memory banks. :oops: Thanks for clearing that up, Humus and co.
 
The game is really quite ugly on a SM2b-limited X800 series card, so I'd say it's SM3. Quite a few of the effects do not render at all and it looks worse than FarCry IMO.
 
The game is really quite ugly on a SM2b-limited X800 series card, so I'd say it's SM3. Quite a few of the effects do not render at all and it looks worse than FarCry IMO.

If you mean shadows and stuff the game doesn't render them while the last beta did. I'm guessing that it's driver problems.
 
If you mean shadows and stuff the game doesn't render them while the last beta did. I'm guessing that it's driver problems.
The look of the cloak is quite different, too. I can't remember everything I noticed anymore. I tried it on a X800XL months ago and it just looked quite shoddy, honestly. Turning up quality settings didn't help. Lots of bugs if that's what they are....
 
Back
Top