Alan Wake: Microsoft preparing to leave PC gamers behind (again)

Or tessellating water under the city 100% of the time, even when said level has none to see.

That's somewhat understandable however. I'm assuming that it's just a generic tesselation thing applied to all "ocean" water. And it's somewhat common practice when creating a 3D scene with oceans to just have a base "layer" of water. Then you layer geometry on top of that.

In theory the water underneath shouldn't have any effect as the hardware z-culling or whatever should prevent the graphics card from rendering any of the occluded water.

However if you turn on wireframe mode so you can see it, the hardware then has to render the wireframe.

The excessively tesselated flat surfaces is the problem with C2.

Regards,
SB
 
I believe the problem was that the performance impact was there just as if you were looking at the ocean even though it wasn't visible.
 
I believe the problem was that the performance impact was there just as if you were looking at the ocean even though it wasn't visible.

Yep, The tessellation mesh for the ocean was present under maps that were not any where near the ocean.
 
Crysis 2 is infected with TWIMTBP..

Do we really need to make up excuses as to why there are highly tesselated flat surfaces and non-present water, which causes heavy performance drops on DX11 AMD GPUs?
 
Crysis 2 is infected with TWIMTBP..

Do we really need to make up excuses as to why there are highly tesselated flat surfaces and non-present water, which causes heavy performance drops on DX11 AMD GPUs?
I don't think it's evil bias. I think the DX11 patch was just a quick hack job to appease the PC market. I doubt that NV specified heavy tessellation to be used in ways the player can't even see. Even if Crytek had gone out of their way to use tessellation effectively, AMD pre-7900 cards would still be at a significant disadvantage compared to GF100/110 because they are just plain slower with geometry.
 
I don't think it's evil bias. I think the DX11 patch was just a quick hack job to appease the PC market. I doubt that NV specified heavy tessellation to be used in ways the player can't even see.

Riiiight...
 
Riiiight...

Let me come at it from another angle - Crytek voluntarily shit on the PC market with the initial release of Crysis 2 and the DX11 patch likely included some more of that half-assed magic. NVIDIA was very involved with Crysis 1 too, btw. Prerelease G80 hardware and development assistance. The relationship surely goes back to FarCry or even earlier.

It's no secret though that a lot of developers like NV more, and it's interesting to hear why that is directly from the devs. The reasons are really in plain sight (like, see RAGE). It's unfortunate really, and ridiculous, but that is how it is.
 
Let me come at it from another angle - Crytek voluntarily shit on the PC market with the initial release of Crysis 2 and the DX11 patch likely included some more of that half-assed magic. NVIDIA was very involved with Crysis 1 too, btw. Prerelease G80 hardware and development assistance. The relationship surely goes back to FarCry or even earlier.

It's no secret though that a lot of developers like NV more, and it's interesting to hear why that is directly from the devs. The reasons are really in plain sight (like, see RAGE). It's unfortunate really, and ridiculous, but that is how it is.

All of that is true, but it doesn't explain why there's highly tesselated water being rendered in levels without water, and why there are simple structures like concrete slabs pushing millions of polygons without any discernible difference between tesselated and non-tesselated (well, there's those handles, lol).

Of course many developers prefer nVidia, the company sends them a task force of code monkeys for free... I have no problem with that, I just wish the code monkeys' agenda didn't include sabotaging performance and features on solutions from the competition..

You can still welcome the sand in the eyes, and excuse everything from TWIMBP infections:
- Batman Arkham Asylum with AA artificially locked for AMD GPUs,
- Assassin's Creed losing DX10.1 compliance through a patch that solved nothing but only lowered performance on AMD GPUs
- Ubisoft refusing to include tesselation optimization for AMD GPUs for the HAWX 2 Demo
- Software PhysX using x87 compiler code in order to make CPUs look weak for physics calculations

The list goes on and on, there's tens of examples like these for the past 9 years or so, all with TWIMTBP games. There's even this recent event of "new users" registering on ChipHell during the Radeon HD7900 release, just to make single posts claiming Kepler will be faster and that HD7900 is bad. For me, it's a given fact that there's a large portion of that company making a living out of dealing low blows.

But if you still want to believe Crytek just accidentally the concrete slabs and the water then go ahead...



Regarding RAGE, I just follow a simple logic: if hundreds of PC games are coming out yearly and are playable using AMD hardware on day one, then what happened is 99.9% id's fault.
AMD may have actually slipped the wrong driver at the time but no developer in his right mind would release a game that doesn't work in half the gaming computers by the time of release. The lack of professionalism (and common sense, really) by id was atrocious, IMO. They can whine all they want and try to blame AMD for what happened, but a whining incompetent is still incompetent.
 
popcorn.gif
 
TTT, sure NV has done lots of evil with some TWIMTBP games. But with Crysis 2, everything works on AMD, plus there is still a performance hit on NV cards from the ridiculous tessellation work. So leaping to the conclusion that it was more than Crytek being sloppy doesn't really work IMO. There's no exclusive benefit to NV customers. On the other hand, we know Crytek's feelings about the PC market and I see little reason they would spend lots of extra time to rework assets for us and not take shortcuts.

Plus, 6900 has the geometry performance of a 560 so the performance would probably look the same even if they had lovingly tessellated objects with care.

But anyway, I find Crysis 2 super boring so I'm going back to Alan Wake now... :)
 
Speaking of DirectX 11, have we seen a single game which proves its worth (DX9 vs DX11 let alone DX10) ?
I think we're still stuck with DX9.
 
Speaking of DirectX 11, have we seen a single game which proves its worth (DX9 vs DX11 let alone DX10) ?
I think we're still stuck with DX9.

Metro 2033 perhaps.

Rage requires features on DX10+ cards.

And yes Crysis 2 DX11 does have a few neat effects but the texture pack was more valuable I think.
 
Regarding RAGE, I just follow a simple logic: if hundreds of PC games are coming out yearly and are playable using AMD hardware on day one, then what happened is 99.9% id's fault.
AMD may have actually slipped the wrong driver at the time but no developer in his right mind would release a game that doesn't work in half the gaming computers by the time of release. The lack of professionalism (and common sense, really) by id was atrocious, IMO. They can whine all they want and try to blame AMD for what happened, but a whining incompetent is still incompetent.

Let's examine the facts.

1. AMD has historically had moderate to severe trouble with OGL, let alone stuff that pushes the OGL boundaries.

2. There's been a Cayman bug in Quake 4 since the card was released. swaaye uncovered it late last year in the 3d drivers/etc forum here at beyond3d. Well over a year.

3. Very little of ID's patches have had anything to do with fixing AMD problems, but there have been a steady stream of drivers from AMD fixing RAGE problems.

4. Nvidia wasn't spotless on RAGE either, but quickly fixed most problems(You can never fix all problems in software, sadly...).

So, I can't really see anything to back up your assertion that it was all ID's fault.
 
Speaking of DirectX 11, have we seen a single game which proves its worth (DX9 vs DX11 let alone DX10) ?
I think we're still stuck with DX9.
Um... Battlefield 3? I'm still waiting for people to benchmark/notice things like the massively better performance of deferred MSAA in DX11 vs DX10 there ;)

And if you're comparing to DX9 it's even more stark... what they do on PC in DX11 is so far ahead of the consoles it's not funny. It barely looks like the same game when I see console footage.

But I do agree that there are too few games that are actually taking advantage of DX10/11.
 
2. There's been a Cayman bug in Quake 4 since the card was released. swaaye uncovered it late last year in the 3d drivers/etc forum here at beyond3d. Well over a year.
I emailed AMD about that and got a "thanks for letting us know!" response. ;)
 
Metro 2033 perhaps.

Rage requires features on DX10+ cards.

And yes Crysis 2 DX11 does have a few neat effects but the texture pack was more valuable I think.
IMHO, Metro2033 is perhaps not a good example of DX11. Does it not crawl almost every DX11 capable card ? I mean there's no single card capable of producing playable frame rates for this DX11 game.

Also, I thought Rage for PC was essentially the same as Rage for XBOX360, minus texture resolution?

Um... Battlefield 3? I'm still waiting for people to benchmark/notice things like the massively better performance of deferred MSAA in DX11 vs DX10 there ;)

And if you're comparing to DX9 it's even more stark... what they do on PC in DX11 is so far ahead of the consoles it's not funny. It barely looks like the same game when I see console footage.

But I do agree that there are too few games that are actually taking advantage of DX10/11.
Ah... BF3. How could I miss that. PC version does not support DX9, if I'm not mistaken.
The difference between BF3 for consoles (X360 & PS3) and BF3 for PC must be significant. (Never seen the console version).
 
Last edited by a moderator:
I thought Rage for PC was essentially the same as Rage for XBOX360, minus texture resolution?
I tried to run it on an old X1950 and one of the missing GL extensions that caused it to not even load was one that appeared in OpenGL 3.0 (DX10 era hardware). So it seems you need R600/G80 or newer.
 
I'd say the best Dx11 games thus far are Crysis 2 with Dx11, BF3, and Civilization 5 (compute, tesselation and MRTs although AMD still doesn't support MRTs).

There's also other cases where Dx11 is used for a speed boost over Dx9 versus obvious graphical techniques.

For Dx10, Just Cause 2 is a fantastic showcase.

I believe the problem was that the performance impact was there just as if you were looking at the ocean even though it wasn't visible.

I'm still unclear of how something that isn't being rendered by the hardware could have any performance impact what-so-ever. If we're going by wireframe analysis then that is a flawed analysis. And it isn't as if we can turn on and off the water to see what impact if any it has. Add to that the water isn't tesselated to nearly the extent of many of the flat surfaces in the game, and I just don't think the non-rendered water has any effect on performance, tesselation or not.

IMHO, Metro2033 is perhaps not a good example of DX11. Does it not crawl almost every DX11 capable card ? I mean there's no single card capable of producing playable frame rates for this DX11 game.

The biggest fault of Metro 2033 was the non-adjustable ridiculous shadowmap size that was forced on the user when tesselation for Dx11 was enabled. Something Nvidia hardware can handle slightly better than AMD, especially with Nvidia cards at the time of Metro 2033's launch having more graphics memory (on the enthusiast cards) than their counterparts (1.5 GB versus 1 GB).

Regards,
SB
 
IMHO, Metro2033 is perhaps not a good example of DX11. Does it not crawl almost every DX11 capable card ? I mean there's no single card capable of producing playable frame rates for this DX11 game.

Also, I thought Rage for PC was essentially the same as Rage for XBOX360, minus texture resolution?


Ah... BF3. How could I miss that. PC version does not support DX9, if I'm not mistaken.
The difference between BF3 for consoles (X360 & PS3) and BF3 for PC must be significant. (Never seen the console version).

I'm running metro 2033 on dx11 with everything on very high 1680x1050 (only tesselation and advanced dof turned off) and its 30+ fps most of the time on my geforce 560ti.
 
Back
Top