Alan Wake: Microsoft preparing to leave PC gamers behind (again)

Yeah, on my unlocked 6950 2GB, the Crysis 2 Ultra object detail setting kills performance. It quite literally becomes a slideshow sometimes. Particularly when near a brick wall. It also doesn't look all that realistic - just super lumpy.
 
I knocked down tesselation to 8 in the AMD CCC and brick walls won't slow down the game anymore, even with all the sliders maxed out, texture pack installed and DX11 features enabled. My CF'd 6970s still aren't enough to run the gaem at native res of my monitor though, so I have to knock down the resolution to 1080P or it becomes unresponsive, but at that res it's quite fluid really.

Hopefully next gen, AMD will progress beyond 2 vertices/clock for geometry processing and then tesselation performance will really take off. The 7900 series is a cool chip, but not really enough to warrant an upgrade from cayman IMO, not with the current price tag anyway.
 
Hopefully devs just don't do stupid tessellation of walls instead. :rolleyes:

Seconded and thirded. :p

Or over tesselation of geometry (Hawx 2).

I'm going to just hope most of these gaffs are just due to it being a relatively new technology and up til now fairly limited developement time spent on them rather than Nvidia helping them to implement tesselation and purposely over tesselating things that requiring little to no tesselation.

Even if I was currently running an Nvidia card, the overtesselation of those things would infuriate me as my card would be burning through power for no good reason. And on Nvidia I wouldn't even have the option to limit the max tesselation factor to reduce that idiotic waste of power.

Regards,
SB
 
It better look significantly better than the 360 version to achieve performance like that!


Well, 1080p is basically 4x the pixel requirements of 360 version. With FP16 on everything @ full res post-processing, the reqs would be > double? Let's be generous and say that's a minimum 8x increase in power (across all aspects of pixel throughput, so shading, texturing, ROPs) needed @ ~30fps.

hmm... So I'm kinda not surprised to see the "6870 to 5870 group" there perform as they do in the chart there (27~31fps min) when you factor in clocks (I'd say it's fairly ballpark expectation when you throw in clocks and architectural improvements since Xenos).

The jump to 2560x1600 isn't exactly halving the framerate, but it seems fairly close in the grand scheme (minimum fps). Would be curious to see a PIX grab. :p
 
Here are the settings compared to the XBOX360 version:
http://forum.alanwake.com/showthread.php?t=7725

How these were on the Xbox360 build:

Resolution - mix of 960x544 and 1280x720, and some things like fog particles were rendered in half resolution that are rendered in full on PC.

Vsync - Xbox had "smart vsync" i.e. Vsync on if frame rate above 30, and off if below 30. PC doesn't support a mode like this. American Nightmare Xbox360 btw has a selectable Vsync (on, off, smart).

Antialiasing and FXAA - We had 4xAA on Xbox. FXAA was added to American Nightmare.

Anisotropic Filtering - Some textures where it was important had Aniso on Xbox (like the road texture with the yellow line), most were trilinear filtered only.

Shadow Quality - Xbox360 is somewhere around the medium setting. The rendering differs a bit so no exact comparisons can be made.

SSAO Quality - Xbox360 build had something like the Low setting - maybe with a bit less noise.

Backdrop Quality - Xbox360 Alan Wake used Medium.

GodRay Quality - Xbox360 build had this turned to off except for some specific scenes where we had the performance to turn it on and had most visual impact. The quality was less than "High".

Volumetric Light Quality - Xbox used the "Low" setting.

Draw Distance - Xbox build had this at Full - it was mainly added as an optimization. Draw calls are way more expensive on the PC than they are on the Xbox360, and this reduces draw calls.

LOD Distance - Xbox build had this around the middle mark.

FOV - Middle / default is the one Xbox360 version used too.
 
Oh nice. :)

Wonder what they mean by mix of 960x544 and 1280x720. The pre-rendered cut-scenes were 720p. >_>
 
Thoughts
-360 had little aniso. That machine seems to get hit hard by it. I've seen bilinear filtering in 360 games. This seems strange when you consider Radeon cards since the beginning have had optimizations to minimize aniso hit. What is up with Xenos?
-PC draw calls cost far more. Thanks DX9?

Ok so now it's time for someone to benchmark the game on various PC graphics cards at those settings and we can see how efficient the PC version of the engine is. I'd be especially interested in the older GPU's like the GF 7900GTX and Radeon 1900XT.
Most recent games end up not working on my old X1950 because the drivers are 2 years old and developers don't test the cards. NV on the other hand is still supporting GF6.
 
Thoughts
-360 had little aniso. That machine seems to get hit hard by it. I've seen bilinear filtering in 360 games. This seems strange when you consider Radeon cards since the beginning have had optimizations to minimize aniso hit. What is up with Xenos?
-PC draw calls cost far more. Thanks DX9?

I guess it's easier to just add in an option to dial down the draw distance than port the game properly to DX11. Even though there's probably no-one out there that would be trying to play this on a DX9 level chip (and very very few on XP as well).
 
Thoughts
-360 had little aniso. That machine seems to get hit hard by it. I've seen bilinear filtering in 360 games. This seems strange when you consider Radeon cards since the beginning have had optimizations to minimize aniso hit. What is up with Xenos?

AF Texture ops hit main memory bandwidth (GDDR3).
 
On DX11 you can definitely do a decent job of reducing draw call overhead by combining a lot of stuff into a single draw (using texture arrays, instancing, etc) but it takes some thought. It's hard to bolt-on to a DX9 engine so ports will typically suffer.

Like the PS3 argument, going in the other direction is easy... so design for PC first and port to consoles in this case? :p
 
AF Texture ops hit main memory bandwidth (GDDR3).
Apparently the 32kB texture cache doesn't help either. (1, 2, 3)

It will be nice to check if they still use A2C on all foliage on PC version, it looked suprisingly good on X360.
 
Last edited by a moderator:
Back
Top