The Game Technology discussion thread *Read first post before posting*

I cant call it nice: heavy aliasied, blurred, many ridiculous low res textures, strange choppy framerate, often low res shadow that flickers strangely.
Really i'm curious if Rockstar will ever developer game on ps3 that dont have framerate issues and use MLAA [its really good on PC in GTA 4], i love their games, but just cant stand shitty framerate and they just hate PC lately ;\. Oh man, its so frustrating, because LA Noire concept is so great.

Rockstar isn't developing LA Noire, Team Bondi is.
 
I bought recently Halo Reach which I have been reading in this thread that it is considered a high technical achievement which it may be, but I think the results may not be celarly apparent because it didnt impress me as much as I hoped. It looks kind of flat and rough so far in places.Mind that I havent progressed much though so I am not sure if I am missing something important that I will face later.
Water effects have been scaled down a bit compared to Halo 3 and I see lots of aliasing?
There is some noticeable increase in polygons on the models compared to 3 and the lighting is much improved though. Although I think that the enemy models could need some more work. Some textures on them are missing normal maps and appear slightly flat and blurry

The first thing I thought when I played Halo Reach was another game I have played earlier with similar scale, and I wondered if Bungie could improve further the visuals for Reach.

I wonder if its ok to discuss that other game as well (since 1)it wasnt talked much about and 2)Digital Foundry missed it) and compare the two from a technical standpoint on how they did it, what sacrifices they have made to achieve specific results and what areas they could lend from each other but I dont want to make it another unhealthy comparison discussion.

Is it ok with the mods?
 
It's only viable for certain types of games:
- willing to put less focus on other tech and environment compared to other games
- only featuring real human characters
- budget to afford casting every single role and recording/capturing every line of dialogue and every bit of performance

Also, increasing the quality (higher resolution textures, more geometry, more complex shaders) means an even bigger memory footprint compared to methods less dependent on massive digitizing.

I'd really wouldn't like Uncharted type of games, for example, to replace their characters with these "real" people. Something like Mass Effect would have problem with its aliens and environments.

Heavy Rain is a better fit for this type of tech, on the other hand. I'd seriously look into it if I were them.

Laa-Yosh, is this the same tech we've seen in those AMD "Cinema 2.0" videos that were making their rounds on the internet a while back? As soon as i saw LA Noire i was reminded of those AMD tech demos. It's impressive that they're actually using that tech in a real game on current gen consoles. I must say i'm really keen to understand how it all works as it boggles my mind when i see it action.

Also, isn't LA Noire being published by 2K Games, the publishers of Rockstar, or are Rockstar themselves publishing now? (A publisher within a publisher... wow :D)
 
I haven't seen those AMD videos, any links somewhere?

Not sure about the Rockstar stuff either, kinda confusing.
 
By the way the basis of the tech is like this:

Use lots of cameras and a real actor.
Capture geometry, normal and color maps for every basic expression. Store the lots of data you get.
Capture actor's performance reading every line. Match elements of the performance with the large library of geometry and textures as good as you can using lots of custom software.

Basically, record a kind of 3D video, then play it back in the engine, with the ability to re-light it.
 
By the way the basis of the tech is like this:

Use lots of cameras and a real actor.
Capture geometry, normal and color maps for every basic expression. Store the lots of data you get.
Capture actor's performance reading every line. Match elements of the performance with the large library of geometry and textures as good as you can using lots of custom software.

Basically, record a kind of 3D video, then play it back in the engine, with the ability to re-light it.

This is the Cinema 2.0 stuff i was talking about. Seems really similar...

http://www.youtube.com/watch?v=Z1EJKifq6JM&feature=youtube_gdata_player

I hope the link works, i can't access youtube directly from here so i had to wrangle it on my iPad and then email myself the link :p

So from your description does that mean that the in-game character meshes have pretty much MEGA polygons on the heads and faces?! :oops: or is it done using normal maps etc...?

Either way the end result is incredible :)

Interestingly though, i wonder what it is about the tech that wouldn't allow it's application on a more non-human ingame model? Maybe not something as crazily different as say Wrex from ME, but maybe something like the green-face assasin guy in ME2 (i forget his name).

Is it possible to manipulate the recorded data such that you can add more custom expressions and movements to a mesh so that it can be applied on a non-human?
 
No, the ingame heads are actually very rough and lowpoly, and rely a lot on the normal maps to add detail. The characters don't have eyeballs for example, or eyelids, just some blob in that place; then the normal map adds the shape and the color texture contains not only the iris, but where it is looking, and the reflections from the recording studio are baked in as well.

The problem with aliens and non-human characters is that you have to record both the shape, the color, and the motion from a real life head. The entire system is very closed because there are no clear separation lines between the mesh and the normal maps, and it's also probably very hard to manually animate anything. So you would have to use movie visual effects level make-up and prosthetics for mostly humanoid creatures to get the proper data using the same kind of recording sessions.

Less humanoid stuff would then have to be produced, I don't know, probably via CG, at a much higher fidelity than usual for games, in order to match the quality of the human faces. You'd have to build a movie VFX level digital character to have some source to capture similar data from and reproduce the entire capture process in a full digital pipeline within the computer.
All those small, lifelike details, subtle movement, wrinkles and folds and such that we need to accept the result as realistic would have to be reproduced for aliens and fantasy creatures. Otherwise the discrepancies between the characters would break the immersion very quickly.

I'll watch the AMD presentation now ;)
 
By using 1152x720 resolution they could also have two 4x8 (or 3x10+2) g-buffers + depth buffer. This fits the EDRAM perfectly, no tiling required at all. However this layout is extra tight, there's no room for extra material properties (just specularity and glossiness). The surface colors are 8 bit per channel and the two channel normal would be likely stored as 10 bits per channel (as 2x8 bit normal quality is not that good). If they have gone this route, they likely have also stored some extra material/lighting parameter in the stencil bits.

I'm curious to see their g buffer arrangement as well. Do you think maybe they went with 3 buffers as you stated to avoid tilling (except perhaps with 16bit depth) and make alternate use of some channels on different frames? For example on even frames use two channels for spec and gloss, and on odd frames use the same channels for motion vectors, and in both cases reuse the data for two frames.They already use temporal aa which is blending data between succesive frames anyways, so maybe re-using stuff like motion vectors or spec data on two frames wouldn't cause horrific visual artifacts.
 
Yesterday I had a conversation with a friend about Frostbite 2 and he told me that the engine will support MLAA, is this for PS3 only? does Dice consider any similar AA solution for the 360?

Thanks in advance. :)
 
Yesterday I had a conversation with a friend about Frostbite 2 and he told me that the engine will support MLAA, is this for PS3 only? does Dice consider any similar AA solution for the 360?

Thanks in advance. :)

Pretty good news if true :LOL: the mlaa on 360 could be used with GPU, but I don't know honestly how good could be, because one of the great advantage on the ps3 are the spu, SMS explained the effect improve when more spu works combined, so I don't know on 360 how much work.
 
Yesterday I had a conversation with a friend about Frostbite 2 and he told me that the engine will support MLAA, is this for PS3 only? does Dice consider any similar AA solution for the 360?

Thanks in advance. :)

At the very least it should also be useful for PC, where MLAA is also supported. MLAA doesn't work as well when you apply it as a post-fx forcing it through a driver, because then the HUD will also be affected needlessly, which can look ugly for smaller fonts for instance. But MLAA still has a performance advantage also on PC, so it's bound to be worthwhile to support there.
 
At the very least it should also be useful for PC, where MLAA is also supported. MLAA doesn't work as well when you apply it as a post-fx forcing it through a driver, because then the HUD will also be affected needlessly, which can look ugly for smaller fonts for instance. But MLAA still has a performance advantage also on PC, so it's bound to be worthwhile to support there.

They also looked for a MLAA solution on PC that affects sub pixel sized detail. They might also take note of the method that does that which Nvidia is going to roll out (if they haven't already).

http://research.nvidia.com/publication/subpixel-reconstruction-antialiasing

Subpixel Reconstruction Antialiasing (SRAA) combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost. SRAA targets deferred-shading renderers, which cannot use multisample antialiasing. SRAA operates as a post-process on a rendered image with superresolution depth and normal buffers, so it can be incorporated into an existing renderer without modifying the shaders. In this way SRAA resembles Morphological Antialiasing (MLAA), but the new algorithm can better respect geometric boundaries and has fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality.
 
I read that last week. It sounds interesting and I can't wait to try it to see if their claims are accurate. As far as i can tell, the haven't implemented it yet. I wonder what cards this will be available with. I imagine 400 & 500 series on up at a minimum.
 

Interesting, really curious to see this in action, or at least explained in more detail. If anyone implements mlaa on pc they hopefully will offer the option turn turn it off as I don't want it on until they offer at least a sub pixel solution, but ideally also a solution that leaves texture detail alone as well. But sraa sounds promising. Bold statement though in Nebula's link where they claim it's equal to a high level of super sampling. We'll see. Their quick blurb doesn't mention though if it leaves texture detail alone and only works on real edges, that could be a bummer but I guess we'll see on that as well. I'm crossing my fingers and hoping we have a new post process aa standard here that doesn't muck with textures and add edge shimmer.
 
Last edited by a moderator:
MLAA is a generic term for a type of anti-aliasing with different implementations across different engines. The one used in GOW3, KZ3, LBP2 are off a version developed by Sony (Santa-Monica) that makes use of the SPU's of the PS3 and as such, it would not be suitable for the Xbox360. There are others (AMD, etc). The one used by DICE for Frostbite 2 might be of a different implementation.
 
Back
Top