Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
What do you mean with system walkthrough??

PS: they even showed their profiling...I wish more devs out there would be so open as GG, good service to us fans!!

More about post effects, more about area lights and shadowing systems, more about parallax occlusion tech, about their AF decisions, more about particles systems, more about volumetric lights systems etc.
 
I can see them on screenshots too, but i meant more inside information about precision, cost, limitations etc.

It is always good to want more. But imo, one already has to appreciate DF and GG for giving us even so much information (just compare this to other devs and games...e.g. did you read the recent FM5 article?)
 
It is always good to want more. But imo, one already has to appreciate DF and GG for giving us even so much information (just compare this to other devs and games...e.g. did you read the recent FM5 article?)

Sure, its better to have even those information than none at all :) But some pieces about God of War or Uncharted or even Killzone 3 were more in-depth generally and i thought, after all hyped twitters Richard gave on, that this will be the best article ever :)
 
The audio info is interesting in light of our recent discussions too.

MADDER sounds great, but seems a little too heavy on dev time with all those materials and angles set ups. Seems worth though.

As stated elsewhere - many of the statements needs clarifications, like :

"It's a system called MADDER - Material Dependent Early Reflections. The point is that there should be no illusion that it's reverb - because it isn't. It's real-time reflections based on geometry."

But reverb IS the result of reflections based on geometry.

If the MADDER engine calculates the early reflections (only 1st order?) based on the geometry, why would that increase the work for the sound designers? The engine would modify/filter etc. the sounds completely automatically without requiring multiple layers of sound. The reflection patterns doesn't actual change in the video - the reflection energy arrives first in the left channel and later in the right channel, no matter how you turn or your position on the map. This could suggest that there are no real time calculation of the reflection patterns in the engine, but a set of different gunshot sounds which get triggered according to a specific sets of rules depending on the closest surfaces.
 
As stated elsewhere -
As replied elsewhere, 'reverb' means a common effect applied to an audio stream in standard parlance. The origins of the effect can be traced to acoustic engineering and physics and the description of sound-waves reflecting off surfaces, but it's been so used in digital audio to mean a global reverb effect applied to the stream (from audio engineering to musicians with reverb units to consumers with reverb options on the HiFis and PCs) that its meaning has shifted to this singular global effect. GG need to make the distinction between a reverb applied to the audio steam as we've been using since PS2 days, and geometry-specific, more physically correct reverberations. Calling it reverb would confuse the issue for Joe Consumer who'd say, "we've had reverb since PS2 says."
 
GG need to make the distinction between a reverb applied to the audio steam as we've been using since PS2 days, and geometry-specific, more physically correct reverberations.

I do understand it's just a marketing statement, but I can't make the distinction between a generic reverb and the 'geometry-specific, more physically correct reverberations' in the videos. The reflection patterns doesn't change according to your orientation or position in the maps - which imply a simple static reflection/reverb system.
 
Last edited by a moderator:
If the MADDER engine calculates the early reflections (only 1st order?) based on the geometry, why would that increase the work for the sound designers?

From what i read, they need to setup materials for sound and angles of them in the level, so its not fully automatic like some techniques used now, but i could read it wrong.
 
I do understand it's just a marketing statement, but I can't make the distinction between a generic reverb and the 'geometry-specific, more physically correct reverberations' in the videos. The reflection patterns doesn't change according to your orientation or position in the maps - which imply a simple static reflection/reverb system.

Yes it does actually. For instance getting near buildings after being in the forest had the building reflect the audio, there were clear changes as you went through the different rooms, etc. What audio were you listening this on? I was on my 5.0 Surround system in the living room.
 
Yes it does actually. For instance getting near buildings after being in the forest had the building reflect the audio, there were clear changes as you went through the different rooms, etc. What audio were you listening this on? I was on my 5.0 Surround system in the living room.

Change of reverb sound when moving through different rooms does not imply anything - I can have a generic reverb which morph between different settings according to the room settings. Just look at EAX - nothing special.

While moving in the forest map - you can clearly see reflective rocks on your right at the start - the reflection pattern doesn't change even thou you move throuth the rock formation nor when you change the orientation. With real time early reflection simulation you would obviously experience change in the reflected energy. The exact same behaviour is missing from the indoor environments.
 
The sounds being dynamically affected according to your surroundings sure is a nice touch, it certainly removes some of that repeated sound feeling and the monotonous nature of playing recorded sounds. Having said that, I don't think this is anything remotely accurate either. They just seem to re-play some frequencies of the original sound based on what is around you, and the results are distractingly directional for me. I haven't been running and gunning in a real forest lately so I may be very wrong.
 
You mean like stuff in this tec demo:

http://www.youtube.com/watch?v=Q0h3T_Y9B0Y

??

Edit: I think it was quite smart by DF to not cover the stuff already shown in the vid!!

Yesh. So they did clean up the workflow after all. The ability to change things quickly and in a robust fashion is a key advantage in nextgen game development.

Their audio engineer gave an interview about "manpower lite" audio layer years ago. It looks like their first iteration is ready and deployed.

Same for their graphics subsystems.

I thought ND would deliver something like this first. Didn't expect GG to beat them to it.
 
Change of reverb sound when moving through different rooms does not imply anything - I can have a generic reverb which morph between different settings according to the room settings. Just look at EAX - nothing special.

While moving in the forest map - you can clearly see reflective rocks on your right at the start - the reflection pattern doesn't change even thou you move throuth the rock formation nor when you change the orientation. With real time early reflection simulation you would obviously experience change in the reflected energy. The exact same behaviour is missing from the indoor environments.

IMO, the 'problem' is that your expertise on this subject, far exceeds everyone else's.
Don't get me wrong, I fully respect you and I see you as an authority on the subject.
I have seen you de-PR the xbox audio architect with just a few posts, so I know that you know a lot.

Now, the situation is like this (I think), no disrespect intended, I'll just try to illustrate my view using an analogy:
a game company says: "our light refraction calculations are 99.9% correct and should mimic real life real well".
everyone here says: "wow that is so cool, you can really see that this solution is indeed 99.9% correct, a real improvement over previous solutions!"

and here you come in, as a (let's say) light-expert:
"well, I don't see how the refraction calculations are 99.9% correct, because in the video you see a prism, but the light doesn't behave like it should."

the rest doesn't really understand the material so they approach your views with doubt, while they should realise that you add real value to the subject.
Maybe we can get in contact with Guerilla so you can ask them a few questions? I think that would clear things up.
 
"It's a system called MADDER - Material Dependent Early Reflections. The point is that there should be no illusion that it's reverb - because it isn't. It's real-time reflections based on geometry."

But reverb IS the result of reflections based on geometry.

Yeah, that threw me too.

I think they should have said it wasn't a reverb effect applied to the sound rather than it wasn't reverb (which it is).
 
IMO, the 'problem' is that your expertise on this subject, far exceeds everyone else's.
Don't get me wrong, I fully respect you and I see you as an authority on the subject.
I have seen you de-PR the xbox audio architect with just a few posts, so I know that you know a lot.

Now, the situation is like this (I think), no disrespect intended, I'll just try to illustrate my view using an analogy:
a game company says: "our light refraction calculations are 99.9% correct and should mimic real life real well".
everyone here says: "wow that is so cool, you can really see that this solution is indeed 99.9% correct, a real improvement over previous solutions!"

and here you come in, as a (let's say) light-expert:
"well, I don't see how the refraction calculations are 99.9% correct, because in the video you see a prism, but the light doesn't behave like it should."

the rest doesn't really understand the material so they approach your views with doubt, while they should realise that you add real value to the subject.
Maybe we can get in contact with Guerilla so you can ask them a few questions? I think that would clear things up.

I think what Rlab explains is fairly straight forward if you've already experienced what he describes in real life.;)
 
Yesh. So they did clean up the workflow after all. The ability to change things quickly and in a robust fashion is a key advantage in nextgen game development.

Their audio engineer gave an interview about "manpower lite" audio layer years ago. It looks like their first iteration is ready and deployed.

Same for their graphics subsystems.

I thought ND would deliver something like this first. Didn't expect GG to beat them to it.

dunno, LOU has some of the best sound ever, they might have been doing something similar already on PS3.
 
dunno, LOU has some of the best sound ever, they might have been doing something similar already on PS3.

The PS3 is a special case. The SPUs basically handle culling, ray casting, audio, among other things.

But because of its limited memory, I have always assumed the developers need to hand optimize resource usage. So something like automatic audio effects generator (to save human effort) may be out of reach.

I was expecting ND to do it for PS4 first next year. Didn't expect GG to have a go at launch.
 
I think the madder thing was cool, but I know that when I get to play the game I will not think much about it. If it was not for the video demonstrating the sound with madder off/on I would not have even thought twice about it.

I am no sound aficionado and the only time I have been impressed by sound (correctness?) was by the movie Heat. During the bank robbery, the sound of those rifles, seriously, that sounded real to me. I could even smell the cordite, when they fired off the G3. Can not validate for the M4 etc, but the G3 was dead on.
 
That framework is more important for devs than for the average gamers. It's designed to save dev effort and facilitate quick changes. We will benefit indirectly if the developers can experiment more during game making.
 
Status
Not open for further replies.
Back
Top