MGS4 3D engine post mortem [translation needed]

Thanks for the info Akumajou, this clears things up a bit.

I also want to ask why do some people think final version of MGS4 uses all PS3 hardware in most optimised way.Do you have any proof?Any technical information?

NAO confirmed that ,graphically, Heavenly Sword was only RSX, Insomniac confirmed in their slides that R&C is only RSX, on a Naughty Dog slide it was written that Uncharted uses mostly %30 of spes power at once and animation was their priority.

What does MGS4 does better than these games?

The only technical information that is available is with Konami, Hideo Kojima, Kojima Productions and any presentation including this one that they may make.

That said there are game developers that are known for trying to use the hardware to the "best of their abilities at the time" of them developing the game engine for the game genre and the type of game that they are making, often alot of people will miss that and lots of things get lost in translation from the Japanese language but that is roughly as I understand it.

These game developers who like to ignore easy to use programming language tools usually are composed of seriously technical guys that are willing to break their fingers writting code and so on after spending months studying the hardware.

This does not mean that they are better than other game developers its just that they are different, however they eventually are able to make a more proper and accurate claim to using XX% of said console hardware.

In the case of the Prototype PS3, Hideo Kojima's team was asked what can they do to make their next planned MGS game, that is no different then when his team was asked the same thing back when PS2 was going to be released, however the main difference is that the evalutation hardware had generous ammounts of ram and bandwith that would not be affordable in the final retail hardware as planned unless Sony was willing to charge more money for the console and given the PS3's history as we know it, we are well aware of all of the complaints some gamers made about the price even though they are ignorant of how much technology really costs, the man power, the hours, etc. Its a very complacent generation of gamers.

Sometimes I think we are more disappointed in the RSX than Sony or game developers are.

You'd just expect a machine with such a CPU and XDR memory to run something like an 8800 prototype.

You could only expect such a machine to have a 8800 prototype if you (and no I don't mean you) live in a world where the laws of reality are violated.

I have been thinking about this for a long time now and I have plans of making a thread about it, based on my limited but very nerdy knowledge and the short of it is that for you to expect anything like a retail PS3 with 512XDR, 512MB GDDR and preferably a 9800 GPU would have to be when the GPU could be made using a 55nm engineering process at Sony's fabs (who make RSX)

Therefore that would make releasing the console in the year 2008, where because of that Sony would have no choice but to use a 45nm Cell (because IBM reached 45nm in 2008) and the console would reflectively cost slightly more than the PS3 we knew that launched in our reality.

Then you have Nintendo with a king of the hill empire and an unchallenged XBox 360 roaming free for nearly three years, just racking up people who did not want to wait or eventually became shocked at the price once it was announced.

The things then to speculate is would people respond positively or take a wait and see luke warm attitude with such a PS3 given its higher spec, higher price and first gen game that would not look as polished or all that different than third generation XBox 360 games, and titles like MGS4 would end up with a 09 or 2010 release because after all you cannot realistically expect to make the console more complex and fast and be easily tamed in a short time.
 
...however the main difference is that the evalutation hardware had generous ammounts of ram and bandwith that would not be affordable in the final retail hardware as planned...
As mentioned in other discussion on this point, development hardware tends to have more RAM to facilitate development and debugging. Kojima should not have used that extra capacity for the game demo (and I don't imagine he did).

I guess one area the specs did take a 'drop' was the change from 512 MB unified XDR to split RAM pools, though I don't know that the XDR only idea was ever official. I guess it could be Kojima targeted 50 GB/s RAM bandwidth to GPU, then this was changed and he found himself unable to work effectively with the split pools in the same way. Note I'm refering to Kojima here as a sort of figurehead personification of the development team, as clearly he wasn't one guy writing all the code ;)
 
Kojima should not have used that extra capacity for the game demo (and I don't imagine he did).

How did he not, if the realtime tech demo thats 2 years older than the final game was more impressive graphically?

I guess one area the specs did take a 'drop' was the change from 512 MB unified XDR to split RAM pools, though I don't know that the XDR only idea was ever official. I guess it could be Kojima targeted 50 GB/s RAM bandwidth to GPU, then this was changed and he found himself unable to work effectively with the split pools in the same way. Note I'm refering to Kojima here as a sort of figurehead personification of the development team, as clearly he wasn't one guy writing all the code ;)

PS3 never had 512mb unified XDR as far as i know.
2005 devkits where 512mb XDR + 512 mb GDDR3 (from the 7800GTX). From that point, the changes in devkits never implied that a unified XDR solution was in the work.
 
How did he not, if the realtime tech demo thats 2 years older than the final game was more impressive graphically?

Because that's what it is, a tech demo. Tech demos generally are more impressive because of their limited scope.

PS3 never had 512mb unified XDR as far as i know.
2005 devkits where 512mb XDR + 512 mb GDDR3 (from the 7800GTX). From that point, the changes in devkits never implied that a unified XDR solution was in the work.

I am sure if it wasn't for 360, Sony would have PS3 with just 256 MB of XDR for Cell and some GPU with eDRAM frame buffer much like PS2. It would make a cheaper PS3.
 
If it wasn't for PS3 (and Epic Games) 360 would have only had 256Mb of RAM, so hey, competition is good, we'll leave it at that :)
 
V3 said:
I am sure if it wasn't for 360, Sony would have PS3 with just 256 MB of XDR for Cell and some GPU with eDRAM frame buffer much like PS2.
I doubt 360 had much to do with why that didn't happen - they switched to NVidia quite a while before 360 was upgraded to 512MB.
If they stuck with something else, chances are we'd end up with 512MB XDR PS3 (+eDram) if MS still made their move(so probably not really cheaper at all), or both consoles would have been 256MB.
 
How did he not, if the realtime tech demo thats 2 years older than the final game was more impressive graphically?
I'm guessing because the scope of the demo was more localised and controlled? Or maybe he used a greater amount of resources in the development version expecting optimizations to make the same results possible in the smaller RAM space? It is something of a mystery though!

PS3 never had 512mb unified XDR as far as i know.
Well I think you're right. As I said I don't think the XDR only model was ever official. But before the nVidia announcement, we knew Cell was on XDR, and 512 MBs RAM looked likely. Anyone starting work then on a PS3 title would likely have been thinking '512 MBs to share with everything, and plenty of BW'. Then with the nVidia announcement they had to face a split RAM pool with two separate busses, which could well have thrown a spanner into their systems, perhaps which they never fully recovered from becuase they were too deep into the game to make significant changes?
 
Is it right that first dev kit had really low bandwidth between the cell and the GPU (2*6800 or 7800).
I'm not sure of my memory in this regard.

Maybe they planned that the cell and the GPU could work in a more synergic manner that what is possibly possible?
 
Is it right that first dev kit had really low bandwidth between the cell and the GPU (2*6800 or 7800).
I'm not sure of my memory in this regard.

Maybe they planned that the cell and the GPU could work in a more synergic manner that what is possibly possible?

correct, it was nuetured, we had a couple of the early kits for a while, which were later replaced maybe ~2 months later, I cant tell you when the first kits came out, only that we got them around 2 months before the refresh
 
How did he not, if the realtime tech demo thats 2 years older than the final game was more impressive graphically?

I think that first level (the demoed one) is the least impressive graphically. Latter parts of MGS4 are really breathtaking... Therefore tech demo didn't show anything that could prepare you for the full game
 
Is it right that first dev kit had really low bandwidth between the cell and the GPU (2*6800 or 7800).
I'm not sure of my memory in this regard.
Yes, they were stuck on PCI-e

Maybe they planned that the cell and the GPU could work in a more synergic manner that what is possibly possible?
Maybe. We've heard next to nothing about the Cell<>RSX connection and what it enables, despite lots of exciting noise pre-PS3-launch that it'd be wonderful.
 
I doubt 360 had much to do with why that didn't happen - they switched to NVidia quite a while before 360 was upgraded to 512MB.
If they stuck with something else, chances are we'd end up with 512MB XDR PS3 (+eDram) if MS still made their move(so probably not really cheaper at all), or both consoles would have been 256MB.

The 256 MB XDR was known quite a while from Elpida statement that Sony was ordering large amount XDR chips from them. But like you said the total 512 MB in PS3 was most likely due to Xbox 360.

But I can't help and see PS3 architecture as a salvage operation rather than a well designed system. Especially next to Xbox 360.
 
Maybe. We've heard next to nothing about the Cell<>RSX connection and what it enables, despite lots of exciting noise pre-PS3-launch that it'd be wonderful.

As far as I know from what I read on the net, graphics processing in Playstations was never only gpu, there was always some parts in their cpus to help or do graphics processing.You think there is a problem about this in cell+rsx combination, resulting in dissapointing graphics in MGS4 or other games or is it just because most games use only gpu this generation?
 
As far as I know from what I read on the net, graphics processing in Playstations was never only gpu, there was always some parts in their cpus to help or do graphics processing.You think there is a problem about this in cell+rsx combination, resulting in dissapointing graphics in MGS4 or other games or is it just because most games use only gpu this generation?

most multi platform games before maybe, if you watch one of the video for warhawk, the developer said the cell can do vertex shaders task, they use the cpu to create the cloud and water in the game. The slide from gdc for phyre engine shows that some effects like depth of field, ambient occlusion can be done with on the SPU. The engine will be ship with every ps3 sdk free of charge so every developer can use it, I guess we probably see some multi platform games doing that now and most of them next year. MGS4 is probably too late to benefit from it since it is very very far in development. Its mostly first party or sony developers that do things first before anyone else.
 
V3 said:
But like you said the total 512 MB in PS3 was most likely due to Xbox 360.
It's more the other way around. 360 was always 256+ (as in, "we'll match whatever Sony has"). Prior to NVidia deal announcement, PS3 was expected to be 256MB + something. Chances are if that never happened, MS would have stayed as well (big cost savings), and we'd have had 256MB console future.
 
As far as I know from what I read on the net, graphics processing in Playstations was never only gpu, there was always some parts in their cpus to help or do graphics processing.You think there is a problem about this in cell+rsx combination, resulting in dissapointing graphics in MGS4 or other games or is it just because most games use only gpu this generation?
No. The CPU will always help out where it can and it's smart to. You can get the CPU to do whatever it is to do in RAM and share that with the GPU. The difference in PS3 is Sony engineered a fat 30 GB/s pipe to share data directly with the GPU without having to go through RAM, and they connected the memory pools up so they were addressable, or coherant, or something-or-other. A tooted possible use could have been to have Cell generate procedural textures to be passed to GPU directly for texturing...but the system doesn't seem to work that way, and the close relationship between Cell and RSX doesn't seem as close, or at least doesn't seem to benefit from being close, as early expectations had it.
 
The difference in PS3 is Sony engineered a fat 30 GB/s pipe to share data directly with the GPU without having to go through RAM, and they connected the memory pools up so they were addressable, or coherant, or something-or-other. A tooted possible use could have been to have Cell generate procedural textures to be passed to GPU directly for texturing...but the system doesn't seem to work that way, and the close relationship between Cell and RSX doesn't seem as close, or at least doesn't seem to benefit from being close, as early expectations had it.

So, some developers tried to use the hardware like this but the hardware failed them?
 
We don't know. No-one talks about it! It has been mentioned in discussion about procedural content that you'd still want Cell creating the content to copy it to RAM for RSX to read. Whether it's possible to create textures on demand per quad, I don't know.
 
I'm inclined to agree with the idea that they just couldn't take full advantage of the given hardware at the time as opposed to it being underpowered. It really does highlight a possible 'gulf' in experience that Kojima himself eluded to. Some of the rendering techniques weren't the best they could have been especially as we can compare to the likes of Uncharted, especially the shadows which were plagued by artefacts. Nevertheless, the game is very impressive. Big, cinematic and lovely post-process effects in cut-scenes. Its also nice to see some time was put into water simulation. For such an epic title and a first from the studio on that platform, I'm interested to see what they bring to the table in future projects.
 
No. The CPU will always help out where it can and it's smart to. You can get the CPU to do whatever it is to do in RAM and share that with the GPU. The difference in PS3 is Sony engineered a fat 30 GB/s pipe to share data directly with the GPU without having to go through RAM, and they connected the memory pools up so they were addressable, or coherant, or something-or-other. A tooted possible use could have been to have Cell generate procedural textures to be passed to GPU directly for texturing...but the system doesn't seem to work that way, and the close relationship between Cell and RSX doesn't seem as close, or at least doesn't seem to benefit from being close, as early expectations had it.

Isn't that like how the ProFX engine suppose to work on PS3? Ready at Dawn said they got procedural textures running on their new engine for their next game, but they still havent announce which system they are working on. I think DICE also said something about Mirror's Edge using the Cell allows them to get higher res texture.
 
Back
Top