Killzone 2 technology discussion thread (renamed)

Status
Not open for further replies.
First of all shifty most of my knowledge is not academic basicly for both my previous job (Telecoms engineer) and my current IT knowledge I have learnt by doing or reading. I know a fair amount about the Cell for example by reading all the research documents etc. And reading forums like these and those forums on IBM dev forums. learning this way suits me while academic studies did not (as I am dislexic). So I can understand reading in depth scientific reports at a certain level and reading posts regarding these reports will bring a deeper understanding. I have a very good memory regarding these types of things and can link things together to from opinions, or understanding. Before your and Frans post I did not take into account the fact that like the Cell, GPU's and other processors are also not so constrained by Opengl and DX software. When I read this from Fran and later your in depth analises I understood my misconception.


you could go on with a second post of the month about what a "deferred renderer is" ;) If not, any voluntaries in little words ?

I would like to learn of this to. My knowledge of GPU's are quite week. From what I gather there seems to be a performance gain due to less calculations havint to be made is this correct? Any comparison between forward and deferred rendering by someone who knows their "onions" would be welcome.

Thanks Terra
 
Breaking News............................


Fran and Fafalada leave their respective employers to form a new studio, with the intent of developing a truly unique new IP. The game, titled "Bitch Slap", has players assuming roles from different workplace departments with the goal of passing the buck (and blame for bonus points) to other departments. It ends with not a traditional Boss Battle but an entire Board of Directors Battle.

Stay tuned...........
 
Breaking News............................


Fran and Fafalada leave their respective employers to form a new studio, with the intent of developing a truly unique new IP. The game, titled "Bitch Slap", has players assuming roles from different workplace departments with the goal of passing the buck (and blame for bonus points) to other departments. It ends with not a traditional Boss Battle but an entire Board of Directors Battle.

Stay tuned...........

What happens when you lose?

Any ideas for character development?

Is it going to be First Person View?

What's the lead development platform?

Can you confirm whether the game will support 1080p with 8xFSMSAA at 60fps?

Thanks
 
Last edited by a moderator:
What happens when you loose?

Any ideas for character development?

Is it going to be First Person View?

What's the lead development platform?

Can you confirm whether the game will support 1080p with 8xFSMSAA at 60fps?

Thanks

But most important, what kind of renderer will he engine be using?...
 
Cheer to you Shifty.
You reminded me why I started lurking through this site a couple of years ago. I will never DXify a console part loosely ever again.
 
We need to get back the "post of the month"... or something like that. Its a shame that only few of us will see that post...

trust me man, it wont be a few that see this thread, it will be many more than the thread deserve to be view - is a killzone thread. :LOL:
 
This is the key point!

... GPU's are sold on the level of hardware acceleration they provide for certain features that DirectX implements... Xenos is a great example of this. In the PC space it would be called a DirectX 9 part because it doesn't feature all the needed DX10 bits to be DX10. But it has more features than a DX9 GPU. The end result is ... DirectX 9.5!

... Where a DirectX 10 GPU supports features x,y,z in hardware, and RSX doesn't support these features, Cell is in a position to implement these features.

So ... what you're saying is ... Xenos is DX 9.5, but Cell is DX10?
 
So ... what you're saying is ... Xenos is DX 9.5, but Cell is DX10?
No, any CPU out there can be basically do any DirectX API (past present and future), the problem is not what you can do but how fast you can do it.
 
No, any CPU out there can be basically do any DirectX API (past present and future), the problem is not what you can do but how fast you can do it.

I was trying to make a joke. :(

I actually put some thought into editing Shifty's post like that... (thought not enough!)
 
In the GPU Gems vol 2, being more concrete in the chapter 9 they talk about the Deferred Rendering on S.T.A.L.K.E.R. The book was written in the GeForce 6800 era (2004).

I am saying this because the people seems to believe that the Deferred Shading is something impossible on a NV4x GPU.
 
Oh, I forgot something.

Extract from the page 145

The first myth -"Deferred Shading is slow on current hardware"- arises mostly from the fact that the current generation of games tried to load-balance most of the lightning work between the vertex and pixel pipelines. When all the calculations are performed on pixel level (this is the only way to go with deferred shading), performance will be similar, because lightning pixel shaders for deferred rendered sre not that mucho more complecated than those for forwared renderers. the onoly added doark is G-buffer sampling and, possibly, unpacking. But your application is much less likely to become bottlenecked by the CPU or the Vertex Pipe. The actual mileage will vary dependin on the data set and, more important, on your rendering engine´s actual overdraw, measured as the numbers of pixel passing the z-test divided by the screen area.
 
Oh, I forgot something.

Extract from the page 145

The first myth -"Deferred Shading is slow on current hardware"- arises mostly from the fact that the current generation of games tried to load-balance most of the lightning work between the vertex and pixel pipelines. When all the calculations are performed on pixel level (this is the only way to go with deferred shading), performance will be similar, because lightning pixel shaders for deferred rendered sre not that mucho more complecated than those for forwared renderers. the onoly added doark is G-buffer sampling and, possibly, unpacking. But your application is much less likely to become bottlenecked by the CPU or the Vertex Pipe. The actual mileage will vary dependin on the data set and, more important, on your rendering engine´s actual overdraw, measured as the numbers of pixel passing the z-test divided by the screen area.

I don't think anyone was saying that Deferred Shading was slow on current hardware its just hard to implement proper MSAA due to DX 9 constraints on the PC.

When you talk about all the calculations performed on the pixel level wouldn't the RSX be a strong point of this. And couldn't you remove the lighting load from the GPU due to maybe the Cell using the WorldLight IP which seems to be a lighting effect done after rendering?

And what is the Cell actually helping with in this instance to, people have said possibly deformable terrain (i.e. walls). However this is the first time that the SPE's have been used in parallel to do gemotry as far as I know.
 
Last edited by a moderator:
First of all shifty most of my knowledge is not academic basicly for both my previous job (Telecoms engineer) and my current IT knowledge I have learnt by doing or reading. I know a fair amount about the Cell for example by reading all the research documents etc. And reading forums like these and those forums on IBM dev forums. learning this way suits me while academic studies did not (as I am dislexic). So I can understand reading in depth scientific reports at a certain level and reading posts regarding these reports will bring a deeper understanding. I have a very good memory regarding these types of things and can link things together to from opinions, or understanding. Before your and Frans post I did not take into account the fact that like the Cell, GPU's and other processors are also not so constrained by Opengl and DX software. When I read this from Fran and later your in depth analises I understood my misconception.

To be honest, every time I read one of your posts my headache I have got worse and worse after each one. These people you are constantly contesting are literally gods of the trade (at least in my imaginative mind they have glowing auras and singing in the background around them) and seeing you consistently say "Nuh uH!" was a very frustrating thing.

A word of advice, lurk more. Posting as you are really brings down the quality of a forum such as this one. (Then again you caused Shifty to respond with the most epic quote I have ever seen, on any forum, ever.)

P.S. This post is not meant to be insulting.
 
To be honest, every time I read one of your posts my headache I have got worse and worse after each one. These people you are constantly contesting are literally gods of the trade (at least in my imaginative mind they have glowing auras and singing in the background around them) and seeing you consistently say "Nuh uH!" was a very frustrating thing.

A word of advice, lurk more. Posting as you are really brings down the quality of a forum such as this one. (Then again you caused Shifty to respond with the most epic quote I have ever seen, on any forum, ever.)

P.S. This post is not meant to be insulting.

Mmmm to be honest I am a little offended by your post. I am in the IT industry and now my stuff. Sure as I have admitted GPU's are not my strong point. I am not a fan boy and the misunderstanding that I had was something that many people had. I have "lurked" here for over two or three years. Yet I have questions or topics that I wish to discuss that I think people here are intrested in to.

People here do know their stuff I absolutly agree. And I want to learn from them. I learn by asking or discussing. And if I make mistakes I hold my hand up. How am I bringing down the quality of the forum??? I only made something like 24 posts and some of them have been topic headers like this one. Things that i brought to this forum to add to discussions.

While people here are superb and helpfull (shifty for example), I find some people here quite arrogant and condecending to. If people were to make false or mistaken assumptions regarding my areas of specality I would be more inclined to point out their mistakes (which I have done) and pass on my knowledge rather than discourage them.

What you just said is "don't post here because your not intelligant or don't have enough knowledge" very very arrogant atitude. I would understand if I was a troll or trying to disrupt these forums but the fact that I am trying to contrabute especally by bringing techinical news of intrest to people, so I do think your atitude is quite sad and to be honest imature.

I did have a succesfull career as an IT Telecomms engineer, I do know my way around IT (except in a diffrent field). And I was very good at my job which was very techincal in nature. And the way I learnt was by talking and discussing things with people who where experts in their field. And thats why I am here.

Most of my knoweldge of GPU's especally comes from this very forum from "lurking" as you said. However to understand more I thought I should come here and ask questions or discuss these issues. However to me it seems that because I haven't got a degree or the level of knowledge you currently have that somehow I am upseting these forums by my very presence?

Communities should be inclusive of people who are intreseted in a subject and should allways help people who are intrested in learning more. If mods believe I am being disruptive or bringing down the reputation of the boards???? then please contact me and I will dissist from posting here. Don't draw conclusions regarding people just because they haven't got a degree etc. I can't help being a little upset with your atitude. And it's not because your implication that my techinical knowledge of GPU's for example is week. During my short time posting here I have learn't a fair bit especally from fran and shifty.

(And I wasn't trying to say "no hu" I was genuinly puzzled at what they were trying to say. Hence my post "I don't understand what your trying to say", later when Fran then Shifty explained it superbly and very helpfully for me (and for other people who had the same misconception that I had) then I thanked them. I was not trying to argue with anyone just trying to understand the point being made (and in the end shifty and fran helped me out in a way I could understand). People reading this excellent forum are of varying levels of knowledge but all share the same thing the attraction to Tech and in this case console tech and isn't what this community is all about (sharing knowledge of a subject that are dear to many peoples heart?).

So I appolagise to anyone who thought I was being obtuse, I was not. Neither was I trying to argue with programing gods (of course programers and devs know more than me), however to understand where they were coming from I was trying to have a discussion from my point of view (or understanding). If you read Shifty's last comment to me he did say he knew where I was trying to come from and gave sound advice.

Lastly just wanted to add due to this thread I have been able to pass this knowledge on to other people who have had this misconception so other people have learned from this thread to.

So please read my discussion/arguments as someone who is puzzled and is trying to understand what the devs are saying rather than being an obtuse "fan boy" who is trying to cause issues with limited knowledge.
 
Last edited by a moderator:
^ i agree with you mate, im on the same boat as you since im no expert at anything (im a Finance and Economic major) related to gaming but i love gaming and i found this site back in 2003 and joined it when it wasnt that popular but i dont go out and say that cause i've joined early than you then give me more respect than i deserved cause i believed that you must earned your respect. The only way to learn properly is to ask questions and discuss about it. If you think otherwise then i dont think you can go far with your career.
 
First off, I apologize for going totally OT for a second here but I would just like to offer you, Terra, a different perspective from what the other posters were trying to communicate. While I don't necessarily agree with their methods/wording, I do understand where they are coming from. I don't think there is anyone in here that is blaming you for your inexperience with the graphic technology field, rather I believe their responses were derived from two things; their reverence to these great devs who are willing to share knowledge with us "common" folk and also to your tone when replying to said devs.

I believe what they were trying to say was... instead of doubting the dev's knowledge by implying their disagreements were essentially agreeing with yours(which in some eyes seem rude and insulting), you could have asked where the disagreement stemmed from in the first place(which would have still resulted in that excellent post made by shifty). Perhaps the other poster's felt a certain condescending tone when you replied in this fashion. Implying that their disagreement is in fact false due to it being ultimately agreeable with your line of thought may be interpreted by some as: "You are wrong, Mr. Dev and I was right all along". That is my take on the situation anyhow.


Back on topic... Do we know exactly what "Killzone 2 Devs use Deferred rendering" implies or means? Sorry if I missed a few things here and there from the multitude of links... but it's not written in stone that KZ2 will be exclusively using deferred rendering(shading) is it? I think most people who have had experience with both would agree that picking one over the other is largely situational and really depends on your specific scene/environment. If I was a betting man, I would put my money in the mixed approach, perhaps with a bit more emphasis in deferred shading(which will explain why they are making announcements about it, perhaps new ground breaking techniques?). Just like Fran said, having a rich environment presumably means lot's of light sources (favoring deferred) AND also have a lot of variety in materials(favoring forward) therefore a mixture would technically be the best solution.

To me the question is, did the KZ team conquer many of the issues of deferred shading with clever tricks, which are also part of sony's EDGE? From my understanding, deferred shading is a fillrate/bandwidth intensive method... which "parts" of the ps3 actually excel in so this may be a possibility. It's also definitely good to hear that MSAA might be possible with methods that aren't restricted by API's... rather than the big misconception floating around on the forums this past generation with many believing that hardware MSAA was not available with deferred shading due to hardware limitations.

Either way, it's good news since "theoretically" deferred shading scales better than forward rendering as scene complexity increases... therefore it is safe to say that the KZ team is aiming quite high in regards to scene complexity with this title if they are making a big deal out of deferred rendering.
 
It`s highly doubtful...it`s not realistic to think that you have an engine that does deferred rendering in some scenes and typical rendering in others. At most, you can have an engine that uses a deferred approach in some areas only(like UE3 who uses solely a deferred shadowing approach, not a general deferred rendering one), but reading what the KZ2 guys are saying, I think they have a fully deferred renderer in place, and they're proud of it:)
 
^ i agree with you mate, im on the same boat as you since im no expert at anything (im a Finance and Economic major) related to gaming but i love gaming and i found this site back in 2003 and joined it when it wasnt that popular but i dont go out and say that cause i've joined early than you then give me more respect than i deserved cause i believed that you must earned your respect. The only way to learn properly is to ask questions and discuss about it. If you think otherwise then i dont think you can go far with your career.

Nice said!

off topic: is your name by any chance taken from a vietnamese singer?
 
Status
Not open for further replies.
Back
Top