The Game Technology discussion thread *Read first post before posting*

In the quotes you posted nAo mentioned a "future GPU".

The term he used was actually "hypothetical".

nAo said:
I'm not advocating Xenos nor RSX, I was just expressing an opinion on a hypothetical GPU :smile:

Which I interpreted as "NDA, wink wink nodge nodge". I think this was before the 96 kB xdr cache of the RSX was common knowledge.
Edit: Just checked, yes it was.

Arun said:
it also has an extra 96KiB of cache dedicated to communication with the XDR memory pool, in order to improve bandwidth utilization and average latency
 
Last edited by a moderator:
Actually for some reason you seem to have skipped this part of my post...

http://www.virtualr.net/need-for-speed-shift-new-screens-interview/


So yes I have bassis, and devs are indicating it is true. And this seems to indicate of a 2x multiplier hence doubling the 180Hz cvar value I posted earlier from games file.

Not really skipped but read with a discerning eye for the nuances. My quote is a very specific context of how the engine runs on the consoles and the fidelity of their AI and Physics threads @ 120Hz and why this was why the Renderer @ 30Hz was sufficient.

In your quote there is more PR and hyperbole and generalities. He mentioned a very loose number "running at around" and didn't say the updates or threads were running at 400Hz, but "parameters" where when specifically talking about frequency updates of threads (AI and Physics) he mentioned 120Hz and 30Hz for the render thread. And it doesn't surprise me, considering how daft some of their PRing was (seemed as bad as Shane Kim ignorant of his OWN software announcements!) in regards to the landscape of the market, that their message was tweaked post-comment. It doesn't look good when you are marketing half the framerate and one-third the physics updates.

The devil is in the details. Shift may have more going on per-update so even at fewer udpates does more. Or maybe EA was able to build a multiplatform engine with more cars than Forza on track AND do more detailed modeling at the same rate (sure does NOT play like it). Your quote shows obvious signs of marketing FUD, though, with the comment "unlimited threads" on consoles. Pretty much a throwaway comment--don't you agree? The transition from a very specific 120Hz for Physics Threads / 30Hz Rendering Threads on the consoles to general "parameters" "around" 400Hz in conjunction with "unlimited threads" should give you an idea of how EA was reacting. One is a little more specific and technical than the other ;) But who am I to argue with the Maximum PC Due :LOL:

Anyhow, enjoy your 30Hz console game ;)
 
The term he used was actually "hypothetical".

Which I interpreted as "NDA, wink wink nodge nodge". I think this was before the 96 kB xdr cache of the RSX was common knowledge.
Edit: Just checked, yes it was.

That seems to be a significantly selective use of quoting as he says, "I'm not advocating Xenos or RSX" immediately prior to his comment.

http://forum.beyond3d.com/showpost.php?p=1355007&postcount=627

I'm not advocating Xenos nor RSX, I was just expressing an opinion on a hypothetical GPU

Anyhow, unless you can show this small cache being utilized for such tiling (do you even know it is possible??) and that this wouldn't destroy FlexIO traffic you are pretty beating a dead horse.

Better suggestion on PD: They are awesome developers (top 1%) who have spent 5 years working and refining their renderer.
 
In your quote there is more PR and hyperbole and generalities. He mentioned a very loose number "running at around" and didn't say the updates or threads were running at 400Hz, but "parameters" where when specifically talking about frequency updates of threads (AI and Physics) he mentioned 120Hz and 30Hz for the render thread.

Ofcourse he says parmeters as you wouldn't do 360Hz update rate for track object physics. It's not a global value and I doubt it is for any racing game. And physics parameters are what defines the cars physics.

And it doesn't surprise me, considering how daft some of their PRing was (seemed as bad as Shane Kim ignorant of his OWN software announcements!)

Much like Turn 10? PR is always tinted in some way or part for most if not all dev houses. "car polys, best ever, etc hint: Dan), "only game with 360Hz physics as far as we know" etc... :LOL:

Or maybe EA was able to build a multiplatform engine with more cars than Forza on track AND do more detailed modeling at the same rate (sure does NOT play like it).

And neither should it as Forza 3 aint the reference! Reality is!

The transition from a very specific 120Hz for Physics Threads / 30Hz Rendering Threads on the consoles to general "parameters" "around" 400Hz in conjunction with "unlimited threads" should give you an idea of how EA was reacting. One is a little more specific and technical than the other ;) But who am I to argue with the Maximum PC Due :LOL:]).

However the interview with the ~400Hz comment was made in July and the one with 120Hz was made in May. Development improvements?

Anyhow, enjoy your 30Hz console game ;)

Actually I play it on PC so it is solid 60fps for me but thanks... and 800Hz tick rate! :p
 
Last edited by a moderator:
Actually I play it on PC so it is solid 60fps for me but thanks... and 800Hz tick rate! :p


Wow you a real sim fan. This is great to see that you are not fooled by marketing by any company.

Do you have a wheel to try pro-mode of Shift? How is it to drive?

Thank you.
 
That seems to be a significantly selective use of quoting as he says, "I'm not advocating Xenos or RSX" immediately prior to his comment.

http://forum.beyond3d.com/showpost.php?p=1355007&postcount=627
I am confused, didn´t I include exactly that quote?
I appreciate his attempt of staying clear of the versus debate and stick to the technical matters, but it didn´t work very well, did it?


Anyhow, unless you can show this small cache being utilized for such tiling (do you even know it is possible??) and that this wouldn't destroy FlexIO traffic you are pretty beating a dead horse.
If they choose to use the FlexIO for this how would it "destroy" it? You may be right that I may be beating a dead horse, but having a render buffer on the XDR side is not that uncommon from what I read, Guerilla Games (probably more devs as well) are doing post-processing on the SPUs and they cannnot possibly be reading fromm DDR3 memory given the ridicilous bandwidth assigned to that, I think even the PhyreEngine have libs for SPU post-processing.

By looking at this developer screen at PD it seems like they are no strangers to rendering tiles.
http://download.gameblog.fr/images/jeux/0/TGS09_POLYPHONYDIGITAL_80.JPG

Must admit I have no idea why the screen looks like this. Maybe they are benchmarking different shader programs against each other, looking for visual artefacts, collecting execution statistics etc. Maybe this is standard procedure during development, maybe some dev can give some better insight.
Anyways given the tiles in the picture they are in the proximity to what nAo suggests and would fit well within the xdr cache of the RSX.

Better suggestion on PD: They are awesome developers (top 1%) who have spent 5 years working and refining their renderer.
Sure, but they are not magicians they must use some pretty smart tricks that no one else or at least very few are doing.
 
Last edited by a moderator:
The tiles in the picture are 16 horizontal 9 vertical (16:9). With resolution of 1280x1080 each tile is 80 x 120.
720p = 720 x 1280 - > 80 x 80 tile
1080p = 1080 x 1920: GT5 Prologue upscaled 1080x1280 -> 120 x 80 tile
GT5 HD upscaled 1080x1440 -> 120 x 90 tile

The screen in the picture is a highend Bravia (old model), pretty sure it´s a 1080p screen, but it doesn´t really matter as it accepts 720p input as well.

Edit: damn you folks are quick, I discovered the error myself within a minute. :smile:
 
Last edited by a moderator:
Not sure where you got the 1280x1080 numbers from?

720p = 720 x 1280 - > 80 x 80 tile
1080p = 1080 x 1920: GT5 Prologue upscaled 1080x960 -> 120 x 60 tile
GT5 HD upscaled 1080x1440 -> 120 x 90 tile

The screen in the picture is a highend Bravia (old model), pretty sure it´s a 1080p screen, but it doesn´t really matter as it accepts 720p input as well.
http://forum.beyond3d.com/showthread.php?t=46241

GT5P was 1280x1080 2XAA per this forum.
 
If they choose to use the FlexIO for this how would it "destroy" it? You may be right that I may be beating a dead horse, but having a render buffer on the XDR side is not that uncommon from what I read, Guerilla Games (probably more devs as well) are doing post-processing on the SPUs and they cannnot possibly be reading fromm DDR3 memory given the ridicilous bandwidth assigned to that, I think even the PhyreEngine have libs for SPU post-processing.
Here is a Guerillas presentation about what the SPUs are doing in KZ2, it's pretty interesting and result is pretty neat indeed. Post processing is done at a quarter of the resolution 640x360 therender buffer is indeed in Xdram, bandwidth is not an issue SPU are compute bound. It sounds like thing are even better now than they were when KZII shipped.
 
Here is a Guerillas presentation about what the SPUs are doing in KZ2, it's pretty interesting and result is pretty neat indeed. Post processing is done at a quarter of the resolution 640x360 therender buffer is indeed in Xdram, bandwidth is not an issue SPU are compute bound. It sounds like thing are even better now than they were when KZII shipped.

Wow. That is a great information my friend. It is unfortunate that Linux is no longer able on PS3.

I wonder if anyone has made rendering benchmark for Cell.

ps.
That is a very large document. Amazing file size.
 
This is just a question to Joker454 or anyone else who may be able to enlighten me :)

In a lot of multiformat games the 360 version is not v-synced while the PS3 version is. Why? Usually this means that in places were the PS3 version drops frames, the 360 tears instead. Even if the PS3 version would tear a whole heap more, isn't framerate always more important than image integrity? I'm guessing the answer is more involved than what I'm thinking but I'm still curious.

I'm talking games like Fallout 3, Resident Evil 5 etc.
 
This is just a question to Joker454 or anyone else who may be able to enlighten me :)

In a lot of multiformat games the 360 version is not v-synced while the PS3 version is. Why? Usually this means that in places were the PS3 version drops frames, the 360 tears instead. Even if the PS3 version would tear a whole heap more, isn't framerate always more important than image integrity? I'm guessing the answer is more involved than what I'm thinking but I'm still curious.

I'm talking games like Fallout 3, Resident Evil 5 etc.
You missed a long discussions some time ago ... there are different theories about that: vsync on the ps3 prefered to simple double buffer, because that cause more tearing compared to 360 and maybe a better propensity to get triple buffer on the ps3 hardware. So some developers seems to prefer image integrity to fps in the ps3 version.
 
Last edited by a moderator:
I could check later on though who knows what settings they used or even if they restarted game after changing settings...

What is the minimum video card requirement?

SM3.0 256Mb VRAM 6600GT/1600XT, single core AMD A64 3200+/P4 3.0GHz.
 
Last edited by a moderator:
btw it looks like the game IS a 360 port going by what users are posting is present in the game exe and console. Lots of xbl references. So possible they just took out things for the worst case scenario and left given everything would look better on a higher end system with higher res and better aa + textures even with things missing.
 
Back
Top