RSX: Vertex input limited? *FKATCT

- 'mission' type games can get around the disc space limit on 360 by shipping on multiple discs. Not an elegant solution, but doable.

- It's questionably whether or not many studios have the budget to fill a 50gb disc with assets.

- On multiplatform games like GTA, will they even bother since it's so much easier to make one set of assets that works on both platforms, and hence limited to 8gb or so.

- Correct me if I'm wrong in this, but from what I've heard streaming load times are slightly longer on PS3 because the optical head has a longer settling time, so it will be slightly slower at streaming than 360. I'm going on another devs info on this so I haven't verified it.

- Single platform games with huge budgets like Metal Gear don't need to care so they may very well fill the whole darn 50gb with stuff which would be awesome.

More discs = more complexity in the development phase, but again those that need more discs are rarely hindered by budgets. For the smaller developers though it´s easier to compromise and aim for one disc.

You don´t need to fill 50GB worth of data in order to claim "superior storage technology" if you are in sony camp. Anything beyond what can be on a DVD is enough. 10GB would still not fit on a DL DVD :)

Afaik the DVD vs BluRay streaming speed, seek times etc is dependent on SL DVD or DL DVD vs BluRay. Being that SL DVD is always faster than BluRay while DL DVD is on par with BluRay and sometimes Slower.

Unless GTA is memory limited or "stream limited" it should be fairly easy for them to use more storage for higher quality textures though the difference wouldn´t be mindblowing (but surely enough to tick it off on the box :))

One question, you seem to indicate that the memory difference is one of the most important differences, At most wouldn´t the difference be something like 5% ?
 
Those numbers seem "more or less" in line with what we've seen. They don't include the edram that the 360 has though, on PS3 we have to use some of the precious vram for frame buffers, don't need to do that on 360

This is not entirely correct. The EDRAM is a scratch pad memory, not storage memory, you can't really store anything in there, you render to, immediately resolve, clear, rinse and repeat. I wouldn't count it as available memory. You still need to resolve the EDRAM content to a framebuffer in main memory that will be displayed. Some memory can still be saved from the fact the framebuffer in main memory is not using MSAA (EDRAM will resolve samples) and with correct timing you don't need to double buffer it. But a framebuffer ready to be displayed is still present in main memory.
 
Also becouse you don't know RSX?
More likely because they're fundamentally the same design. They work the same way fundamental, transforming vertices and rendering fragments, just with slight tweaks about how they go about it. Kinda like a left-hand drive car versus a right-hand drive car. Both can turn left and right corners, but each has an viewing advantage turning one way and disadvantage the other. Each GPU will be more efficient in some tasks, but there aren't many that can't be done at all. Given the UE3.0 engine and GeoW, technically there's nothing going on there that either GPU can't handle, because it's a fairly generic engine.

In a customized, single patform engine, the differences will become more notable, such that analogy might shift to being between a Lotus Elise and a large Corvette. Both still get round the track quickly by burning fuel in an internal combustion engine, but one relies on lightweight agility and the other raw power. Not that I'm likening either GPU to either car - I'm just trying to show how fundamentally they are the same and you can get the same job done.
 
So this internal job manager is purely a trade secret for SCE WWS developers? Not available at all to third-party developers?
Well, it's a bit harsh to say it's a 'trade secret', to be honest. It's just code that we have internally - SCE WWS is kinda separate to the bit of SCEI that deals with PS3 SDK development.. we're busy working on stuff for SCE-published titles. They're busy working on stuff for the platform in general (and all the developers using it). I guess the main reason we have it isn't because it's some kind of miracle system (it's good, but it ain't gonna magically make everything just *run* on SPUs!), but because it helps various SCE WWS developers use a common job framework - which further promotes code sharing between the different studios. If all of us used a different job manager, it would make it difficult to share SPUs to the extent we do now.

As was said before, this sits on top of (or inside, I guess) the SCEI released SPU runtime system.. so there's nothing to stop other developers from doing the same thing.. infact, I'd be surprised if the larger devs (with multiple PS3 titles in production) hadn't already standardised on a single job management scheme for exactly the reasons I stated above.

Cheers,
Dean
 
From what I have read Sony have two diferent codebase the one included in the SDK and other one for first/second partys (the SCE World Wide Studios code base).
I'll get straight to the point on this.. your post makes it sound like SCEI have one magical SDK for 1st/2nd party teams, and one slightly crummier SDK for 3rd parties.. this is absolutely not the case. We *all* use the SCEI supplied SDKs.

As far as multiplatform-style development goes, as far as I'm aware (and hey.. it's New Years's Day, so my head is a little rough at the moment), PSSG is the only thing that falls into that category. And even that is not a core component of the distributed SCEI SDK.. it's an optional download.

libraries and develped by teams operating as part of (or for) SCE WWS. The only visible result of this is the "flag algoritm" (Motorstorm, GTHD and Heavenly Sword flags look mostly the same).
I don't know how you came to that conclusion (in short, I'd love to know what you read in order to come up with the details in your posting), but it's completely wrong. That's like me saying because all those games have characters that are skeletal, they must be using the same skeletal animation and rendering system.

And besides, flag systems are 10 a penny.. they're not the hardest thing to do on any system..

Cheers,
Dean
 
As DeanoC said SPURS is the job manager included in the SDK for everyone but Ninja Theory use a more avanced version from the SCE WWS code base. From what I have read Sony have two diferent codebase the one included in the SDK and other one for first/second partys (the SCE World Wide Studios code base).

This is not exactly true, I don't think Sony has 2 different SDKs, more likely internal studios share the same kind of technology


-The code base for first/second partys is more PS3 expecific and includes both developed by Sony libraries and develped by teams operating as part of (or for) SCE WWS. The only visible result of this is the "flag algoritm" (Motorstorm, GTHD and Heavenly Sword flags look mostly the same).

The SDK is almost the same but Sony has very talented teams that can customize the code and share it, probably EA for example does the same.
 
Oh, seems like I missed this one. Gears does leverage unreal, but Insomniac has also been leveraging an engine that Naughty Dog had written and tweaked for many many years as well. Seems like a fair comparison to me.

Yes Insomniac did use ND's engine, but that was on PS2, were discussing PS3. Resistance is using a modified version of Racthet and Clanks engine ( Witch is based on ND's PS2 engine ) so id assume there is still some PS2 hardware limitaion's present in the engine itself. Comparing a next generation game that use's a tweaked last generation game engine that had little to no support of any next generation effects in the first place to an engine that was prupose built for the next generation hardware is i feel extremely unfair. :)
 
Resistance is using a modified version of Racthet and Clanks engine ( Witch is based on ND's PS2 engine )

Where did you hear that from? I always thought Ratchet & Clank (PS3) was using the resistance engine, just improved for obvious reasons like time. Neither of which was developed by ND.
 
Where did you hear that from? I always thought Ratchet & Clank (PS3) was using the resistance engine, just improved for obvious reasons like time. Neither of which was developed by ND.

:???: Reistance is using a modified version of PS2's R&C engine, and PS2's R&C engine is a modified version of ND's Jak game's engine. How could they use the PS3 R&C engine when the game has'nt been properly unveild? Maybe i did'nt explain myself very well in my post :???:
 
This is not exactly true, I don't think Sony has 2 different SDKs, more likely internal studios share the same kind of technology




The SDK is almost the same but Sony has very talented teams that can customize the code and share it, probably EA for example does the same.
As Dean has stated there is one SDK for all, on top of that developers build libraries etc. Most large publishers have some libraries that can be used by any of there teams, for example EA has Renderware tech that its teams can use if they want. Same for Sony WWS (who is the games publisher side of Sony and has almost nothing to do with the SCEI PS3 maker side of the company), teams who are published by them (which obviously includes 1st parties but also people like nt and evo) have access to that shared tech.
WWS is not that different from EA with regard access to 'SCEI' (which most people mean when they say 'Sony').
Its also worth noting, that which bits a team using it upto them. In some cases we choose not to use WWS shared tech, and use our own or buy middleware in.

BTW flag tech isn't shared. But given its a standard verlet system its not surprising they look the same as other.
 
Question for the devs: How effective is 3Dc on the Xenos and is there any reason not to use it for Xbox 360 games that are exclusive to the platform. Is there a catch to using it?
 
:???: Reistance is using a modified version of PS2's R&C engine, and PS2's R&C engine is a modified version of ND's Jak game's engine. How could they use the PS3 R&C engine when the game has'nt been properly unveild? Maybe i did'nt explain myself very well in my post :???:

Hm? According to this, it was simply the case of Naughty Dog and Insomniac sharing some code (which went both ways). Nothing about modified engines at all. I couldn't find anything on it, but AFAIK, the Resistance engine was all-new as well, built from the ground up for PS3. How often are components of a PS2 engine going to be useful for PS3 development (with the aim of being competetive or even better than most other engines out there on release)?
 
:???: Reistance is using a modified version of PS2's R&C engine, and PS2's R&C engine is a modified version of ND's Jak game's engine. How could they use the PS3 R&C engine when the game has'nt been properly unveild? Maybe i did'nt explain myself very well in my post :???:

I always thought as TurnDragoZeroV2G said, resistance was built from the ground up for the PS3. And Ratchet & Clank for the PS3 (coming in 07 I believe), is an extention of the resistance engine as there has been more development time to work on it. I dont think either games have to do with ND code or any PS2 engines.
 
Where did you hear that from? I always thought Ratchet & Clank (PS3) was using the resistance engine, just improved for obvious reasons like time. Neither of which was developed by ND.


Ratchet & Clank is using their up coming engine named Isla or something like that , i don't quite remember
 
This is not entirely correct. The EDRAM is a scratch pad memory, not storage memory, you can't really store anything in there, you render to, immediately resolve, clear, rinse and repeat. I wouldn't count it as available memory. You still need to resolve the EDRAM content to a framebuffer in main memory that will be displayed. Some memory can still be saved from the fact the framebuffer in main memory is not using MSAA (EDRAM will resolve samples) and with correct timing you don't need to double buffer it. But a framebuffer ready to be displayed is still present in main memory.

Thinking out loud.

A 720p 4xMSAA framebuffer is ~ 30MB. A fully resolved framebuffer is (taking a stab at this) ~ 3.5MB? Someone feel free to correct my math.

I think this is where the majority of the difficulty in discussion forums come in. Depending on how you look at things, and how resources are being used, and how things are stated can make a big difference in how people perceive things.

e.g. Fran notes that there are savings, but there is still a small framebuffer in system memory. His answer wasn't the simplest but was fair and accurate as far as I could tell. I think scratch pad is a better description of what eDRAM does, but it does seem one way or another (indirectly) it does save some system memory. But this is one of the reasons 1-to-1 comparisons are so difficult is because the PS3 and Xbox 360 use resources differently. The classic example we always bicker about is system bandwidth for graphics. It isn't accurate fair to say RSX has ~45GB/s of bandwidth and Xenos 22.4GB/s of bandwidth for graphic rendering; likewise it isn't accurate to use the 278.4GB/s spec for the Xbox 360 for graphic rendering either (because not all that bandwidth is available to all graphic rendering tasks and the system pool is also shared with the CPU).

I guess what I am trying to say is that looking at the workflow and how the design impacts available resources on each machine is different depending on what it attempting to be accomplished. And I think this is where many of the misunderstandings and disagreements we see on the forums come from.
 
A 720p 4xMSAA framebuffer is ~ 30MB. A fully resolved framebuffer is (taking a stab at this) ~ 3.5MB? Someone feel free to correct my math.

That's correct. You basically save two 4X (or 2X) render targets (colour and depth), and only need a resolved 1X frambuffer, unless you also need to resolve the depth buffer for post processing, which is common. But it must also be said that you if you are doing post processing, and there's really no reason to not have a post processing step nowadays, you can probably easily reuse that video memory for other purposes on PS3 after you have manually resolved them. In this common scenario, there probably wouldn't be a big memory save in having EDRAM.

e.g. Fran notes that there are savings, but there is still a small framebuffer in system memory. His answer wasn't the simplest but was fair and accurate as far as I could tell. I think scratch pad is a better description of what eDRAM does, but it does seem one way or another (indirectly) it does save some system memory. But this is one of the reasons 1-to-1 comparisons are so difficult is because the PS3 and Xbox 360 use resources differently. The classic example we always bicker about is system bandwidth for graphics. It isn't accurate fair to say RSX has ~45GB/s of bandwidth and Xenos 22.4GB/s of bandwidth for graphic rendering; likewise it isn't accurate to use the 278.4GB/s spec for the Xbox 360 for graphic rendering either (because not all that bandwidth is available to all graphic rendering tasks and the system pool is also shared with the CPU).

I wasn't very clear in my post. I was under the impression that EDRAM was seen as the 'framebuffer', which is not really the case since the EDRAM is a write-only memory and nothing can read from it. The EDRAM can 'write itself' (resolve) to main memory. The DAC can only read the framebuffer from main memory where it effectively lives.

On the other hand, it's true that rendering to EDRAM will save some memory compared to a more traditional architecture, but you pay this save in terms of having to deal with predicated tiled rendering. There's no free supper :)
 
Last edited by a moderator:
Back
Top