PS3 hardware design choices - good or bad? *spawn

It makes sense. PS3 timeline may had to adapt to the hd-dvd vs BRD battle as well as pressure from MS.

I also believe that Sony got scared by what MS offered with the first xbox (no matter how it turned) and what they could come with for its successor. Hence the huge amount of money sunk in TBE. I believe it was already a losing proposal to try to compete head to head with MS ( focusing on hardware only).

For the Vita I think the guy is right of the money. It's madness that the PSV and the Xperia play are not the same thing I've been saying it for a long while.
Like it's madness for Nintendo to ship a Wiiumote that is not the DS successor. For a company that has so few products that lack of foresight is unbelievable...
 
If Sony made unusual mistakes with PS3 due to fear of the 360's launch timetable, perhaps they'll be able to avoid making them with PS4.

Sony's parochial blinders (Japan's building Vita as if a new PSP was all they needed in the post-iPhone world) and the arrogance that the author talks about starting in the PS2 era bode poorly for Sony, though.

Sony working with (buying?) Gakai seems like a promising sign that they are hungry and paying attention now, though, so perhaps they'll make something happen this next gen.

That's all OT, though.

It was interesting reading that Sony really had been working with IBM (and Toshiba?) on a graphics solution for PS3. Thank goodness that got shot down. Even if PS3 came out too soon with a bandwidth starved GPU, it has to be better that they used a mainstream part than if they'd tried for a GS2 that continued to go against the rest of the market's direction in graphics processing.
 
If Sony made unusual mistakes with PS3 due to fear of the 360's launch timetable, perhaps they'll be able to avoid making them with PS4.

Sony's parochial blinders (Japan's building Vita as if a new PSP was all they needed in the post-iPhone world) and the arrogance that the author talks about starting in the PS2 era bode poorly for Sony, though.

Sony working with (buying?) Gakai seems like a promising sign that they are hungry and paying attention now, though, so perhaps they'll make something happen this next gen.

That's all OT, though.

It was interesting reading that Sony really had been working with IBM (and Toshiba?) on a graphics solution for PS3. Thank goodness that got shot down. Even if PS3 came out too soon with a bandwidth starved GPU, it has to be better that they used a mainstream part than if they'd tried for a GS2 that continued to go against the rest of the market's direction in graphics processing.
I indeed believe that at the time Sony failed to access properly their strength and weakness.
Going with IBM is the sign that they were no longer able to deliver by them-selves. They no longer had an advantage on either Nintendo or MS. It was mostly about buying a design and MS had more than what it takes to compete.
They failed to really access where the technology was heading (if the 360 was unknown PC was not). They knew unified architecture (gpu) were coming or should have guess (which goes back to point one ie their declining technological prowess).

One thing is that they did not under estimate the xbox reception in US (pretty good imho) so they hurry up. But if they were thinking of launching latter than they did they were also underestimating Nintendo and its Wii (which should have been replaced already).

Overal my believe is that for Sony the good system might have been more a super Wii than what we got, a CPU trying to be a super computer on chip and a PC GPU.
The push to HD was to anticipated. It was too costly to make a suitable system for HD, etc.
There still fact technophiles here like to ignore, geow2 was playing ~50% on SDTV. Plenty of 360 are not connected to a network. The same may apply to the ps3.

Looking back I'm still in awe about how Sony could not manage to kick MS out of the market. Definitely something to study in business school.

Looking forward for Sony the problem is that MS is in their seat in US now and if even if they make good decisions MS can do so and is in another league as this point wrt to financial shape as well as technical prowess.
 
Sony working with (buying?) Gakai seems like a promising sign that they are hungry and paying attention now, though, so perhaps they'll make something happen this next gen.

Jason Garrett-Glaser (aka Dark Shikari) one of the lead developers of x264 and who also works for Gaikai said on the x264 developers IRC channel that the E3 announcement does not involve Sony.
 
Overal my believe is that for Sony the good system might have been more a super Wii than what we got, a CPU trying to be a super computer on chip and a PC GPU.
The push to HD was to anticipated. It was too costly to make a suitable system for HD, etc.

.

MS lost what was it, the 1 billion dollars it took to make 512 MB system happen? I guess Sony probably never would've had the GDDR3 perhaps, since IIRC it was their way of matching the 360's RAM amount, without the much higher cost of double XDR RAM and at the same time giving the RSX a dedicated pool of VRAM (not sure on the story, but I figure as such, though the RSX would've been even worse off w/o a dedicated framebuffer). Neither console needed 512 MB of RAM to make HD a reality. Look at some of the best looking PC titles from 2004 and 2005 that really show off what can be done within such a "limited" memory footprint. The extra RAM gave devs an excuse to focus more on graphics instead of game design. We definitely would see more 60 FPS titles, that's for sure. We still could have 720p with 2x MSAA titles, texture resolutions would "suffer" but would still be high enough enough to warrant 1280 x 720 necessary, along with the high geometry that we still would've had anyways. I can't imagine playing even Far Cry PC, FEAR or Black and White 2 below such a resolution, all titles that I think would've been comfortable on a 256 MB Xbox 360 or PS3.

Apparently post v2.2 MSI Afterburner measures VRAM usage on AMD cards now, so I think I'll get to measuring VRAM amounts :p

I just think that developers are killing themselves by allowing the coercion of the industry to define how they make and produce titles. Perhaps if they would budget more towards game design instead of super crazy awesome graphics, and show off innovative game design more openly to the press, perhaps the gaming world would get more on board with whatever they had to present.

Finally,
 
Last edited by a moderator:
I can't imagine playing even Far Cry PC, FEAR or Black and White 2 below such a resolution, all titles that I think would've been comfortable on a 256 MB Xbox 360 or PS3.

Given that the PS3's os at launch used ~135mb or so or ram, and that it took them a long time to get it under 100mb leaves me thinking you may not have been happy with how games would have looked on a 256mb PS3.
 
MS lost what was it, the 1 billion dollars it took to make 512 MB system happen? I guess Sony probably never would've had the GDDR3 perhaps, since IIRC it was their way of matching the 360's RAM amount, without the much higher cost of double XDR RAM and at the same time giving the RSX a dedicated pool of VRAM (not sure on the story, but I figure as such, though the RSX would've been even worse off w/o a dedicated framebuffer). Neither console needed 512 MB of RAM to make HD a reality. Look at some of the best looking PC titles from 2004 and 2005 that really show off what can be done within such a "limited" memory footprint. The extra RAM gave devs an excuse to focus more on graphics instead of game design. We definitely would see more 60 FPS titles, that's for sure. We still could have 720p with 2x MSAA titles, texture resolutions would "suffer" but would still be high enough enough to warrant 1280 x 720 necessary, along with the high geometry that we still would've had anyways. I can't imagine playing even Far Cry PC, FEAR or Black and White 2 below such a resolution, all titles that I think would've been comfortable on a 256 MB Xbox 360 or PS3.

Apparently post v2.2 MSI Afterburner measures VRAM usage on AMD cards now, so I think I'll get to measuring VRAM amounts :p

I just think that developers are killing themselves by allowing the coercion of the industry to define how they make and produce titles. Perhaps if they would budget more towards game design instead of super crazy awesome graphics, and show off innovative game design more openly to the press, perhaps the gaming world would get more on board with whatever they had to present.

Finally,
Short answer:
* I see no real reason for the resolution to be "fixed" arbitrarily by marketing purposes. Nothing would have prevented the system to render a few title in 720p (or close).

* I'm not sure I get your comment wrt to RAM. I don't get how you're connecting the dots between geometry, game design, etc. Possibly an English issue on my side.

* When I'm mean a super Wii, I think of something more powerful (in powers of 2) but kind of sharing some designs trait with the Wii, mainly staying at a safe distance from PC approach to high performance3d rendering.
By the way as such a the system would have been replaced already some of the limitation would have been a lot less relevant.
 
Given that the PS3's os at launch used ~135mb or so or ram, and that it took them a long time to get it under 100mb leaves me thinking you may not have been happy with how games would have looked on a 256mb PS3.
I thought it was closer to 64, but I don't recall where I saw that...
 
I thought it was closer to 64, but I don't recall where I saw that...

It was really bad at launch, well over 100mb, left many of us in a bit of a panic as we then had to fit a square peg (360 build) into a round hole (ps3 build). Good times :)
 
Short answer:
* I see no real reason for the resolution to be "fixed" arbitrarily by marketing purposes. Nothing would have prevented the system to render a few title in 720p (or close).

HD was and still is a big selling point of the 360 and PS3. Yes, the number of households/users with HDTVs and monitors has grown immensely in the last 7 years, but I think the normal uninformed public still adheres to the golden tone of someone or something saying "HD", especially with the rise in resolution on mobile devices.

* I'm not sure I get your comment wrt to RAM. I don't get how you're connecting the dots between geometry, game design, etc. Possibly an English issue on my side.
The state of game design relatively wouldn't have been effected by half the RAM in the current systems. What we see now is just an outgrowth of last gen for the most part, exception being the Wii and Kinect. The large amount of RAM I think is just another reason for devs to continue overtargeting visuals instead of spending valuable time and money developing gameplay. Just look at how many games have framerate issues, most likely due in part to pushing graphics too hard. Yes, a half size memory footprint has the potential to make things worse, but prevents GPU overtaxing.

* When I'm mean a super Wii, I think of something more powerful (in powers of 2) but kind of sharing some designs trait with the Wii, mainly staying at a safe distance from PC approach to high performance3d rendering.
By the way as such a the system would have been replaced already some of the limitation would have been a lot less relevant.

I remember the rumors of the dual core 1.5 GHz PPC + X1600 Wii spec, and in the end such a system would've worked to Nintendo's advantage in the long run. The GC commonality I think worked against them, making cheap shovelware way too ubiquitous and easy to make. Such a thing couldn't work too well on the HD sets, since the minimum level they had to aspire to would be much greater, and therefore more expensive.
 
Sony's parochial blinders (Japan's building Vita as if a new PSP was all they needed in the post-iPhone world) and the arrogance that the author talks about starting in the PS2 era bode poorly for Sony, though.

So according to the author of that article, a forward thinking non-blinkered Sony should have released an abortive phone that would've been creamed by the major smartphone makers and rejected by portable gamers for being the abortive phone that it is. This was supposed to compete with the 3DS? Or was it supposed to compete with the iPhone? In either case, how?

I already have a smartphone, and it does its job very well. I don't want another pricey and rather redundant smartphone, but I do want something portable that's actually good for gaming. Luckily Sony released the Vita: it's cheap, technically sophisticated, offers the best controls on a portable, and can deliver content through an app store. This isn't some pre-iphone dark ages device.

It was interesting reading that Sony really had been working with IBM (and Toshiba?) on a graphics solution for PS3. Thank goodness that got shot down. Even if PS3 came out too soon with a bandwidth starved GPU, it has to be better that they used a mainstream part than if they'd tried for a GS2 that continued to go against the rest of the market's direction in graphics processing.

Unless it was awesome. It's hard to speculate without knowing what the custom Toshiba part was like.
 
HD was and still is a big selling point of the 360 and PS3. Yes, the number of households/users with HDTVs and monitors has grown immensely in the last 7 years, but I think the normal uninformed public still adheres to the golden tone of someone or something saying "HD", especially with the rise in resolution on mobile devices.
I do agree. Still you got to admit that a lot of people played on SDTV for a long part of this generation. If the generation had a more reasonable length the issue would have been further lessen.
For the ignorant public a lot of games are @1080P if you see what I mean ;)
Still I'm OK to concede that HD (720P) was a reasonable point at this point of time.
But I still think that at this point in time silicon budget were to constrained. I'm happy neither with form of the ps360 and that's after 7 years.
I think a compromise have to be done, more on that latter.
The state of game design relatively wouldn't have been effected by half the RAM in the current systems. What we see now is just an outgrowth of last gen for the most part, exception being the Wii and Kinect. The large amount of RAM I think is just another reason for devs to continue overtargeting visuals instead of spending valuable time and money developing gameplay. Just look at how many games have framerate issues, most likely due in part to pushing graphics too hard. Yes, a half size memory footprint has the potential to make things worse, but prevents GPU overtaxing.
I do not agree. I would say what you're describing is more a side effect of the move to AAA games than anything else. Everything has to be pushed at eleven. The competition is rough risk with regard to gameplay or genres is avoided as much as possible. Overall I would say that nowadays games are way better than they were before on average. I mean you remember all those old games that were turning more than often into slide shows? There are still issues but I think it's way better now.

I remember the rumors of the dual core 1.5 GHz PPC + X1600 Wii spec, and in the end such a system would've worked to Nintendo's advantage in the long run. The GC commonality I think worked against them, making cheap shovelware way too ubiquitous and easy to make. Such a thing couldn't work too well on the HD sets, since the minimum level they had to aspire to would be much greater, and therefore more expensive.
Well as much as I dislike the Wii hardware history has proved it was the righ choice. It made them a hell lot of money.
I think that what will have hurt Nintendo (if the wiiU fails) is to hold on the Wii for too long when imho they had a shot to push something new one or two year ago. A real half gen upgrade which may have embarrassed both Sony and MS that could not really afford to move forward at this point in time.

What I mean by a super Wii may be was not clear. So I mean a system that would have been to the ps2 a "super wii" to the gamecube.
Like Nintendo they should have avoid the too modern PC GPU. I mean they are great but moving from the T&L / directx 7 era to the RSX /directx 9.c costed a lot of transistors.
Especially if aiming at HD resolution the saved transistors may have proved useful.

I've a tough time figuring what was Sony technical judgement on the ps2 hardware. I'm mostly ignorant of the psp design and a quick look gave me the impression that they were moving away from the PS2 approach with regard to how geometry was handle.
In the ps2 the EE was in charge of all the geometry and the GS was a rasterizer/fragment processor/ searching a proper word.
In the psp (going by wiki so I might be mislead completely) it looks like the GPU is more complete.

So If I get it right, I would assume that Sony came to the conclusion that the approach in the PS2 was not the most efficient. To me it makes the Cell even more of a strange choice as it looks like it trying to do what the EE was doing and much more.

Point is the PS2 achieved great thing. they should have tried to push the design further.
By them selves could have they design more complex MIPS cores, vector units, and GPU?
Even without aiming at Cell or RSX level of performance, I believe the answer was no.
Did they need to aim at this high. My belief is no either.

My belief is that in 2006 they could and should have produced something more akin to the PSV.
Whatever architecture they would have use for the cpu (be it MIPS, ARM or PowerPC) was irrelevant.
As I see the problem is not only that they were at the limit of what they could do but at the limit of their expertize. I think their is quiet truth about Bill Gates comment about them "not knowing what they were doing". They went for goals that were way above their expertize on the matter.
I believe Sony could not have designed an simple (if simple is a proper description for this) out of order super scalar MIPS processor still at the time they wanted to do the Cell and beat the whole world.
Same for the GPU (which it seems have been aborted early) they wanted to compete with companies who were pushing out multiple GPUs a year, investing a lot, etc. Shortly companies whom doing GPU were the actual business.

For me they went to ahead of them selves, they lose control of the project.

It seems to me that they wanted an EE2 and a fragment processor, so a system acting more or less like the ps2.
Sony should have considered more seriously the successful solutions used either by CPU or GPU manufacturers to make their products better, instead of trying to go for their own solution. It's not like they could not have been fitted into their design choice.

Looking at both the ps2 and the world the reasoning could/should have something like this.
1) We need/want a better/faster CPU.
*What do we have? a low clock, in order, narrow CPU with a super short pipeline, no L2 cache.
*What are the industry leaders doing? wider CPU, longer pipeline, out of order execution, L2 cache.
*Was it an option when the ps2 was designed? No, too complicated, big, cost a lot of power.
*Is it still not an option? The answer should have been no.
Should we pursue the level of performance the industry leaders are providing? Another no. Too, big too hot.
* Can we do it our selves? Looks like it would have been another no.
=> we need a pretty simple but modern CPU, that's to say, a longer pipeline, higher clock speed, on chip L2.
That sounds like nothing but the pentium, clock for clock, beat the original Pentium by between 25% and 35 %. On average another 30% 9clock for clock) was gain by integrating the L2 on die (from Katmai to Coppermine). Was that prohibitive for a system launching in 2006? No Coppermine (Xbox Cpu by the way) was worth 29millions transistors with its 256KB of L2.
That's early implementing.
Including further refinement and a bit bigger L2, and starting from the EE CPU we might reach 200% the sustained perfs and clock for clock. @1.2GHz for example that's 8 time the sustained performances, still that's not marketable GFLOPS figures :(

* Where are they heading? Multi threading and Symmetric multi processing.
*Why? hitting the wall in single thread performance mostly for three reasons: clock speed, power, main memory latencies.
* Should we get there? SMT sounds like an easy yes / SMP was indeed a tough arbitration. They may have indeed decide to squeeze some extra single thread performance. Intel and AMD barely made it around 2005/6 ( I don't remember for sure when the first real dual cores happened, AMD was first).

=> Summup we need to design (or buy) a CPU with good single thread performance. Pursuing high clock speed is an iffy prospect. We are sure we need a good SIMD. We are looking forward a huge jump in performance.

2) We need/want better vector units.
* Do we need two of them?
Like for SMP it makes more things complicated. Software people are not that willing to adapt to SMP, the same may applied wrt to vector processor.
Looking at how modern GPU operate, geometry is handled more efficiently by a more specialized unit(s) within the GPU. that's what T&L and in more modern GPU (at the time of ps3 design) vertex pipelines were handling. Let go for one.
*How do we make it faster?
higher clock speed, wider SIMD, implement scatter gather.
*Do we want memory coherency? Yes.
*how do we keep it busy? No choice SMT to hiding memory latency. OoO execution sounds way too complex to introduce.
*Can we do it alone? No.

=> We need to develop a heavily threaded (like 4) vector processor sharing the same memory space as the CPU. Actually we may want to reuse the result in other devices.

3) We need a better GPU
*Can we pass on handling geometry on the GPU? (see above) No.
*What the manufacturers are doing? Moving away for the fixed function pipeline
*How does it go? (at the time of design) not that well. As I remember the direct 8.0 and 8.1 were more top of the art directx 7 gpu and were taking serious performances hit in directx 8 and above mode. So it's promising but costly.
*What are the most noticable improvement for the average end user? Texture filtering and MSAA greatly improve the IQ.
* Is bandwidth a constrain? Yes.
* what are the solution? EDRAM or TBDR.
* Do we aim for HD? EDRAM is bothering if yes.
* Side note, edram is costly and limit the process one can use.

=> we need to design (or buy) a GPU with a really strong T&L engine (stronger than what were on the market as manufacturers were moving away from it).
We need GPU that makes an optimal use of bandwidth and can handle high level of texture filtering and anti aliasing.
---------------------

I'm kind of toying but it looks a lot like what the students were doing when I was working for a business school. Stay pretty open minded, see what competitors are doing, what works what don't, etc.
Then you match that with your strength and weakness and see how it turns.
I don't say that what I wrote is correct just that at least I find it more coherent than what the ps3 project looks like from outside.

With regard to partnership I would have hoped that Sony and Toshiba have joined force and be up to the task to deliver the CPU and VPU. Either way IBM was the only option.
On the GPU side I would have wish a partnership between Sony and PowerVR (or whoever own them at the time) But anything truly custom for either NV or ATI would have work including edram or not.

As a side note I would have wished Sony to notice that RAM seems to be a god sent for developer and that more RAm can go a long way. I would have which that RAM budget was higher in the PS3 even if that means cutting corners elsewhere. One option would have been to go with cheap ram.
1GB unified pool of RAM would have done imho great.
 
Last edited by a moderator:
I ended up too late yesterday but after that kind of exercise a question arose in my mind. A technical one.

Let say that either as a design choice or because of silicon and power budget constrains one may have passed on a completely programmable GPU.
And the choice would have come to which part of the GPU would you (devs) favors as the programmable part?
So a T&L engine supporting programmable pixel pipeline (say shader model 2b)? Or the other way around?

I myself would say the latter but I don't know better.

___________________

Wrt to how the system would have ended based on the above requirement the system could have ended that way.

=> 1 OoO Cores with a strong SIMD, 2 way SMT doesn't make it to the design.

=>The VPU ends up 16 wide, most of the requirements didn't make it to the device,it was too complex. It ends up with a local store and a DMA engine. The ISA is closer to Larrabee one than to the SPU.

=>Sony re-access the GPU technology as completely programmable GPU and unified pipeline are set to be the next step in GPU evolution. No matter MS failure with the first xbox they feel threatened.

=> PowerVR is not an option at the time, Sony decide to go with Nvidia or ATI. They choose to have a fragment processor only and to have most the geometry handled by the vector unit.

=> Sony chose the GPU to be heavily biased toward texturing power and ROPs. So vs RSX trading some pixel pipes for more or better textures units and more or better ROPs.

=> Sony go with a limited amount of fast (GDDR3) VRAM 64MB or 128MB. It's to be used for rendering only. Through the CPU the GPU can read into the ram.

=> Sony goes for 2 VPUs again.

=> Sony realizes what extra RAM would bring to the system (a bit like what happens between Epic and MS) the system is to ship with 1GB of DDR2.
 
Last edited by a moderator:
I do agree. Still you got to admit that a lot of people played on SDTV for a long part of this generation. If the generation had a more reasonable length the issue would have been further lessen.
For the ignorant public a lot of games are @1080P if you see what I mean ;)
Still I'm OK to concede that HD (720P) was a reasonable point at this point of time.
But I still think that at this point in time silicon budget were to constrained. I'm happy neither with form of the ps360 and that's after 7 years.
I think a compromise have to be done, more on that latter.
I do not agree. I would say what you're describing is more a side effect of the move to AAA games than anything else. Everything has to be pushed at eleven. The competition is rough risk with regard to gameplay or genres is avoided as much as possible. Overall I would say that nowadays games are way better than they were before on average. I mean you remember all those old games that were turning more than often into slide shows? There are still issues but I think it's way better now.

It's interesting though to think about how adding that extra RAM truly overran the cost of the console and in the end might have had no real effect on what kind of games developers could've made.

I took some VRAM usage readings yesterday with some various PC games, some of them multiplatform. I think it gives us an idea of what can be done, even with limited VRAM. Granted I should also let you know, at the time I was taking these data points down, Win7 was using about 110 MB of VRAM in the readings, so do the math.....

Far Cry 1 Maxed 1080p, No AA, 8x AF:-------------------------------------------------- 260 MB VRAM (w/ Win7), ~500 MB SRAM
Far Cry 1 Maxed 720p, 2x AA, 4x AF: --------------------------------------------------- About same as previous
COD4 Maxed 1080p, 4x AA: -------------------------------------------------------------- 560 MB VRAM (w/ Win7), 735 MB SRAM
COD4 Maxed 720p, 2x AA: --------------------------------------------------------------- 400-420-470 MB VRAM (w/ Win7), 730 MB SRAM
COD4 Maxed 720p, 2x AA, High texture settings, Min AF: ---------------------- 300-320 MB VRAM (w/ Win7), ~500 MB SRAM
COD4 Maxed 720p, 2x AA, High texture, normal specular/normal, min AF: 250 MB VRAM (w/ Win7), ~455 MB SRAM
COD4 Maxed 480p, 2x AA, High texture settings, Min AF: ---------------------- 290 MB VRAM (w/ Win7), ~500 MB SRAM
L4D2 Maxed 1080p, 4x AA, 16x AF:----------------------------------------------------- 500-560 MB VRAM (w/ Win7), ~575 MB SRAM
L4D2 Maxed 720p, 2x AA, 4x AF: ------------------------------------------------------- 400-500 MB VRAM (w/ Win7),
BF2 Maxed 1080p, 8x AA: ----------------------------------------------------------------- 500-600 MB VRAM (w/ Win7)
BF2 Maxed 720p, 2x AA: ------------------------------------------------------------------ 400-500 MB VRAM (w/ Win7)
Oblivion Maxed 1080p, no AA: ---------------------------------------------------------- 330 MB VRAM (w/ Win7)
Oblivion Maxed 720p, no AA: ----------------------------------------------------------- 304 MB VRAM (w/ Win7)

The values that fascinate me the most are the Call of Duty 4 ones. No wonder it can run so fast on the 360 and PS3. It's not bloated with visual assets. The "CoD4 @ 720p, 2x AA, high texture settings, min AF," while below 360 and PS3 visual spec (raw textures across all three platforms are the same) intrigues me because once you subtract the Win7 VRAM amount to get ~200 MB, it then becomes possible to deduce that a 256 MB Xbox 360 or PS3 could've still ran a further memory optimized version of CoD4 @ true 720p with not too much of the visual acuity reduced (but still at 60 FPS?). Oblivion didn't use as much as I thought it would. Console performance issues are probably due to background streaming hiccups (hence the virtual need for HDD on the 360 version). Of course, these PC VRAM numbers are not too accurate either in many cases, considering PC games can vary greatly in VRAM use (just look at BF2). Recent Source titles like TF2 and L4D2 also vary wildly in VRAM usage from usually about 300 to 500 MB regardless of resolution, AA, and AF.

I just really like the CoD4 numbers and how it doesn't surprise me that a game that is multiplatform like it is and even built for 60 FPS performance on console, does not use too much VRAM at all.

I think I'll make some more readings.
 
Last edited by a moderator:
One thing I do think about sometimes is whether they would have included another 256MB of GDDR3 had they not also included the BRD. Would that have made a significant difference in the long run compared to what they ended up doing? If Microsoft could have fitted 512MB on the single 128bit bus then they ought to have been able to do the same as well.
 
If Microsoft could have fitted 512MB on the single 128bit bus then they ought to have been able to do the same as well.

Perhaps, though do recall RSX was an MCM with the four GDDR3 DRAMs on-package (simplification of the motherboard tracing).
 
It's interesting though to think about how adding that extra RAM truly overran the cost of the console and in the end might have had no real effect on what kind of games developers could've made.

I took some VRAM usage readings yesterday with some various PC games, some of them multiplatform. I think it gives us an idea of what can be done, even with limited VRAM. Granted I should also let you know, at the time I was taking these data points down, Win7 was using about 110 MB of VRAM in the readings, so do the math.....

Far Cry 1 Maxed 1080p, No AA, 8x AF:-------------------------------------------------- 260 MB VRAM (w/ Win7), ~500 MB SRAM
Far Cry 1 Maxed 720p, 2x AA, 4x AF: --------------------------------------------------- About same as previous
COD4 Maxed 1080p, 4x AA: -------------------------------------------------------------- 560 MB VRAM (w/ Win7), 735 MB SRAM
COD4 Maxed 720p, 2x AA: --------------------------------------------------------------- 400-420-470 MB VRAM (w/ Win7), 730 MB SRAM
COD4 Maxed 720p, 2x AA, High texture settings, Min AF: ---------------------- 300-320 MB VRAM (w/ Win7), ~500 MB SRAM
COD4 Maxed 720p, 2x AA, High texture, normal specular/normal, min AF: 250 MB VRAM (w/ Win7), ~455 MB SRAM
COD4 Maxed 480p, 2x AA, High texture settings, Min AF: ---------------------- 290 MB VRAM (w/ Win7), ~500 MB SRAM
L4D2 Maxed 1080p, 4x AA, 16x AF:----------------------------------------------------- 500-560 MB VRAM (w/ Win7), ~575 MB SRAM
L4D2 Maxed 720p, 2x AA, 4x AF: ------------------------------------------------------- 400-500 MB VRAM (w/ Win7),
BF2 Maxed 1080p, 8x AA: ----------------------------------------------------------------- 500-600 MB VRAM (w/ Win7)
BF2 Maxed 720p, 2x AA: ------------------------------------------------------------------ 400-500 MB VRAM (w/ Win7)
Oblivion Maxed 1080p, no AA: ---------------------------------------------------------- 330 MB VRAM (w/ Win7)
Oblivion Maxed 720p, no AA: ----------------------------------------------------------- 304 MB VRAM (w/ Win7)

The values that fascinate me the most are the Call of Duty 4 ones. No wonder it can run so fast on the 360 and PS3. It's not bloated with visual assets. The "CoD4 @ 720p, 2x AA, high texture settings, min AF," while below 360 and PS3 visual spec (raw textures across all three platforms are the same) intrigues me because once you subtract the Win7 VRAM amount to get ~200 MB, it then becomes possible to deduce that a 256 MB Xbox 360 or PS3 could've still ran a further memory optimized version of CoD4 @ true 720p with not too much of the visual acuity reduced (but still at 60 FPS?). Oblivion didn't use as much as I thought it would. Console performance issues are probably due to background streaming hiccups (hence the virtual need for HDD on the 360 version). Of course, these PC VRAM numbers are not too accurate either in many cases, considering PC games can vary greatly in VRAM use (just look at BF2). Recent Source titles like TF2 and L4D2 also vary wildly in VRAM usage from usually about 300 to 500 MB regardless of resolution, AA, and AF.

I just really like the CoD4 numbers and how it doesn't surprise me that a game that is multiplatform like it is and even built for 60 FPS performance on console, does not use too much VRAM at all.

I think I'll make some more readings.

are you serious or just kidding ? :oops:do you really believe that half the RAM for ps3/xbox360 would have produced the same game design games ?

do you know that RAM is important not only for graphics but even more important to design bigger environments in games ? do you really believe that Skirim, GTA4, GT5 nurburgring full track, Uncharted 2/3 huge set pieces...etc would have been possible at 256 Mb of RAM ?

Did you ever heard that for develoeprs the MAIN PROBLEM of multiplatform games ported from xbox360 to ps3 is due to a difference of just 20-30 Mb of available RAM between the 2 consoles ? and you are saying that halving the RAM wont affect game design ?

I believe the quantity of RAM is one of the most important asset that could help game design (more assets, bigger levels, more things happening at the same time...etc)
I even believe that if ps3/xbox360 have 1 Gb of RAM they could last years longer before we ever need a next gen console. current gen consoles are severely limited by RAM. (battelefield 3 for PC VS consoles is a good example). Developers are doing miracles managing the 512 Mb of RAM in these consoels (assassins creed 3 is an incredible example).

but even the last of us not so big environments but very rich in details is a miracle by naughty dog. how they run all these high quality assets (sound effects/ music/ textures/animation/physics/high poly models/ AI routines...etc) in just 512 Mb is just incredible, they are really pushing streaming, compression techniques and memory efficient management on ps3 to the limits.
 
What I'm saying is that 256 MB of RAM would've been enough to create the same gameplay and styles we have today. There would be a visual hit, yes, but for the most part we could've had the same titles we currently enjoy.
 
are you serious or just kidding ? :oops:do you really believe that half the RAM for ps3/xbox360 would have produced the same game design games ?
Never underestimate what can be cut while keeping the structure of the game intact. If COD4 could be crammed into a Wii and RE2 could be squashed into a 64 MB cartridge, well, it's impossible to say "can't" until someone's actually tried.
 
DoctorFouad said:
are you serious or just kidding ? :oops:do you really believe that half the RAM for ps3/xbox360 would have produced the same game design games ?....

Well, that is difficult to judge imo!
Remember Crysis 2? Where lots of people said that Crytek 'dumbed' down gameplay and sandbox to fit the consol limits? Most time, the argument was that memory is not enough to handle Crysis 1 open world gameplay...until they released Crysis on both consoles!
 
Back
Top