Your thoughts on how much Ram will be needed next gen..

"Paul... Yellowstone is not available yet in ANY board, in any segment... the technology is NOT yet mass-produced... and we are already ~mid 2003..."

It was a pure example of how ps3 using rambus ram exclusivly would be rapidly cheaper than if you put a few chips in a PC. Not to mention, im doubting 512mb of Yellowstone will cost alot in a year and a half.

"Uhm... the current GS has 4 MB of 48 GB/s e-DRAM... "

Which as we know, wasn't enough.

"54.33 MB"

How will this be enough for a system that could very well hit 1000X the power of the ps2? I'm starting to think ps3 will have the vram problems again.

And when I mean't the whole system based on Rdram, I mean't Yellowstone. Gotta remember, an entire system based on it would be more cost effective than putting it in a PC.
 
How will this be enough for a system that could very well hit 1000X the power of the ps2? I'm starting to think ps3 will have the vram problems again.

1) Texture Compression

2) the 1,000x figure is misleading... 1,000x won't be the RAW specs, but more of a figure that reppresents the polygons/frame achieved with long Vertex and Pixel Programs, several textures layers and other effects which simply would CHOKE the current GS ( especially on the Pixel Shading side of things )...

1 TFLOPS < 1,000 * 6.2 GFLOPS...

3) Texture streaming ( Virtual Texturing anyone ? ;) )...


Which as we know, wasn't enough.

Well if the GS had HW support for S3TC this would be quite good ( ~1 MB more than Flipper ) ;)
 
Oh yea, I know you don't multiply everything by 1000 :) I'm not that dumb, just used it for an example.

I'm not trying to be an ass or anything, i'm just curious.

Even with all the things that can be done on ps3 to compress vram space and stream textures, do you think developers will still 'bitch'? and claim that there should be more vram in the first place.

Also, do you think that the 64mb will hold up?
 
What do you mean "hold up" ?

And yes I know that you are curious ( I am too ), I do not think you are an ass :)

I think developers will still bitch about VRAM and external RAM and e-DRAM... RAM is never enough for developers:p, there could always be some more ;)
 
Hold up meaning will it be enough for a system as powerfull as PS3 and will there be problems like ps2 had with it's 4mb of vram.

Also, do you see a HDD being in ps3? And how big, im guessing something made by sony, 60GB.

And could a part of the HDD kinda be used as vram? or just for storing textures like main memory.

And I guess the whole ps3's transfering power over the internet thing was bogus right? And what sony really meant by distributed computing was really several cpu's in one.
 
You know, the next consoles could have SEVERAL gigs of RAM and there'd still be devs claiming they're running out of space. It'd be mostly due to sloppy code and lack of compression though.. ;)
 
Paul said:
Hold up meaning will it be enough for a system as powerfull as PS3 and will there be problems like ps2 had with it's 4mb of vram.

Also, do you see a HDD being in ps3? And how big, im guessing something made by sony, 60GB.

And could a part of the HDD kinda be used as vram? or just for storing textures like main memory.

And I guess the whole ps3's transfering power over the internet thing was bogus right? And what sony really meant by distributed computing was really several cpu's in one.

HDD and Blu-Ray can hold up textures, you have to have your memory hierarchy ( from fastest to slowest, everything can buffer data from one level to the other )... I personally would not miss an HDD as a Re-writable Blu-Ray "lite" would fit PlayStation 3 much better than a simply fast DVD and an HDD...

As far as memory being the bottleneck... well with decent Texture Compression we will see much less crying over the VRAM space as between streaming ( something that Cell provides is bandwidth ;) ) and Texture Compression that 54.33 will be quite enough ;) Of course developers could go crazy and try to stuff up HUGE 3D textures in the VRAM and try to avoid streaming... well maybe that way they won't reach the maximum potential, but they are still going to be able to produce a graphical quality that is nicely higher than several PlayStation 2 titles and the same will be true for Xbox 2 and GCN 2 if they are similar jumps from their predecessors as PlayStation 3 is being developed to be ( and I think that performance wise they will be at least comparable if not slightly better or slower, which won't matter that much... ).

As far as the "internet" thing... well it is true and untrue...

The architecture, Cell, CAN do that... if you read the patent you will read that software Cells ( also known as apulets ) can migrate over to another chip in the same device or onto another device connected to that network or even across different networks... Software Cells have in their headers a Source ID, Destination ID and Previous Cell ID IIRC... Each of these IDs has an IP address ( there is space for it )...

Will they do it in PlayStation 3 ? Only internally maybe ( that is how Broadband Engine and Visualizer might also work together ) and the first external steps would not be sharing power for real-time 3D rendering, but the ease of migration of software Cells, the uniform APU ISA and the fact that accordingly Cell can scale from PDAs to consoles and big servers will make it easier and faster for all these Cell based devices to inter-operate, communicate and share data...

When the technologycal barrier that prevents to do real-time 3D rendering with the help of big fat Sony servers over the internet ( latency and network bandwidth constraints... also reliability of connections would be an issue ) this could move forward...

Sharing Cycles with other users' PlayStation 3 is the MOST farfetched thing, but the Cell architecture is not the one stoopping us here... it is our common sense and the fact there are too many issues with the concept ( again networks are too slow and unpredictable and balancing the load across Cell devices acfross a nation would be a painful and messy undertaking... )
 
I think the Visualizer's APUs and PUs can do nice real-time VQ decoding :) hehe, who knows, maybe they will use something even better ( optimized for 3D Textures too )...
 
Aren't you guys being a bit too generous on the memory? I would think that Sony would continue with the approach of "little memory, large bandwidth". Since graphic apps require lots of fast changing data always on the move, I would think 256 of RAM as a total should be enough if the bandwidth is there to keep the flow steady... Panajev?
 
I agree... as I said I expect 128-256 of external RAM and e-DRAM in good quantity on the Broadband Engine and the Visualizer: they need the tremendously high bandwidth that on-chip DRAM can provide :)
 
Back
Top