Your thoughts on how much Ram will be needed next gen..

Sorry it was 36 Mbits... which would mean 4.5 MB/s which is not that slow ;) ( uncompressed )


36 Mbits/s is for single speed READ and WRITE, of course with the passing of time we will see 2x and 4x speed Blu-Ray devices.
 
From Philips web site...

Blu-ray Disc Key Characteristics

Large recording capacity, High-speed data transfer and Easy to use.

Large recording capacity up to 27GB:
By adopting a 405nm blue-violet semiconductor laser, with a 0.85NA field lens and a 0.1mm optical transmittance protection disc layer structure, it can record up to 27GB video data on a single sided 12cm phase change disc. It can record over 2 hours of digital high definition video and more than 13 hours of standard TV broadcasting (VHS/standard definition picture quality, 3.8Mbps)
High-speed data transfer rate 36Mbps:
It is possible for the Blu-ray Disc to record digital high definition broadcasts or high definition images from a digital video camera while maintaining the original picture quality. In addition, by fully utilizing an optical disc’s random accessing functions, it is possible to easily edit video data captured on a video camera or play back pre-recorded video on the disc while simultaneously recording images being broadcast on TV.
Easy to use disc cartridge:
An easy to use optical disc cartridge protects the optical disc’s recording and playback phase from dust and fingerprints.



Main Specifications

Recording capacity:


23.3GB/25GB/27GB




Laser wavelength:


405nm


(blue-violet laser)

Lens numerical aperture (NA):


0.85




Data transfer rate:


36Mbps




Disc diameter:


120mm




Disc thickness:


1.2mm


(optical transmittance
protection layer: 0.1mm)

Recording format:


Phase change recording




Tracking format:


Groove recording




Tracking pitch:


0.32um




Shortest pit length:


0.160/0.149/0.138um

Recording phase density: 16.8/18.0/19.5Gbit/inch2
Video recording format: MPEG2 video
Audio recording format: AC3, MPEG1, Layer2, etc
Video and audio
multiplexing format: MPEG2 transport stream
Cartridge dimension: Approx. 129 x 131 x 7mm

Of course MPEG2 will not be the only thing used to compress video on Blu-Ray... 36 Mbits/s allows for better than MPEG2 quality...
 
Grall:
There are classic games made all by one dude - many of Jeff Minter's creations come to mind. How long ago was it since that happened, really?
I think it's happening right now, on Minter's latest for GameCube. I believe he has access to Lionhead's staff and resources if he needs it, though.
 
Saem said:
I think for non-graphics tasks a CPU would need about 64 to 128 MB

Gaphics tasks would need LAGRE frame buffers for all the buffers and significant texture space. I figure 256MB in that department. Add a few MBs for sound and so on, I figure anything over 512MB is likely overkill
As for textures, if you’re using virtual texturing, you don’t need much more memory than a framebuffer takes up, because the textures are cut and scaled to fit within a frame. Couple that with the possible use of some sort of TLBR (no z-buffer) better texture compression, maybe even compression of frontbuffer and you end up with significant savings. Of course if you take a conservative “old-fashionedâ€￾ approach then you’ll need 64Mb VRAM for 1080i 64bit.
OT: Squeak is your name taken from the programming language
Yes.
 
Sorry it was 36 Mbits... which would mean 4.5 MB/s which is not that slow
That's still pretty damn slow if you have to fill 1GB of memory. If you had a perfect, sequential read from that disc, it would take almost FOUR minutes! Compared to today's consoles where it takes approx 10 seconds for the same operation, I think it's quite disasterous...
 
I was thinking that myself. I would have thought that Blue-ray would be a whole lot faster than 4.5 MB/s, but if that is the case, then I just don't see how it could effectively service any sort of hardware that may be using several 100's of MB. 512+ MB is clearly something that should be fed by a HD where you could count on 50+ MB/s or so of thoughput by time 2005 rolls around. Maybe it could work with optical media, maybe not, but my point is that it becomes more and more questionable a propositition as the target RAM increases beyond 100's of MB, IMO.
 
HD where you could count on 50+ MB/s

:LOL: HDDs in theory should be past that speed by now... of course a huge Serial ATA HDD with 50 MB/s of transfer speed is cheap ;) huh ?

Marconelly... 1 GB of RAM... well I expect next-generation consoles to have less than that... however, as I said that is only 1x Blu-Ray specs...

Would it be impossible to think about 2x or 4x for PlayStation 3 ?

I think 2x should be quite reasonable.. and that would mean 9 MB/s ( 4x would be 18 MB/s, but I do not see it in PlayStation 3 )...

64 MB of e-DRAM for Broadband Engine, 32-64 MB of e-DRAM for the Visualizer and 128 MB of Yellowstone DRAM...

That is 224-256 MB of total main RAM... if you wanted to go nuts with things you would have 256 MB of Yellowstone DRAM bringing the total to ~512 MB of RAM... I do believe that 256 MB of total RAM would be enough, especially considering that we will have even more power to decompress data and more than enough bandwidth to move compressed and uncompressed data...

256 MB would be filled, at 36-72 Mbps ( 4.5-9 MB/s ), in 56-28.4 seconds.

512 MB would take close to 2 minutes ( 1x ) and 1 minute ( 2x )...

Do you find these loading times so bad ?

How often are we completely filling the whole PlayStation 3's RAM ? When we are streaming data in and out we are not going to transfer THAT much data... we have e-DRAM and Local Storage for a reason ;)
 
Couple that with the possible use of some sort of TLBR (no z-buffer) better texture compression, maybe even compression of frontbuffer and you end up with significant savings.

Hrm, I've heard of rendering being done without a Z-buffer, but I never seen any descriptions. Could you possible go into that?

I'm guessing, TLBR is some sort of translation look-aside buffer?
 
I dunno i think 256 would be the cap on the next gen systems . Mabye less for the ps2 considering the 64 megs of on die ram they are talking about should cost alot more money than of die ram. Ps3 with 64 megs on die ram , 128megs of system ram and mabye 16 megs of sound ram . Xbox 2 prob 256megs of system ram , 32 megs for sound ram and 128 megs of video ram (more than the ps3 since it wont be on die and thus not as fast so they would need more) Mabye 256 if they really need it.
 
I think he meant TBR... still, there is a Z-buffer, on-chip... and still PVR supports external Z-buffer ( FP )...

I thought that might be it, but that didn't makes sense, since there is still a z-buffer.
 
jvd, I'd agree... I think you also have to add the e-DRAM on the Visualizer... even in the patent images the e-DRAM for the Visualizer is not shared with the Broadband Engine...

64 MB for the Broadband Engine+ 32-64 MB for the Visualizer ( already has Image Cache ) + 128 MB of Yellowstone system DRAM should be a safe guess IMHO...

We might have Sound RAM and I/O CPU RAM, but they could very well be not separate pools, but we could use the system RAM ( 128 MB of Yellowstone DRAM ) for at least I/O RAM... It would not be difficult making it work at 800 MHz signaling rate in PlayStation 2 compatibility mode ( feed a 100 MHz clock instead of a 400 MHZ clock to the Yellowstone DRAM PLLs )...

But maybe we could see Sound RAM as a separate memory pool...

Considering the patent, I do not see the I/O CPU having separate RAM, well in theory all the Yellowstone DRAM would be I/O memory ( this is external memory, not e-DRAM )...
 
Sorry it was 36 Mbits... which would mean 4.5 MB/s which is not that slow

Actually that is first gen blue ray... the ps2, which came soon after the first dvds, is equipped with a 4xDVD, and the xbox with a 6xDVD drive... The same should likely hold true for next gen consoles...

12-20MB/s is likely... not to mention the Hdd, while the console is booting up, and the logos, and the start screen, any story sequence, etc.... that's a good 7-15secs to load the first area...

Thats sufficient time to load 100MB+, or enough to load the code, and begin streaming textures, etc... with compression enough to fill 200-300MB+, and the data can continue filling up the Hdd before you reach the next area...
 
Zidane, I agree as I already posted...

I do not see PlayStation 3 with a 4x Blu-Ray device, more like 2x... which is still 9 MB/s... I would not mind the 18 MB/s promised by 4x Blu-Ray, but I am not expecting to see it...

Panajev2001a said:
HD where you could count on 50+ MB/s

:LOL: HDDs in theory should be past that speed by now... of course a huge Serial ATA HDD with 50 MB/s of transfer speed is cheap ;) huh ?

Marconelly... 1 GB of RAM... well I expect next-generation consoles to have less than that... however, as I said that is only 1x Blu-Ray specs...

Would it be impossible to think about 2x or 4x for PlayStation 3 ?

I think 2x should be quite reasonable.. and that would mean 9 MB/s ( 4x would be 18 MB/s, but I do not see it in PlayStation 3 )...

64 MB of e-DRAM for Broadband Engine, 32-64 MB of e-DRAM for the Visualizer and 128 MB of Yellowstone DRAM...

That is 224-256 MB of total main RAM... if you wanted to go nuts with things you would have 256 MB of Yellowstone DRAM bringing the total to ~512 MB of RAM... I do believe that 256 MB of total RAM would be enough, especially considering that we will have even more power to decompress data and more than enough bandwidth to move compressed and uncompressed data...

256 MB would be filled, at 36-72 Mbps ( 4.5-9 MB/s ), in 56-28.4 seconds.

512 MB would take close to 2 minutes ( 1x ) and 1 minute ( 2x )...

Do you find these loading times so bad ?

How often are we completely filling the whole PlayStation 3's RAM ? When we are streaming data in and out we are not going to transfer THAT much data... we have e-DRAM and Local Storage for a reason ;)
 
Guys, guys... Look... It's a 2005 design for fuck's sake. It's NOT going to have 128MB main RAM simply because that would make it look STUPID in comparison to PCs of that timeframe. Simple vanity will make console designers want to stick in more RAM than that, not to mention all the other practical reasons.

I don't care how you think 128 MB would be "enough", etc. Nobody else does. Anyone who thinks PS3 will only have twice the main ram of XB a full four years after the console went on sale is frickin delirious. But let's wait until final PS3 specs are announced and I can laugh at all of you over how wrong you were... :LOL:

I still say half a gig. It's POSSIBLE they go for a quarter gig, but that is still relatively small compared to today's consoles and very small compared to what PCs will have by that point in time. I say at least 60% chance half gig, 30% chance quarter gig, 10% or less chance one gig.


That even blue-ray DVD loads fairly slowly isn't much of an argument against lots of RAM. Who says you have to fill the entire memory before the game can start up? Getting the title screen UI up and running would take less than a second if you didn't bother with splash screens and all that crap.

While you watch the intro movie the first time you boot a game it will copy essential core files to the harddrive anyway which speeds subsequent loads. A 50MB/s harddrive will cost peanuts in 2005. Look, just about every current drive today does 40+MB in the outer zones, the new WD Raptor is past 60. Performance desktop drives might even be pushing a hundred by 2005, especially considering they'll be 15k rpm units.


*G*
 
Grall said:
Guys, guys... Look... It's a 2005 design for fuck's sake. It's NOT going to have 128MB main RAM simply because that would make it look STUPID in comparison to PCs of that timeframe. Simple vanity will make console designers want to stick in more RAM than that, not to mention all the other practical reasons.

So how do you compare GameCube launching with 24MB main RAM and 16MB slow RAM when my PC already had 256MB in it? And many people even at the time had more than 256.

It DOESN'T MATTER how they compare to PC's.

I don't care how you think 128 MB would be "enough", etc. Nobody else does. Anyone who thinks PS3 will only have twice the main ram of XB a full four years after the console went on sale is frickin delirious. But let's wait until final PS3 specs are announced and I can laugh at all of you over how wrong you were... :LOL:

As I said earlier, 128MB seems small, but 256-384MB sounds fantastic.

I still say half a gig. It's POSSIBLE they go for a quarter gig, but that is still relatively small compared to today's consoles and very small compared to what PCs will have by that point in time. I say at least 60% chance half gig, 30% chance quarter gig, 10% or less chance one gig.

512MB is just excessive, and will basically ASK for developers to get lazy with their code.

IIRC, Xbox was originally intended for 128MB, but then MS decided to cut it back to 64MB to be sure devs would tighten up their coding technique.

That even blue-ray DVD loads fairly slowly isn't much of an argument against lots of RAM. Who says you have to fill the entire memory before the game can start up? Getting the title screen UI up and running would take less than a second if you didn't bother with splash screens and all that crap.

Well, taking advantage of the boot screen and maybe a few ad screens can allow you to pre-fill your main RAM with basic, redundant code that never really goes away, so you can just dynamically load levels or something like that.
 
256 MB of main system RAM would be sufficient if the console is not a UMA architecture. If it is UMA then 512 GB should be the bare minimum. The graphics side of things are going to need some decent compression plus a whopping amount of RAM just for some of these CG pics you guys post here.

You think 512 MB of main system RAM is too much just because some devs will code sloppy? How's that any different than half of the devs out there already? Best to give them more headroom so they can at least have minimal slowdown in the game even if it is sloppily coded. What's the point if having a meager 128 MB of RAM when even the top devs are going to be constrained with this small amount of memory. Physics and AI code could get quite hefty in the next generation, especially if the game is going to be simulating a large amounts of dynamic objects all moving within the scene.

Face it, code is going to be getting bigger and bigger with more powerful procs to keep the memory full. You're going to get a lot of sloppily coded games, but I doubt it will change as much as today's games. Those who sloppily code will get shown up by the devs who take the time to learn the hardware and make it show is true colors.
 
Back
Top