Will Next Gen Consoles have Enough Memory?

london-boy said:
pc999 said:
Remenber (Acert93) that exclusives like Doom 3 ( when it is out ) alreay/will use HD to store textures, I dont know if ATIs HiperMemory fit here ( or a no PCI version or whatever like ).
And that GC already had virtual memory.
And he get always the same names ( ATI , MS N...) so some of this things already exists it make very sence that we get all this things in their next consoles.

U mean Doom3 for Xbox?
Swapping textures from the hard drive is very slow and we're not talking about a game like Jak and Daxter where there are no loading times, where the textures are really low res and there is a different focus on graphics.
Doom3 is a typical "Load evel, play level, load next level" unless they re-work it for the Xbox...

I think doom 3 on pc already steams data in addition to the level loads. Within the level it streams data off the harddrive.(or if not then its streaming data from main ram to the video ram...well that's just agp texturing and not really streaming just loading as needed, but xbox doesn't have main ram to load video data from)
 
Fox5 said:
london-boy said:
pc999 said:
Remenber (Acert93) that exclusives like Doom 3 ( when it is out ) alreay/will use HD to store textures, I dont know if ATIs HiperMemory fit here ( or a no PCI version or whatever like ).
And that GC already had virtual memory.
And he get always the same names ( ATI , MS N...) so some of this things already exists it make very sence that we get all this things in their next consoles.

U mean Doom3 for Xbox?
Swapping textures from the hard drive is very slow and we're not talking about a game like Jak and Daxter where there are no loading times, where the textures are really low res and there is a different focus on graphics.
Doom3 is a typical "Load evel, play level, load next level" unless they re-work it for the Xbox...

I think doom 3 on pc already steams data in addition to the level loads. Within the level it streams data off the harddrive.(or if not then its streaming data from main ram to the video ram...well that's just agp texturing and not really streaming just loading as needed, but xbox doesn't have main ram to load video data from)

Yeah Doom3 already splits levels and loads them "on the fly" so to speak, but it's nothing on the scale of J&D, as there are still loading screens here and there between a level and the next.
 
london-boy said:
[?? U seem a bit confused. They might have to rewrite the shaders to fit them on the Xbox GPU capabilities, and i believe that, but that has nothing to do with rewriting the game to stream the levels, or whatever you mentioned, as that would require MAJOR code rewriting. They might as well scrap the original and start from scratch, and we all know that is NOT going to happen.

No they really had said that

Of course Vicarious Visions is making extensive use of the Xbox’s built-in hard drive to reduce loading times and also will result in smooth gameplay once you have begun playing a level.

http://previews.teamxbox.com/xbox/705/Doom-3/p4/

She mentioned that feedback has been extremely positive in regards to early tests, and that through optimization and innovative use of the Xbox's hard drive

http://xbox.ign.com/articles/491/491780p2.html

I had saw that ( first ) in a interview, but I can´t find this one now, but these quotes hint in that direction...
 
london-boy said:
U mean Doom3 for Xbox?
Swapping textures from the hard drive is very slow and we're not talking about a game like Jak and Daxter where there are no loading times, where the textures are really low res and there is a different focus on graphics.
Doom3 is a typical "Load evel, play level, load next level" unless they re-work it for the Xbox...

Surely it's faster than streaming from the dvd?

Halo vs Halo 2? ;)

(talking about texture loading, not level loading/caching to harddrive speed of course)
 
london-boy said:
?? U seem a bit confused. They might have to rewrite the shaders to fit them on the Xbox GPU capabilities, and i believe that, but that has nothing to do with rewriting the game to stream the levels, or whatever you mentioned, as that would require MAJOR code rewriting. They might as well scrap the original and start from scratch, and we all know that is NOT going to happen.


So what IS taking up all this time for the xbox port to finish? Vicarious Visions have all the game content....

*wishes for an interview or update on game development progress*
 
Alstrong said:
So what IS taking up all this time for the xbox port to finish? Vicarious Visions have all the game content....

*wishes for an interview or update on game development progress*


Errr....

So what's taking so long to finish Duke Nukem forever?

And what took so long to release HL2, or Doom3....... :devilish:
 
Resurrecting this old thread because I was bored and just browsing around.

This http://pc.watch.impress.co.jp/docs/2004/0709/kaigai101.htm was reported sometime ago, and my Japanese has improved a bit more since. :oops:

We already have confirmation that all the XDR suppliers are aiming for 256MBit chips in '05 because that was requested by 'the main customer'(guess who? the main customer even made a presentation there!).

Now there's this slide that was actually shown during the RDF.
http://pc.watch.impress.co.jp/docs/2004/0709/kaigai_7.jpg

We can actually make a pretty good guess that PS3 is gunning for 256MB @ 51GB/s with 8 chips. Almost 'confirmation-worthy' guess IMO.
 
I will not be surprised if they use only 4 memory chips (256mBits each). The total memory could be 128MBytes or four times the PS2 memory.

Remenber, this is a console and supposed to be low cost.

Merry Xmas :D
 
hi, I was busy during the time when ppl were actively posting in this thread, but not so much now, so I thought I would post roughly the same thing I recently posted in another thread:


We all know that Unreal Engine 3 was designed with next generation consoles in mind,
UnrealEngine3 is targeted at high-end DirectX9 and future GPU's and next-generation game consoles
(source: http://www.homelanfed.com/index.php?id=21751 )..


... However, Tim Sweeney himself stated the following:
If you only have a 256 meg video card you will be running the game one step down, whereas if you have a video card with a gig of memory then you'll be able to see the game at full detail.
(source: http://www.beyondunreal.com/content/articles/95_1.php )

and
It doesn't exactly take a leap of faith to see scenarios in 2005-2006 where a single game level or visible scene will require >2GB RAM at full detail.
(source: http://www.beyond3d.com/interviews/sweeneyue3/index.php?p=3 )

........


... Now.. granted, a console like Xenon doesn't have to worry about running a bloated OS , or system APPs, or utilities like virus scanners and firewalls in the background........ but there is no way that 256 MB of unified RAM is going to allow Unreal Engine 3 to run at anything more than at drastically reduced :cry: detail levels if Mr. Sweeney is talking about >2 GB of system RAM and 1 GB of video RAM (for a total of >3 GB of RAM) hinted at to be able to run the engine at maximum detail levels .........

Now, I don't expect Xenon to be able to run Unreal Engine 3 based games at absolutely max detail levels, but the delta between 256 MB RAM and >3 GB RAM is huge.. with 256 MB RAM I just don't see Xenon being able to run Unreal Engine 3 at anywhere near the detail levels that it is capable of running at :cry: ...



On a related note, I believe that Sony and Samsung 's recent cross-licensing deal ( http://www.beyond3d.com/forum/viewtopic.php?t=19128 ) should allow Sony to provide the PS3 with more RAM and/or faster RAM and/or RAM at a lower cost to themselves






(I first discovered that BeyondUnreal interview from this thread: http://www.beyond3d.com/forum/viewtopic.php?t=12514 )
 
How optimised is the Unreal engine though? Traditionally PC software has been bloated. PC devs tend to write the code and wait for the hardware to catch up. Do you NEED a 1024x1024 texture when the object will only ever be 16 pixels square in size?

There's lots acheived on 32-64mb consoles that a 256mb (excluding Video RAM) PC can't handle. Any dev demanding 2 gbs is probably being a bit sloppy.
 
As a programmer myself, I agree that there cannot be too much memory from the development standpoint. There will always be a use for more.

However, MS seems to have invested heavily in making RAM size a lesser issue. A cached framebuffer is a nice chunk returned to RAM (and a heck of a lot of bandwidth). The recent patent talking about generating triangles on CPU and the GPU loading them straight from L2 cache can be another significant space saver.

I don't necessarily like it, but 256 MB is probably the amount it'll be. But if the thing is used correctly, it might be enough for 5 years.
 
Shifty Geezer said:
How optimised is the Unreal engine though? Traditionally PC software has been bloated. PC devs tend to write the code and wait for the hardware to catch up. Do you NEED a 1024x1024 texture when the object will only ever be 16 pixels square in size?

There's lots acheived on 32-64mb consoles that a 256mb (excluding Video RAM) PC can't handle. Any dev demanding 2 gbs is probably being a bit sloppy.

I think it would be easier for a console with 32-64MB of RAM to look good at 6408x480 than a PC to look good with 256MB at 1600x1200. I do not doubt Consoles games are more streamlined and have the advantage of static HW for design... but I think 720p will be the future minimum resolution. That is a 3x increase in resolution over 480p. So higher resolution textures will be needed to make up for the fact the output resolution will be higher. So some of that memory is going to be gobbled up in compensating for the higher resolution output.

Now my opinion is that many console game textures are drab and undetailed and often the environments feel very empty. More stuff requires more memory. As a gamer I want large, open, detailed worlds with a lot of stuff in them.

Like you said, I am sure they can work magic with 256MB of memory. But top tier programmers could stuff even more into a game with 512MB. I would even pay extra for a 256MB memory expansion :) I bought the N64 one, so why not?! :)
 
From the TXB link for Doom 3 above:

The dev team did some research and found back doors and even a few undocumented instructions for the NV2X



Anyone have any clue what these could be... and how someone would find undocumented instructions?!?!?
 
Next Gen Memory

With the PS3 being a streaming architecture, I think another big aspect beyond the size of the RAM is the size of the smaller co-processor memories (128KB). As far as I understand, the data will flow through the system and each processor will do its bit to push the calculation forward. You are hampered by the number of registers in the stream as well as dependencies between calculations. These can limit the complexity of the calculation you can do efficiently.
 
Back
Top