PS3 = 512Mbit XDR-DRAM (and Nintendo to adopt XDR-DRAM?)

Panajev2001a said:
e-DRAM = VRAM, like on the GS.

I use VRAM = Video Memory, which in the PC domain means onboard memory, not on-die memory. Call it whatever you want. external GPU-only memory.


A 256 bits XDR solution would mean 512 pins ONLY for data lines running through the system's PCB which is IMHO absurd to ask in PlayStation 3.

Why do you think this is absurd? GDDR-3 on current generation desktop GPU cards utilizes 800 PINS! XDR is a reduction, and 128-bit XDR would have superior bandwidth to all known graphic GPUs and eDRAM in production (e.g. PS2 GS), with far less pins.

Keeping the front-buffer, the back-buffer and the Z-buffer all in VRAM and at 32 bits for 720p would take about 10.55 MB.

720p is a non-starter. By the time the PS3 ships, 1080p sets will be as cheap and ubiquitous as 720p sets today. And you forgot textures. Texturing from system RAM will suck, and contend with CELL.


4xFSAA would increase the size of the back-buffer and of the Z-buffer and the total VRAM requirement would become ~32 MB.

Great, another 200+ million transistors to fit on a die which already has 200+ million transistors allocated to 24-32 pixel pipelines.

DeanoC, there is no contented memory bus. The GPU would have its own private bank of XDR all to itself. If you mean, an ultrawide bus (2000+bits), it's not really "free". It just means that the bus is wasted for every pixel rendered which is not using alpha blending. Its spent transistors for a hypothetical workload. PS2's architecture drives one to want to utilize lots of blending, because thats what it does best, on the other hand, you can't utilize many textures. An archtecture where large scale texturing is cheap, but blending isn't, yields a different workload bias. One is not neccessarily preferred over the other. It depends on what you want to accomplish.
 
DemoCoder said:
720p is a non-starter. By the time the PS3 ships, 1080p sets will be as cheap and ubiquitous as 720p sets today. And you forgot textures. Texturing from system RAM will suck, and contend with CELL.

Mass market for 1080p will not happen before 2010 - 2012. So it's more of an issue for the PS4/XB3 then for this next gen consoles.
 
The widescreen HDTV boom is currently in progress. The number of people who have bought a 720p display pales in comparison to the number of people who will buy one in the next 3 years. Every new display produced in 2006 is either 3 LCD or DMD XHD3 or some other 1080p derivative. 480p will be completely phased out and 720p will be on its way out by then. You completely underestimate the speed at which display manufacturers are rolling out resolution improvements. Half-way into the PS3's lifespan, the number of people who own 1080p displays will be on-par with the number of people who own 720p today, probably more, and 720p models from major vendors like Sony, Samsung, Toshiba, etc will most likely be junk models that are not pushed at electronics vendors.


There is simply ZERO POINT to putting Blu-ray into the PS3 if your design point is 480i/p (like claimed in another thread), because Europeans are stuck in crappy EDTV land. 720p is the absolute minimum that the next-gen must support, but people who buy blu-ray discs in the next two years, and $500 PS3s, are not the sort of folk who own shitty displays.


Anyway simply reducing the requirement from 1080p to 720p doesn't rescue eDRAM. Just give it up, there is no way an NV5x, even on 65nm, can fit enough eDRAM for AA backbuffer, z, and fullscreen textures, and 24-32 pixel pipes.
 
as I've said before, I believe that if PS3 has 256 MB of main RAM, it would be like if PS2 had 16 MB main RAM. a step backwards. and now that the Cell CPU is not going to have any eDRAM, as far as what is now known, it is even more important that PS3 have 512 MB memory if not more. dispite the APU/SPU/SPE LS increase from 128K to 256K.

256 MB main external memory for CPU plus 256 MB external for GPU is a bare minimum IMO. And since the GPU is an Nvidia design, there might not be any eDRAM in that either.


Panajev, what are the chances that PS3 has no embedded memory at all ???? not counting the caches and LS of the Cell CPU and the usual small caches found in Nvidia GPUs.
 
What led to the boom in widescreen tv sets was mainly WIDESCREEN CONTENT - if we'll see a good transistion onto newer formats such as Blu-Ray and HD-TV that will boost HD, perhaps that will lead to a boom in getting HD setups? Not sure if I expect 1080, but most definately 720p would be nice.
 
DemoCoder said:
The widescreen HDTV boom is currently in progress.

Not here in the UK, it isn't. :(

Hardly any TVs here support HDTV - some of the newer flat panel TVs do but virtually none of even the high-end CRTs do. Highly irritating when we know that things such as Blu-ray are just around the corner. :?
 
DemoCoder said:
The widescreen HDTV boom is currently in progress. The number of people who have bought a 720p display pales in comparison to the number of people who will buy one in the next 3 years. Every new display produced in 2006 is either 3 LCD or DMD XHD3 or some other 1080p derivative. 480p will be completely phased out and 720p will be on its way out by then. You completely underestimate the speed at which display manufacturers are rolling out resolution improvements. Half-way into the PS3's lifespan, the number of people who own 1080p displays will be on-par with the number of people who own 720p today, probably more, and 720p models from major vendors like Sony, Samsung, Toshiba, etc will most likely be junk models that are not pushed at electronics vendors.

I think the 720p will reach the state of popularity you're talking about. The 1080p will still take longer, according to all forecasts and market studies I read. The television market is not as volatile as you think. People don't normally buy TV's every year, and they usually need a good reason for it. And trust me, the PS3 is NOT a good enough reason for the common customer. HDTV broadcasting is. And all the major networks around the world are not planning on adopting that kind of transmition until the begining of the next decade. And that will be crucial for making these TV's cheap and popular.


Anyway simply reducing the requirement from 1080p to 720p doesn't rescue eDRAM. Just give it up, there is no way an NV5x, even on 65nm, can fit enough eDRAM for AA backbuffer, z, and fullscreen textures, and 24-32 pixel pipes.

C'mon! Give these guys a little credit. Don't you think they knew about 720p? I bet they must have a display or two with such resolution lying around in their research department. ;)
 
DemoCoder said:
Panajev2001a said:
e-DRAM = VRAM, like on the GS.

I use VRAM = Video Memory, which in the PC domain means onboard memory, not on-die memory. Call it whatever you want. external GPU-only memory.


A 256 bits XDR solution would mean 512 pins ONLY for data lines running through the system's PCB which is IMHO absurd to ask in PlayStation 3.

Why do you think this is absurd? GDDR-3 on current generation desktop GPU cards utilizes 800 PINS! XDR is a reduction, and 128-bit XDR would have superior bandwidth to all known graphic GPUs and eDRAM in production (e.g. PS2 GS), with far less pins.

Well, those desktop GPU cards with over 800 pins for data pins alone cost a pretty penny: the GPU in PlayStation 3 would need to be more cost nefficient than those (even though SCE does sell at a loss for the first year or so).

Keeping the front-buffer, the back-buffer and the Z-buffer all in VRAM and at 32 bits for 720p would take about 10.55 MB.

720p is a non-starter. By the time the PS3 ships, 1080p sets will be as cheap and ubiquitous as 720p sets today.

Which means an insignificant (read "quite low") percentage of the total target user-base for Xbox 2 and PlayStation 3.

And you forgot textures. Texturing from system RAM will suck, and contend with CELL.

Did I say it would texture from main RAM ? I did not, there would be space in the e-DRAM buffer to stream textures just as it is done on the GS and on PSP's GPU. You would have more space than you have on the GS or on PSP's GPU of course ;).
 
enough eDRAM to hold a 1080p 4xFSAA backbuffer and related textures.
I think you are fooling yourself if you think a 1080p with 4xFSAA is feasable (much less important) on next gen consoles. It's ridiculous to waste resources on FSAA on a resolution like that.
 
DemoCoder said:
DeanoC, there is no contented memory bus.
Contended rather than contented surely :)
The contention is between reads and write amongst itself. eDRAMs ability to have stupidly high pin counts solves this problem. Its relatively easy to bank eDRAM, I can have a 1024 bit buffer for the framebuffer and 512 bit buffer for the textures. No matter how much texturing goes on there is always a fixed cost. Thats extremely hard with external memory system unless you have seperate memory pools ALA 3DFX.

Having fixed memory bandwidth allocations makes the memory controllers job much simplier, that gives you 'free' stuff.

1080P will not be a standard resolution on any next-gen console, 720P is the chosen one.
 
DemoCoder said:
There is simply ZERO POINT to putting Blu-ray into the PS3 if your design point is 480i/p (like claimed in another thread), because Europeans are stuck in crappy EDTV land. 720p is the absolute minimum that the next-gen must support, but people who buy blu-ray discs in the next two years, and $500 PS3s, are not the sort of folk who own shitty displays.

I do not think PlayStation 3 will retail for $499.99, but more around $299.99 ;). Sure, they are probably going to sell it at a nice loss for the first hyear, but we cannot yet say how much that loss will be on each unit sold.

Also, PlayStation 3 is the CE product that drives Blu-Ray into the main-stream, it is not marketed towards the high-end CE market: it is not what they plan as Home Server/PSX 2.

To drive 1080p in the main-stream it will take for true 1080p sets quite a bit: they need to hit the $650-750 price-point which has yet to be reached by devices which are "only" true 720p sets.
 
Btw, can someone explain me how can we be sure that given one chip's size, we can know how many of them will be used in PS3 (which gives us total memory) and what kind of bandwidth the memory will have?

Wasn't it said that Cell (one chip) would have 50GB/s memory bandwidth, and 100GB/s I/O bandwidth?
 
what is 1080P actually?

most if not all TFT LCD screens I see here (belgium) are 1200*1000 resolution and plasmas are 824*640 or something like that.


and yes i think also hdtv is booming, with those falling flatpanel prices, i'm tempthing to buy one maybe NEXT year . i already have a 32" wide CRT philips , but those big LCD's are new , cool, must get! but they need to drop a bit more in price
 
And i'll say this, then we'll see what happens..

Plasma/LCD screens prices are falling, certainly.

But the prices of ones that only support 480p are falling the most, so people will go out in the millions eventually, before 2007 surely, and buy "a plasma TV", cause they do no know what 720p or 1080i/p is, what 480p is.
Of course they will most likely buy the cheap plasma displays that only support 480p, like they bought the cheaper widescreen TVs that only supported 480i 5 years ago. So we're back from square one. These people will not upgrade to another plasma display in YEARS. And in the meantime they will plug their PS3 through normal cables that don't support progressive scan, and they won't know the difference, unless component or VGA cables are provided in the box (i'm pretty sure next gen machines will have at least RGB cables in the box, composite will die)


Sorry for being pessimistic, but i have a very low opinion of people.
 
If you ask me, I'm perfectly satisfied with 480i for games. We watch movies on normal TV resolution all the time, and I bet no one ever complained about the image being unrealistic. The poor resolution doesn't make the scenery in the movies I watch become any less realistic, or the girls on porn seem any less hot.

My point is, there's a lot to be achieved for realism, besides the number of pixels.
 
Alejux said:
If you ask me, I'm perfectly satisfied with 480i for games. We watch movies on normal TV resolution all the time, and I bet no one ever complained about the image being unrealistic. The poor resolution doesn't make the scenery in the movies I watch become any less realistic, or the girls on porn seem any less hot.

My point is, there's a lot to be achieved for realism, besides the number of pixels.

True, but getting rid of interlacing is a must. Even 480p is already visibly easier on the eyes. Even faked (like some TVs do, converting interlaced inputs into cleaner pro-scan ones).
 
marconelly! said:
It's ridiculous to waste resources on FSAA on a resolution like that.
Like hell it is. One of the main things I'm looking forward to in the new generation is non-shitty IQ. With all the power that these new systems have to offer I hope developers will put more emphasis on IQ even if it means that something else will have to go. Though I'm not convinced 1080p will be standard like DemoCoder seems to expect.
 
cybamerc said:
marconelly! said:
It's ridiculous to waste resources on FSAA on a resolution like that.
Like hell it is. One of the main things I'm looking forward to in the new generation is non-shitty IQ. With all the power that these new systems have to offer I hope developers will put more emphasis on IQ even if it means that something else will have to go. Though I'm not convinced 1080p will be standard like DemoCoder seems to expect.

Like DeanoC said, 720p has been targetted as the HDTV resolution for the next generation. 1080p will not be standard for a long time.

Besides, i think it all comes down to compromise, personally i think 720p without AA is good enough, and good enough of a jump from 480i of this generation, anything more is icing on the cake. I'd prefer if they focus on other things like better shading and more geometry than on AA modes that will hardly be noticed at such high resolutions, considering how many people wil lplay at those resolutions anyway.
 
Yeah, I don't think there's a chance of 1080p being a standard, but rather 720p instead.

Personally, I pretty much agree with what Alejux said, as I'd much rather have some film quality looking CGI in realtime on 480i or 480p, than games that look like what we have today, or sligthly better, only in 1080p with 4xFSAA... You can get that kind of look today on any decent PC anyways.
 
Back
Top