Lots 'o Embedded DRAM on R500

cybamerc said:
The next gen consoles will support HDTV resolutions so they'll need as much framebuffer space as a PC.
We'll see. Most TV's are still not HDTV's. Don't be surprised if the next generation of consoles are still primarily low-resolution.
 
Gubbi said:
Bandwidth is easy, just throw pins at the problem and scale frequency. Latency is limited by a nasty physical constant, c.
It's not easy, and it's expensive. Adding pins runs into space constraints very quickly, and electromagnetic interference places limits on possible layouts.
 
Tagrineth said:
Revolutionary. :LOL: :LOL: :LOL: :LOL: :LOL:

Hell of a revolution... I mean, WOW, PS2 has only been doing that exact same thing for what... four years, not counting pre-production tests?

Depends what we are really talking about here, what it is going to be used for and a whole lot of other things. People already pointed to the Flipper for the record.

In retrospect I don't see PS2 claiming unlimited resources, so if anything in order to get a more reasonable comparison, next generation consoles should be compared to next generation consoles.
 
Chalnoth said:
cybamerc said:
The next gen consoles will support HDTV resolutions so they'll need as much framebuffer space as a PC.
We'll see. Most TV's are still not HDTV's. Don't be surprised if the next generation of consoles are still primarily low-resolution.

Agreed. It seems 1080i is rare enough on HDTVs (forget about 1080p) that 720p seems the highest worthwhile target to aim for. The new wave of "EDTV's" may lower MS' sights even more.

OTOH, the higher resolutions without AA seem possible if MS considers AA a necessity for the lower, more common HDTV resolutions.
 
Dio said:
Gubbi said:
Bandwidth is easy, just throw pins at the problem and scale frequency. Latency is limited by a nasty physical constant, c.
Historically, the main reason for using embedded DRAM has always been bandwidth. Before R300 many people said a 256-bit memory bus was crazy, (frankly, it still does feel crazy to me) and I'm not quite sure if the world is ready for 2000-pin chips... and scaling memory frequency isn't particularly easy (memory frequencies are scaling significantly slower than core clock frequencies both in CPU's and VPU's).

I'm not sure c really comes into one more than the other.

Now there's an interesting post to come back and look at in the by and by.
 
Pete said:
Agreed. It seems 1080i is rare enough on HDTVs (forget about 1080p) that 720p seems the highest worthwhile target to aim for. The new wave of "EDTV's" may lower MS' sights even more.

OTOH, the higher resolutions without AA seem possible if MS considers AA a necessity for the lower, more common HDTV resolutions.
Remember that it will still be up to game developers to select resolutions. In the past, console game developers have been spoiled by not having to bother with any user-selectable performance settings. In this spirit, console developers may be resistant to supporting many different resolutions, even with HDTV support.

To tell you the truth, I don't doubt that next-gen consoles will support HDTV resolutions. However, they may not support HDTV resolutions with, for example, FSAA. Or, they may just not run optimally at the higher resolutions (particularly if an eDRAM solution is chosen).

Remember that the X-Box, as an example, supports internal resolutions up to 1024x768, but this is hardly ever used.
 
I'll rather wait until the full true specifications about the XBox2 chip get announced; until then neither optimistic nor pessimistic speculations float my boat.


However, they may not support HDTV resolutions with, for example, FSAA.

Whereby anyone has any idea about the chips specifications or it's anti-aliasing algorithm?

Remember that the X-Box, as an example, supports internal resolutions up to 1024x768, but this is hardly ever used.

Which means exactly what? By the time the XBox2 will get released the original XBox and thus it's underlying technology will be how old exactly?

From what I've heard the NV2A chip does use some sort of geometry compression for instance (with unknown effectivity of course). Now would I assume that the R5xx chip contains a PPP (does anyone really know?), how would the opportunities and differences then look like?

By the way aren't developers using Quincunx whenever possible for the XBox? To what resolution would you need to upsample to get a 2*2 grid in 640*480?
 
Ailuros said:
Which means exactly what? By the time the XBox2 will get released the original XBox and thus it's underlying technology will be how old exactly?
I'm speculating on the software developers, not the hardware.

And Consoles have traditionally supported smaller resolutions than PC graphics cards. I just don't think that HDTV has enough penetration yet to change this by much.

Now, if it is indeed true that the X-Box 2's chip will use eDRAM, then it may actually be a way for the chip to support higher resolutions. If it is not necessarily designed to house the entire framebuffer on-die, but instead to intelligently cache it at higher resolutions, then it could go far. If it is designed to hold the entire framebuffer on-die, then there may be problems with performance at higher resolutions.
 
Dio said:
Gubbi said:
Bandwidth is easy, just throw pins at the problem and scale frequency. Latency is limited by a nasty physical constant, c.
Historically, the main reason for using embedded DRAM has always been bandwidth. Before R300 many people said a 256-bit memory bus was crazy, (frankly, it still does feel crazy to me) and I'm not quite sure if the world is ready for 2000-pin chips...

Maybe not. But there are custom packaging techonologies out there, and when you're going to produce multiple millions of such devices you get economies of scale.

Dio said:
and scaling memory frequency isn't particularly easy (memory frequencies are scaling significantly slower than core clock frequencies both in CPU's and VPU's).

Plain wrong. Memory interface clock has been scaling faster than core clock for 3 generations now. You probably meant that core performance has increased faster than per pin memory bandwidth.

Cheers
Gubbi
 
When I first got into hardware core and memory clocks were the same. Now, the core clock is usually ahead the memory clock. Memory technology has improved to double the data rate per pin in the meantime - perhaps this is what gives you the impression that memory clock rates have increased?

Relatively, bandwidth has increased by about a factor of 7 (64bit->256bit, SDR->DDR, but clock slipping a bit) while peak core performance has increased by a factor of 8 (1 pixel/clock -> 8 pixels/clock). So memory performance is slipping slightly behind (which is more significant than it sounds, given that the 64-bit days were typically with 16-bit screen and Z). In other ways memory technology has slipped off a bit too (no 'DDR SGRAM').

It will be interesting to see if this continues.
 
Chalnoth:

> Most TV's are still not HDTV's.

And most PCs come with slow built-in gfx cards and 15/17'' monitors. The prices of HDTVs are dropping and the FCC has ruled that by July 2007 all TVs with a screensize larger than 13 inches must be able to receive a DTV signal, a lot if not most of those sets are gonna be HDTV capable. Additionally I'd speculate that console owners are more likely to own an HTDV capable set than the general population, especially the early adopters. Moreover, with computer monitors getting larger, the same types of flatpanel displays being used for both monitors and TVs, shared digital inputs etc. consoles won't be limited to TV sets.

> Don't be surprised if the next generation of consoles are still primarily
> low-resolution.

I would be surprised. PS2 already supports resolutions up to 1280x1024 and Xbox supports 1080i. No way in hell they're gonna limit themselves to 480p/575p.

> Remember that it will still be up to game developers to select
> resolutions.

Not if the console/set-top box makers dictate that a certain resolution must be supported.

> Remember that the X-Box, as an example, supports internal resolutions up to 1024x768,

It'll output an 1080i signal. The framebuffer can be as large as there's room for.




Pete:

> It seems 1080i is rare enough on HDTVs (forget about 1080p)

To the renderer it's irrelevant if the output is 1080i or 1080p, the rendering resolution is the same unless you do field rendering with nasty tearing as a result.
 
cybamerc said:
Additionally I'd speculate that console owners are more likely to own an HTDV capable set than the general population, especially the early adopters.
I don't think so. Console owners really are varied. It's why consoles are such big business: so many people buy them.

I would be surprised. PS2 already supports resolutions up to 1280x1024 and Xbox supports 1080i. No way in hell they're gonna limit themselves to 480p/575p.
I'm not expecting hard limits. I'm expecting limits imposed by the constraints of the hardware. That is, you just don't see games that run at those resolutions, because it's not worth the performance to do so. I just don't see this changing in the next generation of consoles, as external memory bandwidth is still expensive.

Not if the console/set-top box makers dictate that a certain resolution must be supported.
That would be an interesting scenario. I'm just not sure I believe it would happen. We'll have to see.

Pete:

> It seems 1080i is rare enough on HDTVs (forget about 1080p)

To the renderer it's irrelevant if the output is 1080i or 1080p, the rendering resolution is the same unless you do field rendering with nasty tearing as a result.
Hrm, I thought 1080i was the highest, and the only setting with no progressive counterpart?
 
Chalnoth:

> I don't think so. Console owners really are varied.

Possibly. Even so HDTV will be standard soon enough.

> That is, you just don't see games that run at those resolutions, because
> it's not worth the performance to do so.

That's a matter of opinion. I should hope IQ gets more attention next gen. Resolution is one way of improving that.

> Hrm, I thought 1080i was the highest, and the only setting with no
> progressive counterpart?

1080p was added to the ATSC specification after it was made part of the DVB HDTV standard. It makes perfect sense anyway. 1080i has the same resolution, the difference is how the display is refreshed.
 
It's a bandwidth difference primarily. Interlace requires 1/2 the datarate.

Anyway, in another 3 years, I expect most people who buy bleeding edge consoles will have some form of HDTV. This superbowl saw record numbers of people buying HDTVs. Widescreen Rear projection DLP sets in the 36" inch range are hitting sub-$3000. < 40" plasmas (granted, 800x600 res) are sub $3000. The price has been dropping dramatically every year. I bet even some HDTV CRTs will go <$1000 in a year or two.

Almost every X-Box today runs at 480p. A handful will run at 720p and 1080i and are beautiful. The X-Box2 *better* support atleast 720p as default. I think developers are going to have to sacrifice 2x fillrate to go to 720p, just like today, we demand 60fps, tommorow, we should demand 720p minimum.
 
Remember that in Europe things are very different. To say the least. HDTV will not be common for years to come in here.

All i'm asking is VGA output, like a proper VGA cable out as soon as the consoles launch. Then i'm sorted.

In the end almost everyone has HDTV at home in the form of computer monitors.

If the gamer wants to play on his TV at 480i, good, If he wants to play on his computer monitor at 720p (or the equivalent in monitors terms), he will have the option. I don't see what's wrong with that.
 
Dio said:
When I first got into hardware core and memory clocks were the same. Now, the core clock is usually ahead the memory clock. Memory technology has improved to double the data rate per pin in the meantime - perhaps this is what gives you the impression that memory clock rates have increased?

Ok, now we are splitting words. Your definition of memory clock is how many transaction you can do per second, and mine how fast the interface is clocked (ie. the clock doubled rate)

Bandwidth is the datarate.

It of course depends on what reference you start out with, I've seen

Voodoo2 with 80Mbit/s/pin datarate (pipelined EDO, impressive) @90MHz core
Voodoo3 with 166Mbit/s/pin @ 166MHz core

If you use the original GeForce 1 DDR as a reference you have 333Mbit/s/pin @120 MHz core and you get a completely different picture.

Next gen will see 800MHz GDDR2/3 (1600Mbit/s/pin), I doubt if cores will clock @800MHz.

Cheers
Gubbi
 
DemoCoder:

> It's a bandwidth difference primarily. Interlace requires 1/2 the datarate.

Indeed, but that is not really an issue for consoles. And soon won't be for DVDs leaving only broadcast for which interlaced scanning was originally created.



london-boy:

> Remember that in Europe things are very different. To say the least.
> HDTV will not be common for years to come in here.

Well, it doesn't really matter one way or the other since it's the Japanese and American markets that dictate how the console is designed. That said, all of the big manufacturers are shipping TVs with HD tubes and JVC even has a few models with HD tuners... I suspect that Panasonic will follow soon and the rest will be forced to eventually with CRTs being phased out in favor of flatpanels.
 
Well i just thought i'd drop a line cause a large percentage of people here are from europe or the UK...
Don't get me wrong, there are "affordable" TVs capable of pro-scan here, although i'm not sure whether they go up to 720p, very skeptical about that.
It's just that here in europe we might end up like in this generation, with software that doesnt support pro-scan at all apart from PS2. While the rest of the world is enjoying (or at least has the option to enjoy) Soul Calibur 2 at 720p on Xbox...
We don't even get the option.
 
Gubbi said:
Your definition of memory clock is how many transaction you can do per second, and mine how fast the interface is clocked (ie. the clock doubled rate)
My point is that over time, MClk:EClk ratio has dropped, while the other aspects of the bus (DDR/bitcount) have stayed in balance with the core's peak throughput.

I don't know if this will stay true or change. I have no idea what GDDR3 will do to the numbers (and probably couldn't talk about it if I did!)

(My baseline was the Savage3D (the first unified memory architecture that did pixel/clock with SDRAM). Voodoo2 used a non-uniform architecture and so isn't really comparable.. I could have taken the baseline earlier (say, the Virge DX, which used SDRAM with a non-pixel/clock supporting chip, where I could have got even bigger numbers in my favour ;) )
 
While 100Hz PAL TV's are readily available, and even low-end widescreen models are often supporting this now, I've still yet to see any move towards HDTV in Europe. I suspect the major factor is that PAL picture quality isn't the travesty that NTSC is so there is significantly less drive to upgrade.

That said, the shift by broadcasters to 16:9 is pretty much complete in the UK, and I suspect the other large economies too, although from the footy it is clear that many countries still mostly use 4:3. My guess is that in a couple of years it will be hard to buy a 4:3 TV over here, but maybe the small TV market will cling on a few years longer.
 
Back
Top