Lots 'o Embedded DRAM on R500

Now, if it is indeed true that the X-Box 2's chip will use eDRAM, then it may actually be a way for the chip to support higher resolutions. If it is not necessarily designed to house the entire framebuffer on-die, but instead to intelligently cache it at higher resolutions, then it could go far. If it is designed to hold the entire framebuffer on-die, then there may be problems with performance at higher resolutions.

It wasn't very hard to translate a rather misleading speculation, was it? 9 out of 10 small tidbits get interpreted in a rather obscure or even exaggerated way. I'm not in a position to know what the specific HW really will contain and for what purposes, but an intelligent cache makes far more sense doesn't it?

As a reminder of the usual nonsense that circulates the rumour mill, the NV40 would have 16 pipelines and/or 16MB of eDRAM.....*cough*
 
london-boy:

> Don't get me wrong, there are "affordable" TVs capable of pro-scan
> here, although i'm not sure whether they go up to 720p, very skeptical
> about that.

There are rear projection screens capable of 720p.



Dio:

> I suspect the major factor is that PAL picture quality isn't the travesty
> that NTSC is so there is significantly less drive to upgrade.

Actually, the reason is that local authorities and broadcasters alike prefer quantity over quality. Instead of getting HDTV we're getting more channels. This is the situation on all of Europe. In fact the only country that has committed to broadcasting HDTV in the DVB standard is Australia. It's quite a commitment too as they are required to broadcast in both SDTV and HDTV during a transition period.
 
cybamerc said:
Actually, the reason is that local authorities and broadcasters alike prefer quantity over quality. Instead of getting HDTV we're getting more channels.
I don't think the authorities are much involved - in the UK at least; we've farmed just about the entire thing out to Murdoch now. If he could see a commercial advantage to HDTV you can bet he'd be on it like a shot.

I reckon we'll have to wait until the BBC throws money down the drain for a few years to build the market.

Just like the 16:9 transition really. The good news is that 16:9 is done, DAB radio is done, so they are likely to be looking at HDTV next. The bad news is that the analogue->digital signal switch coming up (2007, is it?) seems likely to me to absorb the 'technical advancement' cash - especially now the BBC is running the whole thing.
 
Dio said:
(My baseline was the Savage3D (the first unified memory architecture that did pixel/clock with SDRAM). Voodoo2 used a non-uniform architecture and so isn't really comparable..

Yes, the V2 had 3 64bit buses. My numbers were for one such bus (640MB/s /bus). I first thought of the V3 when we started this discussion.

Dio said:
I could have taken the baseline earlier (say, the Virge DX, which used SDRAM with a non-pixel/clock supporting chip, where I could have got even bigger numbers in my favour ;) )
That would be cheating :) . the Virge DX is not a proper 3D chip (god awful 3D performance, crap feature set etc) Voodoo 1 perhaps.

Anyway. With next generation memory subsystems @ 50GB/s you can wade through 5 32bit 1920 x 1080 buffers 20 times per frame. And that is without Z-optimizations (hierachical-Z, compression etc.) I just don't see framebuffer/Z-buffer bandwidth being a problem (unless you want to seriously upgrade FSAA). Latency for dependent texture reads however will be a problem.

Cheers
Gubbi
 
mboeller:

> You could even buy an rear projection TV with nearly 150cm diagonal
> screen size from Toshiba with 1920 x 1080p for ~ US$ 5000,-.

Yeah... I was talking about PAL sets though. The selection is somewhat better in the US (and other places).



Dio:

> I don't think the authorities are much involved

I assume the terrestial net is still owned by the government? No doubt broadcasters have a lot to say though. It's all about money in the end and more channels means more money for everyone.

There's still satelite and cable though. And of course more and more ppl are using their tv sets as monitors for devices such as DVDs and video games. Broadcast standards shouldn't dictate what kind of sets are made available to the European market... and won't for much longer.
 
Gubbi said:
the Virge DX is not a proper 3D chip (god awful 3D performance, crap feature set etc)
DX was massively better than the original Virge, but yes, <1ppc chips are not that interesting.

I just don't see framebuffer/Z-buffer bandwidth being a problem (unless you want to seriously upgrade FSAA). Latency for dependent texture reads however will be a problem.
No argument that latency is an issue. ATI's engineers seem to be good at designing latency comps though and that probably filters my priorities. And, undoubtedly, we would like to seriously upgrade AA :) - if we can do it right.
 
Dio said:
cybamerc said:
Actually, the reason is that local authorities and broadcasters alike prefer quantity over quality. Instead of getting HDTV we're getting more channels.
I don't think the authorities are much involved - in the UK at least; we've farmed just about the entire thing out to Murdoch now. If he could see a commercial advantage to HDTV you can bet he'd be on it like a shot.

I reckon we'll have to wait until the BBC throws money down the drain for a few years to build the market.

Actually the local authorites (as in the goverment) have everything to do with it. I think the US goverment effectively forced HDTV to happen by tying use of certain parts of the RF spectrum/satelites to a commitment from the telecos/broadcasters etc to implement HDTV.

Oh, and its unlikely that the BBC could afford to kick start HD in the UK without a fundamental change to its funding method.

John.
 
Dio said:
DX was massively better than the original Virge, but yes, <1ppc chips are not that interesting.


Do You have any Spec's and/or Reviews? I was not able to find any one. It seems this cards were never really tested by a 3D site.
 
mboeller said:
Dio said:
DX was massively better than the original Virge, but yes, <1ppc chips are not that interesting.


Do You have any Spec's and/or Reviews? I was not able to find any one. It seems this cards were never really tested by a 3D site.

Yeesh, just had a post traumatic stress syndrome flashback of choosing between the Diamond Stealth and the Matrox Mystique (oy, there's a name you don't hear much anymore) based on a PC Mag Editor's Choice rec of the Diamond. . .
 
Doesn't surprise me too much. S3's name was being dragged through the mud at that point. The Virge and Virge/VX (used double-port VRAM) were the early chips, the DX/GX followed those, and the GX2 was later.

IIRC the main differences were:
- bit higher engine clock (don't know what either was, though)
- support for SDRAM (DX) and SGRAM (GX)
- free perspective correction (was very expensive on Virge)
- faster bilinear filtering

After those changes it wasn't that bad a chip. Which isn't to say it was a good chip, but S3 didn't quite deserve the crap they had thrown at them over the Virge.

The GX2 had a higher clock again than the GX (and some more tweaks that I fail to remember because I was moving focus to the Savage3D team at that point).
 
Dio said:
After those changes it wasn't that bad a chip. Which isn't to say it was a good chip, but S3 didn't quite deserve the crap they had thrown at them over the Virge.

What I recall is that they missed a product cycle and ended up selling it for a year longer than they should have at a time when the newness of the niche guaranteed that anything a generation old was going to look pathetic. How that maps to "deserve" is a subjective question. :LOL:
 
Dio said:
After those changes it wasn't that bad a chip. Which isn't to say it was a good chip, but S3 didn't quite deserve the crap they had thrown at them over the Virge.

The GX2 had a higher clock again than the GX (and some more tweaks that I fail to remember because I was moving focus to the Savage3D team at that point).
I seem to recall seeing 3D Winbench 2000 running on a Virge (probably a GX2)... it was pretty much a slideshow at 10x7.
 
cybamerc said:
> It seems 1080i is rare enough on HDTVs (forget about 1080p)

To the renderer it's irrelevant if the output is 1080i or 1080p, the rendering resolution is the same unless you do field rendering with nasty tearing as a result.
Who said SLI is dead? ;)
 
mboeller said:
Dio said:
DX was massively better than the original Virge, but yes, <1ppc chips are not that interesting.


Do You have any Spec's and/or Reviews? I was not able to find any one. It seems this cards were never really tested by a 3D site.

no one didn't give a damn. it was all the way Voodoo and Verite back then. it was a shame, because for example when I switched at last from Virge to RageII+, RageII was actually slower in most cases and had much worse dithering than my old Virge. Virges got huuuge amount of driver updates, but RageII only few ones. Virge's DirectX speed improved around 50% during the time I had it. The reason why I didn't switched back to S3 was that ATI 3DCharger had 4MB memory, while my Diamond Stealth 3D 2000 had only 2MB. on switch I also losed 24Bit rendering, though Virge never really wasn't more than slide show on that.
 
I think you just hit on the reason why it was all about Voodoo and Verite back then. Everybody else's products sucked. Yes, some sucked more than others, but I don't think many people cared.
 
Pete said:
cybamerc said:
> It seems 1080i is rare enough on HDTVs (forget about 1080p)

To the renderer it's irrelevant if the output is 1080i or 1080p, the rendering resolution is the same unless you do field rendering with nasty tearing as a result.
Who said SLI is dead? ;)
Me. But what in the world does that have to do with the above post?
 
Heathen said:
Oh, and its unlikely that the BBC could afford to kick start HD in the UK without a fundamental change to its funding method.

they'll just double the licence fee. ;)

To be honest i'd be very happy to pay double the TV license if that would lead HDTV to become standard within the next 3 years. VERY happy.
 
Nappe1 said:
all the way Voodoo and Verite back then.
3dfx yes, Rendition no. I never saw a Rendition card. Maybe that was just the UK though.

Virge's DirectX speed improved around 50% during the time I had it.
"The fate of the company is in your hands..." (in-joke)

on switch I also losed 24Bit rendering, though Virge never really wasn't more than slide show on that.
Really? I usually found 8888 and 565 colour were identical performance because the core was the limiting factor. Maybe the lack of video memory was the problem.
 
Back
Top