If the original Xbox used edram instead

jackal256

Newcomer
Ignoring architecture differences with the PS2 or Xbox. Would difference would it had made if it GPU used 8MB edram with high bandwidth, instead of the 64mb config it used?. The gamecube is pretty much a custom ati graphics card with 3mb edram as video ram.
 
Nvidia simply wasn't designing their GPUs for eDRAM, but for large pools of video memory which NV2A could support through the northbridge. Redesigning NV2A to use eDRAM would've been prohibitively expensive (Nvidia didn't have experience with it either), and the system still would've needed the 64 MB of memory to get the most out of it anyways. In the end, the NV2A would prove in totality the superior graphics chip of the generation.

And the Gamecube's Flipper GPU was not really ATi, it was ArtX. ArtX was founded by former Silicon Graphics employees in 1997, then bought up by ATi in 2000 as they were finishing up Flipper. The ArtX team would go on to develop the Radeon R300 series (Radeon 9700, 9800, X600, etc).
 
I'd just give the GPU and CPU their own banks of RAM. That would greatly boost efficiency. Especially for the GPU. That should push its performance up enough to target 720p for most games instead of just a few. Maybe 64MB GPU + 16MB CPU?

Though I'm sure it was completely impractical from a cost standpoint at the time, and that it had been considered.

XBox is sort of a super N64. With the same UMA problems and benefits.
 
Last edited:
Was the NV2A know for being bandwidth starved at all? I kind of doubt it, because I seem to remember Xbox having some games that rendered in HD.
 
Was the NV2A know for being bandwidth starved at all?
Yes. It certainly didn't have enough bandwidth during alpha blending.

Theoretically, it can do 4 pixels per clock, so with full blending @ 4 B per pixel (read + write) that requires ~7.3GiB/s just for a single target (233MHz core clock). Throw in a Z-buffer and that's doubled, and then you have compounding issues with shared bandwidth plus practical bandwidth utilization, and you get a fairly starved GPU during those operations in a time when forward renderers were dominant.

This is readily apparent with the MGS2 port (rain stuff) and during particular scenes in Ninja Gaiden when there's blood effects borking the framerate briefly. Doom 3 had major framerate issues as well despite the massive amount of work Vicarious Visions did to alter the campaign levels. FarCry Instincts also had issues when rendering the forest.

Team Ninja worked hard for the DOA games, but they're the exception.

There's a reason MS went overboard for 360 to solve that issue. Sticking with that design philosophy for Durango was in some ways a logical decision because of the benefits of having a non-shared space, but ultimately there are other factors that still made it less than ideal in a practical sense (software side) when compared to the simpler setup of Liverpool, but I digress.
 
Last edited:
would probably be more realistic to look at faster ram being possible, given 128MB @ 500Mhz was a reality in 2002 for sub $150 cards I think.
 
would probably be more realistic to look at faster ram being possible, given 128MB @ 500Mhz was a reality in 2002 for sub $150 cards I think.
I might be missing part of the discussion, but MS/nVidia would have been looking at early 2001 to lock down memory specs I think. DDR @ 500 (250MHz) was the top end for Geforce 3 circa October 2001, and given the ludicrous contract prices that MS fumbled into, it probably just wasn't an option, sadly. It'd have given them 8GB/s on the 128-bit bus.

I do wonder if it'd have been a better trade-off to ease off the GPU clock even further (was already down from 250MHz) for higher yields and put the money towards faster RAM, but who knows if nV contract would have been flexible there.
 
Last edited:
I'd just give the GPU and CPU their own banks of RAM. That would greatly boost efficiency. Especially for the GPU. That should push its performance up enough to target 720p for most games instead of just a few. Maybe 64MB GPU + 16MB CPU?

Though I'm sure it was completely impractical from a cost standpoint at the time, and that it had been considered.

XBox is sort of a super N64. With the same UMA problems and benefits.
Wonder how they went about the routing from the CPU to NV2A (bandwidth and lanes). Was Desktop Coppermine/Celeron feeding a dual chan 128-bit DDR? Anyone have a system block diagram? I'm not sure we ever got one of those with the full figures.

At a high level, probably this (simplified)
Code:
CPU <-> north bridge <-> DDR
             |
            NV2A

Dedicated memory spaces would have been costly for the motherboard & memory chips. If anything, the trade-off would have been NUMA with 64MB maximum (Geforce 3 cross bar was 4x32-bit IIRC)) rather than 64MB GPU + x MB CPU (more mobo routing & memory chips), so it would have been better to have the flexibility of a UMA and have devs work around it.

---------------


It'd probably need to be a more custom design from the get-go in order to get something more like a super Gamecube (Wii!), but circa 2001...

e.g.
eDRAM
512-bit width (4 x 128-bit)
512/8 *233MHz = 14.9GB/s (14.58GiB/s) -> which is exactly what the 4 ROPs would need.

32bpp Backbuffer+Z.... well,
640*480 = 2.34MiB
640*576 = 2.81MiB
704x480 = 2.57MiB
720*480 = 2.63MiB
800x480 = 2.93MiB

848*480 = 3.10MiB (848 aligns with 16)
854*480 = 3.13MiB

(anamorphic scaling, of course)

4 x 768kiB slices for 3MiB ?

Then nV would have had to add some more cache to NV2A on the texture side or something.

lalalalalala

As Mobius mentioned, the experience wasn't there for eDRAM (limited to NEC tech at the time IIRC) to be a factor in the design, and Xbox was more or less the trenches from which the 360 design arose anyway. It just wasn't going to happen barring Marty McFly cheating.
 
Last edited:
Wonder how they went about the routing from the CPU to NV2A (bandwidth and lanes). Was Desktop Coppermine/Celeron feeding a dual chan 128-bit DDR? Anyone have a system block diagram? I'm not sure we ever got one of those with the full figures.

At a high level, probably this (simplified)
Code:
CPU <-> north bridge <-> DDR
             |
            NV2A

Dedicated memory spaces would have been costly for the motherboard & memory chips. If anything, the trade-off would have been NUMA with 64MB maximum (16-bit I/O per chip, 8 chips) rather than 64MB GPU + x MB CPU (more mobo routing & memory chips), so it would have been better to have the flexibility of a UMA and have devs work around it.

It'd probably need to be a more custom design from the get-go in order to get something more like a super Gamecube (Wii!), but circa 2001...

As Mobius mentioned, the experience wasn't there for eDRAM (limited to NEC tech at the time IIRC) to be a factor in the design, and Xbox was more or less the trenches from which the 360 design arose anyway. It just wasn't going to happen barring Marty McFly cheating.

This page describes the nForce chipset implementation.

Side note: nForce was rad.
 
This is readily apparent with the MGS3 port (rain stuff) and during particular scenes in Ninja Gaiden when there's blood effects borking the framerate briefly. Doom 3 had major framerate issues as well despite the massive amount of work Vicarious Visions did to alter the campaign levels. FarCry Instincts also had issues when rendering the forest.

MGS2 rain couldnt be done better on the OG Xbox? Played both some weeks ago (PS2/Xbox), it seems a so-so port from not just graphics perspective. MGS2 was designed for the PS2 from the ground up.
 
MGS2 rain couldnt be done better on the OG Xbox? Played both some weeks ago (PS2/Xbox), it seems a so-so port from not just graphics perspective. MGS2 was designed for the PS2 from the ground up.
They'd probably have to compromise somewhere. PS2 fillrate/bandwidth was a huge huge advantage. At the time, there might not have been a particularly good solution other than cutting down on the effect or the framebuffer resolution to begin with.

Using a lower res alpha buffer isn't necessarily a win since there is a different overhead involved, and besides, I don't recall if that sort of solution was in the minds of developers just yet (and perhaps not for a porting job in that circumstance). Dynamic resolution was in a couple titles, but again, not something that was well known.
 
There's a reason MS went overboard for 360 to solve that issue.

I wonder how many members of the ArtX team were involved with that, since they would've had the experience in eDRAM. Coupled that with Xenos' shaders derived from R500's vertex shaders, and R500 is a second follow on of R300 developed by the acquired ArtX team............ I'd love to see some real documentation on Xenos' development.
 
I wonder how many members of the ArtX team were involved with that, since they would've had the experience in eDRAM. Coupled that with Xenos' shaders derived from R500's vertex shaders, and R500 is a second follow on of R300 developed by the acquired ArtX team............ I'd love to see some real documentation on Xenos' development.
Indeed... I've kinda pinged Richard @ DF a few months back about whether we could bug Phil for some history there or at least point to some folks who were directly involved. It would make for an interesting retrospective series for Xbox history.

But... yeah, I dunno if it'd happen or not. Spam-tweet DF I guess. ¯\_(ツ)_/¯
 
This is readily apparent with the MGS2 port (rain stuff) and during particular scenes in Ninja Gaiden when there's blood effects borking the framerate briefly. Doom 3 had major framerate issues as well despite the massive amount of work Vicarious Visions did to alter the campaign levels. FarCry Instincts also had issues when rendering the forest.

While i agree that MGS2 isnt the best port of mgs2 to xbox, it wasnt for pc either. If i remember correctly the pc port had the same issues, i think the recommanded specs where a 9700pro, which i had in my axp 2100+ system at the time, MGS2 didnt run well on that system i can tell. The 9700pro was a monster btw, beat anything nvidia @DX9.
Doom 3 was/is a nice port for what it is really, with just 64mb ram and a custom 733mhz cpu, a high end game designed in mind for ultra high end pc hardware, i think it was quit an achievement. Same for Far Cry in a bit less manner.

See, aside from Ninja Gaiden, other games xbox got was mostly ports or pc games from the beginning somehwere. We would have to see Konami developing MGS2 for the xbox instead from scratch and see what they could do with the hw.

eam Ninja worked hard for the DOA games, but they're the exception.

Like Konami worked very hard for MGS2 on ps2, talented studios working hard with a big budget get the most out of the hardware, in special if the game wasnt build for another very different platform to begin with.

They'd probably have to compromise somewhere. PS2 fillrate/bandwidth was a huge huge advantage. At the time, there might not have been a particularly good solution other than cutting down on the effect or the framebuffer resolution to begin with.

One of PS2's most particle intense games, Black, ran wonderfully on the OG Xbox, nothing was paired down to my knowledge. Instead, xbox version both looks, sounds and runs better. A shootout in a room could cause massive smoke, dust, debris etc flying without any slowdown, 60fps. Loads of nice effects in that game also. Many consider Black as one of the best looking 6th gen games, even saying it could qualify as a early 7th gen game.
Panzer dragoon orta had some nice effects/particles too i remember, havent played for over 15 years though. Quantum Redshift had a track in the rain, which i thought looked very nice, up there with MGS2 rain, perhaps. Found a video, youtube doesnt do the graphics justice but it still looks nice today in person.
Much can be going on in redshift too, explosions etc.60fps no slowdowns. With a cheat one can obtain ultra high speed boosts, game doesnt slow down one bit then either.


Seems that most think xbox wasnt that good at particles/effects, but i dont think it was that bad at all. I guess when its vertex shaders are put to good use the xbox is quit capable in those areas too.
Ive never seen anything like ZoE2 on ps2 though, not on og xbox or gamecube atleast. Even though it had slowdowns on ps2, it was very special for its time.
One would think Xbox could have done MGS2 much better, perhaps at 60fps in the rain. Looking abit different.

Oh and, if it has any value, i managed to get quake 3 running at 1280x960 on the og xbox, its just the pc version, not even a port. Runs very well for what it is. Not bad for being at high settings.

I dont think MS should have done anything different, same for Sony and Nintendo. PS2 was the best utilized and got the best games imo.
 
Last edited:
Indeed... I've kinda pinged Richard @ DF a few months back about whether we could bug Phil for some history there or at least point to some folks who were directly involved. It would make for an interesting retrospective series for Xbox history.

But... yeah, I dunno if it'd happen or not. Spam-tweet DF I guess. ¯\_(ツ)_/¯

If I used Twitter, I guess I'd bug them too.

While i agree that MGS2 isnt the best port of mgs2 to xbox, it wasnt for pc either. If i remember correctly the pc port had the same issues, i think the recommanded specs where a 9700pro, which i had in my axp 2100+ system at the time, MGS2 didnt run well on that system i can tell. The 9700pro was a monster btw, beat anything nvidia @DX9.
Doom 3 was/is a nice port for what it is really, with just 64mb ram and a custom 733mhz cpu, a high end game designed in mind for ultra high end pc hardware, i think it was quit an achievement. Same for Far Cry in a bit less manner.

I'd like to get a 1 GHz-ish Pentium 3 + Geforce 3 system to see if I can get FEAR and Call of Duty 2 to run on it at some kind of worthwhile baseline. While I understand the latter not coming to the Xbox for "political reasons" (that gen had it's own CoD series), I'd bet that an Xbox port of FEAR was considered and explored. A paired back version with fewer particles and physicalized objects seems easily doable at 480p.
 
At 640x480 you can go pretty far with a GeForce 3. You can run full PC Doom3 adequately that way so FEAR seems doable. I don't remember how acceptable FEAR's D3D 8 mode is though.
 
Last edited:
https://www.anandtech.com/show/1416/5
Not too hopeful about it. :p

hm... Ti 4400 should be slightly above Geforce 3 Ti500, I think, although the 4 series has 2 vertex shader pipes (NV2A was probably somewhat more of an underclocked 4 series in that regard).

There's probably a fairly significant CPU dependency for the early stencil shadow games as well.

I suppose we'd be looking at 480p30, so that's maybe the basis for their attempt to port to NV2A along with the drastically altered campaign and various cut-backs (no flashlight shadowing, texture compromise) to accommodate the lower console specs.
 
Last edited:
I'd like to get a 1 GHz-ish Pentium 3 + Geforce 3 system

Same here, should never have done away any hardware ive had before. Should have kept the p2 450 with voodoo 3 too.

Ti 4400 should be slightly above Geforce 3 Ti500

Its more then slightly, i went from a Ti500 to a Ti4200, the difference was there in later modern games. Twin vertex shaders and 128mb memory with higher clocks wasnt a bad upgrade. The Ti4200 was a ultra good value for the price too, no reason for an Ti4600 back then.
 
Back
Top