An original Xbox can do Doom 3 at 720p natively with extra RAM

Xbox, in the first year, had a bunch of games with extra content or exclusive features. Not exclusive titles, but features or levels that were exclusive to Xbox. I don't remember every title, but I do remember Spider-man was a big one. The movie tie in game based on the Raimi movie. It had a Kraven the Hunter level that only appeared on Xbox. I think Tony Hawk had some exclusive levels or characters also. And there were definitely more.

And again, I get that adding more memory helps Xbox achieve more, but really, it's cost at the time and considering what it's contemporary competition had, memory was hardly Xbox's weakest point. In fact, it was one of it's strengths.

I'm not sure if you meant "...in addition to exclusive titles...", but there were certainly quite a few Xbox exclusive titles at launch. This was a time when a Japanese developer (Team Ninja) was developing almost exclusively for Xbox. Itagaki was a huge fan of the Xbox and X360.

Anyway, yeah, the Xbox versions of a lot of multiplatform games often featured Xbox specific content. Soul Calibur II as another example had Spawn in the Xbox version.

Regards,
SB
 
They should have just had user-accessible DIMM slots. Drop in your own PC3200 DIMMs.

Maybe a socketed CPU as well. That would make the PIII-S Tualatin upgrades a bit more simple. ;)
 
Last edited:
Yes, but that's without optimization. Alot of games struggled at 480p or had frame drops with being optimized. Mostly on the other machines.

What I'm saying is give devs 512mb of RAM and target 480p to save resources. I think that would have greatly extended its life in a "what if" scenario.

It a bandwidth starved machine with a CPU that had less power than the PS2 CPU. If It had at least used 8MB EDRAM with 18GB/s it would've greatly helped on the bandwidth issues. But split the 64MB memory as main RAM.
 
Bit of a bump of an old thread but I seem to remember OG Xbox using a Conexant CX25871 video encoder that has a maximum input resolution of 1024x768?

Meaning that no game on Xbox would be rendering at native resolutions above that as the Conexant chip wouldn't accept the input signal.

So Doom couldn't possibly be rendering at native 720p.
 
Bit of a bump of an old thread but I seem to remember OG Xbox using a Conexant CX25871 video encoder that has a maximum input resolution of 1024x768?

Meaning that no game on Xbox would be rendering at native resolutions above that as the Conexant chip wouldn't accept the input signal.

So Doom couldn't possibly be rendering at native 720p.

Oh well, thats aloooooong time ago this was used during the 6th generation console wars lol. Damn the memories. Anyway, from what i remember when this 'xbox cant output more than 1024x768' was tossed around, the connexant (there where focus tv encoder chips aswell) encoder chip has programmable vsync/hsync parameters, specific resolutions can be set that way, even HD ones.

There are games rendering 720p natively on the og xbox, aswell as true 1080i. Using a modded xbox running xbmc, you can run quake 3 (pc port) at 720p or higher, as far to which the console will struggle, but it does output the set resolutions natively. One of my xboxes is running xbmc at a true 1080i.

I think the confusion comes from the PS2's (softmodded) OPL display modes? Now these are actually not the resolutions as set.
 
Oh well, thats aloooooong time ago this was used during the 6th generation console wars lol. Damn the memories. Anyway, from what i remember when this 'xbox cant output more than 1024x768' was tossed around, the connexant (there where focus tv encoder chips aswell) encoder chip has programmable vsync/hsync parameters, specific resolutions can be set that way, even HD ones.

There are games rendering 720p natively on the og xbox, aswell as true 1080i. Using a modded xbox running xbmc, you can run quake 3 (pc port) at 720p or higher, as far to which the console will struggle, but it does output the set resolutions natively. One of my xboxes is running xbmc at a true 1080i.

I think the confusion comes from the PS2's (softmodded) OPL display modes? Now these are actually not the resolutions as set.

You know of any old threads that talk about bypassing the chips 1024x768 limit and games that did it?

Forcing PS2 games to 480p crazily improves the image over 480i even though it's not real 480p.
 
You know of any old threads that talk about bypassing the chips 1024x768 limit and games that did it?

That's 2002 to 2004 forum-days where many discussions where had. And let one of them be shutdown already 15 years ago (pvc forums). I remember reading many 6th gen discussions here at b3d too, but the search engine doesnt really net any results. Try the sega16 boards aswell, they are still there i think.
There where even more discussions regarding GT4 being 1080i or not. Those where the real intresting console wars i think, nowadays its all about trying to find that miniscule difference between the systems but ok.

Forcing PS2 games to 480p crazily improves the image over 480i even though it's not real 480p.

Yeah it does, especially on a non-crt display. Here in europe we got the s-video cables supplied with the system..... even scart was better. But ofcourse component was the way the go.
 
That's 2002 to 2004 forum-days where many discussions where had. And let one of them be shutdown already 15 years ago (pvc forums). I remember reading many 6th gen discussions here at b3d too, but the search engine doesnt really net any results. Try the sega16 boards aswell, they are still there i think.
There where even more discussions regarding GT4 being 1080i or not. Those where the real intresting console wars i think, nowadays its all about trying to find that miniscule difference between the systems but ok.

Yep, hardware was so much interesting back then..... My console is the best because it can render 65,000,000 polygons....... Now my console is better because it can render 65,000,001 polygons :runaway:

Yeah it does, especially on a non-crt display. Here in europe we got the s-video cables supplied with the system..... even scart was better. But ofcourse component was the way the go.

I'm currently playing 6th gen via RGB scart (I'm in the UK) on a 28" Phillips Pixel Plus CRT........so many games from that time I want to replay and others I want to play for the 1st time:love:
 
Yep, hardware was so much interesting back then..... My console is the best because it can render 65,000,000 polygons....... Now my console is better because it can render 65,000,001 polygons :runaway:

Dont forget the bits wars :p Yes that was still kinda a thing in the early 6th gen days on some forums. The architectures where so vastly different that comparisons where almost impossible other then looking at the games themselfs.

I'm currently playing 6th gen via RGB scart (I'm in the UK) on a 28" Phillips Pixel Plus CRT........so many games from that time I want to replay and others I want to play for the 1st time

Nothing beats a solid CRT for those older consoles. Anyway the best way to relive these games (or play them for the first time) is to softmod both your PS2/xbox and possibly gamecube, opening up possibilities and stream of the hdd's, and the libraries of games ofcourse.
 
Thats no shock it came out over 1.5 years after the ps2, remember the pace of hardware improvement is decreasing since moores law fell.
Its like comparing either the ps5 or xbox series X to a console coming out in 2024 and going, wow the one in 2024 is a lot more powerful


I mean, same standard for PS3 (year after 360) or Gamecube (year after PS2, same time as XBox)? Or heck even back to SNES, not sure exactly but a good time after Mega Drive, not decisively more powerful.

Coming later is not necessarily an indicator of power. Also noted pretty much most Nintendo consoles.
 
Coming later is not necessarily an indicator of power
Sure you can pick examples that contradict but if you put every single example into a spreadsheet, run a linear regression on the data you will see that after 1.5 years you will expect on average the machine to be more powerful, thats just a fact.
Same is true with a lot of other things, eg a fav used by climate change deniers is they Oh today, friday 15 of oct was colder than 15th oct last year thus the models are bullshit.

BTW I think every nintendo console was more powerful than the previous nintendo console, Is this not the case?
NES > SNES > N64 > GC > Wii > WiiU > Switch I know the Wii gets stick, but I thought it was more powerful than the GC

Now if the order was
NES > SNES > GC > N64 > Wii > WiiU > Switch
then that would be surprising

Sure you have exceptions in parts eg the cpu in ps3 > cpu in ps4 but overall the ps4 > ps3
 
Depends how you see it, hardware wise the 2001 xbox was actually quite abit (for the time) older than it actually released. P3 733 quite much 1999 cpu part, nv2a is abit harder to judge but i think GF3 is quite close in raw power, thats a early 2001 part if i remember correctly. 64mb ddr400 (?) around that same time frame. Seeing the PS2 launched wordwide october 2000, the actual xbox hardware is kinda in the same time era.

Not it matters a single bit, the GC, PS2 and Xbox all belong to the same generation (6th gen).

Sure you have exceptions in parts eg the cpu in ps3 > cpu in ps4

Thats quite wrong.
 
Last edited:
I mean, same standard for PS3 (year after 360) or Gamecube (year after PS2, same time as XBox)? Or heck even back to SNES, not sure exactly but a good time after Mega Drive, not decisively more powerful.

Coming later is not necessarily an indicator of power. Also noted pretty much most Nintendo consoles.
True. In the case of PS3, it was delayed and was not intended to release after 360. In that time there is nothing to suggest Sony were able to leverage that time to improve the base specifications. Had it actually launched on time I cannot imagine what the basic OS functionality would have been like because it was pretty rough when it launched in the EU, which was 5-6 months after Japan and USA.

Microsoft were changing the original Xbox right up to the wire, like the infamous change from AMD to Intel CPU cores. Sometimes a delay gives you time but you can do nothing with it. Sometimes you change things for no apparent reason, like the AMD-to-Intel switch being "pure politics." according to the original Xbox designer.
 
Sure you can pick examples that contradict but if you put every single example into a spreadsheet, run a linear regression on the data you will see that after 1.5 years you will expect on average the machine to be more powerful, thats just a fact.
Same is true with a lot of other things, eg a fav used by climate change deniers is they Oh today, friday 15 of oct was colder than 15th oct last year thus the models are bullshit.

BTW I think every nintendo console was more powerful than the previous nintendo console, Is this not the case?
NES > SNES > N64 > GC > Wii > WiiU > Switch I know the Wii gets stick, but I thought it was more powerful than the GC

Now if the order was
NES > SNES > GC > N64 > Wii > WiiU > Switch
then that would be surprising

Sure you have exceptions in parts eg the cpu in ps3 > cpu in ps4 but overall the ps4 > ps3
Within it's own product history Nintendo has done OK, but lets be honest, GC to Wii and WiiU to Switch were almost lateral moves. Sure, the newer consoles are better, but it isn't a "normal" generational gap. The difference between Xbox One and One X is larger than WiiU to Switch, I think. And WiiU released not long before PS4 and Xbox One and struggled to match 360 in many games.

True. In the case of PS3, it was delayed and was not intended to release after 360. In that time there is nothing to suggest Sony were able to leverage that time to improve the base specifications. Had it actually launched on time I cannot imagine what the basic OS functionality would have been like because it was pretty rough when it launched in the EU, which was 5-6 months after Japan and USA.

Microsoft were changing the original Xbox right up to the wire, like the infamous change from AMD to Intel CPU cores. Sometimes a delay gives you time but you can do nothing with it. Sometimes you change things for no apparent reason, like the AMD-to-Intel switch being "pure politics." according to the original Xbox designer.
It doesn't really matter why something came out later, only that it did. You would expect a system that came out later to unilaterally outperform one that is older, but with PS3, that was not the case, as most 360 games looked or ran better, often with quicker load times. And Wii, yeah, that didn't hold up next to 360 at all either.

I mean, has anyone here looked at the games that were on Xbox (original) and Wii? The Wii port of FarCry is terrible. The Xbox original is playable. Or how about Splinter Cell Double Agent? The Wii version looks a generation behind even though it's technically a generation ahead.

Here's the thing. Through Wii, we get a good look at what Gamecube could do with a bunch of extra memory, with real development effort optimizing for the system. It even has extra frequency to make it even faster. But I don't think anyone would think it could run Doom 3 and have it look as good as the base Xbox version, much less one with extra memory.
 
It doesn't really matter why something came out later, only that it did. You would expect a system that came out later to unilaterally outperform one that is older, but with PS3, that was not the case, as most 360 games looked or ran better, often with quicker load times.

Of course it matters. A design that is released late to leverage the latest technology you would expect to be more powerful. But if a design is delayed because of a production issue then you wouldn't.

As @Rangers said, the Switch launched four calendar years after PS4 and Xbox One just because. It didn't use any cutting edge technology and no production issues were reported. It probably wasn't possible to launch it earlier because the software (games) were not ready. Nobody expected it to have better performance than the four-year devices it was now competing with.
 
It a bandwidth starved machine with a CPU that had less power than the PS2 CPU. If It had at least used 8MB EDRAM with 18GB/s it would've greatly helped on the bandwidth issues. But split the 64MB memory as main RAM.


There was a multiplayer dev on here years ago that put that to rest. He said if devs took the time to optimize the XBox, it would just take any ps2 game and improve everything, fps, resolution, and sometimes textures. State of Emergency is a good example, they built everything in mind to take advantage of the ps2's high bandwidth with tons of NPCs on screen. Later they decided to do an Xbox port and ran into some issue, but it ended up being 2x the framerate with higher-res textures. I can dig up the article if you'd like.

And there were other examples of CPU-intensive games with higher framerates/and or resolution on Xbox. There were literally several cases where the fps was double over the ps2, and ps2 was lead platform. The Ford Racing series comes to mind. Games like Hulk were open world with tons of objects onscreen and developed for PS2 480p, and was 720p on Xbox. Xbox had a higher framerate according to Gamespot.

 
The PS2's Emotion Engine and it's dual vector units were a powerhouse for sure, but whatever advantages they might've provided were not easy to harness. One VU was dedicated to graphics (geometry, T&L), with a direct bus connection to the graphics synthesizer (GS) chip. The other VU was tied to the MIPS core along with the FPU, acting as another SIMD capable coprocessor for devs to use in whatever manner they wanted. Problem is that there were difficulties in using the second VU. Without the direct connection to the GS, it was hard to use it efficiently for graphics. Hell, I think for anything, there were issues in using it. Plus the FPU despite being not as wide (can't remember if it was single wide or dual SIMD capable), was probably enough for most games so devs often didn't bother.

OTOH, the Xbox GPU had two dedicated vertex shader units in a fully realized graphics pipeline. The CPU had it's own built in SIMD capabilities not sidelined to graphics. There was a massive amount of RAM in comparison. Plus it had DirectX so devs didn't have to play so many of the software and optimization games that PS2 devs did. It was easier, and already faster. Without getting into SIMD specifics of the EE's vector units, the two main advantages of the PS2 were it's eDRAM bandwidth and pixel overdraw. There are actually some examples of these two aspects that were beyond the Xbox, and even problematic for the 360 and PS3 to match when considering render resolution targets for the respective systems. ZoE: The 2nd Runner and Gran Turismo 4's heat shimmer effect come to mind, the former's HD remaster being a force of nature. Doesn't really matter though.

Xbox was a system with very specific context and bleeding edge dedicated graphics hardware. It had DirectX, and a full sweet of graphics features ready to exploit that the PS2 technically wasn't even capable of in hardware. PS2 lacked that context being based on the old way of doing graphics before full pipelined GPUs.
 
On a related note, I knew the (console and PC) apocalypse was at hand when I saw XBox LAN parties thanks to that ethernet port.
 
Back
Top