What if MS, Sony, Nintendo go back in time to change 360, PS3, Wii HW circa 2004

Proelite

Veteran
Supporter
What would have they done differently?

MS:
1. More budget towards cooling! Make the console larger if necessary
2. Dedicated Audio DSP
3. 12-16mb Edram instead of 10mb
4. No tard pack, launch with 20GB Hdd default at $399
5. Wifi standard
6. Fix D-pad

Sony:
1. Target release in mid - late 2007
2. Add Extra PPU in Cell
3. Drop Nvidia from the start, switch to AMD
4. $499 SKU only with 40GB HD
5. More upgradeable OS with non-shitty update system
6. Idk if it was possible to increase / unify the ram, but they should look at it as last priority.
7. MUST WIN DIGITAL FOUNDRY battles convincingly 100% of the time.

Nintendo:
I think it went well for them.
 
microsoft could have waited an extra year and launched with sony and fixed up their crappy disk drive and cooling system and thats all they had to do.

Both microsoft and sony could have went with AMD + ATI x86 cpus so they can have BC this gen for cheap. Dual core athlons 64s + unified shaders could be had in time if they launched in late 2006.

Nintendo would likely have planned an earlier successor to the Wii and plan for 5 year console generations to keep up with sony and MS.
 
probably Cell was never a good idea anyway?

using a more xbox-like CPU and a better GPU (it was 1 year later) would have worked better?

let's say a PS3 with a Xbox 360 CPU or something more like it, 1GB or 768MB of DDR3 and half or 1/3 of a G80, wouldn't it be massively better? I know it's probably unrealistic due to cost/time but... I always wondered how good would something like a G84 (8600GT) have looked on a console, G84 was already cheap enough in early 2007, and at least on the PC it started to clearly outperform the 7600-7800GT on newer games.
 
probably Cell was never a good idea anyway?

using a more xbox-like CPU and a better GPU (it was 1 year later) would have worked better?

let's say a PS3 with a Xbox 360 CPU or something more like it, 1GB or 768MB of DDR3 and half or 1/3 of a G80, wouldn't it be massively better? I know it's probably unrealistic due to cost/time but... I always wondered how good would something like a G84 (8600GT) have looked on a console, G84 was already cheap enough in early 2007, and at least on the PC it started to clearly outperform the 7600-7800GT on newer games.

What about a Cell variant with 2 OOE slow PPUs, and 4 SPUs. I think a design like that would have performed quite well in the real world.

2 OOE PPU at 2.4 ghz
4 SPU at 2.4 ghz
1 GB of GDDR3 ram
500 gigaflops ATI gpu with 16mb of Edram
 
Don't forget the elephant in the room.

360, with HD-DVD drive builtin, may have changed the landscape of HD movies

360- 12mb EDRAM for better 720 support.
PS3 - Unified memory, ditch rambus and crossbar.

If I recall the PS3 GPU was added late, due to the 2nd Cell not performing acceptably. Nvidia bent Sony over the barrel. MS probably had AMD locked down to exclusivity and Nvidia knew it.
 
Last edited by a moderator:
What if they go back in time, or what if they COULD go back in time?

Oh, and my answer is that it's a waste of a time machine.
 
What if they already did? What we remember is exactly the result of their 1-time-use time machine.
 
fun again:

XBox 360:

4 core PWRficient CPU @ 2.4 GHz with 4MB L2-Cache

1GB ; 256bit UMA GDDR3 @ 800MHz (=51,2GB/sec)

64 ALU AMD GPU (complete GPU with ROPs, northbridge for the CPU etc..) @ 600 MHz (=384 GFlops)

no eDRAM-chip. The 51,2 GB/sec should be good enough for 1280x720 HD

HDD for all versions

quiet and efficient cooling system

Downside:
the 1GB 256bit GDDR3/5 would still be expensive during the lifetime of the console due to the 8 memory chips needed.
 
fun again:

XBox 360:

4 core PWRficient CPU @ 2.4 GHz with 4MB L2-Cache

1GB ; 256bit UMA GDDR3 @ 800MHz (=51,2GB/sec)

64 ALU AMD GPU (complete GPU with ROPs, northbridge for the CPU etc..) @ 600 MHz (=384 GFlops)

no eDRAM-chip. The 51,2 GB/sec should be good enough for 1280x720 HD

HDD for all versions

quiet and efficient cooling system

Downside:
the 1GB 256bit GDDR3/5 would still be expensive during the lifetime of the console due to the 8 memory chips needed.
I was thinking of something really close in previous post . Such a set-up could have been troublesome to shrink past the 55nm node though may the generation had a "normal" length it would not have been an issue.

Taking that in account my "last" word on the matter, still the same easy a posteriori line of thinking, knowing where software and hardware was headed +8 years after the design, I would go with something like that:
1 APU + Smart eDRAM

APU: TSMC bulk 90nm process
4 PPC 750 @800MHz
4 Xenos SIMD array + 24 Tex units @ 400MHz
A sound processor.

Smart eDRAM: NEC 90nm process
8 ROPs, 8MB, @ 400MHz

Memory set-up:
128 bit bus to 1 GB of DDR2 800MHz
Same 32GB/s link between the main die and the smart eDRAM.

Power consumption and costs:
IBM has PPC 750 at 16mm^2 on their process, Broadway is 19mm^2. Bulk is denser.
Xenon is ~180mm^2.
I think that the APU would have end in the 300mm^2
Wrt to power, with the down clock of xenos (20%), no matter the extra ALUS and the inclusion of those low power CPU cores, I would no be surprised if such a chip would have burn just a tad more than the Xenos we got (GDDR3 memory controller like their GDDR5 successors may burn more power than their DDR2/3 counter parts).
The smart eDRAM would have been a tad tinier.

-------------------------
The reason behind those choices:
1) speed demons for the CPU were a bad choice, they burn a lot of power, it still show after many shrinks. That was not know at this point in time by the brightest minds be it Intel or IBM, etc. were hoping for 10GHz monsters to take over computing.
2)They were quiet big on top of it, it is tough to get close to their peak throughput, etc.
3) The WiiU is sort of a testament to that, 4 cores (even clocked lower than Expresso 1.2 GHz) could have done the job. In some regards it would underperform Xenon (not too mention the Cell) but when it is all said and done looking at both the silicon and power footprint... Well Nintendo made a wise choice. It is a worthy trade off.
4)Looking at this gen and the PC world, more RAM was better than faster one. In the life spam on the system DDR2 (even 1GB) should have been cheaper than the GDDR3. I could not find when DDR2 PC6400 were release, if not available I guess slower DDR2 could have worked. Looking at that hypothetical system, CPU could not go through as much data as Xenon (not too mention Cell...) freeing more bandwidth for the GPU. Overall I think it was a worthy trade off.
5) bigger but slower GPU=> power consumption would have been improved, not too mention reliability would have been less of an issue.
6) cooling, heat, reliability: we would have a single chip burning north of what Xenon or Xenon alone burnt. It would have been bigger so it would have been easier to dissipate heat, there would have been from scratch a single cooling solution.
7) the eDRAM size. Clearly 10MB looks a lot like a relic of a system aiming at SD resolution +AAx4. Looking at how AA evolved (toward not hardware based solution...) I think that 8MB was just fine, along with a down clock (synchronized with the GPU speed /400MHz).
8MB is all that you need for a forward renderer @720P. A trade off is that even tight G-buffer close to 720p would not have fit. Imo looking at the perfs of a lot of games this gen and resolutions devs chose I don't think it would have been an issue, something like 720x960 could have been the de facto targets for G-Buffer. Looking at the perfs of say Crysis... I'm not sure if it would have been such an issue. PR wise the free x4 AA proved more of a clusterfuck than anything else, actually the whole mandatory 720p requirement was more a hindrance than anything else.
8) Design main focus would not have been about adding crazy custom (2 wide /paired single) SIMD unit to the CPU but on the memory subsystem. Good cache hierarchy, fast access to the memory controller and implementing the silicon on a bulk process.
9) Xenos was the right architecture, nothing to change here. Memexport was forward looking would have been great in an APU with fast communication between the CPU and the shader core (/in a APU)

Overall in some regard that system would be inferior though I think the extra RAM would have made a lot for the system.
 
Last edited by a moderator:
What would have they done differently?

MS:
1. More budget towards cooling! Make the console larger if necessary
2. Dedicated Audio DSP
3. 12-16mb Edram instead of 10mb
4. No tard pack, launch with 20GB Hdd default at $399
5. Wifi standard
6. Fix D-pad

Sony:
1. Target release in mid - late 2007
2. Add Extra PPU in Cell
3. Drop Nvidia from the start, switch to AMD
4. $499 SKU only with 40GB HD
5. More upgradeable OS with non-shitty update system
6. Idk if it was possible to increase / unify the ram, but they should look at it as last priority.
7. MUST WIN DIGITAL FOUNDRY battles convincingly 100% of the time.

Nintendo:
I think it went well for them.

The best way is for all three to wait for 2007.

Microsoft would have a 65nm xenon cpu, a 55nm gpu similar to radeon 3870 plus at least 24 to 30MB eDram..a 256bit bus, 1GB unified ram. a gigabit ethernet disc drive could be hddvd.

Sony a 65nm CellBE all spus enabled and 3.8Ghz....for gpu a 55nm nVidia G92 with 256bit bus, 512MB xdrdram and 512MB Gddr3 at 1100Mhz....sata2 interface and 6x Blu ray drive... dedicated hardware spund chip and the same heavy duty heatsink fan system the PS3 already used...

Nintendo....perhaps give them a full 55nm nVidia G70 based gpu clocked at 750Mhz...and a 4x blu ray drive...

this takes into account the hindsight advantage to waiting for hardware advancements...leaving the storage format war about an extra year and a half for high end customers much like dvd was...and taking advantage of die shrinks for cooler running cpus gpus... of course it would be nicer for more power just keep in mind that an xbox 360 powered the way I say would easily offer over 2.5 times the rendering power...likewise for such a PS3....while nintendo would be maybe at the current level of graphics power.... and this is from a power monger point of view...

PC would have had Dx10.1 as the standard requirement and Crysis would have easily been long forgottent instead of being used as a bougey man.
 
MS launch in 2005 single system with hardrive and 756 megs of ram. Sony , forget bluray and launch in 2005. It was the only thing that would have saved them.

Waiting 2 years bluray still wouldn't have been cheap and the xbox 360 would have had 2 years to cost reduce. I don't think any graphical jump they might have gotten would have over come the lead and price points.
 
MS launch in 2005 single system with hardrive and 756 megs of ram. Sony , forget bluray and launch in 2005. It was the only thing that would have saved them.
Save them from what? And Sony launched later how could have MSFT changed what was already shipped? (not too mention they already bump the memory from 256MB to 512MB and I'm not sure they could go further at that time on a 128bi bus).
Waiting 2 years bluray still wouldn't have been cheap and the xbox 360 would have had 2 years to cost reduce. I don't think any graphical jump they might have gotten would have over come the lead and price points.
There is almost 1 and a half year between the launch of the 360 and the launch of the PS3 in Europe.
I think they could have launched in 2007 world wide. The PS2 were still selling quite well too.

Imo Sony were not under pressure to launch anything. What they launched or could have launched is another matter though. In fall 2007 65nm GPU were shipping for the ref.
 
Save them from lossing half thier maket share and half a decade or more of losses.

The question was what should they have changed going back to 2004

Ms should have gone with more ram and launched at $400 with a hardrive. No other sku.

Sony should have stuck with dvd and launched in 2005 with ms . Being a year late and more expensive cost them big
 
According to Mark Cerny's recent speech, Cell was basically done back in 2003. And I vaguely remember it being a 4 year, $400 million development process, so you'd need to go back to 1999 to avoid wasting money and fix the PS3.
 
According to Mark Cerny's recent speech, Cell was basically done back in 2003. And I vaguely remember it being a 4 year, $400 million development process, so you'd need to go back to 1999 to avoid wasting money and fix the PS3.

You make it sound like Cell was a huge mistake...it would derail the thread but then why the hell did Microsoft ditch the Xbox 1's X86 cpu in favor of a PPC based and then turn around and go back to X86? the answer is obvious, every new console has different hardware, you cannot expect game consoles to simply have the same, but evolved hardware ever console generation and all those CPU contracts do cost a lot of millions...

Also CellBE paid off in terms of the years the PS3 has been out for like what seven? and will no doubt continue to get support for at least a couple more years... yeah it was not the greatest thing since sliced bread... but it sure did its job for a CPU... its too bad that the competition got started in 2005 and by relation the console generation... not that these companies care but they really should have waited a couple more years but thats all in hindsight of course :D

Wasn't it stated that Sony engineers desired 65nm for Cell because of their 4.0Ghz fantasies back in 2003? sure we all know what happened but at 65nm for a "what if" 2007 PS3 could definetly have shipped as a 3.8Ghz with all 8 SPUs enabled... of course once they put their money and make decisions in stone its pretty much dead but this is a fantasy debate.

MS launch in 2005 single system with hardrive and 756 megs of ram. Sony , forget bluray and launch in 2005. It was the only thing that would have saved them.

Waiting 2 years bluray still wouldn't have been cheap and the xbox 360 would have had 2 years to cost reduce. I don't think any graphical jump they might have gotten would have over come the lead and price points.

No of course Blu Ray nor HD-DVD would not have been cheaper...but the mass production drives in 2007 and 2008 would be much faster... the prices would have been similar and Microsoft in a fantasy world could have gained a lot from having an HD-DVD drive's storage capabilty and a full 55nm DirectX 10.1 compliant hardware GPU... it would have been a world of difference... resolutions would not have shot up but image quality, effects and framerate would have been much smoother in theory so a game targetting 30fps at 720p would have looked beyond Crysis, I would wager that CoD series would have been 720p 60fps but they would have had much heavier competition.

MS could not wait. For Sony it might have been a good idea...

Only if Sony were to do that in the real world it was a huge trade off... if they had taken that risk it would have been diminishing returns in terms of the average gamer caring about graphics. Only first party games would benefit...and they would have to be miraculous system sellers...
 
Back
Top