Yeah I'm starting to think an all AMD Xbox 720 box is the way to go. Throw the latest Phenom in there along with a decent ATI video card and call it a day. They are simple to code for, everyones familiar with them, tons of mature tools exist for them, and they run relatively cool which is very important for Microsoft in particular. Porting those games to PC would be cake. They could swing a package deal from AMD for the cpu/gpu guts of the machine and maybe land a $299 launch price point.
Even if an AMD cpu is slower than a Cell2, it won't really matter. It takes a lot of extra cpu power before the typical consumer will even notice the difference, more so if the tools are comparatively poor on a Cell2. So if an AMD driven cpu is ~25% slower than a Cell2, no big deal. Mature tools will let your launch titles look really good, and the titles at launch + 1 year will look very very good which is critically important since those are the titles that will likely be going against the PS4 launch. Larrabee seems like too much of a risk at this point. Plus, letting Intel sit out yet another generation of consoles will make them even more malleable next time around for Xbox 1080 negotiations.
I think the whole idea of spending a fortune on a console and hope to recoup it in 10 years is over. Get in cheap, milk it for ~5 years, then get the next box out. Start making profit from just 6 to 12 months in, instead of waiting years. Part of the logic of keeping a console out there for a really long time was because it takes developers and tools a long time to get to speed. But that argument gets thrown out the window if the machine is simple to code for from day one, as an all AMD box would be.
An "all AMD" machine really isn't a horrible idea if AMD offers a solid cost outlook (and adding ~10M CPU and ~10-20M GPU orders a year wouldn't hurt, nor would potential royalties, cross platform support, pushing forward platform specific features, and so on). e.g. Look at the lifecyle of a 5-7 year console:
Stage #1: Press Release Stage
Stage #2: Initial Wow Factor
Stage #3: Longterm Potential
In
Stage #1 you need (a) great paper specs (programmable performance/FLOPs) and (b) tech demos. Assuming a fixed area budget for silicon there would be good reason to shift realestate to a GPU as GPUs have a lot more execution units. Per mm2 a GPU is going to offer more theoretical bang for buck; Cell may walk all over an X86 CPU in peak FLOPs per mm2, but a GPU does the same to CELL. iirc The Xenos parent die was ~180mm2 (w/ 232M transistors @ 90nm, TSMC) and the daughter die was ~70mm2 (w/105M transistors @ 90nm, NEC). Xenon (iirc) was 165M transistors (1MB cache) ~160mm2 on 90nm (Charter, TSMC). That is 410mm2 for the Xbox 360 silicon budget. Some devs have suggested the performance bottlenecks will swing in a direction where eDRAM's cost/benefit will be more of a negative, so if MS opts for an eDRAM-less design they have about 400mm2 of silicon to work with (assuming similar budgets and givens as the 360). AMD, per mm2, wasn't too far off what IBM did with Xenon (e.g. AMD's X2 had 154M transistors, 2x512KB of cache included, within 142mm2 on 90nm).
Looking at 32nm, does MS really want/need 12-16 X2 style CPUs? An X2 core is much faster than a Xenon CPU core per clock, but even then you have to wonder the utilization of 12-16 standard cores. What if MS chose 6 AMD style cores and shifted the left over CPU budget and the eDRAM budget over to the GPU (~150mm2)--that is nearly a 100% increase in GPU silicon realestate. Execution units in GPUs are quite small, but even assuming only a 50-80% increase in programmable GPU FLOPs hits home as the Peak FLOPs (
important in marketing, Stage #1) have shot through the roof.
Unless Sony and Nintendo made similar design concessions it would appear MS would easily take the FLOPs crown.
From a tech demo perspective MS would be able to push out some amazing graphical and GPU-based physics demos to wow the press and consumers alike. Just look at Cinema 2.0 or the Ruby/City Raytracing tech demos. And it wouldn't be hard with DX and massive developer support/experience to hire out Epic, id, Crytek, etc to do something special for the release announcement.
Stage #2: Initial Wow Factor. Long before you hit your release date you need to give developers the resources to produce that will be available, and wow consumers, in the launch window. A mythical all-AMD machine would leverage fast x86 cores and DX. The latter is important because MS controls DX and is one of MS's key assets in the console arena. Getting developers to migrate from 6 threads of Xenon code on 3 cores to ~6 discrete x86 cores would be a chore but many developers are already multi-platform on the PC so it isn't a complete rewrite and it won't hurt that these x86 cores will be substantially faster than the Xenon threads. And while an x86 core is going to have a much lower potential than some other designs what it does do for the launch window is important: sluggish code and/or engines that continue to be capped by a primary thread dependancy could benefit in the short run (yes, write better code; yes, task your jobs better so you don't have this sort of bottleneck). Very demanding and/or low fruit could be tasked to the GPU (e.g. cinematic physics).
But the big win in
Stage #2 is graphics. Release games look good compared to the old generation, but titles in year 2 and especially 3 and 4 really make launch titles look pretty poor. One hurdle is the reality of diminishing returns (if we jump from the ~10k models we have now to ~100k next gen how many consumers will really notice?). But if you can toss 50-80% more silicon resources at graphics (compared to the previous generation budgets) you have a higher likelyhood of wowing with your initial visuals and convince consumers of your power. Developers will get x86, DX, proven tools, and a ton of GPU resources. They should be able to migrate over code fast to hit the release date and really push visuals. Longterm they may be fretting over the fact they are already at the wall on the CPU cores, which leads to:
Stage #3: Longterm Potential. Whether MS goes with Cell CPUs, a Larabee style CPU, or something else the problem will always remain that developers are going to have to learn something new, adapt new tricks and techniques, and overall work outside the box and invest a lot of R&D to get the last % performance. They are going to face issues where large teams are going to run into issues where junior programmers can really screw things up and the best approach isn't always transparent. In an industry dominated by strategic release dates spending months testing new techniques takes away from adding features to the core product. Any direction MS goes they are going to have developers complaining (some very vocally) and it isn't going to be easy for anyone. But it is important MS have the resources available for developers that, who desire, to have something tangible to tap into.
The big picture is MS owns DX, they can do with it what they want. And through DX they can nudge and direct IHVs to a small degree. GPUs are already on the path of opening extensive GPGPU support so it wouldn't be so wild for MS to leverage this as "their direction." Even with less efficient algorhyms I think this could prove a long term win. Imagine a technique (e.g. a physics task) that takes 2% of CPU resources per frame but 10% on the GPU. We could shift our silicon budget back to the CPU and gain 50% (and get to 1%) ... but the flip side would be losing 30-50% GPU performance. I am sure there are scenarios where a GPU centric comes out short, but there will be many, many examples where GPU centric comes out ahead in the first thing consumers see (visuals) and may offer peak performance that a competing CPU cannot attain in some areas.
A GPU centric design would open the door to potential. It doesn't force developers to go this route, but it would be in their interest long term ... as well as MS's. Remember, MS owns DX. Seeing developers write important non-graphic game code through DX would make them giddy. Of course if Sony and Ninny go with GPUs with basic DX feature support MS also just created a portable platform: If you write some nifty physics code on MS's Xbox/AMD GPU you could port this over to Sony's mythical NV GPU PS4. Sure, Sony may want you to rewrite the code to their Cell CPUs (if they go that route) and depending on the PS4's budgets it may be necessary, but it may remain an option.
Right now, though, who knows what will happen. This gen is just kicking off into the mainstream. We do not know the release windows yet. We don't know the partners. We only have a vague idea of what process nodes will be available in the 2010-2012 window. We don't know who has what up their sleeves beyond 32nm. We don't know how useful DX11 will be yet (especially compute shaders), and we have no idea if Larabee will offer similar visual quality/performance per $ compared to AMD & NV. We don't know the future of Cell. We are in the dark about future input mechanisms being developed. At some point we would need to do an overview of all the players and what we know they may be able to offer.
NV. Strong DX GPUs as well as blossoming GPGPU potential. Great visual performance and great visual quality. The Cuda and AGEIA moves are strong considerations as well as Cg. Between the handheld market and consoles NV can be picky about what deals work best for them and may view their product as a premium and they don't seem included to special designs. Fiscally strong although the G200 misstep as well as potential GPU recalls show how fickle the industry is. Being squeezed by AMD and Intel doesn't help so they may be inclined to be aggressive.
AMD. Inhouse GPUs and CPUs. Strong DX GPUs as well as blossoming GPGPU potential. Great visual performance and great visual quality. Moving toward multi-GPUs offers some new potential (especially if silicon realeastate is expanded) if memory issues can be resolved; also allows for more flexibility if someone tries a last minute performance poison ball. Experience with eDRAM. Quality x86 CPUs, working on new vector unit designs as well as new CPU designs. Ability to use own fabs or 3rd party fabs in some situations. AMD needs cash and getting a strong demand in GPU and CPU orders could give them some cash flow. ATI/AMD has shown in the past a willingness to work closely with MS on DX and do console specific work.
Intel. Killer x86 CPUs. New CPUs (Sandybridge) and Vector units on their way. Great compilers, a ton of software support/experience. Way ahead on process technology and appears best situation to handle future hurdles. Larabee is a huge unknown (will AA, AF be any good? Will it be within 20% performance per $?) Intel has picked up Havok as well as some developers. Is Intel willing to be cost competitive for a performance ballpark to get Larabee into a console? How soon will Larabee v.2 be out? Will Intel be able to fix performance issues with their first design? How committed is Intel?
IBM. Would they do a custom Cell-style chip for MS?
They are always doing neat stuff. IBM is pretty flexible but aren't in the news a lot (and my tooth hurts) so I will end here...
But you get the point. MS has a lot of options. The keys are keeping accessibility high, early performance high, longterm performance potential viable, leverage their assets, keep costs down, and develop means to facillitate more efficient game development. I think software, more than anything, is vital to next gen. Would MS be crazy to pick up an engine developer? If MS went with NV, there is an engine developer quite fond of NV and uses AGEIA... Content creation is a big issue, as are budgets and release dates. Toss in storage issues, online, memory designs, input mechanisms and MS's 1st party issues and I think they have a LOT cut out for them. If they don't release a year early all those cheap exclusives are GONE. I don't think you can gamble on what Sony will do this time, so contracting exclusive support early on is going to be important.
Btw, if my post pains you to no end, well then join the club. Thank my dentist for ripping, cracking, and snapping a wisdom tooth out. I gotta share the pain.