Could this gen have started any earlier?

Shifty Geezer

uber-Troll!
Moderator
Legend
Inspired from another post, we consider this gen late and last gen really, really old. But could this gen have started any sooner? Let's say a new machine launched 2011 as was the expectation in the "predict next gen hardware 2006 edition" thread. Would it have been any good? Seems to me this gen is about a reasonable update from last gen. The power increase isn't massive but the visuals are notably better. So gut reaction is a 2011 machine at a reasonable price-point would have been a half-way improvement, not good enough to really usher in a new gen of games and experiences.

What was comparable PC components of the time and at what pricepoints? What kind of specs and results could we have been looking at? Current gen ish graphics, dialled down to 7, at 900p from a $400 console?
 
Yes, I think MS could have and should have launched in 2012 at the latest. 2011 was possible if they had been willing to take a loss again, but 28nm wasn't really ready yet so I'm not sure if it was possible to swing that unless they wanted the supply issues of the 360 to make an reappearance.

Maybe Kinect took to long to develop (boy, does that look like a mistake now) and the reset of their "vision" as well, but everything else in the X1 could have been delivered in 2012. Launching a year a head, I think they would have been in a much, much stronger position now. Sony could have launched a year earlier as well, although they would have had to go with only 4 GB of RAM probably.
 
Yes, they could, but the technology step would have been even smaller.
Jaguar is much more potent than its predecessors. GCN was not really in mass production. Yes they could have launched soon, but at what price.
From last gen to current gen, the step was "small". But that is the price of better graphics. The better it gets, the more irrelevant it gets. Last gen games are still looking amazing. There is not such a big step like from PS2 -> PS3 or Xbox to Xbox 360. No, the step to the current gen is a really small one.
But therefore the current gen consoles are not so loud like the last gen, and that is what counts most, at least for me :)
Maybe they could have build in much better hardware, raised the price a bit more and than we would have got much louder machines. But nowadays, silence is gold. Also we don't have masses of consoles that fail because of the heat.
 
2011 was possible if they had been willing to take a loss again
They'd just ton a bunch of work to lower the 360's long-term production costs, and the aging console's sales were still rising leading up to that year. As a consumer it sort of makes sense to think the generation was too long, but Microsoft had every reason not to worry about pushing out a replacement that early.
 
2011 is pre GCN, 28nm and Jaguar...the previous stuff from AMD would have been significantly slower with a similar size... their APUs in 2011 were using VLIW5 GPU and k10.5/Bobcat cores at 32nm GloFo and 40nm TSMC

800sps and 4 higher clocked cores seems realistic
 
2011 is pre GCN, 28nm and Jaguar...the previous stuff from AMD would have been significantly slower with a similar size... their APUs in 2011 were using VLIW5 GPU and k10.5/Bobcat cores at 32nm GloFo and 40nm TSMC

800sps and 4 higher clocked cores seems realistic
Is there an example of that setup in PC running a modern game like SWBF or FO4 or Witcher 3? How does it fair?
 
Is there an example of that setup in PC running a modern game like SWBF or FO4 or Witcher 3? How does it fair?

There are some APU results on GameGPU.ru:

SWBF
http://gamegpu.ru/videoobzory/star-wars-battlefront-beta-video-obzor-apu.html
FO4:
http://gamegpu.ru/videoobzory/fallout-4-video-obzor-apu.html
Witcher 3:
http://gamegpu.ru/videoobzory/the-witcher-3-wild-hunt-video-obzor-apu.html

Pitty about the limited settings and resolution, makes the results look a bit like a "why did you bother with this test?" effort. Based on the videos it may simply be some PR effort by AMD to show off against Intel for integrated graphics.
 
Is there an example of that setup in PC running a modern game like SWBF or FO4 or Witcher 3? How does it fair?
I played the witcher 3 on a q6600 and 6950 and it ran fine
it would of been playable at ultrawide resolution just

VzD2rx4.jpg

WNSMwOQ.jpg
 
There are some APU results on GameGPU.ru:

SWBF
http://gamegpu.ru/videoobzory/star-wars-battlefront-beta-video-obzor-apu.html
FO4:
http://gamegpu.ru/videoobzory/fallout-4-video-obzor-apu.html
Witcher 3:
http://gamegpu.ru/videoobzory/the-witcher-3-wild-hunt-video-obzor-apu.html

Pitty about the limited settings and resolution, makes the results look a bit like a "why did you bother with this test?" effort. Based on the videos it may simply be some PR effort by AMD to show off against Intel for integrated graphics.

http://www.anandtech.com/show/9763/the-amd-a8-7670k-apu-review-rocket-league/5

I know its new but its still the same old bulldozer kit on the acient process.

The AM 8 can run Alien Isolation at 720p ultra setttings at 33fps or gta 5 at 720p low settings at 54.88 fps
grid auto sport at 52fps 1080p medium . Modor you get 47.5fps on 720p low
AMD A8-7670K
2 Modules, 4 Threads
3.6 GHz (3.9 GHz Turbo)
R7 Integrated Graphics
384 SPs at 756 MHz



But I certainly think they could have launched in 2012 and went bulldozer. Though I think they would have been better served using a cpu and gpu instead of an apu.
 
A posteriori, I would not launch either in 2011 or 2012. If I were to start the ninth gen earlier I would have started it in 2010 in the wake of the availability of 40/45 nm lithography. the IPs powering the seventh generation of consoles were still relevant, and some tweaking here and there as well as extra silicon could have gone a long way. Following suit to the original XBOX, both the PS3 and the 360 were gigantic step forward from the previous generation "standard" aka the Playstation 2: lots of RAM, crazy fast CPU, GPU, and a matching power consumption. On the PC side of thing manufacturers were also pushing tdp through the roof. I believe that either Sony or Microsoft could have steal a page from Nintendo's book and go with a more progressive jump.
That strategy could have been particularly sensical for MSFT as they had greater plan for their system and my experience with the 360 is that the system was pushed a little too hard and the system limitations also hindered the effort MSFT put in. Microsoft had greater financial backing the Sony at the time (no matter the Rrod CF). MSFT also went through the significant effort of creating Valhalla and that money could have been passed to the new model.
Lets introduce the Xbox 540
NUMA PC-like architecture
Xenon II: 4 Xenon II cores, 2MB of L2, clocked slower, integrated memory controller /northbridge, single channel, connected to 1GB of DDR3 (10.6GB/s).
Xenos II: Redwood PRO under disguised, half the memory controller disabled ala RSX, 512MB of GDDR5 (32GB/s).
x4 BRD player / Standard HDD / low power system ,/ silicon budget below their Valhalla APU / use the same peripheral as the 360.
Backward compatible system (on a per title basis /software solution), launching 299$ launching H2 2010. BRD playing would have been activated on-line for a fee.

The replacement would just have it the street.
 
PowerPC rightfully died, no thanks. :p

That said, I'd be curious if they'd have done Xenon II with OOOE back in.
I'm not sure it'd be worth going with such a stepping stone as Redwood though, or at least I'm not sure what the point would be as opposed to just superclocking Xenos and be done with the R&D.
 
Last edited:
PowerPC rightfully died, no thanks. :p
We can't rewrite history, but sure another console contract or two could have prolongated its agony :devilish:
That said, I'd be curious if they'd have done Xenon II with OOOE back in.
I'm not sure it'd be worth going with such a stepping stone as Redwood though, or at least I'm not sure what the point would be as opposed to just superclocking Xenos and be done with the R&D.
In 2010 there was no good reasonably cheap CPUs for console manufacturers to use outside of custom solutions.
For Xenon, use EDRAM for the L2, increase the L1 size and then the integrated memory controller. With a relatively tiny R&D budget I would have pass at any significant changes, it would have remained an narrow IO core supporting 2 ways SMT. I would not touch the VMX128 units either, improvement or fixs, but same ISA no widening.
Looking at the WiiU CPU size, it could have ended delivering really interesting performances within a tiny package and well at this time there was no other options.
I believe both the Cell and Xenon could have seen one iteration had the manufacturers decided on a shorter hardware cycle. The cell would have been particular as it was so powerful to begin with. A second PPC along with 2MB of shared L2 would already go a hell of a long way balancing things out, then supported for DDR3. Not too crazy on the R&D side of thing and that would have been a hell of a tiny monster.

For the GPU I search for something cheap, not requiring R&D and it sorted of worked with the silicon budget I had in mind which ~170 sq.mm ~Valhalla. Redwood is 104 sq.mm. It would not such crazy thing to do, the XBox and the PS3 used close to off the shelves part for a GPU. Another reason I thought of it instead of simply a bigger Xenos (and a bigger daughter), is that for the low cost I wanted to reach having three chips sounded like a bad idea. So in my mental world, I shifted all the R&D to the CPU and ganging the design from a UMA to a NUMA. In my mental I also chose narrow buses in case they deem it worthy to shrink the chips.

The thing I like with that cheap approach is that the competitor caught its pants would have to wait till 2012 for 28nm to become available and to offer something significantly better and still with no great CPU options arounds, especially CPU that could end in APU.
 
Last edited:
No.

40nm GPGPUs would have to follow the linear route and be discreet GPUs.

Both 2011 solutions by AMD and nvidia were too hot and too power hungry even if both consoles would use launch PS3 copper heatpipe heatsink fan systems.

Current gen console GPUs from the APU are in the 1.6~1.8Tf computational power...if we go 2011 (compared to what we got now) we would be consuming almost twice the Wattage and running 80degrees average on a full power AC room...remember PCs are larger boxes compared to consoles and to put it in perspective the nv47/G70 GPU was initially fabbed at 110nm on discreet cards at 302 million transistors which did benefit from the 90nm mass production (yields, costs although weren't Sony desiring 65nm? We could have gotten G92 based RSX instead for twice the GPU performance with MS going RV670 also)

CPU wise a 32nm CellBE evolution and Xenon evolution as a hexacore would be feasible.

Storage would be Blu Ray but ram...that's a bit tricky because of the different solutions we have now...they would have to be $600 to $700 dollar consoles with optional camera/kinect as extra $$.

You also have the hackers adding to cost of development because both last gen consoles...were hacked and pirated...making for a much earlier anti-hacking solution which might not have been as good as what's on now.

I voted for a delayed current gen on much more mature 28nm and discreet CPU and GPU solutions but only because the graphics whore in me wanted the then unknown GTX 980 GPU with its efficient tdp and cool running temps plus ballpark 4Tf compute forcing MS to have to use either a 280x or maybe a 290x Radeon variant. :D

CPU would be a 2014 fabbed solution or evolved Jaguar cores at nearly 3.0Ghz x8 cores.
 
The economy problems greatly manifested themselves in 2008 and probably a year earlier.

That said the notion of people being under the impression that console generations last an average of five years are heavily flawed.

If both Microsoft and Sony (and by default Nintendo) had chosen to wait for 65nm and 55nm engineering processing for CPUs (evolved CellBE, Xenon and Wii solution all cooler and faster with more cores) and GPUs (preferably GT200 and RV770+eDram 20MB) we would be talking of a bigger jump in image quality and standard resolution options plus compute forcing the current gen to have to be much more powerful at least launching in 2015 with extremely mature chip process nodes or wait for the drop.

Previous gens that lasted shorter times did so because there was more competition back then and four big consoles or more are just not feasible when all makers want a bigger pie slice.
 
2011 is pre GCN, 28nm and Jaguar...the previous stuff from AMD would have been significantly slower with a similar size... their APUs in 2011 were using VLIW5 GPU and k10.5/Bobcat cores at 32nm GloFo and 40nm TSMC

800sps and 4 higher clocked cores seems realistic


Yeah, I had a HD 4890 (800 SP's) in my PC that was factory overclocked to 950mhz...since 2009. It had more flops than the GPU in Xbox One (although the XBO one is much more advanced and probably significantly more capable in reality), which makes me sad...

It would have been an immense power hog though. Although it's still TINY compared the behomoth's of current GPU's. Amazing how each gen of PC GPU's got a much bigger form factor. Until the point where they just couldn't get bigger I suppose.

At least the RAM was plentiful this gen, on the bright side.

All in all, it would have depended on priorities, top of the line in 2011 might have competed more or less with what we got in 2013, if the platform holder was willing to pay for it. but given what we got in 2013 especially from MS, anything earlier would have been quite distasteful.

1080P is an issue too. You want to be at the native res of people's displays. Xbox One even failed to do that, but at least the majority of titles (that I've tracked in my sig, anyway) on One are 1080P. To go lower earlier would have meant more 720P.
 
Last edited:
Back
Top