Larrabee, console tech edition; analysis and competing architectures

I will look to computer games development to illustrate my point. Computer games titles are made to be forward and backwards compatible with previous and future generations. So im not talking about a model that is completely out of this world in regards to computer gaming. Crysis is the perfect example of this scaleability with the future and past in mind.

With a 3-4 year model, the console manufacturer can cover all of his bases at the same time. Brackets mean discontinued sales of games and hardware, but still supported in backwards compatibility <Ancient->Ancient-(Ancient)->Past->Current-Future-Future. What it would mean is that all consoles would support two generations of games behind them and one generation of games ahead of them. The "Past" offering would be like the PS2 of this current generation. The PS3 would be the Current and the PS4 would be future. Remember with a 4 year model vs a 6 year model, in a 12 year time scale it's still the same amount of legacy support.

Now that consoles essentially use the same hardware as computers, there is no reason not to tie into the faster upgrade cycles. No longer are console manufacturers designing their own CPU, they are either modifying off the shelf type components or having a PC component manufacturer do it for them. Xbox CPU, GPU and the Wii and PS3 GPU's are all based strongly on PC hardware. One of the main components of P.C hardware is strong forwards and backwards compatibility. You can play Crysis on a 6900GT if you want to at a lower resolution. So why not do that same distinction on consoles? Is it really that hard to make a game that scales to different distinct levels of hardware? What if Crytek only had to code for a 1900xt a 3870 and a 3870x2? Generalized hardware can be scaled in a predictable way. Think 2-4 cores or 64-128 shader units.

This model does away with $400 launch consoles. The whole system, consider the cost savings of reusing the same motherboard, the same chassi, same power brick, same memory type, same storage devices, same optical drive and sticking in a new chip. This is EXACTLY the same argument brought by us PC fanboys ;) When we talk about just upgrade the memory and the GPU. Just think a smooth release of consoles in a predictable cycle.

Finally consider that we are now entering into a vastly parellel world with regards to computer hardware development. For the last 6 years clock speeds have remained static whilst the number of cores and the design of those cores have been going through shifts. Gpu hardware is becoming more generalized, P.C cpus are becoming more parellel. This may not happen in the next generation but I feel it's the future.

Try thinking about this with your P.C brain, it will be more used to understanding the need to upgrade every 2 years! :LOL:
 
The design is probably the cheapest part for a new console launched. Manufacturing and marketing are what's going to kill you if you do it every 2.5 years. Only recently did MS set aside two billion dollar for the Xbox 360 manufacturing defect. Too much risk, too little return if at all.

I don't think it's the manufacturing, it's the marketing that can kill you. (Does help if you have the "Best" Console ^^. So much of the consoles are P.C's anyway. So manufacturing costs can only go down.


But what I would like too see Consoles divided up like Graphic Cards in terms of their performance and functions. Say $300 get you the basics gaming fuction with sub 720p @ 30fps games with standard res textures and model, than you have the high end $5000 that do 1080p @120 fps with high amount of AA and high res arts as well as more than gaming functions. But launched during the same window of oppurtunity.

Thats sortof what I was saying (I know we posted at the same time hehe)

PC developers are used to moving targets. I think people have gotten too used to "optimizations" Hardware moves a lot faster than Software. I wonder, how much developer time would be saved if they didn't have to "Optimize" their code to the extent they do on consoles. Just because things are this way now, doesn't mean it shouldn't change in the future.
 
The design is probably the cheapest part for a new console launched. Manufacturing and marketing are what's going to kill you if you do it every 2.5 years. Only recently did MS set aside two billion dollar for the Xbox 360 manufacturing defect. Too much risk, too little return if at all.

$1.057 billion

Beside the consoles market are more software driven. They expect the third generation software to look better over the first generation. Developers will exploit the hardware for all its worth is surely a better option for console owner and manufacturer point of view.

Also unlike other hardware like PC or MP3 players, consoles made most of their money from software royalty.

I can see handheld moving to shorter product cycle if there are enough competition, but currently its only DS and PSP. Not like Mobile phone or MP3 player sector where its more competitive.

But what I would like too see Consoles divided up like Graphic Cards in terms of their performance and functions. Say $300 get you the basics gaming fuction with sub 720p @ 30fps games with standard res textures and model, than you have the high end $5000 that do 1080p @120 fps with high amount of AA and high res arts as well as more than gaming functions. But launched during the same window of oppurtunity.

That sounds like a really terrible idea to me.
 
Back on topic. Since Larrabee is technically a CPU too, What if a console contained 2 or more Larrabee abandoning the popular CPU+GPU combo. In one Intel slide they can chain 4 Larrabee together, would that kind of design make a better console compare to the CPU+GPU expected next gen.

Considering consoles don't have beefy CPU typically and Larrabee is not that weak of a CPU looking at the spec and it may turn out being not so great GPU but I am hoping 2 or more will make it competitive.

That kind of design can load balance not only between vertex and pixel load but between graphics and non graphics part that game required. Wouldn't that be ideal ? Discuss.
 
It would be no small feat to fit the equivalent of a four socket motherboard into a console chasis.

The early PS3 kinda did it, with all those chip. It had EE/GS, Cell, RSX and Companion chip all with their own memory chips.

But lets start with 2 Larrabee to replace the typical CPU+GPU combo.
 
The early PS3 kinda did it, with all those chip. It had EE/GS, Cell, RSX and Companion chip all with their own memory chips.

Mind you, that's barring the fact that you're not using all of those at peak operations, and the EE/GS do nott consume nearly the same amount of power. And also keep in mind the routing between a supposed four-socket setup and power supply to all of the sockets. The motherboard will be quite a beast by comparison.

But lets start with 2 Larrabee to replace the typical CPU+GPU combo.

That's a bit more sensible. :)
 
I've never programed for either core. Certainly Cell has a much higher peak FLOP rate than XeCPU. But I'm surprised that a single SPU runs circles around a single XeCPU core. Why would that be?

Well, two words: LHS and L2 misses. If you think 10-cycle L2 on Larrabee is slow, how about 50 cycles on XeCPU?
It should be mentioned that the SPU runs circles around the PPU as well. It's not that the SPUs are that fast, it's that this particular implementation of PPC is just terrible. Split register designs, conversions through memory, lack of decent branch prediction, 6 cycles L1, 50 cycle L2, lack of OOO / store queue snooping, all these handicap it big time. The XeCPU's 128 VMX registers go unused almost all the time, because the compiler thrashes them back to memory. It is VERY difficult to actually beat straight FPU code with VMX because of all the LHS and L2 issues.

[edit: to avoid derailing the awesome topic.]
 
Last edited by a moderator:
This is EXACTLY the same argument brought by us PC fanboys

That's because what you're describing *is* PC gaming! ;)

The model you're espousing would need to be hardware profitable at every step; what would remain the same is simply the API's. It literally is exactly the PC model, right down to the parts, as the economies of scale that normally allow for custom LSI's would be ruined if companies went with offerings that deviated from the mainstream given that aggressive a performance and release timeframe. This isn't a model that's going to reduce launch prices below $400; quite to the contrary! It would just be something like the 'Phantom' project executed correctly. (Not that a lot of people weren't hopeful for the Phantom mind you...)
 
Last edited by a moderator:
When are the next generations of the PlayStation (PS4?) and Xbox due?

next Xbox: 2011-2012

PS4: probably 2012

Is there any chance that Microsoft might make a "0.5" generation release of a new XBox? Something that would be realized in a year or two that would be comfortably more powerful that the existing XBox 360 and PS3, but yet would come out two years or so before the next PlayStation update? Of course, this Xbox 2.5 would be compatible with current games (maybe even play them with more detail) while also allowing new games. Part of what I'm basing this conjecture on is that Microsoft's gambit of pushing hard to get the XBox 360 out before the PS3 seems to have really helped the XBox out.

No.
 
I don't think it's been mentioned in this thread (correct me if otherwise) that Larrabee is said to have on-board space for a fixed function unit. This could include a rasterizing hardware, making some versions of Larrabee, more like GPUs

Along with the cores and the large chunk of L2, Larrabee products will include both a memory controller and a "fixed-function unit." The nature of this fixed-function unit will vary with the Larrabee product, so that a Larrabee GPU might have raster hardware in that slot,

http://arstechnica.com/articles/paedia/hardware/clearing-up-the-confusion-over-intels-larrabee.ars


So how about a console with 2 (or even 4) Larrabees with raster hardware, thus covering CPU, Physics and Graphics?
 
That's because what you're describing *is* PC gaming! ;)

The model you're espousing would need to be hardware profitable at every step; what would remain the same is simply the API's. It literally is exactly the PC model, right down to the parts, as the economies of scale that normally allow for custom LSI's would be ruined if companies went with offerings that deviated from the mainstream given that aggressive a performance and release timeframe. This isn't a model that's going to reduce launch prices below $400; quite to the contrary! It would just be something like the 'Phantom' project executed correctly. (Not that a lot of people weren't hopeful for the Phantom mind you...)


Well if all they are now are cut down PC's... Why not just follow the P.C model more closely then? Pc prices have come down immensely over the last few years, so a partially subsidised by software/tv on demand/xbox live console wouldn't be too bad now would it?

So an Xbox rereleased now with a better chip and double the density ram would raise the price above $400? Considering the fact that Microsoft is making a profit on the sale of each console. $279+100 or less for more ram and a better Cpu or GPU doesn't seem too far fetched! They have reduced the cost of all the OTHER componentry in the Xbox... :)
 
BTW..SPUs can execute up two instructions per clock cycle, and on decently written code a single SPU runs circles around a XeCPU core at any time of the day.

Now that many of you have had 2+ years to work with the SPEs, maybe you can clarify this: when you say "decently written code" are you saying any code type, decently written, will run circles? Or would you qualify that as, "code that pairs well with the SPE design, and written well, runs circles around"?

As a side question, you have been on at least two different teams working with Cell so you probably can make a general observation here: what percentage of the "code guys" (leads all the way down to scripters) are writing "decently written code" on SPEs? Are the struggles a lot of dev teams having less-SPE related and more related to other issues (i.e. the number of cores, split memory pools and slightly smaller footprint for multiplatform games, tool infamiliarity, or what other observations have you seen?)

How do these relate to the OP? My inquiry (and maybe a good thread in itself) is to sum up what Larrabee is against. If developers are getting a firm handle on "good practices" for using SPEs, a large portion of the coding team can write safe/performant code for SPEs, the techniques being used look to scale wonderfully over 16+ cores, and the code is servicable on other platforms this could answer some questions Larrabee poses--as well as begins to form the competition/outlook of Larrabee to a small degree as well as the tech we might see in upcoming consoles.

Edit: Btw, I agree with Carl that each generation needs to assess the shifting resource demands in games as well as where the technology is going.
 
What's fun by reading the comments here and the others larrabee thread is that a "xenon2" would have to be really close to a larrabee to be a competitive cpu (even with less cores).
Anyway it's clear that MS has a lot of work to do on xenos if they want a competitive cpu foir their next box.
Sony is likely to use a "kind of cell" ie the spu/ppu could change of the current 1/7 set up.

As for larrabee It would better have good graphic perfs as the new 3870X2 is touted as >1TFlops.
I know that numbers are often meaningless but 1TFlops for the larrabee when it will be out won't impress anybody.What I try to say is that specialised hardawre have already hit the terraflop bar.
It will tough for a more general purpose device to compete on the gaphic prowess.
 
Well if all they are now are cut down PC's...

Well I don't agree that that's what they are though, even if some of the decisions seemed 'lazy' to some extent. Both are very much custom, with the most PC-like part being PS3's RSX. But the bandwidth, memory setups (and type), and architectures are quite specialized compared to a PC.

Pc prices have come down immensely over the last few years, so a partially subsidised by software/tv on demand/xbox live console wouldn't be too bad now would it?

You wouldn't really be able to subsidize the hardware at all though, since each individual system iteration would never reach the volume of targeted software offerings that would warrant doing so. What you're correct about though is that an equivalent PC would be very inexpensive today (in some respects). At the same time, an 'equivalent' PC hardware-wise wouldn't be able to perform as well under the suggested model, because you'd be putting developers in the PC mindset of not coding to a specific platform per se, but simply within an API.

Not that I don't agree with the inexpense of PC componentry enough to say that now would probably have been a better time to attempt the Phantom! :)

So an Xbox rereleased now with a better chip and double the density ram would raise the price above $400? Considering the fact that Microsoft is making a profit on the sale of each console.

Well, it's not certain that MS actually is making a profit on the hardware; it's never been explicitly stated. I'd believe either way. But yes, if the better CPU and GPU took it back to the old die sizes, I wouldn't be sure what aspect would keep it from being priced in a range similar to the old model. Granted memory prices have fallen through the floor, but that's not a situation we can take as permanent.

Anyway I understand what you're saying, but I think part of it is dependent on what exactly the scope of the 'upgrades' to the chips are.

$279+100 or less for more ram and a better Cpu or GPU doesn't seem too far fetched! They have reduced the cost of all the OTHER componentry in the Xbox... :)

The other components can't really be cost reduced too greatly though. The case, motherboard, DVD drive, 20GB hard drive... these aren't things that are going to cost much less today than they did two years ago. For 360 the majority of the cost reductions will be linked to the silicon itself.
 
What's fun by reading the comments here and the others larrabee thread is that a "xenon2" would have to be really close to a larrabee to be a competitive cpu (even with less cores).

It depends on whether Larrabee is being viewed as a CPU or GPU here, something that's fluid to an extent. I don't think anyone's making a direct comparison on Larrabee vs the XeCPU per se, rather just stating that the XeCPU needs improvement in whatever its next iteration is. As for Larrabee, I think a lot of the question hangs on whether the chip, in an 'all Larrabee system' would serve acceptably were it both the base architecture for the CPU and the GPU. There can be alternate parts used for either/or, but essentially it's trying to guess at Larrabee's own strengths vs future trending in the alternative architectures that's at the root of the thread.
 
I wasn't trying to compare directly xenon to larrabee, but if ms is about to cram ~8 cores together and be have something efficient (cost/size/perf/watt) they could come with the same solutions as Intel did (or will).
I was thinking more about the chip memory architecture:
A ring bus, really fast L1, fast L2 non shared.
It looks like the share L2 cache in xenon was a good idea flexible design but it's way to slow (from what I read here), and it looks like it would scale really badly.
But my bad I wasn't clear at all.

Anyway from what I read here about the shortcomings of the xenon MS may as well start from scratch as a lot of thing seems broken.
xpu are pretty big in regard to their perfs
Cache are slow and subject to trashing
there something wrong with the registers
etc.

EDIT
would MS have to pay Sony money if they were to use spe as a building block?
Like adding stuffs (fast L1&L2 cache, branch prediction,etc) to spe instead a designing a new PPC core, Spe looks like a pretty clean base.
 
Last edited by a moderator:
So no one mentions the possibility that Intel launches into its own console venture?
 
Back
Top