Intel pushing Larrebee console deal with Microsoft

360 CPU is already a far cry from a "traditional" CPU when it comes to adequately running general game code. While it's certainly possible LRB cores are even worse, we haven't seen any public data to support that (what we know, suggests a rough parity).
Thing is that GPR power hasn't been a real issue since last generation, we're close to a point where having more of it just isn't going to yield any perceptible benefits whatsoever.
In regard to the Xenon/LRB comparison, I'm not sure that LRB will fare that bad.
Ok it is likely to take a hit on some tasks due to lower clock speed but it has a way shorter pipeline, and its access to L1 and L2 is way better.
I don't remember the xenon L1 latencies but I remember that L2 are horrible, close to 40 cycles if my memory is right on the other end larrabee may have the fastest caches access ever. Larrabee has two more threads to hide latencies.
Well, there's that idea that MS eventually wants to move away from manufacturing hw boxes and just control the Virtual-Console specification instead.
I could see MS do that later on (after the system is well installed).
 
Last edited by a moderator:
PS3 is too dependant on CELL, with a somewhat lackluster GPU, where PS3 should've had a higher-end (custom G80-based, with EDRAM) GPU to balance out the cutting-edge (for the time) CELL.

I thought that the main "issue" with the PS3 was the NUMA memory and not the RSX.
 
Larrabee cannot , or should not, take over the role of the strong conventional multicore or manycore CPU and Larrabee also cannot take over the role of a conventional highend GPU , IMO. Unless a lower-end, less advanced system is the plan.

...

CPU <> Larrabee GPGPU <> GPU or Rasteriser.

Larrabee might make an exellent & very important component for a next-gen console, but Larrabee should not be the heart of any next-gen console.

1) I think a 3 chip configuration is incredibly unlikely. It'd be expensive, high power, hot, and a bastard to load balance.

2) Larrabee is obviously a throughput based architecture, and will not provide the single threaded performance of a traditional desktop CPU. However, neither is Xenon. None-vectorized single threaded performance on Xenon is quite underwhelming, and I wouldn't be surprised if Larrabee 2 can get very close or even beat it in this regard.

3) Another possibilty for Larrabee would be the addition of a few bigger out of order cores optimized for single thread performance, to be used for any serial workload where latency is important. You could maintain the same ISA and stick to a homogeneous developer freindly programming model while allowing applications to optimize by controlling thread/task affinities.

4) As to GPU performance, I wouldn't be so quick to dismiss them. At this point it's all very much just speculation, but with their process advantage and pure engineering force, I think at least Larrabee v2 will be very competitive (after they pick all the low hanging fruits from their first iteration mistakes).

All in all, I don't see any problem with Larrabee being "the heart of" a next gen console. You're probably right though that such a configuration is a bit unlikely, and I think Microsoft will play it a bit more safe by giving developers a (relatively) high single thread performance CPU that doesn't feel too alien and is easier to get up and going with.
Personally I would be quite excited to see something like 2 Larrabee chips in a NUMA configuration, with one of them stripped of all the texture units, display output, and other graphics specific parts, and have that space used for a couple of larger ISA compatible OoO cores.
 
I thought that the main "issue" with the PS3 was the NUMA memory and not the RSX.

The PS3 memory architecture does give you some load balancing issues, but no more than a PC system would. There is the issue that most developers seem to be running low on main memory rather than graphics memory, which is something I don't think Sony expected. The poor read performance by Cell from RSX local doesn't help in this regard and makes it largely unviable to use RSX local memory for data to be consumed by Cell.

This is not directly a performance issue though, just an extra consideration when you're laying out your memory budgets. I'm not sure what information is public and what is not, so I'll limit my comment on performance to just saying that many games end up having to throw some SPUs at what might generally be considered GPU tasks, to be on par with 360.
 
If Microsoft wants to make a huge leap beyond Xbox 360 and compete with PS4, they'll need a console that has

strong multi-core or manycore CPU from IBM, Intel or AMD
Intel's Larrabee or 2nd-gen Larrabee
ATI or Nvidia GPU

Larrabee cannot , or should not, take over the role of the strong conventional multicore or manycore CPU and Larrabee also cannot take over the role of a conventional highend GPU , IMO. Unless a lower-end, less advanced system is the plan.

next-gen console (be it XB3 or someone else's console):

CPU <> Larrabee GPGPU <> GPU or Rasteriser.


Larrabee might make an exellent & very important component for a next-gen console, but Larrabee should not be the heart of any next-gen console.

Xbox 360 is pretty well balanced between Xenon and Xenos.

PS3 is too dependant on CELL, with a somewhat lackluster GPU, where PS3 should've had a higher-end (custom G80-based, with EDRAM) GPU to balance out the cutting-edge (for the time) CELL.


Larrabee will be better than CELL1 at rendering graphics itself, however I doubt first-gen LArrabee will compete with the best highend or upper midrange GPUs from Nvidia & AMD/ATI.


sorry if im not making sense, Ive had 2 hours sleep in the last 48-50 hours.
I do not agree with your perception about Larrabee.
Larrabee 1 could well suck, larrabee 2 too then our speculations could be worseless :LOL:
I think that you're wrong on the need of many /some CPUs optimized for serial execution.
Say you have two 24 cores larrabees. This is 48 cores.
Lots of the demanding tasks will be spread among the bunch a cores. Having some cores running the main games loop, the OS, the run time environment etc even in a non optimal fashion (in regard to what a say a Peryn would achieve) won't hurt (or be THE limiting factor).
And nothing say that the LRB will suck that much at seriel execution, it's a really short pipeline, flushing a 5 stages pipeline won't have the same impact on perfs as doing so on a long pipeline CPU, and it will benefit from real low access time to L1 and L2. Having Four threads will help even more to hide latencies and keep the core busy.
 
The PS3 memory architecture does give you some load balancing issues, but no more than a PC system would. There is the issue that most developers seem to be running low on main memory rather than graphics memory, which is something I don't think Sony expected. The poor read performance by Cell from RSX local doesn't help in this regard and makes it largely unviable to use RSX local memory for data to be consumed by Cell.

That was news to me, then again I only know what I understand from whats posted on this board. But from what I have managed to get into my head, is that it was the graphics memory that was "low", ie better textures etc on X360.

This is not directly a performance issue though, just an extra consideration when you're laying out your memory budgets. I'm not sure what information is public and what is not, so I'll limit my comment on performance to just saying that many games end up having to throw some SPUs at what might generally be considered GPU tasks, to be on par with 360.

So just the hassle of having to rewrite your code to use the SPU instead of just throwing it at the GPU. Hopefully this is a write once and reuse for new projects task.
 
If Microsoft wants to make a huge leap beyond Xbox 360 and compete with PS4, they'll need a console that has

strong multi-core or manycore CPU from IBM, Intel or AMD
Intel's Larrabee or 2nd-gen Larrabee
ATI or Nvidia GPU

Larrabee cannot , or should not, take over the role of the strong conventional multicore or manycore CPU and Larrabee also cannot take over the role of a conventional highend GPU , IMO. Unless a lower-end, less advanced system is the plan.

next-gen console (be it XB3 or someone else's console):

CPU <> Larrabee GPGPU <> GPU or Rasteriser.

Larrabee might make an exellent & very important component for a next-gen console, but Larrabee should not be the heart of any next-gen console.

Xbox 360 is pretty well balanced between Xenon and Xenos.

PS3 is too dependant on CELL, with a somewhat lackluster GPU, where PS3 should've had a higher-end (custom G80-based, with EDRAM) GPU to balance out the cutting-edge (for the time) CELL.

Larrabee will be better than CELL1 at rendering graphics itself, however I doubt first-gen LArrabee will compete with the best highend or upper midrange GPUs from Nvidia & AMD/ATI.

sorry if im not making sense, Ive had 2 hours sleep in the last 48-50 hours.

I think it would really depend on the overall package. Nintendo has shown that you don't need a really powerful overall hardware package to succeed. Yes there are other factors that made that true but still, if MS next console is good enough with only LRBs like in MCM form and can be made at a relatively low cost, It would already be halfway there.

It would come down development tools and how easy it would be for developers to transition from X360 architecture to LRB only type architecture. Developers are already very familiar with X360 hardware so deviating from it is of course a risk, but developers have already made that leap. Moving from X360 programming model to an all multicore CPU model shouldn't be that hard if the tools are there to help them, afterall the tri-core in X360 is just a smaller LRB.

Personally I would really like to see a LRB-only solution from somebody next generation whether it's MS or Nintendo or even SONY. Nintendo would probably be happy with a single LRB+Wii SoC for BC.


MS already has all the major game studios making games for the X360, I'm pretty sure they wouldn't mind programming a CPU-only machine nextgen if the tools are good. I think the results would be quite interesting and open the way for a console standard.
 
So just the hassle of having to rewrite your code to use the SPU instead of just throwing it at the GPU. Hopefully this is a write once and reuse for new projects task.

Code is rarely that static. You always come up with new ways of doing things. Or need changes to support new features. Having manually unrolled code inflates the time doing changes/maintenance (bug fixes). Using intrinsics compound the problem because you are programming at a low level of abstraction.

Having a CUDA-like environment for CELL would make a lot of sense.

Cheers
 
c2.0 what could you see Intel do in order to improve Larrabee performances?
I submitted a post here:
http://forum.beyond3d.com/showpost.php?p=1213664&postcount=200

I think with any (new) architecture there's always room for improvement. I'm sure the engineers and designers have a backlog of ideas they'd like to try, simulate, evaluate, etc, but at some point you just have to lock down and prepare for fabrication.
Then there's market response, what kind of software ends up being thrown at it, and how this software behaves. This should give them a much broader range of scenarios to tune for and they might find opportunities there (probably too late for larrabee v2, but maybe later generations).
As for concrete examples I think it's much to early to speculate (we don't even have larrabee 1 specs). I think memory architecture is going to play an increasingly important role as we move to more and more parallel architectures, both for sheer throughput, but also latency of comunication and synchronization, so I expect to see some work there.
 
I've been pushing the theme of this thread for a while now, and though Intel lobbying Microsoft doesn't mean that this will come to pass, people saying it won't come to pass due simply to the issues of XBox 1 are way off base. This is much more important for Intel than it is for MS, and if there is even a chance they can succeed in adoption, they will push for it and push hard. I wouldn't be surprised if Intel offered MS chips at or below cost in order to secure the contract, with many favorable forward-looking provisions included as well. On a mm^2 basis, that would probably make these chips the cheapest that MS could source from anywhere.

What MS would provide Intel in return is a large and captive developer audience to become familiar with the architecture when they might otherwise be hesitant or wary, and a massive tools and DirectX effort aimed at enhancing Larrabee's approachability.

Again this isn't saying I think L will show up in a console or it won't, but you can be sure that XBox 1 style contracts won't be a factor in the situation whatsoever.
 
I think with any (new) architecture there's always room for improvement. I'm sure the engineers and designers have a backlog of ideas they'd like to try, simulate, evaluate, etc, but at some point you just have to lock down and prepare for fabrication.
Then there's market response, what kind of software ends up being thrown at it, and how this software behaves. This should give them a much broader range of scenarios to tune for and they might find opportunities there (probably too late for larrabee v2, but maybe later generations).
As for concrete examples I think it's much to early to speculate (we don't even have larrabee 1 specs). I think memory architecture is going to play an increasingly important role as we move to more and more parallel architectures, both for sheer throughput, but also latency of comunication and synchronization, so I expect to see some work there.
Thanks for your response, I'm laking patience I'll see soon enough how larrabee fares, you're right.
Between 3Dilettante also answered me in the other thread and it appears that I'm not a hardware designer for a reason... and a good one :LOL:
 
Last edited by a moderator:
I've been pushing the theme of this thread for a while now, and though Intel lobbying Microsoft doesn't mean that this will come to pass, people saying it won't come to pass due simply to the issues of XBox 1 are way off base. This is much more important for Intel than it is for MS, and if there is even a chance they can succeed in adoption, they will push for it and push hard. I wouldn't be surprised if Intel offered MS chips at or below cost in order to secure the contract, with many favorable forward-looking provisions included as well. On a mm^2 basis, that would probably make these chips the cheapest that MS could source from anywhere.

What MS would provide Intel in return is a large and captive developer audience to become familiar with the architecture when they might otherwise be hesitant or wary, and a massive tools and DirectX effort aimed at enhancing Larrabee's approachability.

Again this isn't saying I think L will show up in a console or it won't, but you can be sure that XBox 1 style contracts won't be a factor in the situation whatsoever.

Exept the monniez MS might save going with Intel, isnt doing what devs want important too? I mean, I dont know much about larrabee but wont it be quite a bit different to work on compared to what AMD or Nvidia has? X360 finally seems to have given MS some decent market space atleast as far as being friendly with devs goes and selling loads of software.

Wouldnt they be cutting their own fingers if they'd try and save money but bring trouble to the devs? Alot of devs arnt too happy with the costs of making games and the general idea seems to be that the x360 is a relative easy machine to work on compared to the ps3. It doesnt seem to smart to me to maybe make the mistake sony did and use something which might give devs a hard time.

OTOH if developing isnt really a issue I dont see why Intel's chip couldnt end up in a console.
 
Exept the monniez MS might save going with Intel, isnt doing what devs want important too? I mean, I dont know much about larrabee but wont it be quite a bit different to work on compared to what AMD or Nvidia has? X360 finally seems to have given MS some decent market space atleast as far as being friendly with devs goes and selling loads of software.

Wouldnt they be cutting their own fingers if they'd try and save money but bring trouble to the devs? Alot of devs arnt too happy with the costs of making games and the general idea seems to be that the x360 is a relative easy machine to work on compared to the ps3. It doesnt seem to smart to me to maybe make the mistake sony did and use something which might give devs a hard time.

OTOH if developing isnt really a issue I dont see why Intel's chip couldnt end up in a console.

Well of course, that's (part of) the reverse side of the equation. Microsoft needs to evaluate how far they can bring the tools forward in order to provide a turn-key solution to developers not willing to shift effort/thinking. It's sort of a tug-of-war between the benefits for Intel, the hassles to Microsoft, and what Intel can do to help offset those hassles in order to encourage MS adoption.

On top of this MS may wish to maintain some architectural relation in order to preserve B/C in future consoles... though who knows how important B/C will be going forward in terms of benefits vs constraints. And I think most importantly of all, Big L is an unknown, so whatever test samples and advance documentation may provide, these are choices that need to be made years in advance, and MS would be making a gamble that the thermal and performance characteristics would come in line with targets. Intel is the master, but they have some costly target misses under their belt as well.
 
I'll reiterate from the other thread, it's incredibly early and all, but I think there is at least 50-50 chance Larrabee does show up in the next Xbox.

The thing I'll have to wonder is if thye performance will be there versus a traditional ATI or Nvidia monster. If not, there's no point.

If LRB is tailored towards being much more general purpose, as it seems to be, well, thats the last thing you need in a console GPU right? I mean as long as it has great gfx performance, the GP stuff wouldn't hurt anything, but it certainly wont be useful in a console.

Unless people are thinking it could say, act as both CPU and GPU?
 
If LRB is tailored towards being much more general purpose, as it seems to be, well, thats the last thing you need in a console GPU right? I mean as long as it has great gfx performance, the GP stuff wouldn't hurt anything, but it certainly wont be useful in a console.

On the contrary, a console is the perfect arena for non-mainstream hardware. The extra programmability could definitely be useful, and only on a console would you have the install base to justify investing in that potential. For PC, most developers will stick to the dominant standard and avoid specializing, especially for a small part of the market. For Larrabee discrete graphics, Intel are likely to have to push it out with partner programs, paying developers, to ensure some exclusive content is produced to show off the platform.
 
On the contrary, a console is the perfect arena for non-mainstream hardware. The extra programmability could definitely be useful, and only on a console would you have the install base to justify investing in that potential. For PC, most developers will stick to the dominant standard and avoid specializing, especially for a small part of the market. For Larrabee discrete graphics, Intel are likely to have to push it out with partner programs, paying developers, to ensure some exclusive content is produced to show off the platform.

What I mean is a GP tailored GPU should theoretically in PC, have the ability to do things besides graphics (like video encoding) better, right?

Those things arent useful in a console. A console GPU only needs to focus on pumping out pretty pictures.

Or are you saying, LRB's flexibility would help it push out better graphics than a "standard" GPU?
 
In my opinion Intel is wasting their time talking to either Microsoft or Sony. They should be talking to Nintendo. Both Sony and Microsoft have multi-core architectures which can simply have the number of cores increased to give a boost in performance with full backward compatibility in dev tools and games. It doesn't make sense for them to do anything else. Nintendo on the other hand will need to bring in a new multi-core console at some stage.
 
Frankly Nintendo will do just fine with 2 or 4 cores OOOE CPU and a decent GPU. Oh wait, probably even MS and Sony would do fine either with the same basic, just more powerful, configuration.
 
Based on some of the replies to my post the other night, I was thinking, if a custom Microsoft-tailored Larrabee 2 (in 2011-2012) could offer a few powerful OoO cores with good single threaded performance plus a bunch more (improved) simpler cores that first-gen Larrabee will have (64 to 128 instead of 16-48) then Microsoft could have an awesome, Cell2-rivaling/beating CPU/GPGPU that's also the front-end for pixel pumper, a monster ATI GPU, or just a rasteriser with EDRAM, like PS2's GS but generations ahead, or something like a Xenos2. Thus, a 2-chip console. The ATI chip would not do anything except rendering to the screen, doesn't need to calculate anything on the front end (vertex shaders, geometry shaders), which would all be in the hands of custom Larrabee2. The ATI chip could even be a split die on one package, logic + EDRAM. just guessing.
 
Back
Top