Predict: The Next Generation Console Tech

Status
Not open for further replies.
Well, Charlie's writing that Intel has secured the GPU contract for PS4. I'm not linking to the article since it's just the typical ranting and horrible analysis, but if that singular point is true it's certainly a big deal. Of course in the many Larrabee discussions that have taken place around here, we knew that Intel would very much need/want a console to jump-start its industry adoption. If it is true, then it will be interesting to see what happens on the CPU side, whether they will stay with Cell or go x86. Either would work fine in terms of Larrabee, but I would at least expect Intel to make the effort for it.
 
If it is true, then it will be interesting to see what happens on the CPU side, whether they will stay with Cell or go x86. Either would work fine in terms of Larrabee, but I would at least expect Intel to make the effort for it.


An x86 "PPE" with SPEs :?:
 
If it turned out to be the case that Intel gets to inject a version of Larrabee into the PS4, adding an x86 CPU from Intel would have to overcome misgivings at Sony for having a single-source supplier for both its CPU and GPU (that wasn't Sony, anyway).

Microsoft's experience with trying to get cost reductions out of Intel might still be informative, though the future of the semiconductor industry is pretty foggy and Intel might be more flexible if it finds fab capacity would go unutilized otherwise.

x86 on the PS4 would be an XBox x2 situation with regards to a lack of control over the main components.

The idea of the PS4 using Larrabee as both CPU and GPU could be mooted--if Intel's position on Larrabee's target markets allowed Larrabee to be used as a host processor. Some statements from the company seem to indicate it will not.

I'm not sure if the timing is right by 2012 for Intel to massage away the ISA fork Larrabee has introduced, if it ever intends to.
 
I suppose the extent to which Intel is "brought on" will play a large role. Are Intel going to be step-by-step now in helping to architect the system? Or are they going to in more the traditional GPU collaborative relationship we are used to from the console industry. Either way we know this is going to have a big effect on ISA and tools development, which I do expect will be very close/collaborative throughout development.

I think Sony wouldn't have been averse to going STI all the way on the GPU as well for instance if STI had a viable GPU to offer, which just makes me wonder the above all the more.

We know that IBM wasn't interested in losing Sony or their R&D dollars though (or fab output draw), so one would think arguments to remain on that track would have been easy enough to make. Cell though not as an architecture per se has been a godsend for IBM as it put their minds in the heterogeneous game and their people on tool development precisely before they might have been blindsided by the present shift in computing.

I am very curious as to whether the CPU will remain IBM or not. I mean I love all novel architectures and Cell especially, so I would love to see its life extended in a v2.0, if only to see what it would look like.
 
Ah, the Inquirer.

It might be a good deal for Sony if Intel is aggressive the way they were in capturing Apple's business. First rate fabbing technology, high volume capacity, low costs for being a leader in making Larrabee relevant.

But this would seem to blow out backwards compatibility for the PS3->PS4 transition, and it seems a weird choice for Sony when they don't have a full Direct-X style stack to abstract the hardware and to allow for software rendering with L.

Weird, weird, weird.
 
What does this mean for Cell? Could it possibly exist with Larrabee?

Of course. But one might wonder - BC asides - why they'd use Cell instead of a second larrabee..unless, as oft speculated, the CPU is going to be 'small' in PS4 (i.e. a 12 or 16 SPE cell). In which case cost might be an obvious reason why they wouldn't put a second larrabee vs a small cell.

As for the inquirer report..it certainly would be interesting..and I'm sure Intel has been talking to all console providers...but I sort of have my doubts over this coming to fruition (and certainly have doubts over the inquirer regardless :p)
 
Of course. But one might wonder - BC asides - why they'd use Cell instead of a second larrabee..unless, as oft speculated, the CPU is going to be 'small' in PS4 (i.e. a 12 or 16 SPE cell). In which case cost might be an obvious reason why they wouldn't put a second larrabee vs a small cell.

As for the inquirer report..it certainly would be interesting..and I'm sure Intel has been talking to all console providers...but I sort of have my doubts over this coming to fruition.

I think as it stands now Larrabee is off the CPU track for all intents and purposes and should just be viewed as a GPU. If it can be used in that regard, then it full well might. But that's sort of what 3dilettante was batting around above.
 
Using Larrabee is an interesting idea but given Charlie's record of reliability on reporting on the PS3 it's very, very hard to believe.
 
Given the track record of Sony's addiction to exotic and non-standard solutions for their consoles, Charlie's assumption could be quite permissible here.
 
I just don't get it, do people really believe in all this?
iwanttobelieve1ry5.jpg


:oops:
 
Why would Sony go with Intel instead of using a proven GPU developer like Nvidia or AMD? I am admittedly not up to snuff on the latest details regarding Larrabee, but I have a hard time believing Intel will be able to best Nvidia and AMD at their own game.
 
I can imagine how PS3's RSX was a kind of humiliation to Sony, but inevitable one, given the time constrains for picking a GFX part for the console. A dumbed down PC hardware piece, slapped straight into a very custom flagship platform, doesn't fit well into the overall "untouchable" image, inherited from PS2.
All this is just to hint, that whatever will be the choice for PSNext, it won't be anything less what MS managed to deal with ATi, and I can't see how (or if) NV could agree for such level of IP contract for a very custom ASIC.
 
I find it really hard to believe Sony will be using an intel GPU in the PS4.

No way in hell intel would let Sony second source the production to someone else (if it was even possible given intels process edge) and if not how could they make a contract where Sony does not risk getting treated in the same way as Microsoft was with the cpu of the Xbox?

The source: a boot lady. I am sorry, but I think I´ll let this rumour pass for now.
 
I also find it hard that Sony will want to go with NV again after RSX. Broken scaler, can't emulate GS for PS2 emulation and seperate bus for extra memory chips. I am pretty sure they prefer something that's unified memory, can scale properly and emulate PS2. Though going with other vendor might make it hard to emulate PS3.

I thought Sony contracted PowerVR for PSP2 ? PowerVR should be able to do GPU for PS4 too I presume.
 
From my laymans perspective (Compared to many of you)

It seems that a lot of the extra workload for the next generation CPUs are extremely floating point intensive if you consider physics, animation and perhaps next generation interfaces which involve a lot of data points. Couldn't these be offloaded onto a hypothetical Direct3d 12 GPU which would leave the CPU free to perform the tasks it does strictly better than the GPUs of the day?

Furthermore if we consider Intel's assertion that off-chip bandwidth is expensive as a form of gospel, would it not make sense for the console itself to utilise a 'local store' of memory to store frequently used information without having to go over an expensive outside bus?

So if we take both into account, wouldn't some sort of multichip module make a lot of sense in terms of manufacturing efficiency, especially as it would help keep latency to a minimum and facilitate the use of the GPUs massive parellel processing advantage. Its about efficiency and cost, and wouldn't making best use of the advantages and mitigating the disadvantages of their available resources make the most sense in developing a cheap and powerful system for the next generation?

I don't really see the total wattage of a next generation console exceeding 150W with 100W feeling more realistic, so having everything covered under the same packaging should save some money and potentially give better performance as well.
 
Status
Not open for further replies.
Back
Top