Predict: The Next Generation Console Tech

Status
Not open for further replies.
That's assuming Cell doesn't evolve either though.
Thing is though, with GPU that programmable, do we really want another discrete CPU with a different arch, even if it's easy to work with?
No, we don't.. so you had the same 'crazy' idea :) Cell evolution or not, better having just one architecture to learn/to code for.

One architecture to rule them all!
 
nAo said:
No, we don't.. so you had the same 'crazy' idea :) Cell evolution or not, better having just one architecture to learn/to code for.
Of course, IIRC weren't we both part of the discussions of this same crazy idea(one architecture to rule them all) 5-6 years ago around here when first Cell patents surfaced ;)

But it does look more feasible then ever now.
 
OAtRTA is the future! The same ISA across all platforms, handhelds and consoles, no struggling with different languages. It would be like writing a book in English as opposed to writing it in English, German and Swahili. And if every platform used the same architecture like x86's dominance, they'd be no developer whinging about a platform being too hard - they'd just have to get on with it! Of course if there was only one common architecture, there may as well be only one console...
 
Aside from the inability to differentiate a platform if everyone adopts "x86" for everything may I ask where all this faith in Larrabee being able to keep pace with AMD's and Nvidia's next offerings is coming from? ( I hightly doubt these companies are going to take Larabbee lying down. )

We've seen precisely nothing in real world terms to my knowledge that defines what Larabbee is actually capable of. I am not one to make light of intel engineer's skills but I still feel there should be a healthy amount of skeptism in place until we actually see something up and running "with the best of rest" in whatever markets Larabbee competes in.

On that matter, Larabbee's target market seems to have changed from the low to mid range market to the high end of the spectrum just recently so maybe this anecdotal evidence is suggestive of Larabbee being able to actually get the job done as compared to its peers.

I dunno. Somebody seeing something the rest of us aren't? (besides intel)
 
Last edited by a moderator:
It doesn't matter whether it's Larrabee or Cell or something else entirely. As long as it works and is a single platform, I'm sure many a dev will welcome it with open arms. AMD and nVidia's offerings will have to be a standalone CPGPU combo thing to satisfy this market. If whatever they make can't run an OS and full system on its own, or requires two different coding models to access CPU and GPU functions, they won't be offering the Ultimate Ease solution.
 
Hs
Doesn't Intel miss a huge opportunity to clean X86/87 instruction set?
(Not move from it just correct weird part or no longer support some really old features)
/Hs
 
It doesn't matter whether it's Larrabee or Cell or something else entirely. As long as it works and is a single platform, I'm sure many a dev will welcome it with open arms. AMD and nVidia's offerings will have to be a standalone CPGPU combo thing to satisfy this market. If whatever they make can't run an OS and full system on its own, or requires two different coding models to access CPU and GPU functions, they won't be offering the Ultimate Ease solution.

I understand this.

I was speaking more to just how likely it is that what is being suggested will happen.

I don't see it as very likely from the hardware perspective.

It's dopey to suggest this but I would place more faith in OpenCl providing the illusion of a single platform for everyone than I would in a single platform at the hardware level ever coming into existence. Barring the market is not completely monopolized of course.

It's a dopey suggestion because like anything else OpenCl is an API and to compete on consoles you often go a bit beyond what "standard" APIs will give you in other spaces.

I would love to be wrong though.

I do expect things will be much easier in the next round ( it's pretty hard to misinterpret how desirable it is ) but I don't see the powers that be as nearly so benevolent as to be willing fight on even terms for the sake offering the ultimate ease of use option to developers.

If you mean that one platform may use only Larabbee and another platform would use only Cell2 then I apologize for misunderstanding you. It's somewhat more plausible but still I don't think it is all that likely. It still requires a fairly large leap of faith in either architecture and for AMD and Nvidia both to fail in enticing the console vendors to use their hardware for competitive advantage if nothing else.

I'm not against the idea at all. I just don't see any reason to believe it will happen...as of yet.
 
Of course, IIRC weren't we both part of the discussions of this same crazy idea(one architecture to rule them all) 5-6 years ago around here when first Cell patents surfaced ;)

But it does look more feasible then ever now.

Will it be atrocious though as far as price:performance ratio goes ?

would you seriously want in 3-5 years a single Larrabee based solution above a CPU+GPU combo?

Well, it would fit my view of a Windows 7 customized PS4-Xbox union to really rule them all ;).

Backwards compatibility with PS3 would be a nightmare though...

Even if Larrabee is all there is and more... an evolved CELL v2 plus a modern and optimally integrated nVIDIA GPU would not be bad performance wise and software development wise as you could trasnfer all of your PS3 knowledge onto it. Think about it from the OS, libraries, and tools points of view too, would it really be better to throw all the current stuff away and re-write it from scratch? More importantly, for someone in the audience ;), would you trust SCE to get it right in time for launch (or hopefully 6+ months earlier) ;)?
 
If Sony will adopt Larrabee as a GPU for its next gen console they have to get rid of CELL, it would be embarassing to have developers not using the CPU becase the 'GPU' is way easier to program for (no local store, better ISA, better tools, etc..) ;)

Why would not developers use it (again even assuming that a CELL v2 would not show some nice new advancements, like a caching hierarchy for the SPU's and other things which might make the programming model more compatible with what you are provided for with Larrabee) after so many years spent learning what to do with it?
 
I just cant see PS4 being anything other than Cell2 and Nvidia or ATI gpu. With the huge development cost of Cell plus all the benfits they would derive with complete backwards compatibilty with dev tools, games etc, anything else woudn't make much sense.

I think that Sony had the design idea for a processor that would last at least two or three generations hence the huge investment in the Cell project.

Meaning that the investment would be for two or three generations making the dev cost of ps4/5 that much cheaper and profits for those consoles to come at a much speedier time within those consoles lifespans.
 
If you mean that one platform may use only Larabbee and another platform would use only Cell2 then I apologize for misunderstanding you...
I'm entertaining both ideas. The idea of a single-chip console is good for devs. PS4 could be Cell2, and neXtBox could be Larrabee, and the portables from these companies would likewise use the same chip in a scaled-down version. Developers would have the simplest hardware model to worry about. Can you write Cell2 code? Yes? Then you can develop for PS4, and scale the same code down to PSP2 effortlessly, and scale up the rendering of PSP2 titles as PS4 downloads, and use the same rendering techniques on CellTV, etc. This was the vision entertained with the early news about Cell.

Now I'm sure even better for the devs is a standard hardware across manufacturers, so all boxes were based on Chip Architecture X, and code was perfectly portable across boxes. I can't see that one ever happening though.

Panajev2001a said:
Well, it would fit my view of a Windows 7 customized PS4-Xbox union to really rule them all ;).
You keep your Windows away from my consoles!
 
Why would not developers use it (again even assuming that a CELL v2 would not show some nice new advancements, like a caching hierarchy for the SPU's and other things which might make the programming model more compatible with what you are provided for with Larrabee) after so many years spent learning what to do with it?
Do you think CELL2 will come with a fully working and optimized software renderer, OpenCL/compute shaders, Ct, TBB, a shader compiler, and all sort of Intel tools already available on x86 (that Intel will likely extend to support Larrabee)?

These might not seem so important but I've seen lots of (even smart..) devs freakin out because to debug CELL code they have to do it outside visual studio. This make me think that Sony needs to give devs a powerful machine but also some sort of 'dev experience' that they are already comfortable with.
Projects complexity will grow even more, we need to make development as easy as possible (easy, NOT dumb), perhaps relaxing perf requirements a bit.
SPUs are insanely faster than 360 cores at almost everything, but gamers have to try hard to see any difference on both consoles. Something went wrong somehere, don't you think?

IMHO t wouldn't make any sense to have two different highly programmable multicore architectures on the same console (more stuff to learn, more support to provide to developers, more complicated tools to develop/mantain/debug). If you have one that can handle graphics pretty well go for it. Pair it with an easy to use/fast/cheap CPU or slap 2 massively multicore chips on it. Make it fast, make it relatively cheap and most of all make sure that developers are not getting shafted again (it doesn't matter if they got shafted for real, it's crystal clear that many believe they got..) and give them powerful and simple tools to write code on these beasts. The rest will come.... unless Wii2 takes over the world as we (used to) know it :)
 
By reading comments it sounds like Ms could be the only one of three manufacturers able to deliver on the software side.
Intel seems to have gone through a lot of work in regard to the cache, do you guys think IBM/DAMIT would be able to provide a compliant clone by the time frame next system are likely to launch?
Could Ms afford the R&D if none of the afforementioned companies are interested in using the chip on their own?
The ATi/AMD case is interesting, they might be wiling to share efforts with Ms even to let Ms buy the IP BUT I'm not sure AMD could legally do so if the chip is X86 based.
 
Last edited by a moderator:
In your opinion then nAo Cell hasn't really lived up to expectations due to there still not having the right dev tools?
 
I concede that Sony has a wondrously difficult challenge in front of them if they are to attempt to outpace both MS and Intel on the tools front "alone."

The caveat that I offer is throwing IBM back into the equation. If IBM wishes to retain Sony ( Sony is as much a customer as it is a partner IMO ) then they will need to throw the full brunt of their ingenuity at making Cell2 as accessible as possible.

I feel IBM is more than willing to capitulate given better tools will only strengthen the value of their other products as well. This is assuming STI does as I expect and brings modern memory coherence to Cell in a cheap but effective manner. If not there is no hope of writing such tools and indeed Cell2 will be dead in the water. If so then any tools for PPC will extend to Cell2 and will only differ when it comes to low level optimization.

I feel IBM's RAD can give Visual Studio a run for it's money any day of the week and I'm not alone according to Evans Data Reports.

http://www.evansdata.com/reports/2008IDE.php?rid=QXJ003

Sony SHOULD bite the bullet and hand developers IBM's RAD. It will go a LONG way to making developer's lives "easier." Further I feel that Sony should start acting like any other customer and leave tools development to IBM...they certainly much more so know what they're doing and secondly this would secure enterprise level support for the tools themselves. No knock to the open source community but the sort of support demanded in today's modern world just isn't there.

I believe Sony has the talent on hand to write a completely software driven and well optimized renderer for Cell2. With SCEA, SCEE, and SCEJ on tap and ripe with people who know what real-time graphics are all about one could argue that Sony is in even better position than Intel to accomplish this task. If Sony and IBM cannot collaborate to produce not only decent if not exceptional results here I will be quite amazed...and wholly disappointing.

Who would implement the OpenCl standard for Cell2 is an open question. The GNU group may do it...for "free" but then I expect IBM to do so as well. I can't imagine OpenCl support isn't in the cards for Cell2 if IBM wishes to seriously contend in the HPC market heading into the future. Who would do a better job? That's of course highly debatable. Who would do a bad job? Neither IMO. IBM certainly cannot afford to miss the mark here.

This discussion is probably suited to another thread but it is clear that offering developers easy access to whatever hardware you bring to market is of critical importance.

This is something that Sony has obviously under estimated this round and clearly is where we all can look to ascertain "what went wrong." However, if Sony is smart enough to learn anything from recent events they have compelling options out there to consider which will make everyone happy without completely tossing tossing Cell out with the bath water.

The expense to Sony to get it right for developers IMO is more than worth it. In truth, Sony can't afford not to.

If we are going to entertain the idea of a single multi-core architecture to do it all in Sony's next console to the exclusion of whatever AMD or Nvidia present then I would like to offer that sticking with Cell2 isn't quite so bleak a proposition...if handled properly.

If Sony attempts to go it alone or without trying to secure well supported enterprise solutions they're pretty much pre-destined to fail as far as I am concerned. History is not on Sony's side doing otherwise.

I'm still not sold that it is likely a single "arch" solution will be offered by either MS or Sony, but I want to make it clear I am not opposed to the idea.
 
Last edited by a moderator:
Don't forget Nintendo either, they could be interested in Larrabee because unlike MS and Sony, Nintendo did not invest heavily in this area this generation. Maybe they can have just a single Larrabee chip for their next console to handle all the work.
 
In your opinion then nAo Cell hasn't really lived up to expectations due to there still not having the right dev tools?
Well, I'm wondering why we don't have something like CUDA (at least a subset of it..) on CELL, what were/are they waiting for?
I'm also wondering why Microsoft and Sony have not provided to developers optimized shader compilers for their CPUs. Texture sampling aside (which doesn't map well to 360 and PS3 CPUs) modern shaders programming model can be used to tackle a lot of different problems, while being simple to use, fast to compile and relatively easy to debug. It seemed to me such a straightforward thing to do, but I guess I was wrong :)
 
I feel IBM's RAD can give Visual Studio a run for it's money any day of the week and I'm not alone according to Evans Data Reports.

http://www.evansdata.com/reports/2008IDE.php?rid=QXJ003

Sony SHOULD bite the bullet and hand developers IBM's RAD.
It will go a LONG way to making developer's lives "easier."
It's already difficult to convince ppl with a 360 AND a PS3 devkit on their desks to work on PS3 (with visual studio), if you take it away from them you'd better cancel the PS3 version of your game too :)
Sad but true..

I believe Sony has the talent on hand to write a completely software driven and well optimized renderer for Cell2. With SCEA, SCEE, and SCEJ on tap and ripe with people who know what real-time graphics are all about one could argue that Sony is in even better position than Intel to accomplish this task. If Sony and IBM cannot collaborate to produce not only decent if not exceptional results here I will be quite amazed...and wholly disappointing.
In order to have an efficient software renderer based on rasterization and texture mapping you'd need to drastically modify CELL architecture. I just don't see this happening.
 
Status
Not open for further replies.
Back
Top