Predict: The Next Generation Console Tech

Status
Not open for further replies.
16 wide vector units are efficient on graphics but not much else. That's why most systems (including x86 chips) are 4 wide.

It doesn't matter that much. In the end you need to use both for graphics. It doesn't matter if you use 1 vector or 4 vectors, in most cases you'll want 4 - because of latency for ALU commands.

I can see Larrabee being used as a GPU but I can't see it beating Nvidia or ATI. It might beat Cell, but only on highly parallel tasks.

Larabee is quite similar to modern GPU designs (read NV/AMD). Couple of round-robins with data passed on circular bus, throughput increased by alternating load/store with ALU operations.
 
Why can't Sony customised the SPE with wider vector units and other things to make it suitable for GPU ? So PS4 can just use a single chip with PPE + SPEs + SPEs modified + cache + texture units. Like how Toshiba modified Cell BE to Spurs Engine by adding extra components or Intel with Larrabee.

I think they need to adopt the strategy from GPU makers with X2 sort of product for their hardcore market while not alienating the mass market.
 
The INQ guy is running the same PS4 rumor for almost a year
http://www.realworldtech.com/forums/index.cfm?action=detail&id=90374&threadid=90000&roomid=2
If it makes sense in monetary terms (especially in this time of recession) Intel GPU for PS4 is as a good deal as the NVIDIA deal for PS3, or even better. Intel showed its flexibility by selling Atom which is undermining the highly profitable high-end CPU market of their own. After all I believe the next generation consoles are not defined by who can push how many shader ops, but by what kind of app they are optimized for as the continuation of "Wii shock". If Intel GPU can do the work, why not.
 
MS+Sony joint venture... That'd be the only way you'd get DX11 on a PlayStation! There is the small possibility of Larrabee cores being included in a Cell chip alongside SPUs, working as the Visualizer component of the original patent. If the software is there to drive them as a GPU system, devs wouldn't have to think about writing code for them any more than they do for targeting RSX. Of course the option would be there to use Larrabee cores for something else, but these also the option for devs to use the GPU for GPGPU work. If they really want to wrestle two programming models, good for them!
 
I just can't imagine such a mixed system being produced. If Intel is really in, I'm sure they're producing the whole shebang.
 
I highly doubt it too, but it is still a possibility. nVidia was talking in the early days as if their GPU tech could be combined into a Cell. So if you're going to have a 3rd party (or fourth, in the case of STI + GPU people!) contributing a graphics processing model, it makes no odds whether it's nVidia, ATi or Intel, nor what programming model it uses if the GPU people are providing the tools and software interface.

So if we're going to have a one-chip combined CPU + GPU, or rather more flexibile mass-processing unit, it looks like Intel contributing and Cell benefiting perhaps from their fabrication technologies is on the cards. However, I expect if Intel were involved, they'd be pushing an Intel-only solution.
 
I highly doubt it too, but it is still a possibility. nVidia was talking in the early days as if their GPU tech could be combined into a Cell. So if you're going to have a 3rd party (or fourth, in the case of STI + GPU people!) contributing a graphics processing model, it makes no odds whether it's nVidia, ATi or Intel, nor what programming model it uses if the GPU people are providing the tools and software interface.

So if we're going to have a one-chip combined CPU + GPU, or rather more flexibile mass-processing unit, it looks like Intel contributing and Cell benefiting perhaps from their fabrication technologies is on the cards. However, I expect if Intel were involved, they'd be pushing an Intel-only solution.

One chip envelope leaves them out of the performance career. How many tranies against a Xenon 2 + Ati gpu ?. In that case they would be going the Wii route.
 
That's assuming MS go the high-end expensive route too. If we follow the concerns about limited price reductions due to limited process shrinks over the life of the next generation, starting at a high price would mean ending at a relatively high price. It's just another option to consider.
 
£299 / $350 are ideal price points for consoles (at launch)

in all fairness they should go back to the simple ps2 price/sku model of console selling. imo.
 
On Larrabe I dont think it makes to much sense, first because from a tech POV they dont need a Larrabe as Cell should be quite good (meybe not as much, but close) in most of the things thaat a Larrabe will do and it would give a number of above mentioned problems, and for "true GPU tech" Nvidea/ATI/PowerVR... should be much better.

So unless they are really competetive in price it doesnt make much sense.

Just one question, does Intel suports OpenGL or just DX?

£299 / $350 are ideal price points for consoles (at launch)

That may depend of a lot of factors.

You most consider that they probably dont want a console with a big cost (hard to reduce). Then if they go a lower power route, they need to give a perception of value to that money wich is harder and so on...
 
Did MS go with Xenon because they were determined to go with ATI (after falling out with nVidia) and that meant, for political reasons, Intel was out?

And if MS wanted to retain ATI (for backwards compatibility and other reasons), they couldn't go with Intel again?
 
MS didn't go Intel because they wanted to own the IP as standard in the console world... and an issue on which they felt themselves burned last time. Probably none of it had to do with the choice of graphics firm.
 
I find this whole rumor highly unlikely.
Sony can easily extend their SPU design to be a better fit for pixel/texture work. Even Larrabee is known to have some fixed function hardware for texture sampling.
SPU pixel work is possible even now on PS3 but is rather limited due to ridiculous limitations when reading RSX's VRAM from the PPU/SPU. Given some fast unified memory, SPUs can do a pretty decent work processing pixels. Actually scratch that, given some decent chunk EDRAM with read-modify-write access, SPUs can do miracles.
 
I find this whole rumor highly unlikely.
Sony can easily extend their SPU design to be a better fit for pixel/texture work. Even Larrabee is known to have some fixed function hardware for texture sampling.
SPU pixel work is possible even now on PS3 but is rather limited due to ridiculous limitations when reading RSX's VRAM from the PPU/SPU. Given some fast unified memory, SPUs can do a pretty decent work processing pixels. Actually scratch that, given some decent chunk EDRAM with read-modify-write access, SPUs can do miracles.

I don't how having a slow read from VRAM is a big limitation when the RSX can read from the XDR.
 
I don't how having a slow read from VRAM is a big limitation when the RSX can read from the XDR.

The point was that IF an SPU was to do pixel shading it will need FAST read AND write access to VRAM. On PS3 PPU/SPU reading from VRAM is VERY slow. You can obviously work in Main (XDR) memory and then move the result to VRAM, but these trips add up pretty quickly.
 
Status
Not open for further replies.
Back
Top