Predict: The Next Generation Console Tech

Status
Not open for further replies.
It's a make-believe website posting unsubstantiated make-believe twaddle, and shouldn't be referenced for anything. It shouldn't be visited even.

Well there are some snippets of truth in there, even if they are just making things up.

1) Larrabee is rather less than likely to be the PS4 GPU.
2) POWER7 cores - the PowerXCell 32iv* was said to have 4 PPEs based on POWER7 technology.


*Interestingly IBM only said the chip with 2PPEs was cancelled, they never said anything about the 4PPE version.


So with that (and the news Larrabee has been canned). What will be in the PS4 now?
 
Well there are some snippets of truth in there, even if they are just making things up.

1) Larrabee is rather less than likely to be the PS4 GPU.
2) POWER7 cores - the PowerXCell 32iv* was said to have 4 PPEs based on POWER7 technology.


*Interestingly IBM only said the chip with 2PPEs was cancelled, they never said anything about the 4PPE version.


So with that (and the news Larrabee has been canned). What will be in the PS4 now?

The cancelled chip was the successor to the PoweXCell 8i which is used in IBM's Roadrunner supercomputers. The Roadrunner is made up of Opeteron and Cell chips for compatibility with existing Opteron code that some of IBMs customers have. What IBM will probably do is to to combine opteron chips and pure SPE chips, because the Power chip on the Cell is redundant when running opteron based applications with the SPEs acting as number crunching accelerators.
 
Well there are some snippets of truth in there, even if they are just making things up.

1) Larrabee is rather less than likely to be the PS4 GPU.
2) POWER7 cores - the PowerXCell 32iv* was said to have 4 PPEs based on POWER7 technology.


*Interestingly IBM only said the chip with 2PPEs was cancelled, they never said anything about the 4PPE version.


So with that (and the news Larrabee has been canned). What will be in the PS4 now?

1.) Not sure anybody really ever believed it would be (I always thought the idea was a joke).
2.) PowerXCELL was never any more related to POWER7 than it was a PPC603C. POWER7 is a massive, hyper-expensive server MCM product with no business to really even be in a console. We keep getting "Oooo, but it's got DPFP"... and this is going to be *the* defining feature next gen, how?


The reality is, honestly we don't know anything better now than we did six months ago, other than IBM has personally lost interrest in future revs of CELL (which does not break possibility of a custom CELL regardless), and that it looks like we're back to realistically a toss-up between Nvidia and ATI.
 
I think yesterday's news that Larrabee will come out only as a SDK initially is conclusive proof that the next Cell, used in PS4, will be so powerful, it will have no problem emulating Larrabee in software!
 
So with that (and the news Larrabee has been canned). What will be in the PS4 now?

Breaking: Intel canned Larrabee to work on PS4.

PS4 will most likely just the current Cell for CPU but with all the 8 SPUs and higher clock. Plus whatever GPU Sony decided to go with. Likely candidate are NV, AMD, PowerVR and maybe Toshiba.

I hope they go with Toshiba, it'll most likely be underpower but at least it'll be some crazy solution that will be interesting to discuss on this board.

If Kutaragi was still with Sony, PS4 would be easier to predict because he will hype it like no tomorrow. Without him Sony lacks vision.
 
Well Larrabee is canned and SCC will be the only INtel many core project Intel will push. Too bad it's a even more threatening chip for IBM: many x86, scalable, no extra headache coming from vectorization.
I'm not confident in IBM trying to push a heterogeneous chip. Tilera, Intel, IBM? (even-though those products aims at different applications/workloads/markets).
"One size fit it all" sounds like tomorrow maxim, by the way I've a question :) (should add this to my signature...) it's in regard to textures units.
Say you have a cpu with VPU (no matter its width) that could work 64bits data or twice as much 32 bits data or four time as much 16 bits data would the lack of proper texture units would hurt that much? I mean most textures operations are done at 8bits precision and 16bits precision may be a bit of a waste but it may make things easier on both the software and hardware side.
What do you think in this regard?
 
Better performing at what kind of tasks? Are these tasks performance critical for next-gen videogames? Can these tasks be run even faster on a different architecture Fermi and/or Larrabee? There are a lot of complex questions involved. I don't see the point of having a Cell-like CPU to run stuff like skinning, motion blur or depth of field, especially when paired with a next-generation GPU.
I'm not particularly knowledgeable when it comes to CPU architectures or programming so I can't give a specific task or what-not, but basically any kind of task that is amenable to fast vector math. The kinds of jobs could be physical simulation, AI, audio, some rendering tasks, and other cpu jobs for handling data. As for whether or not Larrabee or GPGPU can run these tasks better, I don't know. I assume Cell would perform better than your typical wintel CPU but that's as far as I go.

OOOE/SMT are two different techinques to hide memory latency and improve utilization of execution units. Pretty much every modern CPU design uses at least one of these techniques.
I understand that OoOE improves utilization by avoiding stalls due to data latency. An OoOE PPE sounds like a good idea to me but I heard that IBM was looking to pair SPEs with a power6 core which lead me to ask the question. I'm also wondering given an SPE like memory model if there's a way to design your jobs so they won't stall. SPEs are in-order and that's not likely to change for the next iteration of Cell.
 
1.) Not sure anybody really ever believed it would be (I always thought the idea was a joke).

I found it somewhat unbelievable given the source.

2.) PowerXCELL was never any more related to POWER7 than it was a PPC603C. POWER7 is a massive, hyper-expensive server MCM product with no business to really even be in a console. We keep getting "Oooo, but it's got DPFP"... and this is going to be *the* defining feature next gen, how?

POWER7 will be in a big MCM in some configurations but in others it'll just be a normal chip in a normal package.

However, the cores themselves don't appear to very big or power hungry and given they're likely considerably faster then the PPE they could be used as a new PPE.

They've done this before - The PowerPC G5 used a core from POWER4.
 
I found it somewhat unbelievable given the source.



POWER7 will be in a big MCM in some configurations but in others it'll just be a normal chip in a normal package.

However, the cores themselves don't appear to very big or power hungry and given they're likely considerably faster then the PPE they could be used as a new PPE.

They've done this before - The PowerPC G5 used a core from POWER4.

See I could see something closer to this - like a few (two) smaller, stripped down POWER7 cores as PPE, with maybe 12-16 SPEs in tow, but I wonder if it'd even be relevant. I.E. would POWER7 cores have something more substantial to offer than say something from AMD, or another design from IBM. We still have to grapple with how lopsided the hardware might end up (things shifting toward powerful GPUs that largely ignore the CPU). That could figure greatly into what type of CPU we actually end up staring at. We could just end up with a rediculously powerful GPU paired up with some fairly spindly quad-core AMD of the day, and still have the strongest beast freak on the block (Ginormous arms, dangly little legs).
 
So are next generation console chips likely to be fabbed on SOI, Bulk high K or just regular?

What are the design tradeoffs between targetting SOI for perhaps lower power useage and your regular high K processes? Is it simply a question of a tradeoff between power + cooling and power regulation circuitry vs increased cost per viable die?

Lastly if the 32nm SOI process at Global Foundries is ramping up now, how long would it take to build up a stockpile of 5-8M units, assuming the chip tapes out late Q1/early Q2? Im not assuming that its happening, but I just want to know if/it is possible.
 
I would love to see a 1080p 30fps standard next gen. No exceptions. I don't like how some games go below 720p, or how some games go below 30fps.
 
I want a settle on 720p (well, freedom to the developers, I don't doubt we'll see a lot of 1080p titles).
Next-gen, consoles will be limited by watts (and then bandwith), you won't see again a custom version of a high end chip as on PS3. You can safely rule out a full size Fermi or a similar AMD offering.

I favor a solid 720p, well anti-aliased (MSAA + transparency supersampling + selective supersampling built in the code of noisy shaders), more efficient on a 720p display than a downscaled 1080p. (perhaps by then most people will own 720p TV, followed by 1080p and SD in no particular order)

downscaled 1080p is a crappy and expensive 2.25x ordered grid supersampling ;)
 
Something like the 1280x1080 used in GT: Prologue is a nice compromise if you ask me. It delivers really pristine image quality on 1080p displays with simple horizontal scaling and doesn't come with the 2x perf. cost that standard 1080p does.

A mandate that not a single title can ship without 8xaf is needed as well. Lack of proper texture filtering, is the easiest way to make any game look like an ugly mess no matter how many effects it pushes, even moreso than aliasing (which a nice resolution like 1280x1080 will go a decent way to solving anyway) imo.
 
Heck... why not go all the way and have a 1080p 60fps standard ;-)

60 would be nice on sports, fighting, racing games... But could be too restrictive on FPS, TPS, or RPG's that set out and market their games with advance graphics. I could tell a difference between 30 and 60, I think anyone can, but I can tell more of a difference between 25 and 30 or 20 to 30. Thats just me, but it tends to make me spin.

I want a settle on 720p (well, freedom to the developers, I don't doubt we'll see a lot of 1080p titles).
Next-gen, consoles will be limited by watts (and then bandwith), you won't see again a custom version of a high end chip as on PS3. You can safely rule out a full size Fermi or a similar AMD offering.

I favor a solid 720p, well anti-aliased (MSAA + transparency supersampling + selective supersampling built in the code of noisy shaders), more efficient on a 720p display than a downscaled 1080p. (perhaps by then most people will own 720p TV, followed by 1080p and SD in no particular order)

downscaled 1080p is a crappy and expensive 2.25x ordered grid supersampling ;)

I read somewhere that 1080p sets lately are outselling 720p. Mainly because cost of TV's have come down so much, 1080p is a lot more viable. 720p made sense this gen(2005), but if were talking 2012 or 2011 for next gen, and most likely a 7 year life cycle for them (2018-2019), it makes a lot more sense to go 1080p. Regardless if you have a market base still on tubes, or a huge chunk on 720p.

Gears of war looked fantastic on my 480p set. The artificial super sampling effect did some nice FSAA. In fact the game looked some what worse to me when I played it on my 720p set.

I personally can't stand the look of 1280x720 expanded past 32 inches. And I won't be able to bare another generation of it.
 
I read somewhere that 1080p sets lately are outselling 720p. Mainly because cost of TV's have come down so much, 1080p is a lot more viable. 720p made sense this gen(2005), but if were talking 2012 or 2011 for next gen, and most likely a 7 year life cycle for them (2018-2019), it makes a lot more sense to go 1080p. Regardless if you have a market base still on tubes, or a huge chunk on 720p.

Aren't you going to be rendering almost 3x the pixels again. That would require a 3x leap in hardware power just to stay where we are at right now.

No I think 720p will be the standard with loads of fsaa .
 
How about some different standards. I say 720p minimum, No tearing is a must and max 5% dropped frames on 30hz games and max 10% dropped frames in 60hz games. As long as these conditions are met just optimize for the highest resolution for your target refresh rate with 720p 30hz as minimum.
 
Status
Not open for further replies.
Back
Top