Predict: The Next Generation Console Tech

Status
Not open for further replies.
Nothing to do with the rumours, but how about two Jaguar based APUs on the same package connected by something like Hypertransport (dual die Opteron style), where the APUs could generate jobs for each other e.g. if a polygon covered a tile boundary send the post transform vertex data and shader program to the other GPU.

Then you could have two identical chips, each with a 128-bit DDR4 bus (allowing fast cross die access, Opteron style) and use either traditional tile rendering or AFR rendering. Stick 32 MB of edram on each die and you'd have a huge tile or AFR frame buffer available.

In time you could include both in one die with minimal inefficiency - Jaguar clusters will presumably need some kind of interface anyway even if they're on the same die from day one.
 
Just curious.

We have the 7970M outputing 29 GLOPS per TDP, and the upcoming Sea Islands HD 8850/8870 are rumored to output 23-25 GFLOPS per TDP.

If the next Xbox isn't going to use an off the shelf PC GPU what kind of GFLOP/TDP numbers should we expect? Over 30?

Much less. The high-end mobile GPUs can reach such good efficiencies only through binning. Laptops with ~70W GPUs sell sufficiently small numbers that it's ok if only a few percent of all chips made make the top bin for them. All the rest are sold as discrete cards. Consoles can't really benefit from binning all that much. 2-chip consoles can benefit a little, by putting a high consumption gpu with low consumption cpu and vice versa. A solution with two symmetrical APUs would also be good for this.

Generally, when comparing with PC parts, you should always look at the worst version of a chip to think how it would work in a console. MS surely isn't going to manufacture 2x the working chips they need and trash half of them to get to the level of chip quality that the higher-end gpus have.

Sea islands is not going to get a new process, so we shouldn't expect miracles from their performance.
 
Is is possible that Durango / Orbis will be using 32nm/28nm SOI? They can lower the tdp by significant amounts compared to using 28nm bulk, thus achieving high end Pc performance at <200W?
 
Is is possible that Durango / Orbis will be using 32nm/28nm SOI? They can lower the tdp by significant amounts compared to using 28nm bulk, thus achieving high end Pc performance at <200W?

I don't see how that's effective because companies making stand alone $500 GPUs don't use SOI. But maybe if they used SOI now and switched to bulk on shrinks? But that idea seems far fetched.
 
what if the sony is planning orbis to have a 5 yr life like ps2 ? if so how much powerful the hardware can be in orbis so that sony gains profit from day 1 ? also we see the most visually and technologically powerful games on a console in its last years . even a hd 6850 and quad core processor can give a good graphical bump from ps3 and can stay for 5 yrs !
 
Last edited by a moderator:
Is is possible that Durango / Orbis will be using 32nm/28nm SOI? They can lower the tdp by significant amounts compared to using 28nm bulk, thus achieving high end Pc performance at <200W?

SOI doesn't help that much anymore, and it wasn't that big a boon when it did help. Its best times were adding ~10% better CPU clock speed for the same power a few nodes back. It's definitely not going to take a system running at ~400-500W or more like a high-end PC and dropping several hundred watts.

GPUs, aside from the somewhat disastrous turn with Llano and Trinity, don't do SOI because historically they've been on bulk and the extra costs from using SOI didn't justify the additional design work and increased cost per wafer.
The picture isn't better with console components that sell in volume and can't ask for several hundred dollars just for the GPU board.

AMD is also going bulk-only going forward because doing so will save it $20 million a quarter because it had to pay extra to Globalfoundries for research into a process tech nobody else cared about.
 
AMD is also going bulk-only going forward because doing so will save it $20 million a quarter because it had to pay extra to Globalfoundries for research into a process tech nobody else cared about.
Seeing as bulk planar 20nm is shaping up to be a colossal bust, SOI (in the form of FDSOI) is seeing some renewed interest ... everyone is running out of options, it's FinFETs or FDSOI ... planar bulk is dead.
 
I tried this on my translator using google chrome but it's hard to understand.

Does anyone know Japanese so we can translate it correctly?

This is about the PS4 and the rushed job of the Xbox 360.
http://pc.watch.impress.co.jp/docs/column/kaigai/20120608_538586.html

Also a big Xbox executive was hired in March 2010 for the next Xbox Architecture.
Look at some of his awards. Really impressive.

From linkedin:

Eric Mejdrich
Sr Director of SOC Architecture and Principal Architect Xbox at Microsoft
http://www.linkedin.com/pub/eric-mejdrich/1b/108/214
 
Last edited by a moderator:
A couple of quick thoughts. On 32 vs 28 nm, something that may affect the decision is IBM's involvement (if there is any) and use of any embedded RAM. IBM has done a lot of work with regards to embedding RAM on the 32nm node. I'm not sure how mature similar solutions are at 28nm.

The other thing I keep thinking about is that MS has hired chip/soc architects that to me would indicate more involvement in the chip/soc design than just getting an off the shelf Jaguar APU. I'm beginning to think that we will see something really custom from MS. It still made x86 based, but it wouldn't surprise me if didn't resemble an off the shelf solution.
 
what if the sony is planning orbis to have a 5 yr life like ps2 ? if so how much powerful the hardware can be in orbis so that sony gains profit from day 1 ? also we see the most visually and technologically powerful games on a console in its last years . even a hd 6850 and quad core processor can give a good graphical bump from ps3 and can stay for 5 yrs !

the problem is : Competition from Microsoft. and that is great for us gamers, actually if there is one thing that makes me optimistic about next gen hardware its really competition between microsoft and sony, no company can ignore the other company's plans. Sony fear that if they release a weak PS4 hardware, microsoft could surprise them and release a very powerful xbox next, putting them on a very incomfortable position, and the opposit is also true. The only solution I see for both companies is simply what I believe they are doing nowadays :

1- not finalizing their hardware decisions until the very end (maybe just 6 months before the hardware release date, it means most probably final SDKs not before june 2013)

2-while at the same time copying and trying to spy onto the other company's hardware specifications via chats with multiplatform developers, asking them questions like : what would you like we improve for our hardware to make multiplatform development easy for you ? the answers would reveal the competitor's plan and would allow correction of sttrategic errors (amount of RAM, GPU power...etc)

thats why I predict we wont see a huge difference in terms of overall capabilities between ps4 and xboxnext, because thats suicidal for both of them, and each one would try to copy the other, or at least try not to be far behind.
 
1- not finalizing their hardware decisions until the very end (maybe just 6 months before the hardware release date, it means most probably final SDKs not before june 2013)

That's pretty much impossible. I would be very very surprised if the hardware specs aren't yet finalized for the next gen. A little wiggle room with regards to clock speed notwithstanding, I think they're pretty much set in what they'll release next year. That's just my guess though.
 
"AMD has virtually no presence in the fast-growing tablet market, and currently sells most of its chips into the declining PC and graphics markets. AMD currently offers the Z-series chips for tablets, which have performed poorly in the market, and the Jaguar chip could provide a spark to AMD's tablet ambitions.

AMD's tablet rivals include ARM, which dominates the market and whose processor designs are used in Apple's iPad, Google's Galaxy Nexus 7 and Amazon's Kindle Fire. AMD will also compete with Intel, which is expected to release a low-power Atom processor code-named Clover Trail later this year. Intel says 20 tablet designs are in the works based on Clover Trail."
http://www.electronics-eetimes.com/...-applications.html?cmp_id=7&news_id=222913711



"Advanced Micro Devices will describe Jaguar, a low-power x86 core for notebooks, tablets and embedded systems at Hot Chips. Jaguar packs four x86 cores into one unit with a large shared L2 cache to compete both with Intel’s Core and Atom chips.

In a separate keynote talk, AMD will announce a follow-on for its HyperTransport processor interconnect. Freedom Fabric aims to link thousands of cores at more than a terabit/second, likely based on technology acquired from SeaMicro.

AMD is expected to try to make Freedom Fabric an industry standard across x86, graphics and ARM cores, competing with the proprietary Quick Path Interconnect on Intel’s CPUs. Last week, the RapidIO Trade Association said it is trying to get ARM and its SoC partners to adopt its technology as a processor interconnect.

As for the Jaguar core, AMD predicts that based on simulations it will deliver more than ten percent higher frequencies and more than 15 percent more instructions per clock than Bobcat, its current low power x86 core. Jaguar will appear in 2013 in AMD’s Kabini SoC for low-power notebooks and in Temash, AMD’s first sub-5W SoC, aimed at tablets."

http://www.electronics-eetimes.com/...-applications.html?cmp_id=7&news_id=222913711


I have one question :
why almost everyone in this forum tends to believe there would be jaguar AMD chip on every ps4/xboxnext ? how could microsoft and sony make a deal with a company having almost no great experience with Tablet processors, why not Intel or ARM ? and why using a tablet processor for a dedicated home console ?

" AMD predicts that based on simulations it will deliver more than ten percent higher frequencies and more than 15 percent more instructions per clock than Bobcat,"
I dont believe xboxnext or ps4 would use such an underpowered processor, there are plenty of better alternatives :rolleyes:
 
I dont believe xboxnext or ps4 would use such an underpowered processor, there are plenty of better alternatives :rolleyes:

What alternatives are you thinking about? Available ARM cores are slower than Jaguar. A deal with Intel is probably not a viable option. Bulldozer has inferior performance per watt than Jaguar. PowerPC A2 has lower single-thread performance than Jaguar. A quad-core or six-core Power7 derivative might be great , but I am not sure IBM would design something like that. The other option is Microsoft hiring a team to design their own high-performance ARM CPU , but that's risky and unlikely.

By the way, even if the CPU is Jaguar based, it might be heavily customized.
 
I've grown used to that idea, it's at least a lot faster than Wii and Xenon.
There aren't that many faster CPUs, first there's Intel *bridge and *well, but Intel doesn't share much its tech, then Bulldozer iterations : it's too fat for a console APU, then Fujitsu Sparc maybe?, POWER7+ and Itanium. I think I have not forgotten anything. Yes, I forgot the Z mainframe.

Maybe high expectations are because of Intel being so incredibly advanced, so you'd think you would have something comparable to a 2500K or 3770 in late 2013 console. As for the other things mentioned they work in a high cost, high margin market and most of those options (including Intel, even Bulldozer) imply passing on the opportunity of using an APU.

Hell, an alternative is Nvidia Maxwell, with its custom ARM v8 CPUs but it's too late for this new gen.
 
I dont believe xboxnext or ps4 would use such an underpowered processor, there are plenty of better alternatives :rolleyes:

Also : it has low paperflops but is good at actually delivering those flops. If we can believe 8 cores are used too, you have a decent number of reasonably usable flops.
After reading about the unbelievably low IPC in Xenon, I think we've had it worse last time. Consoles always have an underpowered CPU too, the N64 didn't use a Pentium Pro and the Game Cube didn't have a G4 (using Nintendo examples are maybe a bad thing but I hope I'm understood :))
 
I think it is really a matter of money and we are speaking a lot of money.
It costs (it seems) Nintendo 1 billion for a major broadway revision, the whole point is how much AMD offered for an existing CPU (or one they are committed to develop in our case jaguar) and the matching GPU when they have the existing tech and how to to link both (in case of an APU) or come with a "modulation" of an existing IP.
Other hands how much would it cost to negotiate the deal with multiple actors?
I think it is really likely that if:
AMD asks C for the CPU and GPU
IBM asks A for a custom CPU
AMD asks B for a custom GPU
Then we have:
A+B > C
And that is without factoring that IBM doesn't have a compliant CPU, so there is a need for a really custom part (cost most likely more than Jaguar).
 
Last edited by a moderator:
I've grown used to that idea, it's at least a lot faster than Wii and Xenon.
There aren't that many faster CPUs, first there's Intel *bridge and *well, but Intel doesn't share much its tech, then Bulldozer iterations : it's too fat for a console APU, then Fujitsu Sparc maybe?, POWER7+ and Itanium. I think I have not forgotten anything. Yes, I forgot the Z mainframe.

Maybe high expectations are because of Intel being so incredibly advanced, so you'd think you would have something comparable to a 2500K or 3770 in late 2013 console. As for the other things mentioned they work in a high cost, high margin market and most of those options (including Intel, even Bulldozer) imply passing on the opportunity of using an APU.

Hell, an alternative is Nvidia Maxwell, with its custom ARM v8 CPUs but it's too late for this new gen.

The initial rumors on the xbox 720 dev kit claimed it used an Intel quad core chip with hyperthreading (4 physical cores, 4 logical, likely SB i7). That was followed by rumors that it switched to an 8 core AMD processor. If those rumors are to be believed, I think it makes sense that whatever processor MS chooses will be similar in single threaded performance. I doubt ARM or IBM are in the picture anymore, as least not as the main CPU.
 
Seeing as bulk planar 20nm is shaping up to be a colossal bust, SOI (in the form of FDSOI) is seeing some renewed interest ... everyone is running out of options, it's FinFETs or FDSOI ... planar bulk is dead.

If AMD goes that route, it's going because GF offers FD-SOI as a general process offering. AMD's problem with the PD-SOI process was that it was only for AMD, and it had to pay extra for the priviledge (and then paying hundreds of millions of dollars this year to either not use it or underutilize it).

"In a separate keynote talk, AMD will announce a follow-on for its HyperTransport processor interconnect. Freedom Fabric aims to link thousands of cores at more than a terabit/second, likely based on technology acquired from SeaMicro."
The fabric doesn't replace coherent Hypertransport, so it's not a processor interconnect that AMD should claim as linking cores. They're either linked because the cores are on the same chip, or they have no idea of the existence of the other cores.
For consoles, there's no need to virtualize resources like the shared IO and storage planes of a Seamicro server rack.
Non-coherent system traffic can use PCIe. This is trivially so because CPUs in Seamicro boards use PCIe to communicate with the FF ASIC.
It's not relevant to consoles.

I have one question :
why almost everyone in this forum tends to believe there would be jaguar AMD chip on every ps4/xboxnext ? how could microsoft and sony make a deal with a company having almost no great experience with Tablet processors, why not Intel or ARM ? and why using a tablet processor for a dedicated home console ?
ARM isn't better, at least not for consoles. It doesn't really try out for higher (for them) performance until 2014 at the earliest. ARM has no history of strong GPU capability or good memory and system-level performance at the level of a console, or really for tablets and phones for that matter.
Intel doesn't have all that outstanding of a GPU, although its cores and general system infrastructure are among the best for client systems--if you want to pay the going rate for the best.
IBM does have decent cores and decent system implementation, but no graphics or media processor presence.

AMD has a good GPU, it has decent system interconnect experience, and its CPUs are still better than the usual console fare.
That, and AMD is desperate for cash, so they probably promised to do a lot of computationally kinky things for the money.

I don't have concrete info that the consoles will use Jaguar, but let's remember that the money and performance bar for console chips isn't that high. Jaguar in this context isn't half-bad.
 
Status
Not open for further replies.
Back
Top