Predict: The Next Generation Console Tech

Status
Not open for further replies.
what kind of CPU+GPU do you guys expect if MS really goes for Intel+NVIDIA?

if microsoft wants many low-power cores, they could go for an 8-core silvermont based (next gen atom) chip, especially if it supports AVX like AMD's jaguar cores will. If they want less but more powerfull cores they could go with something like a 4-core/8 thread haswell based chip (a quad-core haswell at reasonable clocks without the integrated graphics should be pretty power efficient, even at load)..

on NVIDIA's side, how would a cut-down GK104 fare as a console GPU?
the mobile 100W TDP GTX680M has 1344 SPs @ 720 MHz..

eight-core Atom is the single most, worst CPU on earth you can imagine for a console, I believe its existence was already hinted at monthes ago. it's a server CPU for doing a huge number of small and repetitive tasks such as being a high traffic web front end for the machines which do real work behind the scenes.

a dual core "real" Intel CPU will murder it, also any (imaginary) AVX support would just be serially executing stuff, which instructions being there for the sake of compatiblity (and Ivy bridge pentium has AVX disabled just to piss customers off)

now a quad core Intel, anything that ends in "bridge" or "well", probably being the latter, this of course would be incredibly fast, no need to cripple it with SMT removal either.. the only embarrassing thing would be selling it to Microsoft at a fraction of the price of the equivalent desktop CPU, what if the whole console is $299 but the 4C/8T desktop CPU costs almost as much :). just a slight marketing problem.

pretty tempting is the idea of say two *-well cores and 16 xeon phi cores (for instance) but you have to feed it in bandwith and really, game developers won't necessarily like a Cell situation again.
one console can differenciate by getting such many-core capacity, but it will be expensive, taking budget from the GPU and the cores will just sit unused.
 
Has someone with info you know said GF are fabricating the chips? Because I read SA article mentioning that Durango chips are in production, must be half a year ago at least. I don't think thats credible and I don't think IGN rumor suggesting similar was credible.

I've either seen it indicated or rumored in the past. It's been rather consistent as I've yet to see any mention of TSMC involvement yet. As for the SA/IGN's info, some of us have speculated Oban is related to Wii U. Both have a lot of coincidences right down to Oban being a Japanese coin.

This though...He has dev kit confirmed to be legit by some developers working on it, and you can't get any more credible than that I would assume.

But the only thing confirmed in that article was that he had the dev kit. In the section in question even Leadbetter starts off with...

Beyond that, further information is sketchy and unreliable.
And ends with...

The hardware configuration seems difficult to believe as it is so divorced from the technological make-up of the current Xbox 360, and we could find no corroborative sources to establish the Intel/NVIDIA hook-up, let alone the eight-core CPU.
And then there is this section:

The presence on this screen of the "immintrin" element strongly suggests that the Durango coding environment is built around x86 CPU architecture, supporting the AVX (advanced vector extensions) instruction set that was added in last year's Sandy Bridge revision. However, AVX is now supported on some of the most recent AMD processors too.
http://www.tomshardware.com/news/amd-bobcat-jaguar-kabini-avx,16384.html

A note posted by an AMD developer suggests that the Kabini processor, a 28 nm mobile chip that will integrate the Jaguar processing core, will support Intel's AVX instruction set, as well bit-manipulation instructions (BMI).

The quote: "We are expecting some changes to tunings and costs. We will update them in near future. There are ISA changes as well like btver2 supports AVX, BMI."

It was recently reported that AMD scrapped SSE5 in favor of AVX, which is likely to also introduce SSE4 in Jaguar. Jaguar's predecessor Bobcat only supports SSE3.



Kabini is expected to be released in 2013 as a low-price and low-power mobile processor with up to four cores and a power envelope spanning 9 to 25 watts. The program of the upcoming Hot Chips 24 conference in Cupertino indicates that AMD will be revealing more information about Jaguar at the end of August.
For me I would need to see more people saying the same thing to believe it right now because there's I've seen nothing publicly or privately about MS going with Intel and nVidia. Although I'm biased towards those two, so I wouldn't mind seeing it happen. :LOL:

But while I like them for a PC, their business practices haven't been conducive to the hardware provider. And I'd rather see MS have a good deal on their hardware than have something that puts them in a bad position down the road.
 
Forbes, for whatever the reason, watches internet forums like GAF on regular basis. They had some other stories taken directly from forums.


Forbes is a content farm. Pretty much anyone can post a story there. It's why their game "coverage" so frequently sounds like fanboy nonsense: because it is. And they get paid for clicks which leads to populist pandering and there is little to no editorial oversight.
 
pretty tempting is the idea of say two *-well cores and 16 xeon phi cores (for instance) but you have to feed it in bandwith and really, game developers won't necessarily like a Cell situation again.
one console can differenciate by getting such many-core capacity, but it will be expensive, taking budget from the GPU and the cores will just sit unused.

Exotic, but nevertheless interesting. :smile:
Would a 8C/16T Silvermont-based CPU be really that bad in a console? There is not much info about this architecture yet, but its known that it will be an OoO design and should bring a serious IPC boost over those weak Bonnell cores in today's Atom. Jaguar is also a low-power design and its rumoured to be inside the next PS. I agree that 2-4 Haswell cores (and Steamroller cores for PS4) would still be a better choice.
 
No, but having exact same console as your competitor doesn't make much sense from marketing point of view. I doubt MS would be comfortable going against Sony with just two differences, controller and having to pay for on-line.

When have they ever actually marketed the actual specs? They'll both paste a lot of bullet points on a power point presentation, one will have an xbox logo the other will have a playstation logo and they will both play games and do a bunch of other crap. IF the controller you're talking about is Kinect, I think that's plenty of difference for MS to run with. Do you really believe the average consumer gives a damn who made the parts inside the box, what the box does is what matters, and that's what they will continue to market.

And the 670 doesn't beat pitcairn in perf/watt, not that it matters. It's close, but so is the 7950. The real point I'm trying to make is that they won't just use something different for the sake of being different, if the part is a good value it'll actually be a nice win for 3rd party devs.
 
Alpha Kit = (8 core intel cpu + nvidia GPU)
Beta Kit = (8 core arm cpu + nvidia GPU) aka Project Denver aka NV3A

If Nvidia didn't have the Arm 64 cores ready and Microsoft wanted to start work on early kits they just used some Intel cores to get things going.
 
Does the idea of putting one big core for all the single threaded stuff plus multiple little cores actually make sense? After the first one or two fat cores do you really benefit from having more of the same or can you do more with less and more?
 
No, but having exact same console as your competitor doesn't make much sense from marketing point of view.
Actually there's a case for it - you're no worse. ;) Sell a little bit cheaper, or with a different set of experiences, and you don't need to get bogged down in a stupid tech war amongst gamers who don't understand the specs. If MS lost some customers this gen because some people believed PS3 was more pwoerful than XB360, that wouldn't happen if they are the same box. Many branded consumer items are the same only badged differently, and it'd work just as well regards marketing in a console if we haven't got room any more for exotic hardware and fancy sounding component names to sell them on.
 
Yeah, like the Iphone 4 and the Samsung Galaxy... the same chips (more or less), just a different packaging and screen. Both were midly successful, if I remember correctly.
 
eight-core Atom is the single most, worst CPU on earth you can imagine for a console, I believe its existence was already hinted at monthes ago. it's a server CPU for doing a huge number of small and repetitive tasks such as being a high traffic web front end for the machines which do real work behind the scenes.

a dual core "real" Intel CPU will murder it, also any (imaginary) AVX support would just be serially executing stuff, which instructions being there for the sake of compatiblity (and Ivy bridge pentium has AVX disabled just to piss customers off)

pretty tempting is the idea of say two *-well cores and 16 xeon phi cores (for instance) but you have to feed it in bandwith and really, game developers won't necessarily like a Cell situation again.
one console can differenciate by getting such many-core capacity, but it will be expensive, taking budget from the GPU and the cores will just sit unused.
Why would a 8 core (16 thread) ATOM derivative with AVX (256 bit vector units) be worst thing ever and Xeon Phi cores would be great? Both are simple x86 cores with in-order execution (serially executing instructions), wide vector units and hyperthreading (to fight memory and pipeline stalls). Original Pentium (P54) core (that is the base for Xeon Phi) isn't that much different from a ATOM core. 8 core ATOM derivative (8C/16T) with AVX would offer 4x peak vector processing performance compared to a "real dual core" (2C/4T) Sandy/Ivy Bridge (assuming full width AVX units). Of course the real performance is much closer to the peak performance on a modern out-of-order core compared to a simple in-order core (ATOM or Xeon Phi). I don't think it would be that bad idea for a gaming oriented hardware to have a CPU with lots of simple cores with wide vector units on each core. What made Cell programming hard wasn't the core count, but the lack of automatic coherent caches (no cached addressing to main memory) and the difference in instruction sets (SPU was designed for pure vector processing).
eight-core Atom is (...) CPU for doing a huge number of small and repetitive tasks
But that's exactly what scalable data driven game engines are doing. A CPU with lots of cores and wide vector units would be perfectly suited for that kind of processing.
Does the idea of putting one big core for all the single threaded stuff plus multiple little cores actually make sense? After the first one or two fat cores do you really benefit from having more of the same or can you do more with less and more?
That would actually make much sense. Not all problems are easy to parallelize, and some algorithms benefit much more from functionality offered by a complex core: Branch prediction, out-of-order execution, efficient store forwarding, prefetching, etc. However you wouldn't want to go the Cell route of completely separate cores with different instructions sets and no cache coherency. For a programmer it would be more efficient to have unified address space (all memory locations can be addressed from all threads), and a single instruction set (you can run the same binary code on any core and have perfect interoperation). However if coherent caches and identical instruction sets are required, you would of course have to design these cores together. Gluing up different kind of cores (like in Cell) wouldn't be possible. ATOM and Sandy/Ivy do not have identical cache systems or instructions sets and neither do Piledriver and Bobcat. ARM's A7/A15 however do, and ARM has already annouced this technology (http://www.arm.com/products/processors/technologies/biglittleprocessing.php).
 
Very interesting turn of event here with Nvidia back in the game, pun intended. But can we rule out the possibility of Sony going with Jen Sun Hung again this time or are they very much set in the stone with AMD? At this rate a Cell 2 with 24 SPEs inside a PS4 might just pop up next week.
 
It's odd to suddenly have everything you think you know about next gen upended, but I still think the BG/lherre/bkilian trio have painted a pretty consistent and likely picture.

Even DaE's twitter today has some tweet seeming to say the Durango is an AMD box (yeh, he really doesn't know what he's talking about lol). So our one source for Nvidia/Intel isn't even saying that, apparently.
DaE ‏@superDaE

Also for the record, its an AMD. #durange #car @IGN I never said it ONLY had 8GB of memory.
 
Could it be possible that IF there really in an Nvidia GPU in those Durango dev kits, they are only there because all of the current "next gen" tech is developed on Nvidia hardware.

Would giving developers hardware closer to what they are currently developing on now, so they can begin work as early as possible, and switching to AMD hardware when it's further in development make sense?
 
It's odd to suddenly have everything you think you know about next gen upended, but I still think the BG/lherre/bkilian trio have painted a pretty consistent and likely picture.

Even DaE's twitter today has some tweet seeming to say the Durango is an AMD box (yeh, he really doesn't know what he's talking about lol). So our one source for Nvidia/Intel isn't even saying that, apparently.

He's definitely an interesting character.
 
I think he is trolling

DEA-1.jpg
 
Could it be possible that IF there really in an Nvidia GPU in those Durango dev kits, they are only there because all of the current "next gen" tech is developed on Nvidia hardware.
I don't see what advantage that'd have. PC engines running on nVidia GPUs are going through DX, so are dependent on the drivers for hardware utilisation and shouldn't be having any particular nVidia bonus. Switching to AMD GPUs should run just fine, otherwise PC devs are being asked to run two separate codebases for nVidia and AMD, defeating the point of DX! ;) And if devs are managing to make more efficient use of nVidia hardware, then a last minute switch to AMD will only get in their way so it'd behoove the console companies to provide the same sort of architecture ASAP.
 
I just remembered Rangers post when he IM'd that Dae individual. He said its Intel/Nvidia combo and that its 5xxx series. That guy is clearly trolling. I don't think he can/wants to give specs out. He probably didn't even try to sell it, most probably got hold of dev kit or pictures from it and now is looking for attention.
 
Well never mind on my previous dumb comment :p This DaE guy has me scratching my head.

I don't see what advantage that'd have. PC engines running on nVidia GPUs are going through DX, so are dependent on the drivers for hardware utilisation and shouldn't be having any particular nVidia bonus. Switching to AMD GPUs should run just fine, otherwise PC devs are being asked to run two separate codebases for nVidia and AMD, defeating the point of DX! ;) And if devs are managing to make more efficient use of nVidia hardware, then a last minute switch to AMD will only get in their way so it'd behoove the console companies to provide the same sort of architecture ASAP.

Well my line of thinking was they were less concerned about lower level coding and just wanted something to start working on. I could be wrong, but I thought there were rumors that one of the very early 360 dev kits used Nvidia cards in SLI (possibly for Gears) and was thinking something similar may be going on here. I never meant to insinuate they would be changing hardware at the last minute. :p
 
They will use arm cores on monday, power7 in the days divisible for 2, and amd's steamroller on the remaining days
the different gpus are activated by the moon cycles

anyone feel free to tweet this leak
 
They will use arm cores on monday, power7 in the days divisible for 2, and amd's steamroller on the remaining days
the different gpus are activated by the moon cycles

anyone feel free to tweet this leak

To avoid leak MS created a Tsuleakami.:LOL:
 
Status
Not open for further replies.
Back
Top