Predict: The Next Generation Console Tech

Status
Not open for further replies.
Man I keep worrying that next gen consoles won't be powerful enough. I hope Epic games, square enix, etc. put a lot of pressure on sony and MS to run their new engines flawlessly.

If the specs turn out to be disappointing then everyone in the industry suffers including the costumers. Developers will have to make sacrifices in the environment, number of characters, enemies, objects, etc. Framerates will drop in order to match that caliber of graphics. Development time and costs will rise because devs will have to squeeze whatever they can on inferior hardware just as they do now instead of focusing on the story and gameplay. We will have to wait longer for games and devs will be more inclined to make dlc content to make up for their time.

Basically everyone loses in this scenario because Sony and MS might cheap out like Nintendo has. It's funny that we have to worry if next gen consoles can handle next gen engines, because if they can't then they shouldn't be released until its feasible.

AMD is in trouble as a company right now, I don't see why they don't give them a future GPU design in advance optimized for gaming at a cheap price. Maybe if Nvidia made some competitive offers to Microsoft and Sony then the chance of getting a real next gen system would greatly rise.
 
I remember a rumor saying next xbox not using optical drive, cartridges instead.

I've pushed that idea quite a few times and pretty much been shot down by financial and technical constraints every single time... Mass production is one issue and so is cost.

Microsoft does have the design language of 'peripheral venting'. If they didn't have an optical drive they could also save on cooling as well given the large space an optical drive takes up. This is probably the only other pro to cartridges for them I can think of at least in terms of the initial design.

As we've seen with cost break downs a small cut in costs in production can translate into a large difference in cost for the end user when we take into account the % margins flowing through the various distributers and manufacturers .
 
Man I keep worrying that next gen consoles won't be powerful enough. I hope Epic games, square enix, etc. put a lot of pressure on sony and MS to run their new engines flawlessly.

If the specs turn out to be disappointing then everyone in the industry suffers including the costumers. Developers will have to make sacrifices in the environment, number of characters, enemies, objects, etc. Framerates will drop in order to match that caliber of graphics. Development time and costs will rise because devs will have to squeeze whatever they can on inferior hardware just as they do now instead of focusing on the story and gameplay. We will have to wait longer for games and devs will be more inclined to make dlc content to make up for their time.

Basically everyone loses in this scenario because Sony and MS might cheap out like Nintendo has. It's funny that we have to worry if next gen consoles can handle next gen engines, because if they can't then they shouldn't be released until its feasible.

AMD is in trouble as a company right now, I don't see why they don't give them a future GPU design in advance optimized for gaming at a cheap price. Maybe if Nvidia made some competitive offers to Microsoft and Sony then the chance of getting a real next gen system would greatly rise.
Don't worry about next-gen Xbox i think,since the MGS Vancouver(now rename to Black Tusk)'s first game will use UE4 engine

Sony will follow either i guess
 
That looks like a reasonable rumour, fits in with everything else we've heard about Durango.
It's shaping up to be quite a powerful little console actually, and we don't even what else MS has done to customise the hardware.
 
Saw a interesting reply in neogaf,he gave a data about 3 possible way for 1.6GHz nextbox CPU(if 1.6GHz rumor is true)
http://www.neogaf.com/forum/showpost.php?p=44997411&postcount=573
Thraktor said:
Just to go into a little more detail, I feel from various (and conflicting) rumours there are three possibilities:

IBM PowerPC A2 based

This fits the four-cores, four-threads-per-core rumour, and the 1.6GHz rumour. The cores are about 6.58mm² on a 45nm process, so on a 32nm process you could fit four cores and 8MB eDRAM cache within 30mm² or so, which is pretty small for a console CPU (about the same die size as Wii U's 45nm "Espresso" CPU). Power draw would be around 10W at 1.6GHz.

AMD Jaguar based

The Jaguar architecture is designed to go up to 2GHz, so 1.6GHz would be a reasonable clock for it in a console environment. It's a single-threaded architecture. At 28nm each core (including 512KB cache) is about 3.2mm². Designed for 2-4 cores, but an eight-core chip would come to about 30mm² or so as well. I can't find data on power draw, but ~10W would probably be a good guess here also.

AMD Bulldozer based

I'm including Piledriver, Steamroller, etc. here. The "eight-core"* Bulldozer is 315mm² at 32nm, which is fucking huge for a console CPU (you'll notice that it's literally ten times the size of eight Jaguar cores). It pulls 125W at its stock speed of 3.6GHz, and if they were using it in Durango they'd have to clock it down massively to prevent it melting the console (possibly even to 1.6GHz). In theory they could use a "four core" variant at about half the size, which would put it at roughly the same size as Xenon was at 90nm, but still be somewhat of a power-hog.

*I put eight-core in quotation marks because they aren't really eight-core chips. They have four modules on-board, and each module is something half-way between a dual-threaded core and two independent cores. A "four-core" variant would then be a dual-module variant, in reality.

There are also the rumours of an Intel chip, but I don't put much faith in it, as the logic seemed to be "It has AVX support, therefore it must be Intel" (not true, both Jaguar and Bulldozer support AVX), and it claimed it was an 8-core chip. Intel's only 8-core chips are extremely expensive Xeon server processors, and the only architecture they could use to cram 8 cores in a console-friendly die is Cedarview (Atom 32nm), which doesn't support AVX.
 
Don't worry about next-gen Xbox i think,since the MGS Vancouver(now rename to Black Tusk)'s first game will use UE4 engine

Sony will follow either i guess

Why makes you think it will be UE4? Why not build an in house engine shared with 343i? All other halo games have been in house.
 
I don't think 8 cores Jaguar are weak,and 4 cores A2 will be WiiU CPU die size but i don't think so(if MS use A2 should be 8 or more)
 
IPC would be lower then Athlon 2 and much lower then Thuban and Trinity....
Then like your #15912,if they have other choice and it sounds a lot better but didn't use it,that means this 1.6GHz CPU are better than your choice,maybe not Jaguar,not A2,not Bulldozer,it's some other CPU or actually A2 with 16 cores,who knows
 
So that's Durango imo. 8GB DDR RAM, 8 Jaguar cores, and a 6870/6950 class GPU albeit in GCN form. May seem underwhelming, but I bet we'll be quite pleased with the results. In light of bloomberg recently confirming a fall 2013 release, it's right to expect the hardware to be pretty locked down.

A GCN version of 6870/6950 would be higher than what I expected especially a 6950 GCN I can't see happening. On the other hand, if they go for a low power draw CPU they might put their budget on the GPU instead...
 
IPC would be lower then Athlon 2 and much lower then Thuban and Trinity....
According to AMDs official slides (and analysis around the net). Jaguar core IPC should be pretty close to Bulldozer core IPC in multithreaded workloads (when both cores of each module are fully taxed, and both cores are utilizing the shared FPU). BD modules have a shared 2-way L1 instruction cache, while Jaguar cores have their own 2-way instruction caches (Jaguar instruction cache should trash much less). Also Jaguar cores have better L1 data caches: twice the size and associativity (8-way 32 kB in Jaguar vs 4-way 16 kB in BD).

Both cores can decode/retire two instructions in total per cycle. Peak integer throughput is the same (but BD has more flexibility). Also both Jaguar and BD cores have identical peak FLOP/s per core: Two BD cores can together execute two 4d FMAs per cycle (4*2*2 = 16 FLOP/s for two cores), while a single Jaguar core can execute 4d FADD + FMUL per cycle (4*(1+1)*2 = 16 FLOP/s for two cores).

Trinity (and especially Steamroller) have much better IPC than Jaguar, and BD should also have notably better IPC in cases where only a single thread is running on each module (L1 cache problems disappear & the shared FPU is completely owned by single core). And the IPC isn't telling the whole truth: BD can run at twice the clock rate of Jaguar. So even if the IPC was close, BD would still be twice as fast as Jaguar. High clocks mostly benefit desktops, but are also important in laptops, since turbocore boosts the performance in single threaded applications a lot. The ultra portable 17W dual core 1.6 Ghz Trinity boosts up to 2.4 GHz (http://www.pclaunches.com/notebooks/lenovo-ideapad-s405-with-upcoming-17w-amd-trinity-apu.php). 2.4 GHz Trinity core would slaughter Jaguar in single threaded benchmarks. And unfortunately there are still lots of single threaded applications around. However in multithreaded benchmarks, Jaguar shouldn't be that bad (when compared to other ultra low voltage processors).
 
Well Microsoft don't need to "put their budget" to any part of spec:LOL:

Well, that is very true:smile:, although I am sure the bean counters over there, MS or not, would like to see this thing going for a profit sooner rather than later. Anyway, I was thinking mostly from a wattage/thermal point of view, what they can fit into a "console sized" box and won't need some jet engine loud fans to prevent the box from melting. They wouldn't want to cause en masse heart attacks among their bean counter population over yet another RROD based 1 billion write of...
 
Playstation orbis will have either piledriver or steamroller cores because in that vg247 article it was clearly mentioned that A10 will stay as the base for orbis ! AFAIK the A10 name is only applicable for Amd's mainstream cpus; the count of the cores will be 4 or more . The core clock will also be significant if not 4 ghz . And every rumor is pointing at only one gpu in orbis.
 
Well Microsoft don't need to "put their budget" to any part of spec:LOL:

The power budget, not the cash budget.

Last gen, the performance of the console was pretty much determined by the cost they were willing to pay for the silicon. This gen, I don't think it is anymore -- instead, they have a pretty hard limit of how much power they can dissipate in a console envelope. Every watt spent on the CPU is a watt not spent on the GPU. That's why a lot of devs, me included, want a low-power cpu this time around. After you have those 8 Jaguar cores, spending one watt of power on the GPU is just *better* than putting it in the CPU.
 
Could a single Jaguar core with AVX emulate a single Xenon thread? If you've got 8 cores then you could do one thread per core and possibly emulate the Xbox 360? Or is backwards compatible out of the question without recompiled binaries?
 
Status
Not open for further replies.
Back
Top