Predict: The Next Generation Console Tech

Status
Not open for further replies.
Something like a pitcairn with at least 2gb gddr ram and a hexcore xcpu would suffice...
Wow, that's pretty lowball on the CPU. You're sure you'd be OK with only a 2x increase in CPU power?

It seems to you GPU == Power. Let's say you had a lower level GPU but a 8x more powerful CPU, would the box still be "underpowered"? How about a 3x CPU and Lower GPU, but oodles of RAM?

Edit: edited to remove specific values and GPU names. It was distracting some folks who thought it was indicative of something. Sorry folks, I just googled "Pitcairn GPU" and chose the model one lower for my example, it's pure coincidence it matched up with some other folks expectations.
 
Last edited by a moderator:
One thing I still expect to be an issue is memory stacking. The difference between a console with stacked memory and one without would basically be a generation gap -- you can easily ship an order of magnitude more bw within the same cost and power envelope, and even with half of that the difference will be very visible. So given how much money Sony has poured into the tech, I'd assume they will ship their next gen exactly when they can manufacture a chip with wide stacked memory interfaces in volume.

How much bandwidth do you think would be possible if Sony used stacking? And does that put a limitation on the amount of memory that could be used?
 
Well bandwidth is definitely important there is no disputing it.
I based my choice on the amount of RAM, say MS really up the bar (whatever the reason) I believe that the extra RAM would come handy to keep up with the next box.
Extra RAM (like 2GB vs 4GB even only 3GB) may allow for plenty of dirty tricks (I'm thinking of effects that would rely on previously render targets), will ensure the same assets are used among both system etc.

If we look at trinity and llano they do way better with mostly the same bandwidth as the RSX. There would be 50% more bandwidth (possibly more if Sony goes with faster than 1600MHz DDR3 that is still significantly cheaper than GDDR5).

Definitely it's a significant trade off. But I think it would be pretty suicidal for Sony to try to compete head to head with MS if the latter decide to go with a loss leading hardware.
If they are to fail to reclaim the perfs crown they may better as well position from scratch the machine as the machine for the budget sensitive gamers. Pretty much moving from it does everything to it's that an affordable entertaining machine runs games for cheap.

I'm serious about them passing on HDD all together and letting costumers use SD cards instead.
Definitely a win for the form factor. With such a set-up I could see the thing behind in the dreamcast ballpark wrt the size.

I believe those are the minimum specifications to expect from both sony and microsoft for their next gen consoles, anything less would be just suicidal :

- a quad core OoO CPU running at 2 GHZ with 400 Gflops level of performance.
- an ATI 6000 series GPU architecture 1 Tflops level of performance, running at 650 MHZ.
- 2 Gb GDDR5 Unified Memory with 64 Gb/s bandwidth.
- an HDD (320 Go).
- a 6x blu ray drive.

expected cost : 350 $. 3 options : sell it at 349$, sell it at 299$ with loss, or at 399$ with profits.

and those are the maximum realistic specifications we could expect from sony and microsoft :

- an 8 core OoO CPU running at 3.4 GHZ, with 1.2 Tflops level of performance.
- Nvidia GTX 680+ GPU level of performance at 3.5 Tflops, running at 850 MHZ.
- 6 Gb GDDR5 unified memory with 200 Gb/s bandwidth.
- an HDD (1 To)
- an 8x blu ray drive
- a 128 Go SDD for cash and operating system and main programmes installation.

expected cost : 800 $. only 2 options : sell it at 499$ with huge losses, or sell it at 599$ with 200$ loss per console.

PS4 and xboxnext specifications would be between these 2 extremes. the company approaching most from the high end extreme would win the nextgen hardware technical superiority battle. the more a company approaches the high end extreme, the more it needs to subsidize its console.
 
599 isn't even an option. Sony tried that and failed miserably. I really don't think there is much of a market for consoles above 400 euros/dollars.

And it's not only initial production costs but also about how fast production costs can come down. For the high end stuff it might take a while. Now just consider having to sell 10+million consoles at a 200 dollar loss or more. Your taking a 2 billion gamble.
 
599 isn't even an option. Sony tried that and failed miserably. I really don't think there is much of a market for consoles above 400 euros/dollars.

And it's not only initial production costs but also about how fast production costs can come down. For the high end stuff it might take a while. Now just consider having to sell 10+million consoles at a 200 dollar loss or more. Your taking a 2 billion gamble.

True, I agree, thats why I am thinking that sony and microsoft need to sell their consoles at 399$, and just take 50$ per console loss. a console costing 450$ could be something special. the 50$ could be used to include 3 gb of GDDR5 at 120 Gb/s instead of just 2 Gb at 64 Gb/s bandwidth. the 50$ can be also used to put in a faster GPU, a 2 Tflops GPU instead of a 1.2 Tflops one.

I am sure sony and microsoft would subsidize their next consoles in fall 2013 at least for 50$ per console. now how you use this subsidy to increase the performance and efficiency of the hardware ? this will determine which console will be technically superior.

how you use a mere 50$ per console would determine the technical superiority, crazy stuff, and scary decision to make from the engineers of sony or microsoft....you increase bandwidth the other increases RAM : you are toasted for multiplatform games ! you increase RAM but the other include a far better GPU : you are toasted too ! wow scary decision making....:oops:
 
Wow, that's pretty lowball on the CPU. You're sure you'd be OK with only a 2x increase in CPU power?

It seems to you GPU == Power. Let's say you had a Cape Verde level GPU but a 8x more powerful CPU, would the box still be "underpowered"? How about a 3x CPU and Cape verde GPU, but 8GB of RAM?

That's the least I'd expect...

You know, with not wanting to "run red ink everywhere" ... ;)

But yes with the GPGPU capabilities of MODERN gpu's, I expect more work to be offloaded to the GPU nextgen.

As I said, I'm not being unreasonable in my expectations.

But yeah dropping a 8x more powerful CPU and a 4x more powerful GPU isn't a recipe for success ... FYI. I hope MS doesn't disappoint their userbase. ;)
 
One thing I still expect to be an issue is memory stacking. The difference between a console with stacked memory and one without would basically be a generation gap -- you can easily ship an order of magnitude more bw within the same cost and power envelope, and even with half of that the difference will be very visible. So given how much money Sony has poured into the tech, I'd assume they will ship their next gen exactly when they can manufacture a chip with wide stacked memory interfaces in volume.

Hmmm Optimistic much? Yes, it sounds like the best approach: release a new platform when the technology is ready, but I am not sure that is how the strategists look at it. We can all hope! At this point having to wait 1 extra year for significantly more mature technologies could make a HUGE difference at the same or similar costs but I think some companies (all 3?) are more devoted to market placement strategy than thinking, "What is the 5-8 year prognosis for this technology for our developer partners and the value to our customers?" In fact I think that is exactly what Sony was trying to sell last gen (performance value) but mispoke in terms of cost and the impact on developing partners.

Wow, that's pretty lowball on the CPU. You're sure you'd be OK with only a 2x increase in CPU power?

It seems to you GPU == Power. Let's say you had a Cape Verde level GPU but a 8x more powerful CPU, would the box still be "underpowered"? How about a 3x CPU and Cape verde GPU, but 8GB of RAM?

There is a balance but a Cape Verde GPU isn't very good.

Part of the problem people are seeing is that from a gameplay-perspective on the PC the 360 is matching the mechanics with Xenon. Why do you need 8x the CPU power when the PC shows no indication of the value of such. What the PC does show is growing the GPU results in a richer and more detailed world.

Now I see this as a chicken-egg scenario where the old consoles are holding back the PC. But in the same vein GPUs are picking up more processing loads AND even with the neutered consoles holding back PCs that has not stopped the march of graphical progress.

A new processor which is more efficient, fewer stalls, and a console where more resources are freed up (e.g. dedicated sound chip) could be 4x as fast while being 1/2 as big on 32nm as Xenon was on 90nm. So the question becomes should a CPU go even bigger than that, and what is the reward, or should that budget be shifted over to the GPU which is seeing much, much bigger strides and a) has immediate consumer feedback in visuals that developers will expose right out of the box (the same could not be said of upping CPU resources more) and b) offers more long term resource pool potential.

I don't think anyone will argue that more RAM is a bad thing but a lot of it falls back to system design. If a fast 32GB/64GB/128GB SSD is being considered (see: HDDs are expensive right now and the prospect of dropping SSD pricing looks more promising) less memory may be fine, especially with virtual texturing. If you are loading these things up with slow as mud optical drives and HDD's are again optional ... please, just stab me now. Less RAM is bad but more RAM also means an eternity of load times (oh well, hopefully a TSR will mandate load-once approaches with gameplay withing X time and use a huge RAM pool as a buffer).
 
lherre has been dropping more info in the wii u speculation thread at gaf.

basically seems he believes it will be wii u low, orbis/ps4 mid, and next box high for specs.

he also said xbox 3 will is at least 3-4x wii u in his opinion.

http://www.neogaf.com/forum/showpost.php?p=38277835&postcount=7458

compared to wii u:

http://www.neogaf.com/forum/showpost.php?p=38274838&postcount=7318

Ps4 is not final yet and can change a lot but Xbox 3 it's more or less clear right now and the gap is big in almost every part

adjust speculation accordingly. or go back to sticking your head in the sand and claiming a 6670 for xb3 lol, whichever.

For PS4 seems lherre hints RAM may be the spec still in flux (probably Epic trying to get them to push it up closer to XB3 levels as rumored)

But I'm not sure it should matter if PS4 and Wii U are much less powerful than Durango as it seems, since we've established people dont really care for graphics, right guys? :p

Also bg posted this:

Yeah. It won't be "too big". An analogy I and the late BurntPork have used is that PS4/Xbox 3 would be like a high-end gaming PC and Wii U a mid-range gaming PC. It won't have the same level of effects, but it would still be able to play the same games.

And lherre replies this:

I'll put wiiu as a low specs Pc compared to the others with the actual data

However to be fair in another post he replies that Wii U: Durango is more like Dreamcast:Xbox 1 than PS2:Xbox 1 in his opinion (which IMO is not a gigantic gap, but it doesn't jive with the rest of his comments, and too be fair the asker only gave him those two options).

As usual I'm not even sure where to put this info, here, the Wii U GPU thread (which seems to actually be the general Wii U power discussion thread contrary to it's name), or the Wii U general thread. Such is the dilemma with most Wii U info...
 
lherre has been dropping more info in the wii u speculation thread at gaf.

basically seems he believes it will be wii u low, orbis/ps4 mid, and next box high for specs.

he also said xbox 3 will is at least 3-4x wii u in his opinion.
.

Well, if the gap is supposed to be big in every part, then we have to be looking at 4GB of RAM, right? Isn't the WiiU rumored to have 1-1.5GB of RAM, I wouldn't call 2GB a big compared to that.
 
Actually true I suppose, if you presume Wii U is very close to 360/PS3 (which will piss off all the Nintites if you're saying that :p).

But I am pretty sure it's not 6670. 4GB of RAM is too much trouble for a 6670. Also as we've discussed ad nauseum 6670 would pretty much rule out 1080P, which also would be disappointing.

We can bet, Acert. steviep already owes me a game I'm pretty good at xbox bets :p
 
What would you guys say to this trade-off: going from, for example, a little bit of GDDR5 to a lot of DDR3?

And what would happen if one of the next gen consoles (that isn't the Wii U) had a GPU in the neighbourhood of 1-1.5TF instead of the 2.5tf monsters you guys are expecting?
 
So basically you're hearing Next Xbox has 4-8Gb of DDR3 and a 1-1.5 TF GPU :p

1-1.5 TF GPU is fairly weaksuace. The 6670 we all hate is rated at 768 gflops, 7770 is 1.28 TF and I consider that, well just look in PC terms a 7770 is a joke.

Given I dont think DDR3 is very probable for a next gen console, I'd have to resort to questioning your sources again...
 
I was throwing out a hypothetical and asking for your analysis, not confirming or denying anything. lol

Cape Verde is also nothing to sneeze at - a fantastic mid-range GPU in the PC space. One of the best bangs-for-the-buck AMD has put out in a long time (I say that as I'm shopping to Crossfire a couple Pitcairns lol).

I was just curious as to what you guys thought about a trade-off of a lot of memory (but it has to be of a slower type, like DDR3, since higher densities on speedy memory and stacking and all that isn't going to be ready for prime time just yet). It's one way to get developers the ram they've been begging for, isn't it?
 
but then you have no memory bandwith. you'd have to probably go hefty edram then, which imo just sucks. the edram is costly, will rob you of tons of gpu power and is a bitch to work with (if indeed the cape verde/low GPU is true, the EDRAM eating all the silicon budget would be why)

i have faith in microsoft engineers though out of all three companies, so whatever they're up to would likely make sense. it's amd engineers i'm not so keen on...

cape verde would just be crap for a 2013 console. again put it in pc terms, if people would laugh at a cape verde gaming rig in 2012, what is it going to be good for in a console in 2016, 17, 18?

I'm not dissing you steviep but I am aware your "angle" here is to push for Wii U being pretty close to PS4/720, and so a cape verde 720 would be something you'd like, so I have to take your specs with a grain of salt, and I'm dubious of these but any info is good to hear and one more thing to throw into the pile.
 
The Wii U's GPU isn't even in the same league as a Cape Verde - there is no angle here. And a Cape Verde is in one of my computers. It is fantastic and can play all but the highest end games with great detail. AMD Markham (formerly ATi) has some awesome engineers, and I say that as someone with first-hand knowledge of that fact.

As you said, the issue of removing the bottlenecking in a system could be alleviated through many means.

But some of you guys are expecting 680s and Tahiti-level stuff inside these boxes, and I think that may be unrealistic.
 
Last edited by a moderator:
Ehh, on AMD, I just disagree with the inclusion of EDRAM in the 360. That's probably debatable but I think on balance it harmed the system. That was an AMD idea.

AMD GPU's are pretty good. Bulldozer though was purely an awful engineering effort.

You say Wii U GPU isn't in the same league as Cape Verde, well 7770 is the top Cape Verde at 1.28 TF (and it's clocked kind of high at 1ghz, I wonder if they could even clock it that high in a console). If PS360 are ~250 GF, then there's not a lot of wiggle room in there for Wii U to top PS360 yet still not be "in the same league" as Cape Verde.

Certainly it would provide a nice boost on current consoles, but I dont think enough of one to compel new buyers. It would also put 1080P in dubious feasibility imo. You'd be dealing with on the surface only 5X as much power, so rendering 2.25X more pixels would already be a serious dent.
 
The idea of DDR3 in nextgen consoles is very problematic due to the advantages of Unified Memory architecture : easy development + flexibility + efficiency.

after what happened with multiplatform games in this ps3/xbox360 era : Unified Shaders VS Fixed pixel-vertex shaders pipelines, Unified Memory VS separated pool of memories, huge bandwidth for basic stuff GPU rendering (edram for xenos) VS bandwidth bottleneck for RSX, 3 dual threaded relatively easy to develop for CPU VS weak ppu and a nightmare SPEs....

all this points out to the following :

1- all nextgen consoles would adopt a unified memory architecture approach.
2- given this unified memory approach, DDR3 is just ruled out too little bandwidth, GDDR5 is a necessity.
3- all nextgen consoles would adopt an easy to develop for CPUs, most probably OoO CPUs, with at least 4 symmetric Cores. No more asymmetric fancy CELL design.
 
DoctorFouad said:
1- all nextgen consoles would adopt a unified memory architecture approacrh.
Yeah, but the real question is whether that common address space will "only" be flexibly dividable between GPU and CPU (as in XBOX360) - or whether it will actually be unified in a sense that GPU, CPU, etc. can use it to directly share data they mutually work on. The latter is the actual break-through - and would require way more bandwidth to work properly (as the RAM would basically function as off-chip L3 Cache).

DoctorFouad said:
3- all nextgen consoles would adopt an easy to develop for CPUs, most probably OoO CPUs, with at least 4 symmetric Cores. No more asymmetric fancy CELL design.
More mainstream CPUs at the core of the respectives systems is probably true. That being said, I'm pretty sure we'll see a lot of fixed function stuff added to the usual GPU+CPU combo in the next gen consoles (if there's one field in which fixed function units make sense, it's in highly specialized, "fixed function" multimedia consoles).

Whether some of that fixed function stuff will actually come in the flavor of Cell-derivatives - I don't know. But judging by what Cell-based chips can do in mordern Smart-TVs, the chances of SONY actually re-using some parts of their Cell architecture (instead of paying money for third party designs) don't seem too unlikely (also see Brad Grenz' earlier post).
 
Status
Not open for further replies.
Back
Top