Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
The transition to next gen will be dictated by when package level integration of high bandwidth DRAM (die stacking or otherwise) becomes a reality.

High bandwidth/low latency access access to a fair chunk of DRAM will change the way GPUs will be built (more logic, less on die RAM)

Cheers
 
You think that the new technologies such as HBM and HMC will negate the need for large chunks of SRAM?
I didn't think the die stacking solutions would negate that need as it would still be slower than having large amounts for cache on the actual die.
 
I think MS will be driven by market realities as much as ideal hardware with which to make a big jump.

They really shit the bed (tent) with Xbox One, not so much in terms of sales relative to the 360, but definitely relative to their insane unrealistic expectations. So now MS need to get out ahead of Sony, and do so with a product that's right for the market.

Opportunities to take advantage of disruption within the TV and console space would seem to be based around 4K, VR and AR. MS can't wait too long too long for other technologies or they'll miss the chance to be first movers or "biggest movers" on these things.

For MS I'd bet my pennies on 2017 with a fully BC machine (crucial at this point given digital libraries, long tails for DD games and early moving) on the highest performance node they can mass launch on. They'll go with AMD and have a big x86 APU.

Sony have the luxury of another year or two to get their transition done at a point that'll allow them to milk the PS4 and launch strong on the PS5.
 
How about a big.LITTLE approach if they stick with AMD? Is AMD even capable? :p

A few "2017/8-ish" x86 cores + oct-core jaguars (pumas?).

Ahem. -_-
 
I'd argue quite the opposite actually, simply because of diminishing returns in graphics. A new platform so soon after the One/ps4 will do little in convincing us that we're getting a 'true' generational leap.

If new machines can't do that, then MS/Sony might as well milk the current gen for all it's worth.

Unless VR takes over. Which I highly doubt as I still see it as a tiny niche.

As always, I stand to be corrected.
 
big.Little is a design to solve the problem of hungry cores in a tiny power envelope due to limitations in battery power and the expectation of the consumer that their phone should at least stay charged all day! How would it help with a games console?
Instead of tiny Jaguar cores you could have dedicated silicon for some of the jobs in hand unless they were used for the reason of system compatibility and that would mean the big cores are straying from x86. Don't see Sony or MS doing that - maybe Nintendo will.
2017 is only two years away so I have to agree, that is way too soon.
 
You think that the new technologies such as HBM and HMC will negate the need for large chunks of SRAM?
I didn't think the die stacking solutions would negate that need as it would still be slower than having large amounts for cache on the actual die.

Abundant memory bandwidth would move the design point towards spending relatively more transistors on computational logic than memory system (SRAM, IO pads, on-the-fly compression techniques, etc).

It won't negate the need for close SRAM buffers/caches but certainly mediate the size of these.

Cheers
 
I'll take a guess at MS' plans for Xbox. By 2022 I forsee Surface becoming the new Xbox. The two will be fully converged, by that time, Kinect, PC, and Games will all be served through a single device. They will figure out a method in which the surface can easily sync up with any TV it is beside, be it through a cable or a docking station or wireless. Controllers will still be around, touchscreen will not replace traditional methods of input yet.

VR will not fly, unless I see eye tracking technology coming soon, it's a pure vomit fest. Your eyes can't roam the VR and when you turn your head you naturally move your eyes before your head so, things are often incorrectly out of focus when you need them to be. The engine tries to guess where you are looking (but that's usually at the reticle)

I'm an owner of the Oculus kickstarter, 10 minutes of usage produces a near 5 hrs of headache. I'm fairly good with shooters, and I agree that not everyone is affected in the same way. On the official boards I've had people cite articles that Asians tend to get sicker easier. Well I'm not going to be less Asian anytime soon, so they better figure it out or I'm never going to buy into 3D VR gaming. It being used for 3D movies though, that's pretty bad ass.

Reality is, you have no idea how you are affected by VR googles until you try it. You can force yourself to get used to it, but it's a super jarring experience. You can't convince me to pay $600 for a VR console with headset that I can't play longer than 10 minutes before headache city. And by headache city I'm including nausea. That being said it's the best 3D I've ever had experience despite the fact that I feel like i"m looking through a fly screen when doing it. Wearing glasses is also a small problem btw, it's not easy to pass a headset around, each person must calibrate the vision to their own eyes and that takes time as well... anyway... that was OT rant.

More focus to online processing, could be a fairly powerful CPU/GPU as the world improves on both battery technology and power requirement running life maybe about 9 hours without charge. There will be an improvement to the GPU, but will be relatively middle of the pack like this generation, but will have additional fixed function co-processors (ie hybrid Ray Tracer in the PowerVR). I believe that the current move to compute shaders will drastically change the way these GPUs are loaded/engineered in the future and how games will render. Having tons of available bandwidth will become a massive requirement, so I believe the HBM chip stacking will be present. I don't know if embedded RAM will stick around, maybe it will, maybe it works good with compute, who knows.
 
My prediction is MS and AMD produces a quad durango processor in a MCM package with HBM.

You get 32 cpu cores with 4 TFlops of gpu performance and 128 MB of eSRAM.

Quad crossfire on a chip.

LOL.
 
Last edited:
My prediction is MS and AMD produces a quad durango processor in a MCM package with HBM.

You get 32 cpu cores with 4 TFlops of gpu performance and 128 MB of eSRAM.

Quad crossfire on a chip.

LOL.
I know this is a joke, but that would certainly ensure backwards compatibility.
; )
 
I would not underestimate the fact that in 2 years the Ces and all the tech news sites will be flooded with 8k tv, and anyone suffering from the Small Penis Overcompensation Disease will buy one and complain for the sub 8k resolution games.
So 4k is not the target, it's too low, at least for xbox and ps.
 
I would not underestimate the fact that in 2 years the Ces and all the tech news sites will be flooded with 8k tv, and anyone suffering from the Small Penis Overcompensation Disease will buy one and complain for the sub 8k resolution games.
So 4k is not the target, it's too low, at least for xbox and ps.
Typical fifth-gen target res: ~320x240
Typical sixth-gen target res: ~640x480 (4x pixels of previous)
Typical seventh-gen target res: ~1280x720 (3x pixels of previous)
Typical eighth-gen target res: ~1920x1080 (2.25x pixels of previous)
Suggested ninth-gen target res: ~7640x4320 (16x pixels of previous)

Seems legit.
 
I would not underestimate the fact that in 2 years the Ces and all the tech news sites will be flooded with 8k tv, and anyone suffering from the Small Penis Overcompensation Disease will buy one and complain for the sub 8k resolution games.
So 4k is not the target, it's too low, at least for xbox and ps.

The amount of GPU power they'll need for 4k is going to be pretty crazy. Looking at Far Cry 4 on PC, SLI GTX 980s doesn't even give you locked 60Hz. I know GPUs will advance a lot in the next five years or so, but I think that might be asking a lot. When would you expect the first single slot card to be able to run a game like Far Cry 4 at 4k?
Typical fifth-gen target res: ~320x240
Typical sixth-gen target res: ~640x480 (4x pixels of previous)
Typical seventh-gen target res: ~1280x720 (3x pixels of previous)
Typical eighth-gen target res: ~1920x1080 (2.25x pixels of previous)
Suggested ninth-gen target res: ~7640x4320 (16x pixels of previous)

Seems legit.

Something like quad-SLI GTX 980s seems doable :)
 
The samaritan techdemo needed 3 GTX in sli while now we have farcry4 as launch title.
Plus, until the final announcement anyone was sure that ps4 would be built with 2GB of ram, 4GB if whe were really really lucky.
We are talking about 5 or even 6 years in the future with nth gen hbm and flux capacitators.
But over all, the SPOD will drive the expecations of what is barely enought.
 
Sad but true:D. Pr departaments will push for sharper-gen titles or 8k remasters I see thos interviews already :"how can you be disapointed with those assests in 8k?" :S. Meanwhile, On the other side pc gamers are scrapping their heads how to conform to the latest trend of dual sided 8k pc monitors. hardware caomapnies are pushing ; " how could you possible enjoy gamez on single sided monitors before peasant ?" ,"... two sweeten the deal, soon we will even dig out some old mobile vesa standard and launch it as a revolutionary duplex-sync for right price. It allows you to swing around both sides inside the bezel, nothing will be the same again! ( ROPS and vram to drive them will come later, sorry guys)". God, this stupid pixel chase, its tearing me apart :D
 
The samaritan techdemo needed 3 GTX in sli while now we have farcry4 as launch title.
Plus, until the final announcement anyone was sure that ps4 would be built with 2GB of ram, 4GB if whe were really really lucky.
We are talking about 5 or even 6 years in the future with nth gen hbm and flux capacitators.
But over all, the SPOD will drive the expecations of what is barely enought.

They reduced that to just one GPU when they moved from high MSAA to some Epic's temporal AA. They moved to single-GPU demos after they ditched MSAA [Elemental, Infiltrator].
 
I dunno. I doubt it. 4K was available on PS4 and X1 launch, and neither MS nor Sony made a fuss about supporting it. Yeah, they kinda didn't want to talk about it (because there's no support, I guess), but that's it. A new revision with HDMI 2.0 of either console might support Netflix and some odd stuff at 4K, but nothing else.

It's the same when 360 and PS3 launched, with less bullshitting. PS3 supported 1080P from day one, 360 sort of later (first model had no HDMI and there was no real fullhd support via component), both did get a handfull of unscaled fullhd games. But it was never targetted, really... although Sony claimed their ass of, how PS3 supports it, etc... in hindsight, it really pisses me off (for me to not have seen through the charade).

4K will be the target, next-gen. And honestly, having bought a 4K set recently (got a good price and was in the market for a tv anyhow), I can't see a real benefit, unless we get cheap huge 8K tvs. And that's really unlikely, as the markup above 55'' is already insane, if you go "this years model".
 
If they do go for 4k next gen as a standard then titles that run native will most likely have the graphical complexity of titles we will see later this year only at a much higher definition. I don't know about other people here, but I don't think that is the smartest use of power. You also have to think about 4k's impact on storage as well as digital titles. I don't see the home internet companies increasing bandwidth by 10x in 5 years. I also don't see 50 terabyte hard drives being common or affordable either. I think 1080p will be just fine for next gen and if the consoles have the extra grunt to push much more pixels it would be better spent on something like 2x super sampling.
 
Status
Not open for further replies.
Back
Top