Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
"low-cost" needs a comparison to conventional DRAM setups.

I don't have any numbers, but everything Samsung has been saying about their upcoming low-cost HBM solution, is that it's intended to alleviate the pricing issues that prohibited HBM2 being using in all but the highest end GPU parts. I believe Samsung wants lcHBM positioned to be accessible to high to mid-range desktop GPU/APUs for mainstream notebook products.

It seems as though that's their intended goal. Whether the realities play out that way, however, is anyone's guess.

If I was a betting man, as much as I'd hope low cost HBM can be priced low enough and be available in high enough quantities to be used in PS5, I'd actually put money on PS5 going with another traditional 16x chip GDDR6 solution.
 
Something I've said before, but thinking about all this again makes it even more clear to me, is that these mid gen machines has made it much harder to build a compelling next gen by 2020-21 (earlier for some).
Especially the 1X with the extra memory for assets.

With no mid gen, could've released a console in 2019 with 8TF, Zen, 2-3TB HDD, 16GB fast memory. It would've easily looked much better than XO and PS4 at 4k.

Not saying I'm against mid gens though, just saying it's made it a lot harder for next to be compelling

This is all the more reason devs are prohibited from making mid-gen exclusive games, imho.

I think the value of a hypothetical 8-12TFLOPs PS5 APU will become readily apparent when we start seeing games built from the ground up to specifically take advantage of all those extra FLOPs and a nice fat AMD Zen-based CPU. So in many ways I disagree with your point.
 
This is all the more reason devs are prohibited from making mid-gen exclusive games, imho.

I think the value of a hypothetical 8-12TFLOPs PS5 APU will become readily apparent when we start seeing games built from the ground up to specifically take advantage of all those extra FLOPs and a nice fat AMD Zen-based CPU. So in many ways I disagree with your point.
Gameplay could change a lot, especially with a more performant cpu.

If a game was built from the ground up for the mid gen consoles, I'm not sure how much difference visually there would be compared to what they are going to produce even if it needs to run on base hardware.

Gameplay I agree with you, graphically not so much.
 
I expect something like an AMD Vega with 4/8GB HBM + an AMD Zen 8 cores with 8/16 GB DDR4.
I expect optical storage, and unfortunately an HDD.
(With games reaching 50GB each, 10 could be installed simultaneously on a 500GB SSD, which I find reasonable enough.)
 
In 4 years we will have moved from 8 GBs to 12 GBs of RAM on consoles, which is a 50% gain.
And we did that because for the fist time, companies decided to offer optional midgen upgrades. That means nothing in the grand scheme of [generational upgrade] things.

When time is right, the console makers will pick that best suites them, and what will best suit developers needs.
Epic "forced" Xbox executives to go from 256MB to 512MB of unified ram in X360, while Randy Pichford was instrumental in forcing Sony to go from 4GB to 8GB of ram in PS4.
 
If AMD's caching technology comes through, I would expect a highly cost optimized memory subsystem:
4 GB lcHBM/HBM2/HBM3, cache for the DDR4
16GB DDR4
128 GB flash, cache for the HDD.
1-2 TB HDD
Optional optical drive.

I expect around $100 for the RAM (HBM+DDR4) and $100 for the permanent storage (Flash+HDD+optical). We're currently in a supply shortage wrt. ram and flash with corresponding high prices, that will change.

Cheers
 
I'm not sure only 4GB of HBM makes sense. I think most of the cost of HBM is due to the stacking and not the memory die itself.

To justify HBM usage over a GDDR, solution you need to be shooting for bw in excess of 768 GB/sec. A GDDR6 solution on a 384-but bus could achieve that.

384-bit bus with 12 2Gb GDDR6 chips would give a BW of 768 GB/sec with 24 GB of RAM which would be sufficient and more likely way cheaper that a HBM solution or some hybrid.
 
Loading data straight from its source to memory is ideal.
What would be the point of an extra memory pool ?
Copy my resources twice [disk->slow RAM->usable RAM] ?
Introduce more latency [when loading from disk] ?
Use it as a disk cache ? [In that case I think an SSD/HDD hybrid would be better than putting extra burden on the dev shoulders...]
Yes but currently (on PS4) users are able to use an external HDD (slow, fast, whatever) that replaces the internal one. What if users replace your hypothetical fast internal SSD by a slow external HDD ?

A fast internal SSD/HDD (used as a reliable low latency storage by developers) would forbid the use of external HDDs to store games. Or they would need to add 2 HDDs in the console: a mandatory fast and an optional slow -> costly for the manufacturer and more work for the developers.
 
So, just a fast internal NVMe M.2 SSD, removable so the customer may upgrade it.
(We already delete our old games to make room for new ones on PC, so why not on Console. [Davros doesn't count.])
 
Gameplay could change a lot, especially with a more performant cpu.

If a game was built from the ground up for the mid gen consoles, I'm not sure how much difference visually there would be compared to what they are going to produce even if it needs to run on base hardware.

Gameplay I agree with you, graphically not so much.

"If" being the operative word here. Games aren't and are unlikely to ever be built from the ground up for the Pro/XB1X, so the point is moot. Games on those mid-gen boxes will forever be condemned to being uprezzed ports of XB1 games with mostly the same asset fidelity.

Developers simply aren't looking to employ more computationally expensive lighting systems, for example, on these boxes. They will, however, on PS5/XBNext. So I'm pretty confident that there'll be a clear difference in both the visual and graphical quality of games between reasonably priced next-gen systems in 2019/20 and current-gen systems. Certainly, "enough" to make them worth a purchase in the eyes of consumers.
 
I'm not sure only 4GB of HBM makes sense. I think most of the cost of HBM is due to the stacking and not the memory die itself.

To justify HBM usage over a GDDR, solution you need to be shooting for bw in excess of 768 GB/sec. A GDDR6 solution on a 384-but bus could achieve that.

384-bit bus with 12 2Gb GDDR6 chips would give a BW of 768 GB/sec with 24 GB of RAM which would be sufficient and more likely way cheaper that a HBM solution or some hybrid.

Don't stack it, - and do away with the logic die as well. 4 8gbit dies on an interposer would add less than $10 in interposer cost ($600/300mm wafer). you'll have close to 1TB/s bandwidth at lower power consumption and more freedom in deciding your final memory size.

We'll have to see how good AMD's HBCC is.

Cheers
 
In the context of mixed precision shading, it'll be interesting to see developers who are beginning to experiment with the 4Pro now as they ought to be able to hit the ground running when it comes to next gen launch.
 
Don't stack it, - and do away with the logic die as well. 4 8gbit dies on an interposer would add less than $10 in interposer cost ($600/300mm wafer). you'll have close to 1TB/s bandwidth at lower power consumption and more freedom in deciding your final memory size.

We'll have to see how good AMD's HBCC is.

Cheers

Are you suggesting a separate package altogether?
 
A fast internal SSD/HDD (used as a reliable low latency storage by developers) would forbid the use of external HDDs to store games. Or they would need to add 2 HDDs in the console: a mandatory fast and an optional slow -> costly for the manufacturer and more work for the developers.

This is one of the reasons SSD/HDD hybrid is better if the SSD and HDD are separate, then.

Seagate "SSHD" where both are inside a HDD's casing are the best known solution perhaps, it's transparent to the OS and user and is also easy for OEMs (laptop or iMac can use HDD, SSHD or SSD). It's a drop-in solution which doesn't really require anything in term of software or motherboard's firmware or OS configuration.
But for many years using a separate SSD has been available, even on consumer hardware : Intel's "SuperDuper Storage Accelerator" (I'm too lazy to search for the name lol) on Z68 or Z77, and the ZFS file system on Solaris, FreeBSD, in a lesser way Linux.

Having an M.2 PCIe slot on the motherboard would make it easiest : base option of 64 or 128GB, high end option of 512GB (adds the ability to store whole games basically, or most of their contents) or ability to put a 2TB drive or bigger in.
You can also sell a base model with no caching SSD present whatsoever. As in, most people probably are interested in the $399 version with price drop to $299 after a couple years, not in something $50 or $100 more. But then the developers have to make the game play at least decently without an SSD.

Another option is some kind of ReRAM, 3D XPoint etc. soldered on the motherboard for each console, no idea if that can get cheap enough for a small amount.. At least it should be more reliable. If you put in a small amount of flash like 32GB, would it be slow and die from exhaustion? Or with writes just fast enough to keep up with the HDD's reading bandwith, if you have good quality flash.
 
In the context of mixed precision shading, it'll be interesting to see developers who are beginning to experiment with the 4Pro now as they ought to be able to hit the ground running when it comes to next gen launch.
This is pertinent, particularly the last post. If I read the posts (from blu) correctly, mixed precision shading (done right) basically combines the performance of FP16 with the quality of FP32, at least in this particular test.

http://www.neogaf.com/forum/showthread.php?t=1361283&page=2
 
"If" being the operative word here. Games aren't and are unlikely to ever be built from the ground up for the Pro/XB1X, so the point is moot. Games on those mid-gen boxes will forever be condemned to being uprezzed ports of XB1 games with mostly the same asset fidelity.

Developers simply aren't looking to employ more computationally expensive lighting systems, for example, on these boxes. They will, however, on PS5/XBNext. So I'm pretty confident that there'll be a clear difference in both the visual and graphical quality of games between reasonably priced next-gen systems in 2019/20 and current-gen systems. Certainly, "enough" to make them worth a purchase in the eyes of consumers.
I don't think if is the operative word here though.
I think that if dev's didn't have to scale to the base models, they would still be aiming for the same resolutions as they currently are on the mid gens.
In terms of tech there isn't much difference between base models and the mid gens apart from more of the same.
So the engines wouldn't be much different. Especially when aiming at the higher resolutions.
The 4pro obviously has the ID-Buffer, but that is already being made use of.
So I still stand by graphically the games wouldn't be much different if built from the ground up on the mid gens.

Not including the fact that most engines are cross platform now, and I don't see that changing even when we move into the next gen.
 
This is pertinent, particularly the last post. If I read the posts (from blu) correctly, mixed precision shading (done right) basically combines the performance of FP16 with the quality of FP32, at least in this particular test.

http://www.neogaf.com/forum/showthread.php?t=1361283&page=2

Well, it's a neat isolated case with a rather different architecture to test (I'm not sure how Mali compares to GCN), so I'd be wary of making blanket performance comparisons. It's certainly promising. There's a lot that goes on in a modern renderer! (bottlenecks different depending on what the devs want - 1 million troops!).

What I was getting at is more that developers experimenting now on the one platform may lead to better showcase for next-gen (development time, older vs new renderer) when everyone can be targeting that level of architecture while it might turn out to be a seemingly mediocre leap in raw performance vs the mid-gen (fp32 flops), even if it's just for certain things - "go crazy with x-effect!", "Kratos can use his hair to attack!" (not that games should be tech demos first vs game, but I digress).
 
Last edited:
What are the chances of Xpoint making it into next gen consoles? Xpoint is supposedly cheaper than RAM so you can get 128 GB XPoint instead of 32 GB of DDR4 or something like that. Then you can have 8GB of HBM caching for 128 GB of XPoint and then a regular HDD for mass storage.
 
Status
Not open for further replies.
Back
Top