Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
considering that microsoft has been working more closely lately with nvidia on dxr and such. is it possible that they would consider going with nvidia for their next generation console?

this last gen, i think one of the reasons that sony was able to dominate was because of the overall similarity between the two companies products, if microsoft went with nvidia this time it would really shake things up.
 
And just what CPU can Nvidia offer? Nothing from x86. And with just a different GPU they would have harder time with BC.
 
considering that microsoft has been working more closely lately with nvidia on dxr and such. is it possible that they would consider going with nvidia for their next generation console?

this last gen, i think one of the reasons that sony was able to dominate was because of the overall similarity between the two companies products, if microsoft went with nvidia this time it would really shake things up.
The reason Sony has dominated this time is they didn't screw up (PS still a go-to brand) while MS did - having the same architecture had nothing to do with it. That is, if XB1 was nVidia based with an ARM or Power CPU, it'll still have been over-priced with an unwanted Kinect sensor, a lousy TV TV TV message, and still faced the same struggles.

The reason AMD worked so well for 360 was because it was a superior architecture. What would MS going with nVidia next gen achieve? AFAIK they offer the same game experience just at lower power draw, by and large. Things like Tensor Cores are in super expensive systems for specialist applications. And whatever hardware nVidia implements to facilitate DXR, AMD can (and likely will) add something suitable, because they are provided the same DX spec from MS and know what's needed.

nVidia aren't off the table, but I don't think an nVidia based console has any inherent advantages, while the difficulties in creating an nVidia based system have been well covered already.
 
considering that microsoft has been working more closely lately with nvidia on dxr and such. is it possible that they would consider going with nvidia for their next generation console?

The bigger question is whether NV would be willing to work on a low margin custom part for console use? Nintendo basically used an off the shelf part that was a commercial failure for NV. I don't think MS would be willing to settle for an off the shelf part for the next console.

MS working on incorporating ray tracing into DirectX does raise some interesting possibilities for a future console, however, at the same time by incorporating it into DirectX, it means that AMD is also able to potentially create hardware that has hardware assists for ray tracing in DirectX.

So, is it more likely that if MS were going to investigate possible inclusion of some form of ray tracing in a future console would they
  • Pay for a high margin custom part from NV thus guaranteeing that the next MS console would be significantly more expensive than the next Sony console?
  • Work with AMD on a custom part that is capable of assisting with DX Ray Tracing in order to get a low margin custom part that keeps them price competitive with Sony?
  • ...and the long shot... Work with Intel on a custom part using something from their new initiative WRT GPU development. Intel doesn't like low margin parts either, but a console win would be a potential marketing showcase for their GPU efforts.
Regards,
SB
 
Incidentally, PVR have actual ray-tracing hardware and not just assists on $3000 GPUs...

With no idea what the RT assists are and what they'd cost, I'm not sure anyone should be looking for RT in the next consoles. Sounds great, but it's probably 2 generations off before we get something like the Star Wars demo in games. Real-time RT (RT²) will factor first in the media and visualisation industries at a premium, before being cost reduced and optimised down to consuming gaming, if conventional business and economics are to be followed.
 
The reason Sony has dominated this time is they didn't screw up (PS still a go-to brand) while MS did - having the same architecture had nothing to do with it.
And more + best-selling exclusives.
It's not like E3 2013's presentations were everything that made Sony sell 2x more consoles even in 2018..
 
Incidentally, PVR have actual ray-tracing hardware and not just assists on $3000 GPUs...

With no idea what the RT assists are and what they'd cost, I'm not sure anyone should be looking for RT in the next consoles. Sounds great, but it's probably 2 generations off before we get something like the Star Wars demo in games. Real-time RT (RT²) will factor first in the media and visualisation industries at a premium, before being cost reduced and optimised down to consuming gaming, if conventional business and economics are to be followed.

Yes. A lot can happen in the graphics industry in 2-3 years time, but even if RT becomes as common in GPUs as, say, Tesselation, you'd still need to get the industry to transition to it.

How many developers would be ready to use RT? Would hardware that is fast at assisting with RT still be fast at traditional rendering? IE - would an RT capable GPU still show generational growth in performance? Or would implementing RT come at the expense of growth in traditional rendering speed? Especially important for the consumer market as the consumer market can't support something like a V100.

Regards,
SB
 
And just what CPU can Nvidia offer? Nothing from x86. And with just a different GPU they would have harder time with BC.

I would do Nvidia GPU with EMIB to Intel CPU. That’s the only solution that makes sense to me if Nvidia is providing the GPU.

I would be less worried about BC. They’ve managed to get every generation BC with the previous, the last two transitions being x86 to PPC and back. Two x86 in a row? Piece of cake!

Possibly cements 2020 release window?
The Micron slides are more telling.

69f91593-df39-46f5-b989-10cdb5f7caac.png
 
The reason Sony has dominated this time is they didn't screw up (PS still a go-to brand) while MS did - having the same architecture had nothing to do with it. That is, if XB1 was nVidia based with an ARM or Power CPU, it'll still have been over-priced with an unwanted Kinect sensor, a lousy TV TV TV message, and still faced the same struggles.

Showing the USP (unique selling point) of your device (ex Switch hybrid or PS4 specs indie specs) isn't a wrong move nor does it dictate what price consumers will pay for it. The proof is the starting $1K iPhone X flying off shelves because of its kinect rooted usp, or echos/minis also flying off shelves because of their voice capabilities. And looking at where devices are now (ex aformentioned devices) compared to Xbox One's original intent, and the likely question asked of Microsoft engineers back then, "where do you see advances going in the next 5 years", they actually nailed it.

What they didn't nail was the noise. For example, the Snowden factor has all but subsided with all the privacy violation devices, again, flying off shelves, regardless of price. That major factor alone is always ignored when dismissing Microsoft with the same "kinect, tvtvtv" social media... feedback.

Regardless of specs or price, the next device has to avoid leaks so that it can stand on it's own merits. Not early strategic snippets of it bandied about in the worst possible light because nobody wants to or tries to see the whole picture when it comes to Microsoft. And no, it's not some nefarious "taking away my consumer rights cry me a river" crap.
 
I would do Nvidia GPU with EMIB to Intel CPU. That’s the only solution that makes sense to me if Nvidia is providing the GPU.

I would be less worried about BC. They’ve managed to get every generation BC with the previous, the last two transitions being x86 to PPC and back. Two x86 in a row? Piece of cake!


The Micron slides are more telling.

69f91593-df39-46f5-b989-10cdb5f7caac.png

Interesting what console out now or coming out this year has GDDR5x/6? That would indicate that by the end of 2018 there will be a sizable amount of consoles with GDDR5x/6

Regards,
SB
 
I read the graph as "through to the end of 2020", so the end of 2018/beginning of 2019 is where GDDR5X/6 starts to ramp up (there seem to be four five piece-wise linear segmentations).

edit: or rather, each numbered year = end of the year

edit: don't know what the presenter person was saying during the slide though.
 
Last edited:
Showing the USP (unique selling point) of your device (ex Switch hybrid or PS4 specs indie specs) isn't a wrong move nor does it dictate what price consumers will pay for it.
I'm confused. You're saying MS did nothing wrong? And the reason they're not matching PS4 sales is...they don't have nVidia hardware?? o_O
 
for early adopters sure, it was more expensive and shackled to the unwanted kinect, but later on its life it was about the same price as ps4, i think at least for the last couple years or so.

there arent many compelling exclusives for it either, and if you are looking to play with your friends you probably buy the same console they have, which is likely to be the ps4, developers are also probably going to want to develop their game for the console that won as well and you have a tricky situation for microsoft.

would it have been better had they partnered with nvidia this generation? probably not, it would have came in hotter and more expensive based on their technology at the time. i think now though is a different story and it might be something they would consider.
 
I read the graph as "through to the end of 2020", so the end of 2018/beginning of 2019 is where GDDR5X/6 starts to ramp up (there seem to be four piece-wise linear segmentations).

edit: or rather, each numbered year = end of the year

edit: don't know what the presenter person was saying during the slide though.

I read it as five linear segments, end of 2018 nothing in console, end of 2019 something in console, straight line between. Interesting that they seem to be measuring in GB rather than discrete GDDR(5/x/6) chips. One way to make the graph take off I guess.

My thoughts on Micron's expectations based on the console graph:

- Limited growth of GDDR6 by the end of 2019 may indicate a shrink of one or more of the current systems, but GDDR5 still dwarfing GDDR6.

- 2018 jump in GDDR5 approaches magnitude of 2019 drop in GDDR5 and bump in GDDR6. Total memory quantity increases only slightly. Possible switch in memory type for X1X (yeah I know, a bit of a stretch!)?

- By end of 2020 GDDR5 in massive decline, fast enough to indicate some current systems switching from GDDR5 to GDDR6.

- Massive increase in GB shipped by end of 2020 indicates systems with more than 8 GB of ram making up significant proportion of console sales. Possible shrunk X1X not possibly able to do this on its own with its limited sales and 12 GB. Potential 16+ GB PS5 expected by Micron end of 2020??

TL:DR: Based on graph I would propose:

- At least one console shrink with GDDR6 appearing in 2019.
- PS5 landing before end of 2020. At least 16 GB of GDDR6. Possibly 24.
 
I read it as five linear segments, end of 2018 nothing in console, end of 2019 something in console, straight line between. Interesting that they seem to be measuring in GB rather than discrete GDDR(5/x/6) chips. One way to make the graph take off I guess.

My thoughts on Micron's expectations based on the console graph:

- Limited growth of GDDR6 by the end of 2019 may indicate a shrink of one or more of the current systems, but GDDR5 still dwarfing GDDR6.

- 2018 jump in GDDR5 approaches magnitude of 2019 drop in GDDR5 and bump in GDDR6. Total memory quantity increases only slightly. Possible switch in memory type for X1X (yeah I know, a bit of a stretch!)?

- By end of 2020 GDDR5 in massive decline, fast enough to indicate some current systems switching from GDDR5 to GDDR6.

- Massive increase in GB shipped by end of 2020 indicates systems with more than 8 GB of ram making up significant proportion of console sales. Possible shrunk X1X not possibly able to do this on its own with its limited sales and 12 GB. Potential 16+ GB PS5 expected by Micron end of 2020??

TL:DR: Based on graph I would propose:

- At least one console shrink with GDDR6 appearing in 2019.
- PS5 landing before end of 2020. At least 16 GB of GDDR6. Possibly 24.

Agree with your conclusions. Question is: what is benefit of a console shrink with GDDR6? Is saving power worth the extra cost? Will they help out their supplier by taking the really low binned modules?
 
would it have been better had they partnered with nvidia this generation? probably not, it would have came in hotter and more expensive based on their technology at the time. i think now though is a different story and it might be something they would consider.
Why? What does nVidia bring?
 
I would do Nvidia GPU with EMIB to Intel CPU.
EMIB is a very wide bus that Intel uses to connect HBM2 to the Vega M in Kaby Lake G.
For GPU - CPU communication they use a much simpler and comparatively anemic PCIe 8x bus.
If console makers went back to discrete GPUs and CPU over SoCs (they probably won't), I doubt they would use EMIB for that bus.

The Micron slides are more telling.
Micron seems awfully optimistic in those slides if they think HBM (within "others") will lose half its marketshare by 2020.
SK Hynix and Samsung might have a different opinion on that.
Image search shows me those slides are from early 2017. Perhaps they knew they bet on the wrong horse (HMC) when they made those slides and were trying to trick investors with those forecasts.

The proof is the starting $1K iPhone X flying off shelves because of its kinect rooted usp
iPhone's average selling price has been steadily increasing over the years, with apple keeping production of the 1-year-old and 2-year-old models to offer lower-cost alternatives. The first model launched for $500 and its closest competition was the ~$730 Nokia N95 with arguably lower specs.
If the first iphone launched at $800 and no one would have touched it with a ten foot pole. Believe it or not the iphone started with a pretty aggressive price, before gaining brand recognition.
Like the first couple of iphone generations, Microsoft was in no position to ask 25% more than the competition considering they had the lower-performing console.

Agree with your conclusions. Question is: what is benefit of a console shrink with GDDR6? Is saving power worth the extra cost? Will they help out their supplier by taking the really low binned modules?
As pointed out yesterday, back in 2013 Sony actually switched the RSX's memory pool from GDDR3 to GDDR5 and halved the bus to 64bit in the process.
By transitioning to 4* 16Gbit GDDR6 chips, the PS4's and PS4 Pro's SoC+RAM could look like this:

MwZRTGx.jpg


They could make a tiny motherboard out of it.
 
EMIB is a very wide bus that Intel uses to connect HBM2 to the Vega M in Kaby Lake G.
For GPU - CPU communication they use a much simpler and comparatively anemic PCIe 8x bus.
If console makers went back to discrete GPUs and CPU over SoCs (they probably won't), I doubt they would use EMIB for that bus.


Micron seems awfully optimistic in those slides if they think HBM (within "others") will lose half its marketshare by 2020.
SK Hynix and Samsung might have a different opinion on that.
Image search shows me those slides are from early 2017. Perhaps they knew they bet on the wrong horse (HMC) when they made those slides and were trying to trick investors with those forecasts.


iPhone's average selling price has been steadily increasing over the years, with apple keeping production of the 1-year-old and 2-year-old models to offer lower-cost alternatives. The first model launched for $500 and its closest competition was the ~$730 Nokia N95 with arguably lower specs.
If the first iphone launched at $800 and no one would have touched it with a ten foot pole. Believe it or not the iphone started with a pretty aggressive price, before gaining brand recognition.
Like the first couple of iphone generations, Microsoft was in no position to ask 25% more than the competition considering they had the lower-performing console.


As pointed out yesterday, back in 2013 Sony actually switched the RSX's memory pool from GDDR3 to GDDR5 and halved the bus to 64bit in the process.
By transitioning to 4* 16Gbit GDDR6 chips, the PS4's and PS4 Pro's SoC+RAM could look like this:

MwZRTGx.jpg


They could make a tiny motherboard out of it.

Good point on the PS3 change.
 
I read the graph as "through to the end of 2020", so the end of 2018/beginning of 2019 is where GDDR5X/6 starts to ramp up (there seem to be four five piece-wise linear segmentations).

edit: or rather, each numbered year = end of the year

edit: don't know what the presenter person was saying during the slide though.

I read it as five linear segments, end of 2018 nothing in console, end of 2019 something in console, straight line between. Interesting that they seem to be measuring in GB rather than discrete GDDR(5/x/6) chips. One way to make the graph take off I guess.

That doesn't make sense if you look at the PC graph. The 2016 notch on that graph matches up with NVidia launching GDDR5x equipped GPUs (May 2016). So each year signifies the start of each year and not the end of each year.

I believe those were also the first GDDR5x (standard ratified in Jan. 2016) products on the market.

[edit] Then again if you zoom way in on the graph, it looks like the GDDR5x/6 section may go into the first segment. So, not so sure now.

Regards,
SB
 
Last edited:
Status
Not open for further replies.
Back
Top