Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
sky's the limit I guess if you want to look at it that way.
Something to consider is that the 2070 is actually slower than a 1080TI and the 1080TI has 11.5TF of power.
Why this is an important consideration is because Nvidia has dedicated silicon for RT, and it only exists in the RTX line which is 2070+.

So in theory it's possible that PS5 (have something greater than 11.5 TF) could come out and only have its eyes set at a solid 4K machine, but that's all it would be able to do, and it's clear that nvidia is willing to give that crown up once it started walking down the RT path.

Not entirely sure if having more FLOPs is the answer here is all i'm saying.

I'm definitely leaning heavily towards next gen having some RT capabilities, at the very least, I expect this for Xbox Scarlett.
RT on next gen console is not viable in my opinion, there's just too much to do if we're still prioritize a traditional graphical leap. You will leave no room for RT once you hit a decent res and a suitable set of next gen effects, not to mention 60fps like some are hoping to mainstream. If Scarlett incorporates RT then it's well and truly silicons wasted, I doubt most people or even early adopters would down res to 1080p just to have it on. You just haven't seen the amount of batshit crazy effects afforded by traditional rendering on a 12 tf machine yet, once the trailers start to roll out you might very well think otherwise ;).
Maybe PS5 Pro or PS6 would give us a more RT friendly environment.
Of course, the usual grain of salt for WCCFTech rumors.

Vega 64 achieves 12.665 TF in 486 mm^2, giving a TF/mm^2 of 0.026.

These numbers would be a TF/mm^2 of 0.062, which is a growth of 2.39x.

TSMC has a shrinking factor of 70% for 16nm->7nm. Keep in mind this is likely 16FF+ to 7nm SoC, not 7nm HPC. 2.39x is a roughly 58% area reduction, iso clock, so these numbers seem plausible to me.

Say a next gen APU has 200mm^2 available for graphics and memory controller. That gives 12.44TF, assuming they can hit the same clock. This also assumes Navi has a similar density to Vega. GloFo 14nm and TSMC 16FF+ are pretty close from what I remember.
Seems like a 12-14tf console is very doable then without breaking the bank. Hope this rumor turns out true.
Weren't there rumors of PS5 SDKs containing 20TF GPUs?
Yeah that was awhile ago like the link below, we used to think it's a Xfire config but it could well be that 7nm machine.
 
Why would you down Res to get RT, there is no reason the keep the rasterization and RT resolution the same. Shadows or reflections that RT is being used for ATM are traditionally quarter Res on consoles anyway I think.

Of course we are assuming AMD needs special units for this, Nvidia may have other objectives or reasons for their implementation. What's to stop it being far more compute with memory or data type assistance, then it's part of the compute package and not a waste of area if unused or otherwise.

I doubt there is only one way to achieve the same results, performance may be lower but choice of hardware use would be far higher

On Nvidias etc cards, it seems they have many new rasterization features which may have taken die space and with existing software may not be used, old games may be slower but will that be the case when thoes features are used? This would be a key point for console where all the effecieny gains from the last 5 years can fully be used. Xbox one X still keeps very close to the One so it's rasterization is probably behind what could be possible now with the same 6tf figure.

Random rambling which is hopefully not totally incorrect.

I am keen to see what BFV looks like on a RTX card Vs a 1080 image quality wise and see if RT delivers a jump regardless of die cost and what more flops would deliver.
 
RT on next gen console is not viable in my opinion, there's just too much to do if we're still prioritize a traditional graphical leap. You will leave no room for RT once you hit a decent res and a suitable set of next gen effects, not to mention 60fps like some are hoping to mainstream. If Scarlett incorporates RT then it's well and truly silicons wasted, I doubt most people or even early adopters would down res to 1080p just to have it on. You just haven't seen the amount of batshit crazy effects afforded by traditional rendering on a 12 tf machine yet, once the trailers start to roll out you might very well think otherwise ;).
Maybe PS5 Pro or PS6 would give us a more RT friendly environment.

Pretty much my thinking. Although I think we'll see the occasional experiment from indies - naturally, running slower than it does on PC's with RTX hardware.

Ray tracing, when sufficiently developed in a few years' time, will provide a very obvious, very visible upgrade over traditional rendering. Because of that, I think it would make more sense for ray tracing to be deployed in a PS6 rather than a PS5Pro.

Seems like a 12-14tf console is very doable then without breaking the bank. Hope this rumor turns out true.

Agreed. But if a 12-14TF console is possible without breaking the bank, an 8-10TF console should be feasible relatively cheaply, and a 16-20TF console should be possible for the Pro and X1X crowds.

With both Sony and Microsoft having streaming solutions this generation, I think there's a case to be made for them wanting to render and output as many simultaneous streams as possible, from the smallest amount of hardware, and using hardware that can be part of the standard supply chain.

So why not make that hardware available to the public too? Launch with a Pro model, with exactly double the amount of rendering hardware - clocked higher than the base model, in order to ensure a more stable framerate - and use that model as the equivalent of server blades.

A home user can run two separate games, two instances of the same game (ensuring splitscreen multiplayer for any multiplayer game,) or a single game, utilising the resources of two base consoles.

But as awesome as I think that idea is, I do wonder what would be the best way to implement it? Would a further iteration of the PS4Pro be best: butterfly design applied to the entire APU? Or would separate dies be best, letting Infinity Fabric glue together whichever configuration?
 
Pretty much my thinking. Although I think we'll see the occasional experiment from indies - naturally, running slower than it does on PC's with RTX hardware.

Ray tracing, when sufficiently developed in a few years' time, will provide a very obvious, very visible upgrade over traditional rendering. Because of that, I think it would make more sense for ray tracing to be deployed in a PS6 rather than a PS5Pro.



Agreed. But if a 12-14TF console is possible without breaking the bank, an 8-10TF console should be feasible relatively cheaply, and a 16-20TF console should be possible for the Pro and X1X crowds.

With both Sony and Microsoft having streaming solutions this generation, I think there's a case to be made for them wanting to render and output as many simultaneous streams as possible, from the smallest amount of hardware, and using hardware that can be part of the standard supply chain.

So why not make that hardware available to the public too? Launch with a Pro model, with exactly double the amount of rendering hardware - clocked higher than the base model, in order to ensure a more stable framerate - and use that model as the equivalent of server blades.

A home user can run two separate games, two instances of the same game (ensuring splitscreen multiplayer for any multiplayer game,) or a single game, utilising the resources of two base consoles.

But as awesome as I think that idea is, I do wonder what would be the best way to implement it? Would a further iteration of the PS4Pro be best: butterfly design applied to the entire APU? Or would separate dies be best, letting Infinity Fabric glue together whichever configuration?
Just so I have a clear idea where your coming from, could you just list out the sku's/devices with what you think price point would be.
Seems way out there to me at the moment
 
Ray tracing on next gen might be like tessellation on 360. It could be possible, but limited. It could also be like tessellation on current AMD hardware, functional but slow. I think it's realistic to expect a certain level of RT capabilities in hardware in the next 2 years from AMD, and that those features will make their way to console.
 
I assume devices naturally means PSVR2. I wouldn’t get too excited beyond that as an additional device.

Potentially...

My thinking is that it's in their interest to provide people with the widest range of ways to play on PlayStation.

-- Portable and VR --

There have been rumblings of a dedicated PlayStation tablet. Given that LocoRoco is an emulated version of its PSP counterpart, we can safely say that Sony could release a tablet, and its games could also play on the PS4/Pro/5.

Oculus are releasing the standalone Quest early next year, and smartphone VR solutions are still the most popular VR platform.

So why not kill two birds with, well, three stones?

Release a 6" PlayStation tablet, with a 16:9 1440p 120hz screen, and an SoC that's at least up to par with the Quest. Include dedicated hardware for low latency video streaming, in order to make it perfect for both Remote Play and wireless, tethered VR. Include a split DualShock - the PS5's standard controller - but also provide compatibility with the DualShock 4.

Separately, release a VR headset for the tablet to slot into. This contains the cameras and lights for tracking, as well as its own battery, which will power both the headset and the tablet.

Lastly, release the PS5's camera, with a PS4 adapter. The camera can transmit images for the VR headset, and is also used in conjunction with the headset's sensors to determine the headset's position.

Any PST game will run on the PS4/Pro/5, meaning any PSTVR game will run on the launch PSVR. So they can tackle the portable market at the same time as the VR market, whilst utilising their existing install bases.

-- Micro Console --

7nm Navi and Zen2 should be able to emulate the PS4 with a tiny device. Use an external power brick, in order to free the device up for portable shells, in car use etc. ~128GB of flash instead of an HDD, no ODD, one USB type C port at the front for charging a controller, one USB type C port at the back for external storage, an HDMI2.1 port, and a port for the PS5 camera.

I've already covered the home consoles: phase out the Pro, and launch with two tiers.
 
You just haven't seen the amount of batshit crazy effects afforded by traditional rendering on a 12 tf machine yet
I can imagine. Watch dogs 2 @ 4K Ultra barely eeps beyond 30 fps on a 1080TI. Which is more or less the target goal for next gen consoles as per your commentary.
When you approach ultra+ quality settings, you are looking at features that are doing their best to approximate what RT does. You may as well have RT hardware to do it.

Everything else concerning 4K is being done already on X1X. It's lighting, shadows, and reflections that will really define next gen here and I'm not entirely convinced that our current methods will be able to perform better than RT.
 
We're at the point of diminishing returns with traditional rasterization techniques. By introducing a console with a high base tf count, all new resource intensive techniques like Ray tracing can finally be used. Recent posts by remedy and Dev Dennis Gustafsson are noting it takes about 9ms at 1080p for effective Ray traced effects, using a 1080 level card. It's clear next Gen will have at least a 1080 - 1080ti level gpu performance. So this lends well to raytraced effects at 30fps. Combined with neural upscaling and we're in the next Gen.
 
Just so I have a clear idea where your coming from, could you just list out the sku's/devices with what you think price point would be.
Seems way out there to me at the moment

Yeah, sure.

People have been talking about the $400 launch price, and how we can't expect to see it alongside a generational upgrade, when continuing to push 4K. I say the base model should be focused on 1440p-1800p, and the Pro model should be focused on 4K/dual gaming.

-- Base PS5 --

$350

- 8c/16t Zen2 CPU clocked at 3GHz
- 70CU Navi GPU clocked for 8-10TF
- 16GB GDDR6 memory
- 64GB NVME
- 1TB HDD
- UHD BR drive

-- PS5 Pro Duo --

$550

- 2 x 8c/16t CPU clocked at 3.8GHz
- 2 x 70CU Navi GPU clocked higher than the base model's for just over double the performance (think PS4 -> PS4Pro)
- 2 x 16GB GDDR6 memory
- 2 x 64GB NVME
- 2TB HDD
- UHD BR drive
- 2 x HDMI 2.1 ports, 2 x camera ports

That's based on the fact that the APU+memory in the PS4 came to something like $190, whereas this time I reckon it'll be more like $250.

They can enter the market aggressively to gather both cost conscious and performance conscious gamers, and they can also negate the risk of a competitor having more performance.

Base PS5 owners who subscribe to PSNow will be able to stream from the Pro, but play it locally if preferred. PS4 owners, or PlayStation tablet owners, who subscribe to PSNow can stream from the PS5Pro. When Sony's PSNow PS5Pro blades are being overloaded, any number of them can divide into two.
 
Pretty much my thinking. Although I think we'll see the occasional experiment from indies - naturally, running slower than it does on PC's with RTX hardware.

Ray tracing, when sufficiently developed in a few years' time, will provide a very obvious, very visible upgrade over traditional rendering. Because of that, I think it would make more sense for ray tracing to be deployed in a PS6 rather than a PS5Pro.



Agreed. But if a 12-14TF console is possible without breaking the bank, an 8-10TF console should be feasible relatively cheaply, and a 16-20TF console should be possible for the Pro and X1X crowds.

With both Sony and Microsoft having streaming solutions this generation, I think there's a case to be made for them wanting to render and output as many simultaneous streams as possible, from the smallest amount of hardware, and using hardware that can be part of the standard supply chain.

So why not make that hardware available to the public too? Launch with a Pro model, with exactly double the amount of rendering hardware - clocked higher than the base model, in order to ensure a more stable framerate - and use that model as the equivalent of server blades.

A home user can run two separate games, two instances of the same game (ensuring splitscreen multiplayer for any multiplayer game,) or a single game, utilising the resources of two base consoles.

But as awesome as I think that idea is, I do wonder what would be the best way to implement it? Would a further iteration of the PS4Pro be best: butterfly design applied to the entire APU? Or would separate dies be best, letting Infinity Fabric glue together whichever configuration?
If Sony is a little bold they could potentially experiment with the double SKU (power wise) strategy, after all it's unprecedented in the console space and could yield unexpected return. The benefit is clearly there and they could even do another mid gen launch on top of it (8k, RT, etc). I'm all for premium hardware at launch.
I can imagine. Watch dogs 2 @ 4K Ultra barely eeps beyond 30 fps on a 1080TI. Which is more or less the target goal for next gen consoles as per your commentary.
When you approach ultra+ quality settings, you are looking at features that are doing their best to approximate what RT does. You may as well have RT hardware to do it.

Everything else concerning 4K is being done already on X1X. It's lighting, shadows, and reflections that will really define next gen here and I'm not entirely convinced that our current methods will be able to perform better than RT.
Well I think we have a different take on Ultra settings for next gen games :). I believe the direction to push graphics with traditional rendering is not necessarily focused on getting things as accurate as possible like how RT does it. But more like expanding the simulation in hair, cloth, fluid dynamic in fire, water, particles, animation, explosion and destruction. You don't need pixel accurate RT shadows or reflection for the player to perceive a believable puddle or room, but you would instantly notice water, fire , drapes or hair behaving realistically which adds tons more visual impact. I'm not gonna pretend RT is useless, certain games might benefit it greatly more than others but I'm just saying it's not the best balance, maybe I'm wrong :), just a hunch.
 
Yeah, sure.

People have been talking about the $400 launch price, and how we can't expect to see it alongside a generational upgrade, when continuing to push 4K. I say the base model should be focused on 1440p-1800p, and the Pro model should be focused on 4K/dual gaming.

-- Base PS5 --

$350

- 8c/16t Zen2 CPU clocked at 3GHz
- 70CU Navi GPU clocked for 8-10TF
- 16GB GDDR6 memory
- 64GB NVME
- 1TB HDD
- UHD BR drive

-- PS5 Pro Duo --

$550

- 2 x 8c/16t CPU clocked at 3.8GHz
- 2 x 70CU Navi GPU clocked higher than the base model's for just over double the performance (think PS4 -> PS4Pro)
- 2 x 16GB GDDR6 memory
- 2 x 64GB NVME
- 2TB HDD
- UHD BR drive
- 2 x HDMI 2.1 ports, 2 x camera ports

That's based on the fact that the APU+memory in the PS4 came to something like $190, whereas this time I reckon it'll be more like $250.

They can enter the market aggressively to gather both cost conscious and performance conscious gamers, and they can also negate the risk of a competitor having more performance.

Base PS5 owners who subscribe to PSNow will be able to stream from the Pro, but play it locally if preferred. PS4 owners, or PlayStation tablet owners, who subscribe to PSNow can stream from the PS5Pro. When Sony's PSNow PS5Pro blades are being overloaded, any number of them can divide into two.

No. Just no. Especially the "Pro Duo".
 
I can imagine. Watch dogs 2 @ 4K Ultra barely eeps beyond 30 fps on a 1080TI. Which is more or less the target goal for next gen consoles as per your commentary.
When you approach ultra+ quality settings, you are looking at features that are doing their best to approximate what RT does.

Bad example. Ultra settings are always very ineficiente afterthoughts meant to put an extra cherry on top of a niche market's ice-cream. They are definitely not representative of what a game built for a 1080Ti from the ground up would look like. There were ps360 gen games that ran sluggishly on ultra settings on cards faster than the ps4, and still didn't look an inch as good as any PS4 exclusive.
 
Yeah, sure.

People have been talking about the $400 launch price, and how we can't expect to see it alongside a generational upgrade, when continuing to push 4K. I say the base model should be focused on 1440p-1800p, and the Pro model should be focused on 4K/dual gaming.

-- Base PS5 --

$350

- 8c/16t Zen2 CPU clocked at 3GHz
- 70CU Navi GPU clocked for 8-10TF
- 16GB GDDR6 memory
- 64GB NVME
- 1TB HDD
- UHD BR drive

-- PS5 Pro Duo --

$550

- 2 x 8c/16t CPU clocked at 3.8GHz
- 2 x 70CU Navi GPU clocked higher than the base model's for just over double the performance (think PS4 -> PS4Pro)
- 2 x 16GB GDDR6 memory
- 2 x 64GB NVME
- 2TB HDD
- UHD BR drive
- 2 x HDMI 2.1 ports, 2 x camera ports

That's based on the fact that the APU+memory in the PS4 came to something like $190, whereas this time I reckon it'll be more like $250.

They can enter the market aggressively to gather both cost conscious and performance conscious gamers, and they can also negate the risk of a competitor having more performance.

Base PS5 owners who subscribe to PSNow will be able to stream from the Pro, but play it locally if preferred. PS4 owners, or PlayStation tablet owners, who subscribe to PSNow can stream from the PS5Pro. When Sony's PSNow PS5Pro blades are being overloaded, any number of them can divide into two.
I'll just say that I find your pro duo wildly optimistic at that price point.
If you was saying that it would be the ps5 pro launched few years later, then that would've made bit more sense to me.

Base ps5 sounds reasonable though
 
Care to expand upon that some?
First of all the idea of using dual stuff is ridiculous. Even with the extra control and known hardware it just adds too much extra hassle to get it to work at all.
2 CPUs means the devs need to carefully place specific threads to specific cores and make sure that nothing vital needs to communicate with anything on the other CPU or memory connected to it
2 GPUs means the same thing unless you use AFR which has it's own problems every PC enthusiast is awfully familiar with and which never scales perfectly. Also adds extra complexity to handle frame pacing without adding too much latency. And you would need to reserve double the memory for them, since they would obviously need to be connected to their own CPUs if we're assuming they'll use APUs (which seems the only logical option really)
2 x memory pools needs to be carefully handled to make sure everything is always in the right chips connected to right CPU to avoid latencies beyond terrible

Why would you want two NVMe drives? For RAID 0 which would rocket the chance of losing all the data for most likely marginal speed boosts in real world console operations?

Why 70 CUs? Even if AMD would put in the effort to allow over 4 shader engines required for that, 70 is illogical. 72 would work, but 2 disabled CUs for redundancy might be too little for the chip that size, and next logical step 80 with 10 for redundancy seems excessive already
 
Status
Not open for further replies.
Back
Top