Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
Generally speaking, you can't wait on technology. There's always something better 12-18 months away, and sometimes if you wait, you'll end up waiting 3 years for it to actually arrive.

I doubt we are in general situation today. I feel we are at a big junction where the game changes or doesn't change. If neural networks, hybrid render etc. becomes a thing that changes world as much as unified programmable shaders(rsx wasn't so great compared to competition) Also the next gen manufacturing technology is not really there, 7nm is early compared to what type of manufacturing process consoles use.

If anything sony is in position where they can confidently wait and "print money" with ps4. No need to go gambling. Whatever they decide to release as ps5 stays with them long time as consoles don't change their architectures every few years.
 
In a system with multiple memory types andstorage, as has been suggested here by multiple people and multiple times, a Southbridge+OS kind of chip might be able to serve other intesting purposes. It may move data between the different memory types without bothering the main game SOC's bandwidth.
You could even have it be able to do some type of decompression, so it can read data that comes in big compressed blocks (a useful and common practice to optimize disk seek times) and only send the actual wanted part to the SOC.
I don't know how much a big bus width costs, but maybe the thing could even have a larger bus than the actual SOC, so it can always feed the SOC as much data as possible while also juggling these other operations on it's own while at it.

As bandwidth is increasingly the biggest bottleneck for graphics and large simulations, that sort of win may prove itself worthwhile in a console design.
 
Discussing whether the "OS" can be moved to a secondary processor or even onto a secondary chip needs clarity on what exactly the "OS" is. The OS is responsible for managing the overall platform and giving a consistent presentation of the console to the application. That can be privileged operations to parts of the hardware, synchronization operations, resource allocation, system services, and security layers. Games will routinely hit points where threads will interact with these elements, or even with modern compute layers the sleight of hand used to lower compute overhead is necessarily managed by a privileged part of the system.

Pushing those off the high-performance domain, or putting an IO interface in the path gives "offloading" some instruction cycles from a high-performance SOC in exchange for waiting on a distant and slow controller to get back to it. (edit: And depending on how exactly this separation is handled, the synchronization and safety hazards of doing this can be brutal.)
There are other elements that are more superficial or less necessary during gaming loads, but this also means their burden is proportionally small.
 
I'm seeing references to the secondary processor/chip interacting with resources or functions being used by the main SOC, which would involve the rest of the OS if the developers want the system to be functional.
If the OS is defined as just the fluff that isn't running when the game is, and where the game isn't running or is throttled when they do, then the payoff seems limited compared to the complexity introduced to do this safely.
 
Discussing whether the "OS" can be moved to a secondary processor or even onto a secondary chip needs clarity on what exactly the "OS" is. The OS is responsible for managing the overall platform and giving a consistent presentation of the console to the application. That can be privileged operations to parts of the hardware, synchronization operations, resource allocation, system services, and security layers. Games will routinely hit points where threads will interact with these elements, or even with modern compute layers the sleight of hand used to lower compute overhead is necessarily managed by a privileged part of the system.

Pushing those off the high-performance domain, or putting an IO interface in the path gives "offloading" some instruction cycles from a high-performance SOC in exchange for waiting on a distant and slow controller to get back to it. (edit: And depending on how exactly this separation is handled, the synchronization and safety hazards of doing this can be brutal.)
There are other elements that are more superficial or less necessary during gaming loads, but this also means their burden is proportionally small.

That's what I was asking about really.

I can see the sense in a secondary processor to deal with apps and UI. I can see the sense in using a secondary processor to output the final image, so the likes of Twitch and Remote Play can operate with no impact to the game.

But I wonder if there are other uses for a semi custom ARM SoC? We see upscalers in consoles, and an anti-aliasing HDMI cable was released fairly recently. Is there anything akin to those, which could conceivably sit outside of the high performance hardware, yet still be useful?
 
At this point in time a second cpu or whatever for some os functions is just overly complicated and unnecessarily adds cost. Surely with 8 core zen chips and gddr6 there's less need than ever for exotic complicated designs. Let's just keep the os bloat under 4gb or I will lose it lol. 6-7 zen cores available for games seems pretty all right for next gen and maybe we get smt as well.

What I have wondered though is if it was possible for Nintendo to use the already existing small cores on tegra x1 to free up that precious 4th main a57 core for games, kinda seems lazy on their part but maybe someone here can elaborate.
 
At this point in time a second cpu or whatever for some os functions is just overly complicated and unnecessarily adds cost. Surely with 8 core zen chips and gddr6 there's less need than ever for exotic complicated designs. Let's just keep the os bloat under 4gb or I will lose it lol. 6-7 zen cores.

And will that one Zen core be able to operate while in standby without the fans having to be on? I consider that an important feature.
 
ARM cores are comparatively much more powerful than those available in 2013 for previous gen consoles. I would think they could get quite a capable solution there if need be. I don’t see how Zen would be any better at running low power than Jaguar, which was specifically designed for low TDP end uses.
 
Sony's backwards compatibility patent was updated in April.
2018-04-30 US15967246 Pending
2018-08-30 US20180246802A1 Application

https://patents.google.com/patent/US9892024B2/en?oq=9892024
I'm not too familiar with reading patents wrt changelogs, but the bottom of the page seems to just show:

2018-04-30 AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT LLC, CALIFORNIA

Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:SONY INTERACTIVE ENTERTAINMENT AMERICA LLC;SONY INTERACTIVE ENTERTAINMENT LLC;REEL/FRAME:045671/0744

Effective date: 20180323

So nothing of import for the technical side of things?
 
That's what I was asking about really.

I can see the sense in a secondary processor to deal with apps and UI. I can see the sense in using a secondary processor to output the final image, so the likes of Twitch and Remote Play can operate with no impact to the game.

But I wonder if there are other uses for a semi custom ARM SoC? We see upscalers in consoles, and an anti-aliasing HDMI cable was released fairly recently. Is there anything akin to those, which could conceivably sit outside of the high performance hardware, yet still be useful?
Perhaps this depends on the usage model for the UI. A more traditional console model would have the UI off-screen during gameplay, and bringing it up would involve reducing the focus of the game. An outside HUD or UI for another app would be a minor performance cost to dedicate another SOC for.
As far as a secondary processor to output a final image, the GPUs already have compositors and scalers, which are dedicated processors or processor blocks for such purposes. Encoder/decoder blocks are also frequently present as well, so what else is needed to be offloaded in this scenario is unclear to me.

What some kind of streaming or remote play scenario requires is that another application or client contend for the same output, buffers, IO, and network resources. The lowest-overhead method would be to have one or more cores in the same memory hierarchy and compute domain be involved in the arbitration and data movement.
Increasing separation with something like a lower-performance version of the same architecture needs to be compared to running the streaming app on one of the main cores. If there are points in the workload where there's an expectation that a big core will be well-utilized, shunting that workload onto a slower core reduces performance in a way that could reduce quality or force the main system allocation to stall longer than if a main core was performing it. This may be some portion of the encoding process, the speed of data movement, or some platform synchronization or arbitration function that takes longer and obligates faster cores to spin until the slower core is done.

Using a different microarchitecture, even if still x86 like the idea of using Jaguar cores in a primarily Zen system may be problematic as well. Apps on the secondary cores would have to be compiled without Zen's ISA extensions, and system functions would need to be written separately due to architectural changes. Sharing memory spaces may not be as easily managed. The TLB system has evolved with Zen, and the cache coherence protocol has expanded.
A different architecture entirely like ARM almost always means encapsulating the systems from each other to the point that they only see each other as IO devices if active concurrently, and frequently perform many operations only when the other is not active.

For security and low-power purposes, there can be a stand-off level of integration with a mostly separate system device. I am not sure what parts that may interact with the foreground functionality can straddle that divide.

edit: grammar
 
Last edited:
And will that one Zen core be able to operate while in standby without the fans having to be on? I consider that an important feature.
Like when you're watching netflix or on the dashboard? Well.. Ps4 and 360 certainly have fans spinning so I don't see why this would change for next gen regardless.

Usually I end up getting a later model console near the end of the gen which I only game on and nothing else so for me it wouldn't matter.
 
Generally speaking, you can't wait on technology. There's always something better 12-18 months away, and sometimes if you wait, you'll end up waiting 3 years for it to actually arrive.

I can understand being caught behind a transition like that when we're taking about a vague "coming soon" proposition, but Nvidia are already shipping Turing GPUs. I can't fathom AMD not having a similar solution ready for market by 2020.
 
Status
Not open for further replies.
Back
Top