Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
there needs to be at least 16 GB of 'virtual memory per game however'. XSX can handle up to 5-6 games? in total before needing to drop one off? So that's approximately 100GB, perhaps not so coincidental how that number came about.

Does there? That seems wasteful, most data is only in memory to feed the GPU, I would guess a large proportion is unchanged from the ssd. Your saving duplicate data to then load it back from a separate space?

If they can track blocks of changed or new data, or the inverse with known unchanged data they could reduce the footprint when saving this off. Its all coming back from the ssd so why duplicate that data?

More complicated and prone to error so possibly no good for BC titles but it would improve performance and reduce storage required.

Would it be that hard in the grand scheme of things?
 
Direct memory management is always preferable in my opinion and its really based on specific developer needs.


There is an upper limit. (can only speak for PS4)


Memory on consoles is micromanaged to an absurd degree (for the games benefit of course), that is why the game patch process on PS4 is ridiculously long.
Thanks, I was also trying to read in-between the lines on the API thread but I wasn’t sure at the time.
Does there? That seems wasteful, most data is only in memory to feed the GPU, I would guess a large proportion is unchanged from the ssd. Your saving duplicate data to then load it back from a separate space?

If they can track blocks of changed or new data, or the inverse with known unchanged data they could reduce the footprint when saving this off. Its all coming back from the ssd so why duplicate that data?

More complicated and prone to error so possibly no good for BC titles but it would improve performance and reduce storage required.

Would it be that hard in the grand scheme of things?
You need to dump out all of the contents in memory alongside possibly anything that is being held in virtual memory. I think from a security perspective that’s an open vector for attack it seems like. In general spilling the contents out of memory into the SSD seems like an easier vector for people looking to cheat; I would assume that the process would be willing to take on some inefficiencies for the sake of security.
But you are right; might be entirely wasteful to do it that way.

At the very least you’re gonna want a save state and a checksum. Maybe some encryption and some compression.
 
So outside of throughput, curious to see how Xbox is handling all of this. It doesn't seem like they are allocating a large space for virtual memory, they are just using the game files themselves as _virtual memory_, there needs to be at least 16 GB of 'virtual memory per game however'. XSX can handle up to 5-6 games? in total before needing to drop one off? So that's approximately 100GB, perhaps not so coincidental how that number came about.

Wait..What?

I just assumed each game gets a dedicated page file that has 100 GB limit.

And this has nothing to do with the switching between apps where game instance has its memory in RAM compressed and placed into the SDD. If you are going to have a decompressor then why not a compressor as data compression minimizes write amplification and extends the life of the drive. Without a compressor you also minimize the utility of the decompressor.
 
Last edited:
Wait..What?

I just assumed each game gets a dedicated page file that has 100 GB limit.

And this has nothing to do with the switching between apps where game instance has its memory in RAM compressed and placed into the SDD. If you are going to have a decompressor then why not a compressor as data compression minimizes write amplification and extends the life of the drive.
Yea that was an inaccurate thought. But what about virtual memory? If each game is allotted 100GB of VM, then what do they do with that ?
 
Lets the game have an idea of where to grab assets quickly?

Although there are games that do have >100GB install sizes, so I'm a little puzzled.
 
Lets the game have an idea of where to grab assets quickly?

Although there are games that do have >100GB install sizes, so I'm a little puzzled.
Indeed. Dobwal is probably right here. Page file 100GB. But the contents don’t need to shift around on the hard drive. They stay out. No duplication. Only memory contents need to be dumped out if required. Virtual memory may still stay small ie 4GB instead of this massive amount, but the page file supports directly accessing the game files.

we know that the CPU has a dedicated 6GB workspace for its main utilities. So perhaps that much is actually dumped out during a game shift.
 
Yea, think there's some misunderstanding going on here.
It even said it in the quote. The virtual memory maps to the game install.
It's just another option/way to access the game data.
You have as much space to save out that you would have even without virtual memory.

Example is, instead of having to load the data into memory. Its mapped to memory so you just access it directly like you would GDDR6 memory. It would just be very slow in comparison.

So you have fast, slow, and slowest memory addresse locations. Slowest is just generally the game package.
 
Yea that was an inaccurate thought. But what about virtual memory? If each game is allotted 100GB of VM, then what do they do with that ?

To a game it basically thinks it see up to 113.5 GB of physical memory.

Lets the game have an idea of where to grab assets quickly?

Although there are games that do have >100GB install sizes, so I'm a little puzzled.

The whole game doesn’t necessarily have to fit inside the page file. The page file is presented as part a virtual memory space to what the application sees as a big chunk of ram. Windows already does this with system ram when gaming unless you disable it. Also, the page file tends to be a lot smaller as the page file is restricted to 1.5X-4X the size of physical memory.
 
I would take that with a pinch of salt, like usual. If true, it would be a first.
No it wouldn't. They're using the same metrics as on previous exact same comparison which was GCN vs RDNA (more specifically Vega 64 vs RX 5700 XT in Division 2 1440p), and that metric actually downplayed how much better 5700 XTs perf/watt is on average compared to Vega 64
 
That was from gcn to rdna. A 50% ipc increase in ideal situations perhaps.
It's performance per watt, not IPC, improved IPC is just one contributor.
And like I said, it's downplaying the actual difference. They achieved and advertised 50 % increase, while according to TPU numbers on same resolution 5700 XT has actually 62% better perf/watt than Vega 64.

So what exactly is misleading in the first numbers and what would make you think, using exact same metrics again, the new numbers would be misleading? Unless you count product being actually better than advertised as misleading, of course.
 
Yea that was an inaccurate thought. But what about virtual memory? If each game is allotted 100GB of VM, then what do they do with that ?

I believe this is related to Sampler Feedback Streaming. The up to 100 GB is allocated at install time and the contents of every file/texture within that 100 GB is known and available to the program.

It's similar to virtual memory in that you can pull (partial file/texture fragments) directly from that pool, but dissimilar in that I don't believe the game (or OS even) can alter what is there as knowing the location of everything at all times is key to the smart streaming tech, IMO.

Regards,
SB
 
I believe this is related to Sampler Feedback Streaming. The up to 100 GB is allocated at install time and the contents of every file/texture within that 100 GB is known and available to the program.

It's similar to virtual memory in that you can pull (partial file/texture fragments) directly from that pool, but dissimilar in that I don't believe the game (or OS even) can alter what is there as knowing the location of everything at all times is key to the smart streaming tech, IMO.

Regards,
SB

Yah, I think their BCPack format is basically designed for virtual texture streaming. Textures stored as compressed tiles or something.
 
Oh..so it might be better....that sounds nice. I was mainly thinking in terms of the console cpus being weaker 1:1 having to account for design constraints desktops dont...the opposite is something im not complaining about

I assume the improvement was meant in comparison to the current consoles. Next to desktop parts from what I've seen these are pretty standard Zen2's with no boost, maybe reduded cache and possibly less optimal memory access.

In terms of clocks it's a little slower than a Ryzen 3700x without boost.
 
I assume the improvement was meant in comparison to the current consoles. Next to desktop parts from what I've seen these are pretty standard Zen2's with no boost, maybe reduded cache and possibly less optimal memory access.

In terms of clocks it's a little slower than a Ryzen 3700x without boost.

I see so its a mixed bag.
 
Nah, it's a compromise if you will, compared to the full desktop CPU. You can't expect that to land in a small sized console/APU like that :)

Some ppl are saying they are optimizing the cpus for faster performance without the overhead neccesary for desktop and pc components, others are saying things like cache will be reduced to compensate for the apu console design in addition to bandwidth resource contention.

Seems like a mixed bag of positives and negatives on these consoles to me
 
Some ppl are saying they are optimizing the cpus for faster performance without the overhead neccesary for desktop and pc components, others are saying things like cache will be reduced to compensate for the apu console design in addition to bandwidth resource contention.

Seems like a mixed bag of positives and negatives on these consoles to me

Offcourse, they strip these things that have not so much impact, if any, on gaming performance, but they also needed to reduce clock speeds and cache sizes. They did the best they could, it's probably the perfect fit for these consoles. They won't be as fast as full fat desktop CPU's, first you cant expect that, and second it probably wasn't needed either. If heat, die size and cost wouldn't be a problem, they could have slapped in a 3950X or something for full 12c/24t at 4ghz or higher.
 
Offcourse, they strip these things that have not so much impact, if any, on gaming performance, but they also needed to reduce clock speeds and cache sizes. They did the best they could, it's probably the perfect fit for these consoles. They won't be as fast as full fat desktop CPU's, first you cant expect that, and second it probably wasn't needed either. If heat, die size and cost wouldn't be a problem, they could have slapped in a 3950X or something for full 12c/24t at 4ghz or higher.

I know...thats what i originally suspected that performance would be inferior somewhat to desktop chips but some disagreed so..
 
Status
Not open for further replies.
Back
Top