Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
SFS on the Series consoles is hardware-exclusive. SFS is approximately three times more efficient than normal SF on PCs. Effectively XSX requires a third as much bandwidth to move textures. This leaves plenty of extra bandwidth with special effects, raytracing, etc.
From what I understand, SFS shouldn't offer any VRAM bandwidth savings beyond regular Sampler Feedback. Instead, SFS provides bandwidth savings to the SSD in addition to the VRAM bandwidth savings afforded by Sampler Feedback. So I don't think it is any more efficient than Sampler Feedback on PC... just that it adds additional functionality in the form of IO throughput efficiency.
 
Ye I guess in principle. But when you’re cache is 128MB it’s not entirely clear how it works in conjunction with L2 and L1.
Maybe something like: Keep L2 L1 L0 etc. unchanged, but instead of having L2 talking to memory subsystem, it talks to the 128MB L3 instead, and each line in the L3 is 4KB which matches page size instead of 64 bytes or 128 bytes.
 
No, Sampler Feedback and Sampler Feedback Streaming can be a multiplier for DRAM size and bandwidth vs not using Sampler Feedback at all. Sampler Feedback Streaming as used by the XSX is just a refinement of Sampler Feedback as used in all DX12U capable hardware to smooth over texture mip transitions in hardware when the required mip isn't transferred into video memory in time for the completed frame.
SFS does couple more things as well as transitional filtering.
That's why wondered if the figure was actually relating to the added benefits which doesn't seem to be the case.
I personally don't expect particular performance gains compared to DX12U SF, but will be pleasantly surprised if wrong.
 
Quick question for you folks. We've seen speculation regarding AMD's new cache setup and while looking at the Series X, I noticed that it was 320 bit.... The 6800xt is 256 bit so why did Microsoft choose to go with a 320bit bus? This has got me thinking about whether or not the Series X soc uses the new cache architecture by AMD. The 6800xt with more CU's requires less bandwidth than the series X and it's 560 gb/s of bandwidth(10GB GDDR6)? Am i off base or does something seem amiss in their design?

The 6800's 256-bit bus is solely dedicated to the GPU. The 320-bit bus of the XSX is not.
 
Not a Megatron. We've had general console specs for some time. This is merely further refinements on that. We've also known each has their own set of customizations.
 
I suspect Sony would just continue with their own solution instead of using VRS....but what would they have instead of SFS?
 

I dunno. Maybe, maybe not. I knew that Sony wouldn't have stuff like SFS exactly because that's MS's nomenclature for that particular technology which would be their own implementation. It's not that Sony may not have their own version of it, but if so they've likely indicated it back in March where it was pertinent.

Thinking on it for a bit tho, one thing that does stand out to me is how deeply ingrained DX12U is into the RDNA2 architecture. It seems to be going a ways beyond than simply AMD making their product and then MS trying to ensure their stuff is as compatible as possible afterwards. No, this is looking like it was a deep operation between them from the get-go. Which, there were already rumors regarding that (tbf, there were also rumors that Sony co-designed Navi along with AMD)...but some of the stuff mentioned today shows it's deeper than even first thought.

Like, it's to a point where I now see whatever customizations Sony's done on their GPU, might've been out of sheer necessity to make sure they had the silicon needed for their API stacks that clearly aren't DX12-based (because they have their own API stuff). Maybe there is a chance in the future AMD spin off an RDNA GPU card that is more compatible with Vulkan, if what they're going to be putting out right now turns out to not be too Vulkan-friendly?
 
How's this a megaton when Cerny stressed that there GPU part had custom features ?
It's not really about the custom features, they both have those.
It's more so about what isn't included that is part of RDNA2.

Keeping in mind that it's possible that Sony didn't think that feature was worth while in console, so didn't even want it.
We don't actually know what it/they are yet though.
 
A full RDNA2 GPU would mean infinity cache. XSX has no infinity cache, ergo it's not full RDNA2.
indeed. they are based on RDNA 2. But custom means you can add/remove blocks.

In this case, it looks like they either opted to remove infinity cache, or if it's present it's smaller.
Feature wise, aside from infinity cache, which is more like bandwidth augmentation, it's got all the same features.
 
Status
Not open for further replies.
Back
Top