Sony ID Buffer *spawn*

I think the general consensus is the lack of bandwidth and ram relative to the compute increase. Ps4 pro is a good 1080p console. Xbox X is a good 4k console.

That combined with a reliance on the ID buffer to attain higher resolution with CB. Except almost noone wants to use the ID buffer (as implemented on the PS4) for CB because it was a pain to implement and there are better methods of reconstruction that developers prefer to use. Basically the ID buffer was trying to make up for shortcomings in the hardware, but it was too difficult to use with too little upside for the time investment. Especially when it came to multiplatform developers.

Regards,
SB
 
That combined with a reliance on the ID buffer to attain higher resolution with CB. Except almost noone wants to use the ID buffer (as implemented on the PS4) for CB because it was a pain to implement and there are better methods of reconstruction that developers prefer to use. Basically the ID buffer was trying to make up for shortcomings in the hardware, but it was too difficult to use with too little upside for the time investment. Especially when it came to multiplatform developers.
You're going to need some pretty big references to support such wild blanket statements.
 
That combined with a reliance on the ID buffer to attain higher resolution with CB. Except almost noone wants to use the ID buffer (as implemented on the PS4) for CB because it was a pain to implement and there are better methods of reconstruction that developers prefer to use. Basically the ID buffer was trying to make up for shortcomings in the hardware, but it was too difficult to use with too little upside for the time investment. Especially when it came to multiplatform developers.

Regards,
SB

Sources please?

Like actual developers on record stating such issues. I can't remember any articles or videos from Digital Foundry or any other enthusiast game tech site mentioning such issues with PS4's CBR method among developers. What you're stating sounds like a wide spread issue.
 
Like actual developers on record stating such issues. I can't remember any articles or videos from Digital Foundry or any other enthusiast game tech site mentioning such issues with PS4's CBR method among developers. What you're stating sounds like a wide spread issue.
I am not sure how widespread it is but this is what I know.

It was avoided on Horizon Zero Dawn because ID BUffer is too expensive for them since it is in fact full-resolution (4K) (they call them something like helper/aid buffers or something like that).

IMO, I think the temporal accumulation style of image reconstruction is the current less visually intrusive (Spiderman, UE4, Ubisoft games). It does not create saw-tooth edges like Checkerboarding when it fails, it just leaves the edges looking lower resolution.
 
I am not sure how widespread it is but this is what I know.

It was avoided on Horizon Zero Dawn because ID BUffer is too expensive for them since it is in fact full-resolution (4K) (they call them something like helper/aid buffers or something like that).

IMO, I think the temporal accumulation style of image reconstruction is the current less visually intrusive (Spiderman, UE4, Ubisoft games). It does not create saw-tooth edges like Checkerboarding when it fails, it just leaves the edges looking lower resolution.

Do you know of any other PS4 games or developers that avoided Sony's CBR method (ID buffer) purposely?

This is interesting, I never heard Sony's CBR method being a pain in the ass. Mostly heard praises for it - like God of War and Detroit: Become Human. News to me...
 
I am not sure how widespread it is but this is what I know.

It was avoided on Horizon Zero Dawn because ID BUffer is too expensive for them since it is in fact full-resolution (4K) (they call them something like helper/aid buffers or something like that).

IMO, I think the temporal accumulation style of image reconstruction is the current less visually intrusive (Spiderman, UE4, Ubisoft games). It does not create saw-tooth edges like Checkerboarding when it fails, it just leaves the edges looking lower resolution.

The ID Buffer operates at the same resolution as the Z-Buffer, so it's only 4K when the internal rendering resolution is 4K:
Mark Cerny said:



Furthermore, where did you see that Guerrilla avoided checkerboard in H:ZD?
The game is using checkerboard:
Pro and HDR support was integrated into Horizon relatively late into the game's six-year development, and Guerrilla experimented with several higher resolution strategies, trying them out side-by-side before settling on 2160p checkerboard.



AFAICT, ID buffer is just the fast extraction and availability of an additional variable that says which 3D object that pixel belongs to.
It makes sense that this data would be useful for reconstruction techniques, and there's no reason to believe its implementation is slow.
 
Do you know of any other PS4 games or developers that avoided Sony's CBR method (ID buffer) purposely?

This is interesting, I never heard Sony's CBR method being a pain in the ass. Mostly heard praises for it - like God of War and Detroit: Become Human. News to me...
IIRC Spiderman comes to mind.
 
ID buffer will most likely not make a return for the PS5, due to it's limited success.

I'm not sure how helpful the ID buffer is, but I can understand why few developers would want to bother extracting its functionality when they can make use of the more straightforward horsepower increases and just bump the resolution up to 1440p-1800p.

If it's helpful, I would expect to see it in both the PS5 and Scarlet. And actually get used.
 
The ID Buffer operates at the same resolution as the Z-Buffer, so it's only 4K when the internal rendering resolution is 4K:


Furthermore, where did you see that Guerrilla avoided checkerboard in H:ZD?
The game is using checkerboard:




AFAICT, ID buffer is just the fast extraction and availability of an additional variable that says which 3D object that pixel belongs to.
It makes sense that this data would be useful for reconstruction techniques, and there's no reason to believe its implementation is slow.
It is in their presentation on it. No ID buffer, still a checkerboard pattern though. (it is not as if "checkerboarding" pattern has, requires, or needs such an id buffer)
"without native-res hints" - they go over as to why in the course/slide notes.
11rejf0.png

12bfju1.png
 
Sounds like ID Buffer is sort of being treated similarly to tiled resources. There is a hardware implementation, but the developers would rather use their own due to flexibility and control over it. I recall Gears 4 having Tiled Resources option (one of the very few), was that feature removed for Gears 5?
 
That makes zero sense to me. ID buffer generates polygon maps 'for free' which can then be used for anything like per-object blurring and outlining and preserving edges in reconstruction techniques. I don't see any obvious downsides assuming it takes up next to no silicon and processing. Indeed, it was included to save devs having to do this with shaders themselves. If it's not being used, that's likely because devs want a one-size-fits-all solution for their cross-platform rendering. Currently games can filter the Z-buffer to find edges and as that and similar works across all platforms, it's the easiest common solution.

The ID buffer itself sounds like a good idea to me. Knowing which object each pixel belongs to is valuable info.
 
If it's not being used, that's likely because devs want a one-size-fits-all solution for their cross-platform rendering.
But we don't see this happening.
We know Spiderman did not leverage ID Buffer and HZD hasn't either. Both of them exclusive to Sony.
Note I'm not saying there's something wrong with ID Buffer, I just don't know why it's not being used that much.

edit:
From DF HZD Article:
"There are different ways to do checkerboarding as well," adds Giliam, who told us that they 'rolled their own' solution as opposed to using Sony's reference model. "You can have more information per pixel, or less information per pixel when rendering checkerboarding and depending on how much information you have, you can go for different checkerboard resolve techniques. We came up with one that doesn't need a lot of extra data at the per-pixel level and that gave us some performance boosts as well in the rendering of the whole geometry and the lighting pass."

So this does have similar parallels to the debate of custom virtual texturing methods vs tiled resources.
 
Last edited:
Detroit, GT Sport, and God of War, did use the ID buffer and they look great. So it's certainly being used on some of the best looking games of the generation.

If you look at subjective lists of "best looking games on ps4 pro", there's a lot of checkerboard there. Whether they used the stock reference code or reimplemented another checkerboard resolve is not that important, also the reference implemetation is not a static code that never get improved.

ID buffer is free in terms of processing but it takes some bandwidth. Since the size of that buffer will not really change next gen, it might be even more useful on ps5, since the main reason some didn't use it was bandwidth cost on the Pro.
 
Why is it being generally assumed that whenever a developer uses a modified checkerboard and/or a custom reconstruction they're not using data from the ID buffer?

The only valid motive I'd see it not being adopted by devs is if they're using techniques that will work on all platforms even though they're more expensive (e.g. using Z-buffer) or lower quality (e.g. edge detect), like @Shifty Geezer suggested. But that doesn't really apply to 1st and 2nd party games.
 
But we don't see this happening. We know Spiderman did not leverage ID Buffer and HZD hasn't either. Both of them exclusive to Sony.
Spider-Man was created before Insomniac went first party, so they possibly developed a technique that was platform agnostic. And PS4 hasn't got an ID buffer either so Insomniac would no doubt be looking at solutions that work for both.

As for HZD...
From DF HZD Article:
I don't see that saying they did or didn't use ID buffer. They didn't use Sony's reference algorithm but rolled their own. They came up with a solution that doesn't need a lot more info per pixel. ID per pixel isn't a lot of info. How do we know what info they are using per-pixel?
 
16 bits per pixel? That'd range from 125 MB/s for 1080p30 to 1GB/s for 4K60.
Wow it's really not a lot. Maybe when it's read back it's increasing the data per pixel and impacts caches? Why were they saying it has a cost to use it? I thought they meant bandwidth but obviously it doesn't add up.
 
I guess it's the same reason devs are forever trying to squeeze more info into less bits, packing the data in clever ways. Here's a typical sort of G-buffer;

K2+GBuffer.png


That's from KZ2. Maybe the ID buffer can't be 16 bits and has to be 32 bits to fit the memory system of the GPU? At which point it's adding 20% to the G-buffer requirements. Even if a 16 bit buffer can be read and used, that's 10% more.
 
Spider-Man was created before Insomniac went first party, so they possibly developed a technique that was platform agnostic. And PS4 hasn't got an ID buffer either so Insomniac would no doubt be looking at solutions that work for both.

As for HZD...
I don't see that saying they did or didn't use ID buffer. They didn't use Sony's reference algorithm but rolled their own. They came up with a solution that doesn't need a lot more info per pixel. ID per pixel isn't a lot of info. How do we know what info they are using per-pixel?
I believe the slides that @Dictator placed speak to it. Native Res Hints are 'ID Buffer'? If I understand correctly.
Sony's reference algorithm uses Native Res Hints; first bullet point.
  • Depth Buffer
  • Triangle Buffer
  • Alpha test coverage

Custom algorithm is the one that they went with below because native res hints were too expensive.
The slide below details their custom solution a bit further.

As for developers pushing the envelope. I remember a very taxing console discussion over AF settings and consoles not being able to hit 16xAF even though it should be 'free'.

edit: some shotty detective work that needs vetting:
For the source, you can read articles about how ps4 upscales with the checkerboard thing. Essentially it has a "full resolution" ID buffer, wich holds the exact triangle number and drawcall number as a texture. This is FULL resolution, so complete 4k. Meanwhile the shading is actually done on a half width buffer, with the "sampling" of each pixel shifted a bit to the left or to the right. Then you upscale it on a pixel shader into full resolution, and temporal antialias it with the last frame. When you do the TAA for the upsample, you normally use that ID buffer, becouse it allows you to minimize ghosting massively (only "bleed" the pixels that are from the same triangle or drawcall).
Going to need to find the original presentation on ID Buffer as opposed to this verbose statement.
 
Last edited:
Back
Top