PBR

So, might as well ask since. The PBR pipeline which is generally getting video games closer to CGI - do textures need to be specifically made for PBR? Or is it possible to also have procedurally generated textures that can interact properly with a PBR pipeline?

PBR is a buzzword in gaming. It's the real thing in CGI. They are basically materials that are physically plausible. In order to be legitimately physically plausible, one must have a) lights that have area and b) the materials/lights must behave based on some probability. In gaming, they are just using the distribution model (i.e. the portion that describes the distribution of the material you are trying to mimic) with non-realistic lights (i.e. spot, point, directional) and no probability. Textures aren't used to achieve PBR. That's just art.
 
Last edited by a moderator:
So, might as well ask since. The PBR pipeline which is generally getting video games closer to CGI - do textures need to be specifically made for PBR? Or is it possible to also have procedurally generated textures that can interact properly with a PBR pipeline?

A material as you see it is the sum of various interactions of light. For the computer and for simplicity we decompose this black-box transformation - there is no correct large scale light/surface interaction model which could be correct for all cases, there are only correct wave based light-surface interaction models on wavelength scale, particle pased light-surface interaction models are insufficient as well - we decompose it into a handful of the most important observable features and try to find an approximate formula which is sufficiently look-alike. It is totally not physically, arguably "based" on physics, but not on physical attributes, but instead observations of physical behaviour. For example specularity and specular color. The reasons lie in the atomic properties of the surface hit, of the interaction of light bouncing from atom to atom in a possibly mixed material/molecule, lightwaves enter the electric field of the atom and there is an enery exchange, light being an electric particle/wave and so on. But the specular maps don't contain lists of which atoms in which quantities and which arangements are hit. Instead the light-model-maker looked at observations of light-surface interactions (for example the fresnel curve is a plot of light surface-interaction showing the change in wavelength, amount of absorbtion and retro-reflection over the angle, that's a lot of stuff thrown into one value) on a very very large scale - millions of times larger than atomic scale, and thus totally "blurred" - and parameterized the formulas in such a way, that the observations (and not the causal attribute) can be fed, resulting in a plausible reproduction of observable material responses.
As such, all computer models, are not based on physical attributes, they are based on reproducing observations of the real world. And to be clear, all non-artistic models (Phong, Blinn-Phong, Oren-Nayar, Cook-Torrance, He, Ward, Stam, Ashikmin, Marschner, we can go on ...) are physically based. What distinuishes them is the scope of representable materials, and input data. For some the input data has to be trained with reference images because it's some fantasy representation, for some you can take measurements and use the data directly without anything else.
There are very few wave based models, He fe., most of them just invent something more or less senseful, Torrance invented the probabilistic based micro-surface based model, but's extremely far from physical reality, not so much though that it can't look believable.

Using the term PBR is meaningless, no scientist ever uses the term, it's more in the way of unstanding, than that it helps. If you think micro-surface models get's raytracers near reality, then you are mistaken. Only wavelength scale BSSRDFs can do that, BTFs could do that from shot to shot, but then BTFs are basically real-world captures over all parameters of a render equation.

Now to answer your question(s):
Utilizing more complex models in games helps trailing the even more complex models used in offline rendering, and using much more complex models helps trailing reality. When you model atoms and waves, then you do get imagery, which in fact, is real stuff. Let philosophy decide if atoms living in a computer and the ones outside, are the same "real" (is a simulation of reality less real than reality?).
And yes, you need to author specifically for one model or the other. Less often than more you can translate the data from one model to the other, because most models are not sub- or super-sets of another, pretty much every model has something another can't, so you can't translate it reliably all the time.
When the model is based on statistics and probability functions, then it's relative easy to invent a procedure to produce meaningful data, so that's very much possible. Other data inputs, especially the ones which require fitting to imagery, don't lend themselfs easily for procedual generation.

TL;DR: Yes, yes.
 
So, might as well ask since. The PBR pipeline which is generally getting video games closer to CGI - do textures need to be specifically made for PBR? Or is it possible to also have procedurally generated textures that can interact properly with a PBR pipeline?
You need to use textures specific for PBR but they can be procedurally generated.

PBR is a buzzword in gaming. It's the real thing in CGI. They are basically materials that are physically plausible. In order to be legitimately physically plausible, one must have a) lights that have area and b) the materials/lights must behave based on some probability. In gaming, they are just using the distribution model (i.e. the portion that describes the distribution of the material you are trying to mimic) with non-realistic lights (i.e. spot, point, directional) and no probability. Textures aren't used to achieve PBR. That's just art.
Kinda. CryEngine uses image based lighting for the ambient lighting (diffuse and specular) which is basically empirically acquired light sources. No analytical ones though but recently DICE published a paper about the new features of PBR Frostbite and they do include both IBL (can be updated on a per-frame basis, oh yeah) and area lights (sphere, disk, tube) and both use physically based lighting units for maximum realism . I'm hoping they show off this at GDC.

Then again it's called "phsycally based rendering", not "physically exact rendering" so you have to give them so leeway :LOL:
 
Last edited by a moderator:
Kinda. CryEngine uses image based lighting for the ambient lighting (diffuse and specular) which is basically empirically acquired light sources.

But those aren't sampled according to a statistical probability. It's just an encoded spherical harmonics (low res) with regular light probe lookups.

No analytical ones though but recently DICE published a paper about the new features of PBR Frostbite and they do include both IBL (can be updated on a per-frame basis, oh yeah) and area lights (sphere, disk, tube) and both use physically based lighting units for maximum realism .

And the area lights still won't be using multiple importance sampling and producing shadows that vary softness according to distance. However they are implementing it, it's not really physically "plausible" unless they cast rays according to some statistic. ;)
 
But those aren't sampled according to a statistical probability. It's just an encoded spherical harmonics (low res) with regular light probe lookups.
They're HDR cubemaps actually:

hdr_01_by_ocasm-d4fpvbm.png


And the area lights still won't be using multiple importance sampling and producing shadows that vary softness according to distance. However they are implementing it, it's not really physically "plausible" unless they cast rays according to some statistic. ;)
Oh yeah at least in Frostbite they're faking area shadows by basically blurring the hell out of them so no variable penumbra. Some guy modified the shader code in Crysis to produce variable penumbra shadows but Crytek didn't pick it up for their next engine iterations:

iLsxZsgWfYOnt.jpg

iRxJ5q9TyD0qQ.jpg
 
Post 1: Everything above this post was copied from another thread.

First of all I hate it when threads are closed before I could add my opinions ;) but I also think this is a subject worth of it's own discussion.
Also, keep in mind that I'm no expert at this, being a models guy instead of a textures/shaders one.

I believe PBR is a great invention despite the fact that it's still not completely correct or plausible.

Previous implementations of shading and lighting have offered a lot of freedom for artists because all the various parameters of a shader were completely independent and thus everything could be tweaked. Since VFX has always been about the final image, everyone was looking for shortcuts to get there as fast as possible.
The usual result of this approach was that the asset has only worked well in the shots for which the parameters were tweaked for, and usually broke down in different lighting conditions. The solution was to create unique shaders and such for each of the different sequences, which meant a lot of artist time was required for the entire show.
The same was more or less true for games, although the budgetary restrictions usually meant that assets weren't looking realistic in any setting at all instead.

The most important element of PBR is that it introduces certain limitations; things like energy conversation where the reflected light, the sum of the specular and diffuse elements, can not be grater than the amount of the light hitting the surface. Cheating is no longer allowed, there's a smaller number of parameters to tweak with several restrictions. It's impossible to boost the speculars, AO can't be painted or baked into the diffuse/albedo maps, and so on.

The result is that artists can only have one setting for all the parameters that has to work in all possible conditions, but in the end it cuts down the amount of man-hours on an asset while creating superior and more predictable results. This should also usually create more realistic looks. However this doesn't automatically mean that everything is going to be realistic either, you can still build materials that look wrong ;)
 
Last edited by a moderator:
@ShiftyGeezer :) can you bring back the PBR posts from VFX ethatron and sco field from the bearded thread here?I didn't know it closed!
 
I guess we'll see how it goes... I've not been particularly enthused about their design directions in a number of areas.

Off ModEdit: On topic but I was hoping you could answer me on this one since Josh Holmes confirmed Halo 5 is using PBR.

But how can you tell a game employs PBR by just looking at some in-game screenshots? What are some things to look for? :)
 
Last edited by a moderator:
Ray Tracing, nuf said, burn your bag of rasterization tricks in a fire and embrace Monte Carlo Simulation ! :p
Yes PBR is a great move toward photo-realism, as was getting gamma correct not so long ago (or maybe still today).
 
Couldn't you perform an analysis of static elements of your scene, and calculate a map/tree of groupings of geometry that were likely to have rays bouncing between them? That might allow you to know which core/CU to transfer a ray to depending on where and at what angle the ray exited/entered a subsection of the total volume.
 
Couldn't you perform an analysis of static elements of your scene, and calculate a map/tree of groupings of geometry that were likely to have rays bouncing between them? That might allow you to know which core/CU to transfer a ray to depending on where and at what angle the ray exited/entered a subsection of the total volume.

Monte Carlo = random. :)
 
Off topic
but I was hoping you could answer me on this one since Josh Holmes confirmed Halo 5 is using PBR.

But how can you tell a game employs PBR by just looking at some in-game screenshots? What are some things to look for? :)

We have a thread on PBR to read that would be way more accurate than the answer i'm about to give you, but generally I would describe it as looking accurate and consistent. Like when you look at games like Order, Ryse, DriveClub, Alien Isolation, Fox Engine games, there are others too, when you look at the material it's appearing closer to how it should appear in real life. How the light travels around and through the object based upon it's ability to reflect and refract light. Someone has done some calculations ahead of time to put some values into the textures or so I understand.

So if you look at the different concretes, marbles, quartz, and woods in Ryse for instance if you say, yea that's what it should look like with that much light shining on it, then it's likely a PBR. Same would apply to other games, DC for instance does a great asphalt, he materials on the cars are correct and so forth. PBR has been giving gamers the next gen visuals that people are associating next gen with.
 
This one is coming out soon that appears to be running PBR, though I guess this is inline with just running UE4.
 
We have a thread on PBR to read that would be way more accurate than the answer i'm about to give you, but generally I would describe it as looking accurate and consistent. Like when you look at games like Order, Ryse, DriveClub, Alien Isolation, Fox Engine games, there are others too, when you look at the material it's appearing closer to how it should appear in real life. How the light travels around and through the object based upon it's ability to reflect and refract light. Someone has done some calculations ahead of time to put some values into the textures or so I understand.

So if you look at the different concretes, marbles, quartz, and woods in Ryse for instance if you say, yea that's what it should look like with that much light shining on it, then it's likely a PBR. Same would apply to other games, DC for instance does a great asphalt, he materials on the cars are correct and so forth. PBR has been giving gamers the next gen visuals that people are associating next gen with.

This one is coming out soon that appears to be running PBR, though I guess this is inline with just running UE4.

Oh nice! Apologies, didn't know we had a thread based on PBR. :(

Thanks a lot for the explanation! Because unless I'm looking at screens like the one below of The Order 1886 (credit to v1ncelis from NeoGAF for the screenshot), then it's hard for me to pin-point if the game utilizes PBR or not. I asked in the Halo 5 thread because some users were unsure if it used it then Josh Holmes comes out on twitter and says it does. And I'm playing the beta asking myself... Where is it again? lol

Guess I need to train my eyes to catch those things you pointed out like material matching real life and lighting. Thanks again! :)

theorder_1886_2015022wbo0h.jpg
 
Oh nice! Apologies, didn't know we had a thread based on PBR. :(

Thanks a lot for the explanation! Because unless I'm looking at screens like the one below of The Order 1886 (credit to v1ncelis from NeoGAF for the screenshot), then it's hard for me to pin-point if the game utilizes PBR or not. I asked in the Halo 5 thread because some users were unsure if it used it then Josh Holmes comes out on twitter and says it does. And I'm playing the beta asking myself... Where is it again? lol

Guess I need to train my eyes to catch those things you pointed out like material matching real life and lighting. Thanks again! :)

theorder_1886_2015022wbo0h.jpg
There is another thread called Mizguichi engine (spelling might be wrong), and that is a pretty good showcase of it as well.
 
I don't know if you can necessarily see that a game is using physically based rendering or not. PBR doesn't mean a game is going to look realistic. It just helps the artist keep things consistent. You can still do very stylized and unrealistic materials and lighting. Conversely, you can make materials look just as realistic without it, it's just a lot more work for the artists, or that's how I understand it.
 
That's exactly the thing - PBR is not something you could recognize, because theoretically one could get very realistic results even with all kinds of cheating going on; different shader variations for different environments and so on.

PBR's main advantage is in trading off math and complexity for artist hours. There's a much lower number of knobs in the shader with more limitations on the settings, but the one final setting should work in all possible circumstances and it could be re-used for all similar materials and so on. Once you get the artists trained, they should become a lot more productive and create generally higher quality content as well.
 
Also, I don't like that the gaming industry uses the term when it's actually not really physically plausible at all. For example, their Kajiya hair model is far from plausible. And they use a lambertian model for their diffuse material.. again not even realistic. Their only plausible BRDF is the GGX model used for specular. Even then, the reflections don't even follow that model (and it should).

rfxU_022.png


rfxU_021.png


Notice how ours follows the reflection when the highlight is anisotropic.

So this game like most others aren't even close to having a fully physically plausible shaders that covers the whole rendering pipe (i.e. reflections, hair, AO, SSS, etc..).
 
Last edited:
Back
Top