SIGGRAPH 2016

Clukos

Bloodborne 2 when?
Veteran
Supporter
4 presentations in this years siggraph next week:

188kgz.png

2dfjur.png

3xmkq6.png

4s7jni.png


http://advances.realtimerendering.com/s2016/index.html
 
Many interesting things planned, will take time to read everything and even longer to "digest" them ;)
 
Yep clearly, and outside "gaming", there's a lot of news and presentation too for VFX, render engines..
 
More CGI, VFX oriented..

Position-Normal Distributions for Efficient Rendering of Specular Microstructure
http://cseweb.ucsd.edu/~ravir/glints.pdf

http://www.digitaltrends.com/computing/researchers-rendered-light-metal-wood-glints/

The method takes an uneven, detailed surface and breaks each of its pixels down into pieces that are covered in thousands of microfacets, which are light-reflecting points that are smaller than pixels. A vector that’s perpendicular to the surface of the material is then computed for each microfacet, aka the point’s “normal.” This “normal” is used to figure out how light actually reflects off the material.

According to Ramamoorthi, a microfacet will reflect light back to the virtual camera only if its normal resides “exactly halfway” between the ray projected from the light source and the ray that bounces off the material’s surface. The distribution of the collective normals within each patch of microfacets is calculated, and then used to figure out which of the normals actually are in the halfway position.


Ultimately, what makes this method faster than the current rendering algorithm is that it uses this distribution system instead of calculating how light interacts with each individual microfacet. Ramamoorthi said that it’s able to approximate the normal distribution at each surface location and then compute the amount of net reflected light easily and quickly. In other words, expect to see highly-realistic metallic, wooden, and liquid surfaces in more movies and TV shows in the near future.
 
Just for sake of completeness: http://kesen.realtimerendering.com/sig2016.html

edit - BTW @sebbbi I saw your tweet about virtual texturing with depth only using volume encoded UV maps... wouldn't that require 3 passes? Would it be worth it?
edit2 - more over an extra pass. one pass to lay down z, one pass to ascertain the visible portions of the textures, and finally a third pass or a compute shader to shade pixels.
 
Last edited:
Confirmation TAA in UC4 is combined with sharpen filter and with all system cost probably less than 2ms on PS4... No cost is given for motion vector.
 
Impressive work by artist for volumetric based material, all fabrics were done by hands because it take less memory than scanned fabrics.
 
Winner of the SIGGRAPH 2016 Award for Best Real-Time Graphics and Interactivity, this scene based on Ninja Theory’s upcoming game, Hellblade: Senua’s Sacrifice, was shot, edited and rendered to final quality in minutes, a process that would normally take weeks or months.

This real-time cinematography project was developed by Epic Games, Ninja Theory, Cubic Motion & 3Lateral, with additional support from House of Moves, IKinema, NVIDIA and Technoprops.

http://www.guru3d.com/news-story/video-real-time-cinematography-in-unreal-engine-4.html
 
There could be a limitation in the graphics engine, simplifying the mapping of the captured data. Off-line methods produce much more convincing results.
 
Epic's work on bringing Agregate buffers to UE4 is very interesting and even a bit ambitious. I think that kind of aproach will become very popular eventually.
 
Back
Top