I was kind of expecting more discussion here. I mean, the voxel cone tracing algorithm is awesome. It's clever and efficient and now it's implemented in what's probably the most licensed game engine. Seems like a big deal to me.
Anyway, there are a few things here that haven't come up yet that I'm curious about people's impressions of:
1) Tessellation. Obviously the mountains scream it and the main character's spikiness looks obvious, but are these displacement maps on conventional meshes? Are they using any higher order surfaces in general? It doesn't look like it to me.
2) Textures. There isn't a mind-blowing amount of texture detail here so its hard to say if there's any virtualized texture tech being used - at least nothing beyond ue3.
3) AA. No jaggies visible to me in either video. Most likely MSAA but the shader aliasing looks solid as well. It's hard to say from a youtube video.
4) Light emitting particles. I've looked at this a few times and I can't really figure out if the particles are emitters in the GI solution. The room with the 'ice' particles does turn blue but it's hard to say if that's light from the sphere that's generating them or the particles themselves. The mass of them in the 'fire' element doesn't seem to be reflected in the lighting around the room to me.
5) How is that lava flow being handled? Looks like a pretty solid (not perfect but solid) fluid sim to me. Pretty cool if it's generalized and they can just drop a fluid in there like that.
That's about all I got.
Didnt want to start a new thread for this :
This video is from a Square Enix Real Time tech demo to demonstrate the capabilities of their new, "Luminous," engine.
1) The polycount is shockingly low in this particular demo. Just look closely at the frame buffer grabs provided by NVIDIA or even the first wireframe shots posted in the Wired unveiling article a few weeksago.
You will also notice that ther's no AO (Ambiant Occlusion) at all in the demo.
Remember that the voxel structure is pre-filtered.Now how did they achieve it at acceptable performance (guessing around 30fps)? Maybe by trading off precision/quality for performance. The other presentation on NVidia does mention when talking about the glossy reflections that lowering the resolution/precision/ammount of specular reflection cones gives performance increase. Perhaps that was one way they achieved it for UE4 elemental demo.
As I said, that was a disappointment to me, but on another watching today, I think there might be some displacement mapping going on in the lava which seems to have true geometric features, so maybe there it is. Again epic is using tessellation on things where artifacts end up more unnoticed.
Well, there is no actual "Ambient Occlusion" to speak of, but their voxel GI is naturally occludes their dynamic ambient light bounces, so you could say they do have some low frequency world space directional ambient occlusion. But I do believe adding a good HD SSDO solution there to complement their real time radiosity would have increased the lighting quality in that demo a good notch. The same way crytek's SSAO turns their even lower frequency light propagation volumes (three cascades of a 32x32 grid around the camera) into something that truly fits in well with the scene.
If well done it could make it hard to tell when the high detail screen space trick ends and the softer world space correct occlusion begins.
1) The polycount is shockingly low in this particular demo. Just look closely at the frame buffer grabs provided by NVIDIA or even the first wireframe shots posted in the Wired unveiling article a few weeksago.
2) Indeed, texture quality is also surprisingly low.
3) Jaggies are everywhere is you watch the firect feed video provided by NVIDIA (.MOV file on Geforce.com) and the original non shopped screenshots.
4) The particle system is all over the place IMO. First it's fairly apparent that what was shown was nothing more than NVIDIA's APEX Turbelence (http://developer.nvidia.com/apex-turbulence) which was alreacy showcased at GDC 2012 in UE3. The other physics based effects where also done via PhysX according to Tim Swenney. So no really Epic tech here it seems unfortunitelly. You will also notice that ther's no AO (Ambiant Occlusion) at all in the demo. Seems like lights emmited by the particles have no effect on the GI solution right now.
5) Once again: most probably NVIDIA PhysX.
Timothy Lottes on his blog said:FYI TXAA is not in any of the screens or videos for UE3 or UE4. But you will be seeing TXAA soon...
What do you want to be using? Bicubic? That's obviously only going to make much difference for mag filtering, which is only relevant when you *are* using low texture resolution.What bothers me more than the low texture resolution is the fact that it's 2012 and we're still using linear interpolation for texture filtering.
What do you want to be using? Bicubic? That's obviously only going to make much difference for mag filtering, which is only relevant when you *are* using low texture resolution.
I'm not really disagreeing (although people can do it now with 4 bilinear taps), but I just found it odd that you said you didn't care about low res textures but wanted better mag filtering... obviously higher res textures would remove the need for mag filtering so presumably that should be an acceptable solution to you as well.
Ah ok, fair enough.I guess what I meant was that the combination of low resolution textures and linear interpolation is what gets me. I think higher resolution textures would help, but I've never seen a game where I couldn't get close enough to something to see artifacts from linear filtering.
I don't see low resolution textures going anywhere for a while, and even if you have 2k textures on everything it's still nice to have a decent edge case. Also I hate aliasing, and linear filtering has a poor frequency response for magnification, so it would be nice to see magnification done with higher order interpolation.
Neat demo of Kismet in Unreal 4.
That's Mark Rein.