New Reality Engine Footage

DjordySeelmann said:
PRT is scalable to deformable objects, moreover, Doom 3 uses stencil shadows, not shadow maps or shadow volumes.

Shadow maps and shadow volumes to compete with PRT? Try scaling shadow maping and volumes to a huge outdoor environment, the lack of fillrate will kill you. PRT is absolutely the best solution for 'real-time' radiosity at this moment.

I'd be very interested in hearing more about how this can be done. I haven't seen any reserach on providing actual radiance transfer (in the Peter-Pike definition) on even dynamic objects. I'm unsure what you could be doing for the deformable case.
 
squarewithin said:
XxStratoMasterXx said:
HDR etc... is all in the D3 engine, and the shadowing is better in D3, so I'm guessing you're correct :)

Doom 3 didn't have HDR. Carmack said he experimented with a render path for it but never included it in the game.

The engine does. It's useable, according to Carmack. Also, to the person who said D3 uses stencil shadows not stencil volumes...well STENCIL VOLUMES ARE STENCL SHADOWS!

Anyway, point is, this engine isn't too impressive.
 
He is the engines developer man so be nice.

Anyway I think he is trying to say he didn't know PRT could be done yet dynamically yet alone one deformable objects in realtime. Thus he asks about how you implemented or to put it simply did it.
 
XxStratoMasterXx said:
squarewithin said:
XxStratoMasterXx said:
HDR etc... is all in the D3 engine, and the shadowing is better in D3, so I'm guessing you're correct :)

Doom 3 didn't have HDR. Carmack said he experimented with a render path for it but never included it in the game.

The engine does. It's useable, according to Carmack. Also, to the person who said D3 uses stencil shadows not stencil volumes...well STENCIL VOLUMES ARE STENCL SHADOWS!

Anyway, point is, this engine isn't too impressive.

Points you've raised so far don't seem to support that conclusion. Moreover, Carmack probably dropped the HDR because it didn't do what he expected from it, and so he left it. For a fair number of developers, this alone is already a reason not to use HDR in combination with the D3 engine, and I suspect the D3 engine is not optimized for the use of it, since it wasn't utlilized by Doom 3 itself.

And did you ever hear me say that D3 uses Stencil volumes? I said Stencil shadows, seriously. Not that it makes a difference.

I didn't implement LDPRT, I'm just a level designer, but the link I provided you with should be enough to get an idea how LDPRT works. Since LDPRT is supported since the latest DX 9.0 SDK, shouldn't be too hard to get it implemented. Thanks for the clarification Xenus.
 
DjordySeelmann said:
Concerning the LDPRT (Local Deformable PRT):
http://msdn.microsoft.com/library/d...ialsandsamples/samples/localdeformableprt.asp

I'm not sure what you mean with "I haven't seen any reserach on providing actual radiance transfer (in the Peter-Pike definition) on even dynamic objects.", could you explain?

Sorry, should have been clearer. They summed it up here:

LDPRT differs from standard PRT by employing a single coefficient per spherical harmonic (SH) order; this ensures that the transfer function has radial symmetry around the normal. This radial symmetry allows for very quick rotations and makes LDPRT suitable for use on animated meshes.

It has no directional information on the light source by enforcing radial symmetry, compressing the SH equations from n^2 to n. Also, it's only locally deforamable, so one wing can't shadow the other.

Part of the problem is the term PRT has grow to be the set of all things relating to spherical harmonics and ambient occlusion and diffuse interreflection. If a technique does any of those things, people call it PRT, even if it shares no real resemblence to the original Siggraph paper.

Also, don't (woops) interpret me as attacking the quality of your work, just the the somewhat ambiguous description of what you are doing.
 
Doom 3 can support day and night cycles. It's been done. Outdoors have also been done in the doom 3 engine and they look stunning.
Considering the content wasn't designer for HDR, that's why he didn't enable it. It's there, and it can work.
No, Reality doesn't have "real time radisoity". No engine for a while will have "real time radiosity".
And Doom 3 doens't require a recompile AND it's a what you see is what you get editor. The website you linked to is erranous.
Also, how the hell can Doom 3 be architected for DX9? It's OpenGL! Now, D3 supports fragment programs, and you can program whatever you really want in terms of "DX9" stuff.

Most of the stuff on that comparison website is TOOLS, not the actual engine.

I'm sorry, but this engine isn't really superior to Doom 3, and CERTAINLY not Unreal Engine 3.
Heh, however I don't think you'll agree with me since you're on the dev team.
 
Reality Engine supports more than just PRT for shadowing. It makes use of PCF w/sampling for direct lights w/fuzzies and PRT for indirect soft shadowing and lighting. As to how well DPRT stands compared to UE 3's lighting/shadowing model, read the post below.

TimJohnson said:
Hey, I'm the Engine Lead at Artificial Studios.

I must admit we're guilty of laziness in that lobby scene. The lobby was something we threw in at the last minute to demonstrate some nice bump-mapping, parallax, and HDR. I believe the textures were from an old ATI demo, but Humus had done a good job of showing them off in his demo, and our artists hadn't quite grasped the concept of good parallax maps, so being out of time we just threw them in. However those were freeware, and absolutely everything else in all our demos were made in-house.

As to our demos. The Mansion demo was mostly finished in 2003, when the best card we could get was a 9700. We squeezed out every possible feature we could on this card, but of course it's impossible for a 2003 demo to rival a 2004 UE3 demo when they are running a native ps3.0 DX9-min pipeline! Furthermore we've got a very small art team compared to Epic, so it's hard to compete on raw presentation.

Our feature set matches UE3 quite closely these days. In addition we have some things they do not. For example we have adopted PRT as our core pipeline, with heavy tool support. Absolutely everything in the outside world now uses this for the base pass, with our per-pixel effects on top. The Backyard demo is the first example of this. Seeing it in motion is quite something. All interreflections, subsurface scattering, soft shadows transition as the day moves to night, the sun goes down and the moon comes up.

Epic on the other hand have stuck to lightmaps multiply blended with per-pixel lighting for their core architecture. I believe this is a case of legacy technology rather than the demands in the 2006 timeline. We can scale our PRT scenes better than we could our per-pixel scenes! The Backyard demo actually runs decently on a Geforce3. I'm working on normal-mapped LDPRT for our character demo. Keep an eye out for that ;-)


We also have some really cool content creation advancements, such as our networked level editor. Anyone can jump on the server and modify the level then play a networked game at the click of a button.

Do we know what resolution on a R9700 that video was made on? Any AA? AF? Tunings on a R9700 just for the purpose of the video? CPU?

The scene ran ~35-40fps with FP16 HDR at 800x600 on 9700 Pro, with a 2Ghz Athlon. Video sys specs were slightly different, capture software takes a huge hit. We had an insane amount of per-pixel lights lying around, nothing was static, and we applied rigid body physics to everything we could. Was quite a stress test ;-)
 
Tools determine about 50% of the developers' decision on which engine to license. While there's a wide array of engines supporting latest techniques, tools make the largest difference today.

You don't create content for the use of HDR, only when it concerns overbright features of textures, which is certainly not the reason why HDR is being used and developed.

Now this is why PRT can be called real-time (to a certain degree, as you see):
PRT_DataFlow.png


So that can surely be interpretted as real-time radiosity, namely the calculation of the shadows is a real-time process. More can be read about it here.
And of course, don't forget the Spherical Harmonics parts PRT offers.

The pros of Doom 3 supporting day/night cycles are soon defeated, without any kind of real-time radiosity, having day/night cycles is technically useless, since there's no accurate lighting.

I haven't seen any Doom 3 WYSIWYG editor yet, do you have a link for me?

Hehe, seem you've find Tim's post ;)
 
1. I don't like their use of C# as a scripting language (MS-only, not portable).
2. Sounds like a very inefficient engine for complex environments.
3. I don't see implementing PRT as a basic algorithm as being overly significant, since it still can only work on static geometry (sure, you might be able to some very limited animation, but nothing like characters moving around). PRT results in long level compile times. And no, it's not realtime radiosity (it wouldn't include the word pre-computed if it was!).
4. The demo videos posted that I saw (game demo and mansion) really didn't show any of these things to any significant effect. The Game demo appeared to only have shadowing computed from directly above (which means that shadows were probably just projected geometry, not stencil shadows or anything like that). The mansion demo mentioned soft shadowing, but did not actually show any soft shadowing.

Anyway, I just don't buy that this engine can really be a strong competitor to Unreal 3, or other big-name engines, due to the support Epic and other companies can offer, and their experience in making games.
 
DjordySeelmann said:
You don't create content for the use of HDR, only when it concerns overbright features of textures, which is certainly not the reason why HDR is being used and developed.

You better be or you'll get lots of quantization errors :p

DjordySeelmann said:
Now this is why PRT can be called real-time (to a certain degree, as you see):

So that can surely be interpretted as real-time radiosity, namely the calculation of the shadows is a real-time process. More can be read about it here.
And of course, don't forget the Spherical Harmonics parts PRT offers.

No, that's real-time display of precomputed data. That's why it's called precomputed radiance transfer. You aren't creating that data in real-time.

But in the LDPRT you cited, you aren't using it for shadow generation. You said that yourself. The point of radiosity, photon maps, and other GI techniques is that you have a unified solution to illumination and reflection. Not cobbled-together collection of shadow maps, stencil shadows, PRT and whatever else you can find like current (and next-gen) engines do.

And I fail to see how simply invoking sphercial harmonics gives you real-time radiosity.

And never mind the fact that PRT isn't radiosity in the first place.
 
1. I don't like their use of C# as a scripting language (MS-only, not portable).

What C# offers compared to "older" scripting languages far outweigh the two disadtanges you name, and I wouldn't see MS-only as a disadvantage myself, just look at the technology they've brought to us the last couple of years. Anyway, C# offers: Almost instant compile times, automatic memory management, heavy IDE integration and intellisense, an industry-standard language you can easily hire people to use, a massive support library already out there for game-related functionality, proper debugging facilities (unlike most scripting languages), and a built-in compiler allowing artists or anyone else to edit scripts in notepad and run the game.

I don't see how that could outweigh an other scripting language.

2. Sounds like a very inefficient engine for complex environments.
The fact that all demos you've seen in the videos run fine on mid-range hardware, does mean you're dealing with the opposite.

3. I don't see implementing PRT as a basic algorithm as being overly significant, since it still can only work on static geometry (sure, you might be able to some very limited animation, but nothing like characters moving around). PRT results in long level compile times. And no, it's not realtime radiosity (it wouldn't include the word pre-computed if it was!).

From my experience, compiling PRT data takes less time than generating static lightmaps. PRT can indeed not be used for characters moving around, however, using other techniques for lighting and shadowing those is appropriate as well. For a explanation on why I think PRT can be interpretted as real-time lighting, (again, to a certain degree) you can look at my above post. Moreover, something like PRT, if you find it real-time or not, is always better than static lightmaps in my opinion.

4. The demo videos posted that I saw (game demo and mansion) really didn't show any of these things to any significant effect. The Game demo appeared to only have shadowing computed from directly above (which means that shadows were probably just projected geometry, not stencil shadows or anything like that). The mansion demo mentioned soft shadowing, but did not actually show any soft shadowing.

Firstly, the Mansion demo shows technology of 16 months ago, the features mentioned on the website is what Reality supports today. The game demo characters, physics objects and vehicles have soft drop shadows, but that's it, most other spotlights you see project dynamic shadows as well. Since shadow maps are really (fillrate) expensive, you can't use it on everything, which is why we use drop shadows on most "free-moving" objects at this moment. For instance, take the UE 3 video, there was one dynamic shadow projector, and that was the oil-lamp moving around through those caverns.

The Backyard tech demo shows you what PRT does, and since you haven't seen it yet, I really recommend that you do so to get a better idea of what Reality is capable of. Link

Anyway, I just don't buy that this engine can really be a strong competitor to Unreal 3, or other big-name engines, due to the support Epic and other companies can offer, and their experience in making games.

Nonsense, if that was the truth those firms you refer to would have been changed into large IBM-like firms already. Still, to convince potential licensees that we do offer the support they'd at least expect when licensing an engine, they can enter our Evaluation program for 60 days.
 
squarewithin said:
DjordySeelmann said:
Now this is why PRT can be called real-time (to a certain degree, as you see):

So that can surely be interpretted as real-time radiosity, namely the calculation of the shadows is a real-time process. More can be read about it here.
And of course, don't forget the Spherical Harmonics parts PRT offers.

No, that's real-time display of precomputed data.

Doesn't the same count for lightmaps?

But in the LDPRT you cited, you aren't using it for shadow generation. You said that yourself. The point of radiosity, photon maps, and other GI techniques is that you have a unified solution to illumination and reflection. Not cobbled-together collection of shadow maps, stencil shadows, PRT and whatever else you can find like current (and next-gen) engines do.

Actually illumination and reflection of light rays is exactly what PRT simulates. Shadow maps and stencil shadows don't have anything to do with PRT or radiosity, so I wonder why you name them (I didn't call them parts of PRT, if that's what you think). The PRT data calculated on a mesh is simply compiled to a file which accompanies the mesh on which it is calculated, no point in packing all PRT files in one since you'd have to recompile the whole level again when you alter the PRT properties or the position of the object. Coupling the PRT files with the mesh file has the advantage that when you change the properties or position of an object, PRT only needs to be recompiled on one object, and not a whole level. This all, is not what I call a cobbled-together collection of files, and if it was, it would be very inefficient.

To clear something up about SH and how it relates to PRT:

PRT using low-order SH basis functions has a number of advantages over typical diffuse (N • L) lighting. Area light sources and global effects such as interreflections, soft shadows, self shadowing, and subsurface scattering can be rendered in real time after a precomputed light transport simulation. CPCA allows the results of the simulator to be compressed so the shader does not need as many constants or per-vertex data.

http://msdn.microsoft.com/library/d...phics/tutorialsandsamples/samples/prtdemo.asp
 
DjordySeelmann said:
Doesn't the same count for lightmaps?

Yes, but I didn't equate them to real-time radiosity.

PRT obviously does more than mere lightmaps, but it's not real-time radiosity. We might be argueing around the same point, but I somehow doubt it. My contention is with your claim that your performing real-time radiosity, or are performing PRT (in the original definition of full SH basis for use with ambient occlusion, diffuse interreflection) on dynamic scenes or deformable objects.

And, I'm well aware of what SH and PRT are.

DjordySeelmann said:
Actually illumination and reflection of light rays is exactly what PRT simulates. Shadow maps and stencil shadows don't have anything to do with PRT or radiosity, so I wonder why you name them (I didn't call them parts of PRT, if that's what you think).

You did:

DjordySeelmann said:
Now this is why PRT can be called real-time (to a certain degree, as you see):
So that can surely be interpretted as real-time radiosity, namely the calculation of the shadows is a real-time process.

And,

DjordySeelmann said:
The PRT data calculated on a mesh is simply compiled to a file which accompanies the mesh on which it is calculated, no point in packing all PRT files in one since you'd have to recompile the whole level again when you alter the PRT properties or the position of the object. Coupling the PRT files with the mesh file has the advantage that when you change the properties or position of an object, PRT only needs to be recompiled on one object, and not a whole level. This all, is not what I call a cobbled-together collection of files, and if it was, it would be very inefficient.

But that PRT information for that object only describes it's interaction with itself. It's an offline process (thus the simulation). It contains no information about what other objects in the scene effect it. It's for illumination via environment map. You can render scene data into that map for important objects, but it doesn't scale to the general case.

Furethermore, the LDPRT demo you referred to only has subsurface scattering for homogeneous materials via a depth texture. The information they compute has only local data, or how a given surface patch interacts with it's immediate neighbors. It contains no information about the global state of the model for a given pose. That is why one wing of the bat does not shadow the other wing in the demo. That's not local.

Finally, I'm quite aware of how SH and PRT relate to each other. Illumitation is integration over a hemisphere. SH takes a hemisphere and projects it into a set of bases. PRT uses those bases to represent various illuminants and masks in the scene, thus compressing an integration into a vector multiplication, given (reasonable) constraints such as distant illumination.

You still aren't doing real-time radiosity or full dynamic/deformable PRT.[/quote]
 
You still aren't doing real-time radiosity

You guys love to nit-pick don't you! We've never claimed we were except in the context of PRT. How detailed would you like us to get on a marketing blurb that everyone has to understand? We've never claimed to be using anything more than the well documented techniques for PRT.

I don't like their use of C# as a scripting language

Tools and productivity-enhancing technologies will be the selling points for the next generation of engines. Manpower is the most expensive resource for any development studio, and by building on Microsoft's .NET and C# languages we do move a step closer than our competitors in achieving improved productivity goals. It's the age-old debate. 10 years ago we'd have probably been arguing assembly v C v C++, though now processors are fast enough the field of argument has shifted from speed to compatibility.

Additionally, while I believe we can hold steady in a tit-for-tit comparison with any next-generaiton engine, bear in mind that UE3 is far above $500K royalty-free. This is significantly out of reach for most game developers (a field where profits are already razor-thin if not negative). The proof is in their single licensee. We're not attempting to conquer the behemoth bloatware market, we're providing enabling technologies to reduce developer cost and increase productivity - this is where the successful market for next generation solutions lies.

BTW - One of our licensees just posted a few new shots. Better than UE3 media? No, but this was done in three weeks with two artists who were using Reality for the first time ever.
 
TimJohnson said:
You guys love to nit-pick don't you! We've never claimed we were except in the context of PRT. How detailed would you like us to get on a marketing blurb that everyone has to understand? We've never claimed to be using anything more than the well documented techniques for PRT.

Yeh, I got a bit more worked up than I meant to on that. My original point was just to clarify something that seemed either erroneous or vague. I wasn't attacking the quality of the engine. I took DjordySeelmann as simply trying to lecture me on my lack of understanding of PRT, instead of clarify the points that I was inquring about. My bad.
 
DjordySeelmann said:
None of the top-tier engines supports a day/night cycle, and now I'm just talking about features. And this all on current mid-range hardware.

Check this out http://www.doom3world.org/phpbb2/viewforum.php?f=57&sid=997c0b296e6e6aa15e2b7205e25ac058

Test build 2 video shows of the day night cycle, test 3 is more complete however the video doesn'thave it. You can download the actual content though and get the day/night cycle.

Edit: I see later you did suggest that D3 supports day/nigh cycle, but discounted it b/c you disliked the implementation or some such thing...
 
Back
Top