A question about Unreal Engine 3 demo shown at E3.

sonix666 said:
Yes, tell us, because that is the first time I ever heard that.
Me, too. The Unreal Engine 2.0 is certainly not going to receive such an enhancement. Why bother when the current material system and materials themselves are not at all set up for it? It'd be quite an extreme update indeed. Also, I think it was mentioned that SM 3.0 is not currently in the Unreal Engine 3.0, either. It's simply really long PS 2.0 shaders.
 
Well, I'm sure UE2 can support VS/PS 3.0, because it's pretty trivial to add such support. So it seems pretty apparent that future games released using UE2 technology will use SM3, but this has nothing to do with Epic and their current games.

Ostsol said:
Also, I think it was mentioned that SM 3.0 is not currently in the Unreal Engine 3.0, either. It's simply really long PS 2.0 shaders.
It was not in the UE3 that was demoed at GDC. I'm not sure what advancements have been made since then.
 
We probably won't be seeing it in UT2004, but I'd be willing to bet we'll see it in a future game from them based off the UE2 engine. If they don't then that just proves how useful it is.
 
ANova said:
We probably won't be seeing it in UT2004, but I'd be willing to bet we'll see it in a future game from them based off the UE2 engine. If they don't then that just proves how useful it is.
I'm sure we will see SM3 in a future game based on the UE2 engine, but I see no reason why this has anything to do with Epic being "influenced by nVidia."
 
ShePearl said:
The Unreal Engine 3 demo at E3, was it running on nv40 or R420 ? (or on both perhaps ?)

If both : any coments if either of them ran it better than the other ?

Cheers.

I saw the demo at E3 twice. It was in NVIDIA's booth and running on the 6800 ultra. It was not always perfectly smooth and you could sometimes see tearing in the image (they had vsync off), but it was easily the most amazing thing that I saw at the show.

I didn't see UE3, or even any of ATI's own R420 demos, in the ATI booth. They were featuring Half Life 2.

I have a feeling that if it does run on R420, it would actually be slower because they'd have to work around the lack of FP blending support. This requires ping-ponging two different FP buffers as render targets and re-running them through the pipeline to do the color sums in a fragment program. This is expensive in terms of memory, bandwidth, and switching rendering contexts.
 
This requires ping-ponging two different FP buffers as render targets and re-running them through the pipeline to do the color sums in a fragment program.
This is one thing I don't get. Let's consider HDR Rendering, for example. The whole "process" of HDR Rendering can be summarized into these steps :-

* Render the scene into a floating point buffer
* Make a copy of this buffer suppressing ldr color values
* Blur it
* Composite it with the original buffer with tone mapping

So where's the "ping-ponging"? Color accumulation isn't being done anywhere.
 
poly-gone said:
...they'd have to work around the lack of FP blending support. This requires ping-ponging two different FP buffers as render targets and re-running them through the pipeline to do the color sums in a fragment program.
This is one thing I don't get. Let's consider HDR Rendering, for example. The whole "process" of HDR Rendering can be summarized into these steps :-

* Render the scene into a floating point buffer
* Make a copy of this buffer suppressing ldr color values
* Blur it
* Composite it with the original buffer with tone mapping

So where's the "ping-ponging"? Color accumulation isn't being done anywhere.
If you're doing multiple passes (like D3) then you do color accumulation in step one. If you want transparent surfaces like windows, flames etc you also need fp blending in step 1.

The other steps looks like bloom lightning, not a nessecary part of HDR.
 
Chalnoth said:
ANova said:
We probably won't be seeing it in UT2004, but I'd be willing to bet we'll see it in a future game from them based off the UE2 engine. If they don't then that just proves how useful it is.
I'm sure we will see SM3 in a future game based on the UE2 engine, but I see no reason why this has anything to do with Epic being "influenced by nVidia."

I'd call advertising for nvidia in your games and using nvidia's hardware exclusively for demos influenced. They haven't said a word on how UE3 runs on the X800. Maybe they're hiding something? ;)
 
If you want transparent surfaces like windows, flames etc you also need fp blending in step 1.
Hmm... transparency would require multipass rendering, but not blending (and so, not more than one render target) since you could do that in the pixel shader without any ovehead.

If you're doing multiple passes (like D3) then you do color accumulation in step one.
With the higher shader models, even the most complex shaders can be combined into a single pass, except things like shadow mapping or dynamic reflection mapping.

The other steps looks like bloom lightning, not a nessecary part of HDR.
The last step is, of course, the most important part and very much necessary ;)

So, with modern programmable hardware, blending isn't all that important. Filtering on the other hand is very essential.[/quote]
 
ANova said:
I'd call advertising for nvidia in your games and using nvidia's hardware exclusively for demos influenced. They haven't said a word on how UE3 runs on the X800. Maybe they're hiding something? ;)

I remember discussions on the Atari UT2K3/4 message boards regarding the advantages of using ATI cards, only for Mark Rein (marketing/business guy) to pop in just to tell us how happy they were in the performance of Nvidia's NV3x cards. :rolleyes:
 
ANova said:
We probably won't be seeing it in UT2004, but I'd be willing to bet we'll see it in a future game from them based off the UE2 engine.

No, you actually said : "The fact that they're adding SM3.0 support to a game that doesn't even heavily use DX8 should speak volumes." You stated it was a FACT they were adding SM3 support to a game (by implication UT2004) when you are totally and utterly wrong. Trying to wriggle out of it by changing your statement just makes you look more ridiculous.

ANova said:
If they don't then that just proves how useful it is.
Oh, I see. If they do add it they are damned, and if they don't they are damned too, eh? What a ridiculous and laughable argument. The fact that they are not adding it to UT2004 in no way makes it 'unuseful', it's just that it wouldn't be practical, useful or feasible to add it in any significant form to the current engine given it hardly uses PS1 at the moment. This in no way means it won't be useful in Unreal Engine 3 which makes extensive use of shaders.

ANova said:
I'd call advertising for nvidia in your games and using nvidia's hardware exclusively for demos influenced.
Virtually every game you buy has Nvdia's logo on it somewhere. It's called advertising. The marketing guys give them big money and they stick the advert on the box. Welcome to the world of capitalism.

ANova said:
They haven't said a word on how UE3 runs on the X800. Maybe they're hiding something? ;)
First, at the time of the first demo back in March the only hardware they had that could run it was the NV40 since ATI hadn't given them a X800.

Second, why on earth should they say how well it runs on either card, when the performance is irrelevant given that the engine is still in development and games based on it 2 years away? Who needs to know and why? By the time games are released based on it then both cards will be redundant any way.

Sadly, whilst trying to insinuate Epic are biased you in fact do nothing more than show your own bias towards ATI which totally blinds you to reality. These forums really don't need any more people like you. Sorry.
 
Bouncing Zabaglione Bros. said:
I remember discussions on the Atari UT2K3/4 message boards regarding the advantages of using ATI cards, only for Mark Rein (marketing/business guy) to pop in just to tell us how happy they were in the performance of Nvidia's NV3x cards. :rolleyes:
Perhaps, just perhaps, that is because the NV3x cards ran UT2003 fine? The NV3x cards only did badly in games that heavily used DX9 features, which wasn't the case with UT2003. So what is wrong with them telling people with NV3x cards that UT2003 will run fine on them? Should their marketing guy be lying and saying it doesn't? Are you mad?

Check out the UT2003 benchmarks on Andantech. You'll see that the FX5700 Ultra actually beats the 9600XT and the FX5600 Ultra beats the 9600 Pro. Performance is fine for that game.
 
ANova said:
I'd call advertising for nvidia in your games and using nvidia's hardware exclusively for demos influenced. They haven't said a word on how UE3 runs on the X800. Maybe they're hiding something? ;)
1. They didn't use nVidia hardware exclusively for any demos.
2. They aren't in the business of advertising products from other companies, particularly not before launch.
 
Chalnoth said:
2. They aren't in the business of advertising products from other companies, particularly not before launch.

And how would you would interpret their presence at the 6800 launch event? They were there to sell cookies?
 
This is one thing I don't get. Let's consider HDR Rendering, for example

I think a lot of the posters above don't realize that accumulation is blending. Anytime you have a multi-pass algorithm that needs to access the previously-rendered-to buffer for some equation, you can either choose to blend (if supported) or copy to texture and re-run through the rendering pipe to do the math you want.

Here would be an example simplified Doom3/UE3 rendering pipe using high dynamic range and assuming that FP blending works. Stencil shadows could be replaced with shadow maps if desired. The basic structure would be the same.

Code:
set FP color buffer as render target
clear depth buffer and color buffer
turn off color writes
change depth test to GL_LESS
render scene to depth buffer
turn on color writes
change depth test to GL_EQUAL
enable blending with blend function GL_ONE, GL_ONE

Loop over lights
{
  clear stencil buffer
  render stencil shadows for current light
  render scene as lit by current light
}

turn off depth writes
change blend function to something like GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA
repeat loop over lights with depth sorted translucent objects
Do HDR processing on FP buffer and write result to screen back buffer

If your card doesn't support blending at the color precision you wish to render at, your loop changes to something like this:

Code:
loop over lights
{
  Bind FP buffer 1 as render target
  clear stencil buffer
  render stencil shadows for current light
  bind FP buffer 2 as texture
  render scene as lit by current light and add to that the light already stored in FP buffer 1 (for each pixel in fragment program, you look up FP buffer 1 and add to result from lighting geometry)
  swap pointers to FP buffers 1 and 2
}

and you'll have to pre-render the z buffer into both FP targets. So, to get around the lack of FP blending, we have to do twice the z buffer work, waste on the order of 12 MB for an extra FP buffer, complicate our code, complicate our fragment program, waste a texture unit, and swap render targets multiple times per frame.

FP blending is really important for efficiently doing HDR with multiple light sources.
 
Diplo said:
Bouncing Zabaglione Bros. said:
I remember discussions on the Atari UT2K3/4 message boards regarding the advantages of using ATI cards, only for Mark Rein (marketing/business guy) to pop in just to tell us how happy they were in the performance of Nvidia's NV3x cards. :rolleyes:
Perhaps, just perhaps, that is because the NV3x cards ran UT2003 fine? The NV3x cards only did badly in games that heavily used DX9 features, which wasn't the case with UT2003. So what is wrong with them telling people with NV3x cards that UT2003 will run fine on them? Should their marketing guy be lying and saying it doesn't? Are you mad?

Check out the UT2003 benchmarks on Andantech. You'll see that the FX5700 Ultra actually beats the 9600XT and the FX5600 Ultra beats the 9600 Pro. Performance is fine for that game.
I wonder how performance wouldd stack up if you used 43.45 for benchmarking. Or have we all forgotten and decided that it's A-OK now?
 
Ostsol said:
And what reason have they given you to not trust them? Please don't say it's because of TWIMTBP. . .

Seeing as they are a business epic obviously has to coordinate what they say. Offhand remarks about hardware are not exactly a good idea when you depend on those companies in part for your living. In this sense they'd be stupid to bash either hardware, although they do seem to be closer then most developers with the TWIMPTBP marketting to NV... which is kind of questionable behavior of them, but they certainly aren't to the point of having ati fans swear them off...

Just guessing, my bet is that this new gen ati hardware is going to be the shader leader... NV didn't have the speed last generation, and I don't think they've gained it... I'm curious as to PS/SM 3.0 even makes sense at this point, extra instructions just mean more cost and slower performance so is it a benefit that can be used yet? No one has pushed it so it can't really be said, but it just seems likely to me that high speed 2.0 is going to do better right now then medium speed 2.0/3.0
 
AlphaWolf said:
Chalnoth said:
2. They aren't in the business of advertising products from other companies, particularly not before launch.
And how would you would interpret their presence at the 6800 launch event? They were there to sell cookies?
To show off their new engine. There were other game developers there, too.
 
Back
Top