New dynamic branching demo

Humus

Crazy coder
Veteran
As I promised in the Far Cry thread, I made a demo showing the dynamic branching technique on non-ps3.0 hardware.

dynamicbranching.jpg


http://esprit.campus.luth.se/~humus/

Performance improvement with early-out enabled is in the range of 2-4 times that of without early-out.
 
I LOVES IT!!!! :D

The difference in performance is remarkable, the difference between unplayable and smooth as far as it went for me.
 
Nice .. GJ man .... i have one thing to say now ... no doubt ... !@#$ NV this round again ! 8)

RainZ
 
This is great and all Humus, but what really matters is not whether a homebrew app with spinning lights delivers on performance, what matters is how the games perform and look. There can be all the SM2.0 tricks in the world, but if devs don't use them it really don't matter. With the talk of the upcoming FarCry 1.2 patch having SM3.0-only visual effects, stuff like this is fun in theory but more useful if games actually implement it, and you have to wonder how well ATI is doing in that department. I mean, if the next big game was "Humus: The Game" then we might something significant to talk about.

AFAIK its not anything new either, so I'm not sure why this is being made such a big deal of, except to attempt to distract attention from and detract from Nvidia's more advanced Shader Model 3.0 support whose game support is getting larger and larger by the day. Being that you work for ATI, it's kind of hard not to make this connection. Sorry to say, but this sounds like FUD in a hard, ATI needs to detract from the competitors features because they are behind, way. Other comments you've made like "Nvidia can consider themselves pwned," talking about raining on Nvidia's parade, etc, are also telling of your stance and what brand you are probably optimizing for when you write those demos.

In the end though, if people do buy an ATI card and are disappointed because SM3.0 does offer things that SM2.0 doesn't in actual games, and users do end up missing out on visual effects because they own an X800 card, will you step up to the plate and take responsibility for making misleading claims that SM2.0 will look and perform similarly to SM3.0? Or are you going to let the blame fall on the developer and Nvidia? Again, since you now work for ATI these questions come to mind.
 
With the talk of the upcoming FarCry 1.2 patch having SM3.0-only visual effects

it doesn't have sm3.0 only visual effects . It has effects they chose to implement on sm3.0 only . Where as they could have implemented them on p.s 2.0 hardware .

Its a twimtp tittle.
 
On the one hand, you have a point- ATI should work with developers more to get PS2.0 to be used in such a way. They seem to be WAY too passive on this, and there are far too many TWIMTBP games coming out. It's sad, really, and I wish they'd take the initiative. I hope HL2 changes the situation a bit.

But on the other hand, part of the blame for situations like this does fall on Nvidia, because they are misleading in their marketing. PS3.0 doesn't offer anything useful that can't be done in PS2.0. Effectively, they are overhyping it because the competition is too close for comfort on the PS2.0 level.
 
jvd said:
With the talk of the upcoming FarCry 1.2 patch having SM3.0-only visual effects

it doesn't have sm3.0 only visual effects . It has effects they chose to implement on sm3.0 only . Where as they could have implemented them on p.s 2.0 hardware .

Crytek apparently chose not to spend the time and money implementing an emulation on SM2.0 hardware for the new effects.

But that doesn't matter. The point is if you have an X800 card it looks like you may not be able to see the visual effects because they are SM3.0 only. Telling yourself that SM2.0 can emulate the same thing isn't going to make the visual effects magically appear on the screen.

Much like Nvidia did with the FX series, ATI has left themselves wide open to things such as this by not adequately supporting the latest shader technologies.

Devs have a choice. They can either code for SM3.0, and spend extra time coding and testing an SM2.0 emulation of those effects, or they can just code SM3.0. SM3.0 will be useful on the 6800 and all future Nvidia/ATI cards. An SM2.0 emulation will be likely be too slow for last generation (98xx/59xx) cards, and irrelevant for future generation cards. So by spending the time to develop SM2.0 emulation of SM3.0 effects, the developer is essentially spending that time and money developing for a single video chipset - the R420/X800. Is it worth it to develop for a single chipset? Its up to ATI to convince the developer to do so, much like Nvidia was able to get NV3x paths in a number of games last and this year.
 
Ruined said:
will you step up to the plate and take responsibility for making misleading claims that SM2.0 will look and perform similarly to SM3.0?
I don't think he is claiming that SM2.0 will look and perform similarly to SM3.0, just that it can look and perform similarly.
 
On my 6800 Non Ultra I get 37 FPS with The Dynamic Branching set to true. With Dynamic Branching To False its 40 FPS.
 
Thowllly said:
Ruined said:
will you step up to the plate and take responsibility for making misleading claims that SM2.0 will look and perform similarly to SM3.0?
I don't think he is claiming that SM2.0 will look and perform similarly to SM3.0, just that it can look and perform similarly.

The problem is, if you say that it can, many will assume that it will. And if it doesn't some or most of the time, then that is misleading.
 
Crytek apparently chose not to spend the time and money implementing an emulation on SM2.0 hardware for the new effects.

Who said sm2.0 had to emulate them ?

sm2.0 can do them just fine. sm2.0 does hdr fine in half life 2 .

Telling yourself that SM2.0 can emulate the same thing isn't going to make the visual effects magically appear on the screen.

No . Telling myself that i shouldn't support twimtp tittles that purposely lower quality on other cards to make nvidia look better will explain alot .

Much like Nvidia did with the FX series, ATI has left themselves wide open to things such as this by not adequately supporting the latest shader technologies.

Mabye so. But all the things not included in farcry for p.s 2.0 is not the fault of p.s 2.0 it is nvidia not wanting it to be there since they don't have any cards on the market that would benfit from it .

Would you be happy if valve said from now on we are only going to let the 6800ultras run shader model 1.1 paths ?

Becasue that is the same thing that cyrotek is doing right now .
 
ChrisRay said:
On my 6800 Non Ultra I get 37 FPS with The Dynamic Branching set to true. With Dynamic Branching To False its 40 FPS.

Yeh this demo doesn't work properly on my GeForce FX either, just like Humus' last demo. It seems they are being optimized/coded specifically for ATI cards now.
 
jvd said:
Would you be happy if valve said from now on we are only going to let the 6800ultras run shader model 1.1 paths ?

Becasue that is the same thing that cyrotek is doing right now .

That's not at all what CryTek is doing, and that is a horribly faulty analogy. The 6800 supports Shader Model 2.0, which Half Life uses - Making the 6800U run HL2 in 1.1 would mean that Valve would disallow Nvidia cards from running a Shader Model they can run just fine.

On the other hand, CryTek coded additional visual effects for Shader Model 3.0, and chose *not* to code an additional emulated fallback SM2.0 shaders for these specific effects. Not implementing the effects in Far Cry is simply CryTek not spending the time/money to do so - if ATI supported Shader Model 3.0 they could display the effects fine, but they don't.

A better analogy would be "How would you like it if Valve didn't include an NV3x path in Half Life 2," because that is a similar case where one card can get similar effects of another card if additional time-consuming coding is done. And my response to that is that its up to Nvidia to convince the dev that its worthwhile to add this to their card, because it is something that is not necessary, just like writing fallback shaders for every single visual effect for ATI cards that do not support the latest shader technology is not necessary.
 
I second what Ruined said. I'd also like to add that this isn't dynamic branching and it isn't a generic replacement for such. It is a special case which requires switching a bunch of render states and sending all the geometry through the pipe again. It's a hack.

I'm not saying it's not a nice technique, but why do you try to mislead people and say that it is more than it is, and that NVIDIA is "pwned"? If this is true then I expect ATI will be pushing this technique in the future and scrapping development of any SM 3.0 hardware now that you've single-handedly made it obsolete, right?
 
Great demo Humas,

On my 9800 Pro (Catalyst 4.5) this method gives an increase from 40 FPS to 125 FPS (thats about a 213% speed increase!)
 
ChrisRay said:
On my 6800 Non Ultra I get 37 FPS with The Dynamic Branching set to true. With Dynamic Branching To False its 40 FPS.

Just for comparison.

9700pro 140fps true
43fps false

fullscreen 1280x960
95fps true
26fps false
 
Back
Top