New dynamic branching demo

Well done Humus.

For those poo-pooing on his parade, he has simply shown that you can take an effect and SPEED it up on a large install base of Hardware. One would think that developers want their cool effects to run on all cards with maximum performance on ALL cards. It shows how well the engine is built, which as someone mention, is what Crytek is selling as well.

IMHO, the greatest rewards in programming is figuring out something others never thought possible. Humus has taken what appeared to be a SM2.0 limitation, and solved it.

Congrats
 
looks like the current HLSL compiler doesn't favor dynamic branching, it just generated SM2.0 style code even if I was using ps_3_0 profile. :cry:
 
Shame he's advertising it in such a way that can be considered not-so-objective. Look at the results it created on these boards. A tiny little bit more professionalism on his part could do wonders to calm down teh fanbois one here, both Ati's and NVidia's. And let's not forget about the good old PVR fanbois.
 
It's not really solved except in one specific case, though. A PS3 version of the same rendering algorithm would be able to do all the lighting application in one pass, while still not doing the lighting calculations that don't need to be done. With Humus' algorithm you'll be sending geometry data to the video card many more times than you would with PS3, which may become the limiting factor in some situations.
 
london-boy said:
Shame he's advertising it in such a way that can be considered not-so-objective. Look at the results it created on these boards. A tiny little bit more professionalism on his part could do wonders to calm down teh fanbois one here, both Ati's and NVidia's. And let's not forget about the good old PVR fanbois.
Yeah, so? ;)

"We're putting out fires with gasoline,
putting out fires with gasoline!"
 
reever said:
And when developers talked about the befits of SM3.0 they always mentioned branching and it's main purpose, which always seemed to be helping this problem with per-pixel lights, and now it's possible without SM3.0 albeit using a trick, where is the problem?

This "trick" is just something that has a similar result like dynamic branching. But it does only work with this shader.

So I don't understand why so many guys think that this shows anything important.

It's a tiny optimization for a specific task - nothing compareable with dynamic branching!
 
But it does only work with this shader.

And? Here's a quote from crytek themselves

Q: 7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

A: In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

Now doesn't this trick do what they want to do? Certainly if they needed branching for more they wouldn't have mentioned one thing
 
trinibwoy said:
I'm not really trying to justify anything. And your analogy is poor at best :) Far Cry does support the highest shader model available on ATI cards - 2.0. Running a 2.0 card at 1.1 is in no way analogous to this situation.

I'm not talking about dynamic branching, geometry instancing, or whatever feature card Y doesn't support. I'm talking generally about what CryTek might NOT let some consumers enjoy, like offset mapping. We know nVidia tried to pass those pictures as SM 3.0 when even a nVidia engineer in an interview conceeded those effects were possible in 2.0. And if it's a matter of instruction count there's always 2.0a and 2.0b.

Similar situation, wrong application. I don't think you can equate a benchmark to these extra Far Cry features. Please realize that these features are in a patch and are not what you 'paid for' so to speak. If you though Far Cry was awesome before and willing to buy it, why would you change your mind now? The game hasn't.

When I buy a game I expect it to either be bug-free (ah! Now there's a thought!) or download patches. This isn't a separate feature. It's part of a patch thus it ends up being part of the "package" you buy. As such, when the patch offers feature A for card Z and leaves out card Y when it's perfectly capable of using it (or with a small amount of work) that's when I know the developers want to earn their living not by making the games the consumers want but by producing tech demos for whichever IHV pays the most. That's perfectly fine, but I prefer to spend my money on a developer that cares for consumers (all of them).

Another example, when Gabe said nVidia cards wouldn't have AA (and then HDR) there was an outcry from the nvidia fans (and rightfully so). Eventually these seemingly "difficult" obstacles were overcome. Another example, Neverwinter Nights. Shiny water should have been available under ATi cards, but because bioware apparently only likes their consumers with nVidia cards they lost my $50 (btw, as context I never owned an ATi card and my last two and current one are nVidia cards).

What I'm trying to show is that all situations are different but all come down to this: when a developer can provide the same (or very similar) experience to all consumers he should do so and everyone should encourage this behaviour because today is the brand of cards you hate but tomorrow it's you who's left in the rain.
 
Humus said:
This technique can replace most common dynamic code paths, or at least those that we will see in the near future. It won't do while-statement in the general case. But it will do any kind of if-statement, including nesting, and only run the correct path. For performance optimization, this is the most common use of dynamic branching. So a technique that can replace that is IMHO significant.

Humus, correct me if I'm wrong, but your technique can only emulate an "if-then-else" statement when the output colour is set either in the "then" or "else" block or any nesting thereof. These are the renderstates that Zeno was alluding to. However, it cannot handle computing intermediate variables (not unless you use a very nasty hack of MRTs AFAICS) which is what dynamic branching is really about.

edit: typo
 
Mordenkainen said:
When I buy a game I expect it to either be bug-free (ah! Now there's a thought!) or download patches. This isn't a separate feature. It's part of a patch thus it ends up being part of the "package" you buy.

Yes but most patches are to resolve issues with the game as shipped. I don't think the SM3.0 patch qualifies as such. Did you purchase the game expecting them to add displacement mapping in a future patch? You must have pretty high expectations for your games ;)

Another example, when Gabe said nVidia cards wouldn't have AA (and then HDR) there was an outcry from the nvidia fans (and rightfully so). Eventually these seemingly "difficult" obstacles were overcome.
Yes, this is an analagous situation and hopefully ATI card owners will get the opportunity to take advantage of the new features.

Another example, Neverwinter Nights. Shiny water should have been available under ATi cards, but because bioware apparently only likes their consumers with nVidia cards they lost my $50 (btw, as context I never owned an ATi card and my last two and current one are nVidia cards).

You expect us to believe that you own an Nvidia card but you didn't purchase a game because it didn't have 'shiny water' on ATI cards? Yeah right.

What I'm trying to show is that all situations are different but all come down to this: when a developer can provide the same (or very similar) experience to all consumers he should do so and everyone should encourage this behaviour because today is the brand of cards you hate but tomorrow it's you who's left in the rain.

Well for one, anybody who 'hates' a video card brand is a retard. And secondly, you are right, the developers should provide the same experience during game development. There are some that will disagree but I don't exactly see that stance extending to add-on features. Would Crytek have even given us those features if it wasn't for Nvidia? Shouldn't Nvidia benefit by their active involvement in getting them out there. If Crytek wanted displacement mapping in Far Cry at the outset wouldn't it be included in the shipping version in PS2.0 guise?
 
Another example, when Gabe said nVidia cards wouldn't have AA (and then HDR) there was an outcry from the nvidia fans (and rightfully so). Eventually these seemingly "difficult" obstacles were overcome.
I think that's because R3xx had part of SM 3.0 called centroid sampling. AA didn't work on NV3x because HL2 uses lightmaps(a lot of them).
 
pat777 said:
Another example, when Gabe said nVidia cards wouldn't have AA (and then HDR) there was an outcry from the nvidia fans (and rightfully so). Eventually these seemingly "difficult" obstacles were overcome.
I think that's because R3xx had part of SM 3.0 called centroid sampling. AA didn't work on NV3x because HL2 uses lightmaps(a lot of them).
Lightmaps weren't the issue, it was texture packing that was the issue.

-FUDie
 
Demo trying to provide a hack and some FUD

rainz said:
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

True

Its really not about sharing knowledge. Thats cool and thats great.

I have an ATI 9800 Pro in my machine right now as I typed this, but I feel ATI has fallen into the same traps as Nvidia as of this generation.

ATI had Nvidia down for the count, only to basically let them go and regain ground.

ATI didn't have Shader Model 3.0, which isn't an nvidia feature, its a direct x 9 feature. Tons of moronic ATI fanboys think SM3.0 is an nvidia feature and since they don't have it currently, they don't need it and then they try to add in hacks to make it look like they have that feature via SM 2.0.

I don't hate ATI, I own thier products, but it makes me angry to see ATI fanboys doing the same things the Nvidia fanboys have done (which is make up crap out of a hack to prove they don't need SM 3.0, when in fact its a DX 9 feature that should be in ATI video cards to begin with.

Its not about knowledge as much as its FUD that SM 3 is not needed, which it is and no hacks will prove otherwise. Its a damn shame that ATI with its leadership over the years is letting Nvidia lead in DX 9 technology.

SM 3 is needed for the future and current development of games and it is NOT an Nvidia ONLY feature, but a feature that all state of the art and hard core marketed video cards should have. Since ATI doesn't want to support SM3.0 until sometime next year, that strategy is a poor one.

All of this crap is why I moved away from Nvidia and now ATI seems to be doing the same kind of things and at the same time Nvidia's products are looking much better in my eyes and that pathetic since Nvidia was down for the count and ATI was supposed to be foward thinking.
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
I have an ATI 9800 Pro in my machine right now as I typed this, but I feel ATI has fallen into the same traps as Nvidia as of this generation.

ATI had Nvidia down for the count, only to basically let them go and regain ground.

ATI didn't have Shader Model 3.0, which isn't an nvidia feature, its a direct x 9 feature. Tons of moronic ATI fanboys think SM3.0 is an nvidia feature and since they don't have it currently, they don't need it and then they try to add in hacks to make it look like they have that feature via SM 2.0.

I don't hate ATI, I own thier products, but it makes me angry to see ATI fanboys doing the same things the Nvidia fanboys have done (which is make up crap out of a hack to prove they don't need SM 3.0, when in fact its a DX 9 feature that should be in ATI video cards to begin with.

Its not about knowledge as much as its FUD that SM 3 is not needed, which it is and no hacks will prove otherwise. Its a damn shame that ATI with its leadership over the years is letting Nvidia lead in DX 9 technology.

SM 3 is needed for the future and current development of games and it is NOT an Nvidia ONLY feature, but a feature that all state of the art and hard core marketed video cards should have. Since ATI doesn't want to support SM3.0 until sometime next year, that strategy is a poor one.

All of this crap is why I moved away from Nvidia and now ATI seems to be doing the same kind of things and at the same time Nvidia's products are looking much better in my eyes and that pathetic since Nvidia was down for the count and ATI was supposed to be foward thinking.
Fair enough, but you have to remember that the R420 IS just a refresh part for ATi and that they are going to be releasing the R500 a lot sooner than anyone expects. ;)

nVidia's on a new chip design, ATi is on their refresh....next round nVidia will be on their refresh and ATi will be releasing their fresh chip design.

There is a balance to it, patience grasshopper. ;)
 
Mordenkainen said:
As such, when the patch offers feature A for card Z and leaves out card Y when it's perfectly capable of using it (or with a small amount of work) that's when I know the developers want to earn their living not by making the games the consumers want but by producing tech demos for whichever IHV pays the most. That's perfectly fine, but I prefer to spend my money on a developer that cares for consumers (all of them).

The work Crytek are doing could be future engine work that can be retro-fitted fairly easily.

It would be a major mistake looking forward to see PS3.0 as a investment in NVIDIA only... ATI maybe a little slow on joining the SM3.0 party but doesn't mean there not coming and bringing lots of beer ;)

There is no future in PS2.0, its now legacy support, all future graphics research should be targetting PS3.0+.
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
I don't hate ATI, I own thier products, but it makes me angry to see ATI fanboys doing the same things the Nvidia fanboys have done (which is make up crap out of a hack to prove they don't need SM 3.0, when in fact its a DX 9 feature that should be in ATI video cards to begin with.

hacks? I think you mean fallback methods, which are an extremely helpful thing with shaders as the customer's video card levels will always be extremely varied - there is no such thing as a target platform with computers, so the more that CAN be covered the better.

Nice job with spamming this thread with irrelevant personal opinion people... heh, maybe if there were more active card makers I wouldn't have to venture near such polarized conversations just to find out more about technology and games... sheesh
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
SM 3 is needed for the future and current development of games and it is NOT an Nvidia ONLY feature, but a feature that all state of the art and hard core marketed video cards should have. Since ATI doesn't want to support SM3.0 until sometime next year, that strategy is a poor one.
Rubbish, features vs speed. NV40 vs R420. Which is better depends on who/what you do.

ATI made a choice that games would look better, run faster with a pure speed increase. Whereas NVIDIA are hoping the extra features make up for the drop in speed.

Who's right? Both probably... If you want to run/near future games (all games) a R420 is hard to beat without going SLI, but SM3.0 may make a few games faster/bit prettier only with new code that many games won't have.
 
Back
Top