New dynamic branching demo

Ruined said:
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

I don't mind the knowledge, I think its just being presented in a misleading way and could be getting people's hopes up for something that very well (likely) might not follow through in real life games. Also, an ATI employee making statements like "nvidia is pwned," "raining on nvidias parade," and the fact that both of his recent demos do not work properly on Nvidia cards. Seems more like FUD to me, to attempt to detract from SM3.0. And now that Humus works for ATI, no matter what he or anyone else says, that makes him biased towards them.
You might want to examine the target audience for much of Humus' demos. They're really more meant for people like those at Beyond3D, who are interested in the possibilities the various technologies are able to provide and the different ways in which this can be accomplished. They're not necessarily meant for the average gamer who simply cares about the results and couldn't care less about how it's done.

I recall previous demo that Humus made that rendered relecting ripples in water. There were a few people who bashed it saying that it was nothing new and was an effect that could be done by a Geforce 2. However, they missed the entire point of the demo. The point was to show that the effect could be done almost entirely in pixel shaders. It was a different way of doing things and that's what made it interesting.

The same thing goes for this demo. It shows a different method for achieving that same effect that another technology is being specifically marketted for. Of course, like all things in game development, just because there are other methods does not mean they will be used by the developers. It's just a fact of life that people have to deal with. However, this demo shows that just because a piece of hardware does not support a technology (dynamic branching), it doesn't mean that there aren't other methods that achieve the same ultimate goal.
 
The Baron said:
The reasons are important. You are strongly implying that there is some technical reason why SM2.0 could not do these effects. In fact there is no technical reason why the same visuals could not be supported under SM2.0. The higher clock rate of NV420 would go a long way towards offsetting any effeciencies from using SM3.0 over SM2.0.
My God, it's full of stars.

nVIDIA is still using SM 2.0 for the NV420? OMG, nVIDIA has really fallen behind in features. The R4200 can render a scene with full GI.
 
Ruined said:
jvd said:
right becasue it can't be a driver problem ?

If it was there was obviously no attempt to work around it or test it, or get it working etc... His last demo also had problems on Nvidia cards. This furthers the idea of being ATI biased ;)
Either that or he simply doesn't have a NVidia card handy.
 
DaveBaumann said:
jvd said:
Hardocp , this site , elite baters and all were correct.Sm 3.0 would only offer performance increases .

Ummm, while I can't remember every one of our sites comments, I don't believe we've ever explicitly stated such a thing.

sorry dave i was refering to what he said . he wrote it like that so i responed with it writen like that ;)
 
Xmas said:
jvd said:
hdr can be done on the r3x0 and r42x. Its been proven by half life 2 .

It may not do fp 16 blending but it can do hdr so why not included it ?
Maybe because "HDR" is not a single, well defined effect, but actually a whole range of things (no pun intended). Radeon cards can do some of it, but not everything. NV40 can do some more.

true but why stop ati users from getting some of the hdr . There can only be 1 reason .
 
This whole argument has gotten out of control. I believe the title of this thread of Dynamic Branching Demo Not SM3.0 emulation demo. All the guy is saying is that dynamic branching can actually be done by ATI cards. I have not downloaded the demo, nor do I intend to. I really don't car at this point. It's getting tiresome to read a thread in this forum due to the bickering that goes on between you guys. This is almost worst than reading a sports related forum. The 6800 and X800 series of cards will be obsolete in a time span that equates to a blink of an eye. To much stress people!!! Enjoy your cards. Hip Hop Hooray, the Nvidia Nvidiots will be able to exclusively run one game with slightly better visual effects than the ATI Fanatics for a short period of time. Enjoy your 6-8 months of supremacy on one game, before the next generation comes out and makes you feel small again. ATI fan-boy you think I am? I could careless about them either, they don't sign my paycheck, wash my car, walk my dog, but they can kiss my ... Yes in my opinion the r400 is a re-badged r300 with a bigger tail pipe. Arguing about crap like this is so trivial with so little at stake considering that these cards perform relatively the same way. You see, it's all relative, while you Nividiots are looking at the ATI Fanboys in the review mirror and telling them to eat your dust, in 6-10 months when you get to the finish line I'll be waiting there in an NV50/R500 and asking you: What took so long???
 
jvd said:
Xmas said:
jvd said:
hdr can be done on the r3x0 and r42x. Its been proven by half life 2 .

It may not do fp 16 blending but it can do hdr so why not included it ?
Maybe because "HDR" is not a single, well defined effect, but actually a whole range of things (no pun intended). Radeon cards can do some of it, but not everything. NV40 can do some more.

true but why stop ati users from getting some of the hdr . There can only be 1 reason .
No, there can actually be other reasons.
If the scene you want to render with a high dynamic range contains transparent objects or decals that should be blended, you need alpha blending on a HDR frame buffer format, or you need to find some workaround, e.g. ping-ponging of buffers. Which can be very expensive, so you might have to put arbitrary restrictions on your scene. Or you render those things incorrectly, not really an option IMO.


ZenThought's posting is very interesting.
 
Wow, quite some response to this thread. :oops:

Evildeus said:
Would be interesting to see a SM3.0 demo to back up your claims. Hope someone do that.

I'm interested too. The demo is fairly simple. It should be easy for someone to just comment out the early-out code, and do some changes to the lighting shader to take advantage of ps3.0.
 
Guys stop bashing Humus and lets just ask him for an explanation of this.

Humus do you have any explanation as to why your technique is actually running slower on NV40 Variant graphic cards, and if you intend to investigate the issue.

That would be appreciated.

Chris
 
Ruined said:
This is great and all Humus, but what really matters is not whether a homebrew app with spinning lights delivers on performance, what matters is how the games perform and look. There can be all the SM2.0 tricks in the world, but if devs don't use them it really don't matter. With the talk of the upcoming FarCry 1.2 patch having SM3.0-only visual effects, stuff like this is fun in theory but more useful if games actually implement it, and you have to wonder how well ATI is doing in that department. I mean, if the next big game was "Humus: The Game" then we might something significant to talk about.

If you're totally uninterested in technical aspects of graphics, then you're on the wrong forum. Go browse nvnews.net or something. I showed a technique that implements dynamic branching with standard API features. New or not, who cares? It works, and it's mighty fast. It's another tools developers may find useful, if not, well, at least they had another option.

Ruined said:
Shader Model 3.0 support whose game support is getting larger and larger by the day.

Currently zero. So I guess it's even between this technique and ps3.0 so far.

Ruined said:
telling of your stance and what brand you are probably optimizing for when you write those demos.

This demo has no vendor-specific optimization. It's open source. Check for yourself, I got nothing to hide.

Ruined said:
In the end though, if people do buy an ATI card and are disappointed because SM3.0 does offer things that SM2.0 doesn't in actual games, and users do end up missing out on visual effects because they own an X800 card, will you step up to the plate and take responsibility for making misleading claims that SM2.0 will look and perform similarly to SM3.0? Or are you going to let the blame fall on the developer and Nvidia? Again, since you now work for ATI these questions come to mind.

I've never said this was equivalent to ps3.0. I've stated plain and simple what this technique does. Then it's up to the consumer to make an educated purchase decision. Would you think it's better if we just forget that there are ways to implement dynamic branching without ps3.0 just so nVidia can sell more chips, while ATi card run at a speed far below what it's capable of?

Were you as angry when nVidia said that ps1.4 wasn't needed since you could just do it with multipass in ps1.1? Using ps1.1 to implement a ps1.4 effect is a valid way of doing things. Of course ATI told developer how nice and convenient ps1.4 was, while nVidia told developers how useless it was and that everything could just be done in multipass anyway. This is how things work in the real world. Tell me a company that doesn't work that way. I'll write down its name so I don't forget it when they run out of business.
Or to quote Dilbert: "If you don't like it, try communism."
 
Ruined said:
Telling yourself that SM2.0 can emulate the same thing isn't going to make the visual effects magically appear on the screen.

Nor does nVidia telling us how great 3.0 is make graphical effects appear on screen. This discussion is entirely about what's possible, not about what CryTek are up to at the moment.
 
Thowllly said:
I don't think he is claiming that SM2.0 will look and perform similarly to SM3.0, just that it can look and perform similarly.

Right. Not always, but in many situations.
 
I don't know if it's only myself.
This demo can't run on my FX5600.

The FPS counter number jumping from 1 to 349865273094627. And the animation like slides.

I'm currently using Albatron FX5600, 61.71 and DX9b.
 
Does anyone know where NV3x and NV40 apply stencil kill? In the frontend of the pixel pipeline or the backend?
 
Richteralan said:
I don't know if it's only myself.
This demo can't run on my FX5600.

The FPS counter number jumping from 1 to 349865273094627. And the animation like slides.

I'm currently using Albatron FX5600, 61.71 and DX9b.
Well have you been able to run any ps2.0 demo? You see this demo runs on DX9, and you need a DX9 card to run it at all. I guess you should ask NVDA why...
 
Ruined said:
The problem is, if you say that it can, many will assume that it will. And if it doesn't some or most of the time, then that is misleading.

Are you as mad at nVidia when they say ps3.0 can improve visuals? How many are misled to believe that that's across all games, or even just that it will be common in future games?
 
karlotta said:
Richteralan said:
I don't know if it's only myself.
This demo can't run on my FX5600.

The FPS counter number jumping from 1 to 349865273094627. And the animation like slides.

I'm currently using Albatron FX5600, 61.71 and DX9b.
Well have you been able to run any ps2.0 demo? You see this demo runs on DX9, and you need a DX9 card to run it at all. I guess you should ask NVDA why...

Well, thanks for your reply.
Firstly, I DO have a DX9 card. Geforce FX5600. Which I know it is quite slow in PS2.0 but it still CAN do PS2.0
Secondly, I installed Microsoft DirectX 9.0b.
Thirdly, if my card don't support PS2.0 at all, the program should refuse to run but not running with strange behaviour.
 
Humus said:
Ruined said:
It seems they are being optimized/coded specifically for ATI cards now.

You're free to put up any specifics you find here ...

Well I did ask earlier if you have any idea why your technique is decreasing performance on Nvidia cards.

That would be a start.

Chris
 
Mr. Travis said:
holy crap... :oops:

9500 pro - 1024 x768

false - 20-30 fps avg
true - 70-100 fps avg


million dollar question... could this be applied to current game's lighting like far cry, or forth coming doom 3 for a speed up?

IMO this only demonstates how poor ATi drivers still are.
The reason nVidia cards don't see such an improvement is because they can already handle the more complex case more efficiently and clumsy optimizations such as this only slow them down unecessarily.
 
Back
Top