New dynamic branching demo

Chalnoth said:
It's not really solved except in one specific case, though. A PS3 version of the same rendering algorithm would be able to do all the lighting application in one pass, while still not doing the lighting calculations that don't need to be done. With Humus' algorithm you'll be sending geometry data to the video card many more times than you would with PS3, which may become the limiting factor in some situations.

This demo only implement one case, but it's not specific to this. It was mentioned for instance in another thread that you could do soft shadows with ps3.0 and only do the 64 samples when you detect you're in the prenumbra. This could just as well be implemented with this technique. And so much more.

Regarding high vertex count situations. You can solve that by rendering position to a texture, and use that to render the ifs. Sort of a deferred shader thing. That way you only need to draw a fullscreen quad. It will take a little more from the fragment shader, but will eliminate the vertex shader bottleneck.
 
Humus said:
And no, it's not about any "rare lighting situation". Just because I only produced one demo doesn't mean it's the only situation where it would work. It works with any form of if-statement structure, nested or not. It will likely be able to implement 90% of the ps3.0 usage we'll see the next 6 months.
That's all well and good, but it's still using more framebuffer traffic and much more geometry processing than a PS 3.0 version of the same thing.

If you have a lot of branching situations in PS 3.0, you just end up with the added latency of branching. If you have a lot using the algorithm you're talking about, the number of passes executed could explode.
 
Humus said:
It's not a hack, and it's not useless, and it should work fine with stencil shadows if you're careful.
Are you sure or are you supposing? And what are the performance consequencies if it works?
 
HAL-10K said:
This "trick" is just something that has a similar result like dynamic branching. But it does only work with this shader.

Gaaaahhhh!!!! Will people stop echoing this lie?
 
Re: Demo trying to provide a hack and some FUD

DaveBaumann said:
Proforma said:
1) point number one Nvidia usually names their refresh parts NVx5 right?
Well now they are naming it the NV48 (which is closer to the NV50 in numbering) and the NV45 is currently a PCI Express version of the NV40.

Probability, at the moment, is that DX Next won't be available until 2006/7.

In the last NVIDIA analyst conference they already made note that the low end NV4x chips would likely last 2/3 years, while there will be another high end architecture in that time. This fits with the DX Next timescales and says that the next high end architecture from NVIDIA is likely to be another, ostensibly, SM3.0 part (with presumably a bunch of other enhancements).

As for the "right decision" as to what to support, ATI and NVIDIA are playing two completely different games. After two years of building up their brand and engendering themselves to the gamers, ATI are now chasing the OEM's - the choice to support PS2.0 for another generation, apart from cost reasons (which is a valid point) is a reason why they have managed to churn out 3 ASIC's in time for PCI Express; NVIDIA is in a situation where it need to repair its brand so they have chased the technology rather than chasing PCI Express and they are re-engendering themselves at the high end and to the gamer, to the cost of their OEM business for the time being.

The financials are currently bearing out that ATI's choice was the correct one for their business in the short term, with them seeing thir largest revenues and forcasting next quarter more than NVIDIA have and basically getting all the Teir-1 OEM positions that use PCI Express. If ATI's choice is going to show issues from abusiness perspective it will happen 6 or so months down the line, when the OEM's have more native PCI Express options.

yeah, but of course. We have to use the latest bus technology with
all of those tons of motherboards out there that currently using PCI Express. :rolleyes:

We also know that PCI Express is the future and of course it's using
shader 2.0 technology which we know beats shader 3.0 technology :rolleyes:

Well I love ATI, but these kind of decisions make me wonder
why they even try.

Why not just give Nvidia the market right now. Screw the future!
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Again, read my previous post, NVIDIA have already stated that they will have another high end architecture while the rest of the NV4x line will last through to DX Next.
 
Drak said:
Humus, correct me if I'm wrong, but your technique can only emulate an "if-then-else" statement when the output colour is set either in the "then" or "else" block or any nesting thereof. These are the renderstates that Zeno was alluding to. However, it cannot handle computing intermediate variables (not unless you use a very nasty hack of MRTs AFAICS) which is what dynamic branching is really about.

edit: typo

You could render intermediate values to a render target and use that in all shaders thereafter. It'll cost some, but if there's enough shared parts it would be a gain.
 
Why not just give Nvidia the market right now. Screw the future!

Uh huh, lets just take away the number one thing that makes nVIDIA try so hard. Remember when Intel was dominant? Prices flew sky high and Prescott was a failure.
 
Evildeus said:
I thought it was "occlusion", but i may be off base.

Yeah, but it doesn't make sense. I don't get what he mean with "implement occlusion" in the shader.
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
Uhm, no. The first number has always been their generational one, the nV48 is still gonna be an nV4x series chip.

Thats how its been previously, but without a video card for the next year in the high end hardcore sector of the market and without a change for Direct X until Direct X 10 in late 2005/early 2006, this can change you know.
Wavey said:
Probability, at the moment, is that DX Next won't be available until 2006/7.
I trust Dave's prediction a lot more than yours.

The problem is that you are sticking to ideas that have been with Nvidia since the begining I think, where the NVx0 is always the next generation video card and the NVx5 is always the refresh, but this can change like I said.

When is the last time that Nvidia went a year without two new video cards for the hardcore market? (one for next generation and one for refresh)?

The answer is never, yet if that roadmap is correct, then the NV48 will be the first hardcore market video chip thats spread over a one year period. Which according to the past would not happen.

So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Remember, that Direct X used to be released with a new version every year and that thinking has been changed. DX 8 went to every two years, now with DX 9, its three years.

Not everything stays the same as time goes on and things change and what structures in the video card market might apply today, they may not apply in the future.
I don't think I need to rethink a bloody thing about nVidia at this time, but I will if the situation warrants it....and unfortunately nVidia hasn't done a single thing to change my opinion of them. :(

Komb-Why am I not surprised that as a staff member of Rage3D you've never heard of the R500? ;)
("What bitter, me bitter?")
 
Re: Demo trying to provide a hack and some FUD

DaveBaumann said:
Proforma said:
So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Again, read my previous post, NVIDIA have already stated that they will have another high end architecture while the rest of the NV4x line will last through to DX Next.

I don't doubt that, but what is ATI going to have that blows away
Nvidia's architecture? They can't have any significant technology
beyond something like better anti-aliasing or something that does
not require a DX Next upgrade.

Nvidia's technology will still be based off the NV40 technology
until Direct X Next, but they can speed it up to the point
of not releasing any more high end cards besides the NV48.

Thats why I said that ATI is missing shader technology which is really
a stupid move. I thought that they were going to continue making new
ground with Shader 3.0, instead of making 9800 pro, 9800XT, XT800, etc all refreshes of a three year old architecture.

Stupid, stupid, stupid. That will just confuse the market now. um where
can I buy a Shader 3.0 video card, well ATI doesn't support it for two full generations at least.

ATI is really disapointing me and I thought they could standup against Nvidia, now I see their technology lead is just a brief thing and nothing more.

I really thought that ATI was better than Nvidia, but boy have I been mislead. They both are moronic companies and make stupid decisions like Nvidia did with the Geforce 4 MX video cards, what a crock of crap that was.

Why do good companies have to make such backwards decisions?
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
DaveBaumann said:
Proforma said:
So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Again, read my previous post, NVIDIA have already stated that they will have another high end architecture while the rest of the NV4x line will last through to DX Next.

I don't doubt that, but what is ATI going to have that blows away
Nvidia's architecture? They can't have any significant technology
beyond something like better anti-aliasing or something that does
not require a DX Next upgrade.

Nvidia's technology will still be based off the NV40 technology
until Direct X Next, but they can speed it up to the point
of not releasing any more high end cards besides the NV48.

Thats why I said that ATI is missing shader technology which is really
a stupid move. I thought that they were going to continue making new
ground with Shader 3.0, instead of making 9800 pro, 9800XT, XT800, etc all refreshes of a three year old architecture.

Stupid, stupid, stupid. That will just confuse the market now. um where
can I buy a Shader 3.0 video card, well ATI doesn't support it for two full generations at least.

ATI is really disapointing me and I thought they could standup against Nvidia, now I see their technology lead is just a brief thing and nothing more.

I really thought that ATI was better than Nvidia, but boy have I been mislead. They both are moronic companies and make stupid decisions like Nvidia did with the Geforce 4 MX video cards, what a crock of crap that was.

Why do good companies have to make such backwards decisions?
Why are you posting in a 3D thread with ATI and nVIDIA when you don't like either company?
 
Re: Demo trying to provide a hack and some FUD

digitalwanderer said:
Proforma said:
Uhm, no. The first number has always been their generational one, the nV48 is still gonna be an nV4x series chip.

Thats how its been previously, but without a video card for the next year in the high end hardcore sector of the market and without a change for Direct X until Direct X 10 in late 2005/early 2006, this can change you know.
Wavey said:
Probability, at the moment, is that DX Next won't be available until 2006/7.
I trust Dave's prediction a lot more than yours.

The problem is that you are sticking to ideas that have been with Nvidia since the begining I think, where the NVx0 is always the next generation video card and the NVx5 is always the refresh, but this can change like I said.

When is the last time that Nvidia went a year without two new video cards for the hardcore market? (one for next generation and one for refresh)?

The answer is never, yet if that roadmap is correct, then the NV48 will be the first hardcore market video chip thats spread over a one year period. Which according to the past would not happen.

So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Remember, that Direct X used to be released with a new version every year and that thinking has been changed. DX 8 went to every two years, now with DX 9, its three years.

Not everything stays the same as time goes on and things change and what structures in the video card market might apply today, they may not apply in the future.
I don't think I need to rethink a bloody thing about nVidia at this time, but I will if the situation warrants it....and unfortunately nVidia hasn't done a single thing to change my opinion of them. :(

Komb-Why am I not surprised that as a staff member of Rage3D you've never heard of the R500? ;)
("What bitter, me bitter?")

Thats fine, trust dave, his opinions change quite a bit though, wonder if he will think the same as he does six months from now. Wanna make a bet?

I don't know jack, I admit that, but I don't think Dave knows as much as he lets on about either. He changes his info quite often.
 
Evildeus said:
Humus said:
It's not a hack, and it's not useless, and it should work fine with stencil shadows if you're careful.
Are you sure or are you supposing? And what are the performance consequencies if it works?

It should work. You can render the if pass first, which will stencil out an area that's lit, but set it to for instance 16 instead of 1. Then do shadowing as usual, but use stencil mask to preserve the upper 4 stencil bits. Then do lighting the usual way, and use 16 as stencil reference value. Now fragments will be culled if they don't match both lit in the if-statement sense and the shadowing sense.

I don't see any additional overhead as such by combining the two.
 
Re: Demo trying to provide a hack and some FUD

pat777 said:
Proforma said:
DaveBaumann said:
Proforma said:
So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Again, read my previous post, NVIDIA have already stated that they will have another high end architecture while the rest of the NV4x line will last through to DX Next.

I don't doubt that, but what is ATI going to have that blows away
Nvidia's architecture? They can't have any significant technology
beyond something like better anti-aliasing or something that does
not require a DX Next upgrade.

Nvidia's technology will still be based off the NV40 technology
until Direct X Next, but they can speed it up to the point
of not releasing any more high end cards besides the NV48.

Thats why I said that ATI is missing shader technology which is really
a stupid move. I thought that they were going to continue making new
ground with Shader 3.0, instead of making 9800 pro, 9800XT, XT800, etc all refreshes of a three year old architecture.

Stupid, stupid, stupid. That will just confuse the market now. um where
can I buy a Shader 3.0 video card, well ATI doesn't support it for two full generations at least.

ATI is really disapointing me and I thought they could standup against Nvidia, now I see their technology lead is just a brief thing and nothing more.

I really thought that ATI was better than Nvidia, but boy have I been mislead. They both are moronic companies and make stupid decisions like Nvidia did with the Geforce 4 MX video cards, what a crock of crap that was.

Why do good companies have to make such backwards decisions?
Why are you posting in a 3D thread with ATI and nVIDIA when you don't like either company?

I want to buy a nice video card that I can develop software for via shaders. I am angry because both seem to be run by a bunch of rednecks.
 
Humus said:
It's not a hack, and it's not useless, and it should work fine with stencil shadows if you're careful.

It is an hack. This does not replace if statements at all. One thing it does is mess and complicate the renderer..

The solution you presented to shadow volumes may lead to problems if the passes in the stencil go over the 4 bytes. Then you would have to have to use only 3 bits for if statements, or just 2, etc...

This technique does bring more problems than the one it is trying to solve: and that is simply the lack of hardware implementation of PS3.0...
Then simply, why bother if ATI is making a card with PS3.0? Or probably not? :?

Humus said:
Sigma said:
Another thing I would like to know Humus is why did you forget to implement any kind of occlusition to the part where it uses branching. That does shift the results a lot because the stencil is the "occlustion"...

Implement what?

:oops: Sorry! Occlusion. I mean occlusion...
 
pat777 said:
Why not just give Nvidia the market right now. Screw the future!

Uh huh, lets just take away the number one thing that makes nVIDIA try so hard. Remember when Intel was dominant? Prices flew sky high and Prescott was a failure.

No, I want ATI to be competitive and right now they arn't.
Its pretty much that simple. They had Nvidia down for the
count and now they are just going to let them catch back
up and maybe even surpass them while they work on
spreading their resources too thin just like Nvidia did.

They are losing focus and not learning a damn thing
from what Nvidia has done, thats why I am angry.
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
I don't know jack, I admit that, but I don't think Dave knows as much as he lets on about either. He changes his info quite often.

Errr, we are dealing with future roadmaps that have tendancies not to stay static. For instance, DX Next has always been tied to Longhorns release, intially epxected in 2005/06 which is now likely not to be 2006/07 and theres now suggestions that DX Next may even not appear until after Longhorns release.
 
Re: Demo trying to provide a hack and some FUD

Proforma said:
pat777 said:
Proforma said:
DaveBaumann said:
Proforma said:
So I know what your thinking, but I think you need to rethink what you are thinking. I understand your level of thinking, but it may not be true.

Again, read my previous post, NVIDIA have already stated that they will have another high end architecture while the rest of the NV4x line will last through to DX Next.

I don't doubt that, but what is ATI going to have that blows away
Nvidia's architecture? They can't have any significant technology
beyond something like better anti-aliasing or something that does
not require a DX Next upgrade.

Nvidia's technology will still be based off the NV40 technology
until Direct X Next, but they can speed it up to the point
of not releasing any more high end cards besides the NV48.

Thats why I said that ATI is missing shader technology which is really
a stupid move. I thought that they were going to continue making new
ground with Shader 3.0, instead of making 9800 pro, 9800XT, XT800, etc all refreshes of a three year old architecture.

Stupid, stupid, stupid. That will just confuse the market now. um where
can I buy a Shader 3.0 video card, well ATI doesn't support it for two full generations at least.

ATI is really disapointing me and I thought they could standup against Nvidia, now I see their technology lead is just a brief thing and nothing more.

I really thought that ATI was better than Nvidia, but boy have I been mislead. They both are moronic companies and make stupid decisions like Nvidia did with the Geforce 4 MX video cards, what a crock of crap that was.

Why do good companies have to make such backwards decisions?
Why are you posting in a 3D thread with ATI and nVIDIA when you don't like either company?

I want to buy a nice video card that I can develop software for via shaders. I am angry because both seem to be run by a bunch of rednecks.
Well, mistakes happen in every company. If you want a good video card you should like companies like nVIDIA and ATI. They make very good video cards.
 
Back
Top