New dynamic branching demo

K.I.L.E.R said:
Coders code for the lowest common denominator and if they have time they knock in some extras like PS 2.0 or whatnot.
That's the point. Which company do you think will have the first low-end SM3 part?
 
Chalnoth said:
K.I.L.E.R said:
Coders code for the lowest common denominator and if they have time they knock in some extras like PS 2.0 or whatnot.
That's the point. Which company do you think will have the first low-end SM3 part?

Low end PS3.0?

Like the 5200 is low end PS2.0?
 
That's a completely different scenario. Just pay attention to the 5800's PS2 performance and you'll see what I mean.
 
Chalnoth said:
That's a completely different scenario. Just pay attention to the 5800's PS2 performance and you'll see what I mean.

Why is it completely different? They are both going to be underpowered cards. How do you think performance will be using hardware like this by the time SM3.0 is used for special visual effects in games? That would be like complaining that a GFMX can't run PS2.0 shaders in Far Cry, even though it's not powerful enough to anyways.
 
Chalnoth said:
K.I.L.E.R said:
Coders code for the lowest common denominator and if they have time they knock in some extras like PS 2.0 or whatnot.
That's the point. Which company do you think will have the first low-end SM3 part?

AFAIK the lowest end would be a Geforce 2 nowadays.
It has nothing to do about lowest end PS 3.0 card.

Lowest end shader card would be a NV20 wouldn't it?
 
Eronarn said:
Why is it completely different? They are both going to be underpowered cards. How do you think performance will be using hardware like this by the time SM3.0 is used for special visual effects in games? That would be like complaining that a GFMX can't run PS2.0 shaders in Far Cry, even though it's not powerful enough to anyways.
Sure, it wouldn't be powerful enough to do it at 1600x1200, but that's just resolution. People don't buy low-end video cards to play games at high resolutions. That doesn't mean they won't be able to play them with most effects enabled.
 
K.I.L.E.R said:
AFAIK the lowest end would be a Geforce 2 nowadays.
It has nothing to do about lowest end PS 3.0 card.

Lowest end shader card would be a NV20 wouldn't it?
Obviously you can't design a game that will work on everybody's machine today. Some people still have machines using the original Pentium. But you can reasonably require some minimum featureset once that featureset can be had at a low price point, and achieves decent market saturation.

How do you think ATI's not supporting SM3 will affect the market saturation of SM3 parts?
 
Chalnoth said:
Eronarn said:
Why is it completely different? They are both going to be underpowered cards. How do you think performance will be using hardware like this by the time SM3.0 is used for special visual effects in games? That would be like complaining that a GFMX can't run PS2.0 shaders in Far Cry, even though it's not powerful enough to anyways.
Sure, it wouldn't be powerful enough to do it at 1600x1200, but that's just resolution. People don't buy low-end video cards to play games at high resolutions. That doesn't mean they won't be able to play them with most effects enabled.

I'd hardly call a $300 card 'low end'. By the time it drops in price to become low-end, ATI will already have PS3.0 cards out.
 
Eronarn said:
I'd hardly call a $300 card 'low end'. By the time it drops in price to become low-end, ATI will already have PS3.0 cards out.
Neither would I. I'm talking about the $100 range, and I'm certainly not talking about the NV40 here, but the rest of the generation.
 
Chalnoth said:
Neither would I. I'm talking about the $100 range, and I'm certainly not talking about the NV40 here, but the rest of the generation.

Yes, but how long until that comes out? I haven't heard anything about it so far.
 
Roadmaps I've seen place the release date of the rest of the NV4x lineup in the fall (in about 3-6 months), with ATI not releasing the SM3 R500 until next year.
 
Chalnoth said:
Roadmaps I've seen place the release date of the rest of the NV4x lineup in the fall (in about 3-6 months), with ATI not releasing the SM3 R500 until next year.

That's not too much of a problem, then. Both should have PS3.0 out well before PS3.0 is required for the 'full' experience in a game.
 
Sure, if you're only looking in the short-term. Not having SM3 parts from all manufacturers now will slow the adoption of SM3 parts as a whole, and will therefore slow the uptake of SM3 in games.

So sure, we won't get much SM3 until ATI also supports it, but mostly because we can't get much SM3 support until ATI supports it also (unless, by some magical set of circumstances, nVidia swallows up 90% of the non-integrated graphics business with the NV4x).
 
Humus - thanks for answering my two questions. Its great to see good techniques develop to translate Shader model 3.0 techniques to even faster shader model 2.0 techniques.

Dave et al,

It strikes me there should be several very interesting thought pieces here (or in 3d architecture) on how effectively SM 3.0 can be translated into SM 2.0 or SM 1.0 routines. Because really all these shader model debates need better perspective.

If it turns out SM 3.0 can only distinctly offer (i.e. you simply can't do it in a lower shader model) a scant handful of visual effects its all a big fuss about nothing. Because any game developer really has to offer fallbacks for SM 3.0 -> SM 2.0 -> SM 1.0 etc.

What we really need is a much better appreciation of is how hard this is!
 
Well, these are my thoughts on it.

As has already been shown, SM2.0 can emulate SM3.0 pretty well, as long as the devs code for it. So, people with SM2.0 cards can use some SM3.0 stuff. I think that by the time the non-reproduceable effects (or ones that would slow it down too much) become required for certain games, the R420 and NV40 cards will be too slow to run it with those settings anyways. Similar to how it is with the FX5200. It has the capability for SM2.0, just not the speed.

So, in the end, I don't think it's important as long as ATI works to get devs to make code useable on PS2.0 parts- which will benefit not just ATI, but Nvidia. Will this slow adoption of PS3.0 in games? Not really. It will probably be used in ways that PS2.0 can emulate for quite a while yet, so we'll just see SM3.0 capable cards running games faster than their non-SM3.0 counterparts.
 
Chalnoth said:
Sure, if you're only looking in the short-term. Not having SM3 parts from all manufacturers now will slow the adoption of SM3 parts as a whole, and will therefore slow the uptake of SM3 in games.

So sure, we won't get much SM3 until ATI also supports it, but mostly because we can't get much SM3 support until ATI supports it also (unless, by some magical set of circumstances, nVidia swallows up 90% of the non-integrated graphics business with the NV4x).

We had the same problems with Nvidia not supporting SM2.0 . First they were late with NV30, then we had to wait for NV35, and then we found it was a very poor SM2.0 part. As you know several developers did not use SM2.0 or DX9 because of Nvidia's lack of capable support.

I'd make the argument that Nvidia's retardation of the SM2.0 market is what really slowed the adoption of the SM3.0 market. If developers had been encouraged to support SM2.0 over the last couple of years, the jump to SM3.0 would be much easier and much better supported, much more quickly - the developers would be ready. Instead Nvidia (one of the two biggest players on the market) actively discouraged developers from supporting SM2.0

As it is I expect that in the normal course of events we are back at the same position as we were a couple of years ago - waiting a year or two for games to arrive using the new SM.

This might only be changed in a few high profile games because of the way that Nvidia seems to be trying to spend huge sums of money, effectively commissioning developers to write them SM3.0 code.
 
(Yea, I'm very late to enter this thread)

Devs have a choice. They can either code for SM3.0, and spend extra time coding and testing an SM2.0 emulation of those effects, or they can just code SM3.0.

It seems like there is a third option. From what I understood, FarCry was written with SM2.0, and then it was later optimized by collapsing multiple passes into one, using conditional branching on SM3.0 (I suppose the multiple passes are sorta like Humus' demo, where pixels are somehow flagged for further processing).

The results are quite interesting. The 6800 gets a slight performance boost, but ATi has the faster SM2.0 cards, so they manage to keep up quite well. And if Humus wants to demonstrate that point, I understand fully. If he wants to claim that this SM2.0 trick is as good as SM3.0, I would disagree. Multiple passes still mean that you have to process the geometry multiple times. And you also have to check all pixels in every pass multiple times. So you have more overhead. Then again, dynamic branching also has overhead.
So I am willing to believe that a SM2.0 card can beat an SM3.0 card in some cases (it just has to be faster than the SM3.0 card, basically, but ATis cards are), but I am quite sure there are also cases where SM3.0 can outperform SM2.0 considerably (for example scenes with high poly count, or relatively little pixels flagged, compared to the total number of pixels rendered).
 
Yes, PS 1.x will penetrate the market nicely that's why it's mostly supported in the majority of games (not all games, games still come out using no shaders).

Ati not supporting PS 3.0 will not hurt the market.

Like I've said above, PS 2.0 still isn't anywhere close to being halfway utilised and until that happens video cards could even come out using SM 500.0 and it will go either partially used or completely unused.

When the majority of developers add support for PS x.x regardless of what chips support it then IHV would start adding it as they will see it is necessary/nice/useful.

PS 3.0 is a good thing, any advances are a good thing but devs should get more time to add these things in and have their games support it from the ground up or through patching games but preferably the former.

If that was the case then Ati might have changed their strategy.


Chalnoth said:
K.I.L.E.R said:
AFAIK the lowest end would be a Geforce 2 nowadays.
It has nothing to do about lowest end PS 3.0 card.

Lowest end shader card would be a NV20 wouldn't it?
Obviously you can't design a game that will work on everybody's machine today. Some people still have machines using the original Pentium. But you can reasonably require some minimum featureset once that featureset can be had at a low price point, and achieves decent market saturation.

How do you think ATI's not supporting SM3 will affect the market saturation of SM3 parts?
 
Scali said:
So I am willing to believe that a SM2.0 card can beat an SM3.0 card in some cases (it just has to be faster than the SM3.0 card, basically, but ATis cards are), but I am quite sure there are also cases where SM3.0 can outperform SM2.0 considerably (for example scenes with high poly count, or relatively little pixels flagged, compared to the total number of pixels rendered).

That's the rub. By leaving out SM3.0 ATI is able to build a faster SM2.0 chip. By having SM3.0, Nvidia is not able to clock their cards as fast. Performance of NV40 is lower because of SM3.0, so the question is whether the efficiencies of SM3.0 can make up for the loss in speed Nvidia have committed to by having to run the extra 60 million SM3.0 transisitors.

Personally, I don't belive that Nvidia's choice has yet been proven to be correct. It relies on shader heavy games and developer support during the life of the card (12-24 months). It relies on NV40 (and derivatives) having the SM3.0 performance efficiencies to able to bridge that gap. In the meantime, ATI is enjoying a speed advantage on all near/current /future games including any SM2.0 code.

Effectively, Nvidia has elected to take a speed hit on current/near future games (compared to ATI), while gambling that they can get that loss back using SM3.0 over the next year or so.
 
K.I.L.E.R said:
SM 3.0 is great.

SM 2.0 has yet to be fully utilised by the industry.

I see more games out using SM 1.x than anything.

Bravo. Most sane post in this thread by far.
 
Back
Top