New dynamic branching demo

digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

I don't mind the knowledge, I think its just being presented in a misleading way and could be getting people's hopes up for something that very well (likely) might not follow through in real life games. Also, an ATI employee making statements like "nvidia is pwned," "raining on nvidias parade," and the fact that both of his recent demos do not work properly on Nvidia cards. Seems more like FUD to me, to attempt to detract from SM3.0. And now that Humus works for ATI, no matter what he or anyone else says, that makes him biased towards them.
 
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

you and me both. Sad when someone comes up with something, and people attack him for being creative
 
Ruined said:
... just like writing fallback shaders for every single visual effect for ATI cards and Nvidia cards that do not support the latest shader technology is not necessary.

Quote corrected.

They're shafting their own customers, not just ATI's.
 
joe emo said:
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

you and me both. Sad when someone comes up with something, and people attack him for being creative, and thinking
\

Err, Humus didn't invent this technique, he just made a demo using it.
 
Eronarn said:
Ruined said:
... just like writing fallback shaders for every single visual effect for ATI cards and Nvidia cards that do not support the latest shader technology is not necessary.

Quote corrected.

They're shafting their own customers, not just ATI's.

Not true. The SM3.0 effects would most likely run too slow on any SM2.0 card with the exception of the X800 series based on the performance of the beta patch at E3, so if they did write fallback emulation for the SM3.0 effects, they would probably only be playable on ATI X800 cards. How worthwhile it is to code for one card series is questionable.
 
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

Its the Nvidits bashing Humus. They think because he works for ATI that he is spreding FUD and very bias.

Come on guys its the good old Humus we all know so stop bashing him.

Do yall want be to start bashing you for owning a Nvidia vcard? Thats stupid!

It looks like Humus is showing the false statments of Nvidias FUD to me.
 
AlphaWolf said:
Ruined said:
Not true. The SM3.0 effects would most likely run too slow on any SM2.0 card with the exception of the X800 series.

got proof?

At E3 with the beta patch/beta drivers the SM3.0 patch was just about acceptable (30-60fps) on a fast CPU and 6800U setup at 1280x1024. While that performance will probably improve quite a bit in the final patch, I doubt it will improve to the point where an FX series or 9800 card would be able to run the effects at a playable rate. Heck FX series and 9800 cards have problems running Far Cry as is, nevermind adding new effects.

I could ask the same thing - do you have proof that a game as complex as Far Cry would work? Simple offset mapping demos have been made for SM2.0 but nothing on the scale of Far Cry.
 
you doubt the effects would run at acceptable speed on r3xx and NV3x based HW. You're personal doubts are hardly indicative of anything factual.
 
{Sniping}Waste said:
Its the Nvidits bashing Humus.

Better to come up with a good argument than call names.

They think because he works for ATI that he is spreding FUD and very bias.

Of course he's bias because he works at ATI. That's a given. Doesn't mean he will spread FUD, but in this case it kinda looks that way to me.
 
joe emo said:
you doubt the effects would run at acceptable speed on r3xx and NV3x based HW. You're personal doubts are hardly indicative of anything factual.

It looks like CryTek may have had some doubts too otherwise they probably would have implemented the SM2.0 emulation :)
 
Ruined said:
It looks like CryTek may have had some doubts too otherwise they probably would have implemented the SM2.0 emulation :)

That's working from an assumption that all game development decisions are based on technical merits rather than the business/marketing/money side of things. In an ideal world, game development would focus entirely upon sound technical decisions, but we don't live in an ideal world.
 
Dio said:
Isn't that a circular argument?

Yes.

"Crytek didn't implement SM2.0 emulation because it wouldn't run fast!"

"SM2.0 obviously doesn't run fast because they chose not to emulate it!"

:rolleyes:
 
digitalwanderer said:
I can not believe the amount of attacks a man gets for sharing some knowledge around here anymore. :(

The technique is fine, Dig, and I would congratulate him for it if it were presented as "New technique to reduce per-fragment calculation while rendering multiple lights under SM 2.0".

What gets me is that it comes with this "SM 3.0 is 0wn3d!" junk which is quite misleading. It makes Humus (who has earned my respect over the years for his personal work and his contribution on other forums like opengl.org) look intellectually dishonest. I was just disappointed by that.
 
John Reynolds said:
That's working from an assumption that all game development decisions are based on technical merits rather than the business/marketing/money side of things. In an ideal world, game development would focus entirely upon sound technical decisions, but we don't live in an ideal world.

So you agree with the statement that although the FX 59xx series and R98xx series already both struggle with FarCry, it makes sense that by adding even more intensive visual effects will not degrade the performance even more (probably to an unplayable level)?

Doesn't make much sense to me. Crytek had to convert a bunch of 2.0 shaders to 1.1 to get the game to run well on last generation's cards, and you think they are gonna run 3.0-emulated/more intense shaders well when they can't even do full 2.0 as it stands? Not exactly the best logic IMO.
 
Ruined said:
Doesn't make much sense to me. Crytek had to convert a bunch of 2.0 shaders to 1.1 to get the game to run well on the FX series, and you think its gonna run 3.0-emulated shaders well? Not exactly the best logic IMO.

Because the FX series has poor PS2.0, this does not impact the 9xxx series PS2.0. They run Far Cry pretty well, actually. As for graphics settings, well, I know that I'd prefer to have offset mapping and the like than a higher res (I play in 1024 max since I have a 15in monitor).
 
Back
Top