New dynamic branching demo

Eronarn said:
Ruined said:
Doesn't make much sense to me. Crytek had to convert a bunch of 2.0 shaders to 1.1 to get the game to run well on the FX series, and you think its gonna run 3.0-emulated shaders well? Not exactly the best logic IMO.

Because the FX series has poor PS2.0, this does not impact the 9xxx series PS2.0. They run Far Cry pretty well, actually. As for graphics settings, well, I know that I'd prefer to have offset mapping and the like than a higher res (I play in 1024 max since I have a 15in monitor).

Even at 1024x768 with 4x AA/8x AF even the 9800 scores ~30fps average in Far Cry. I doubt you'd be able to run the game with offset mapping playable. Those new shaders even in SM3.0 were a significant performance hit for the 6800 in the beta patch. Should be faster in final patch, but I doubt fast enough to be good for old cards.
 
Ruined said:
Doesn't make much sense to me. Crytek had to convert a bunch of 2.0 shaders to 1.1 to get the game to run well on last generation's cards, and you think they are gonna run 3.0-emulated/more intense shaders well when they can't even do full 2.0 as it stands? Not exactly the best logic IMO.

Once a game has been released you've made the majority of money you are going to from it - patches are usually released purely for compatibility purposes, not spend significant development time creating new features as that is usually spent on you next project that will be a revenue earner from actual sales. Its already been demonstrated that 6800 can run the Radeon path fairly easily so if you were just patching to get the 6800 to well, to the best the game has then why would you not just alter the title a little in order to do this?
 
nice demo :)

...sounds kind of dumb for crytek to implement SM3.0 effects seeing as how less than 1% of the market has the ability to display it. They'd probably get more wows if they went for 2.0. The game is still mostly SM1.X anyhow....
 
DaveBaumann said:
Once a game has been released you've made the majority of money you are going to from it

You are forgetting that Far Cry is based on on a marketable graphic engine. By making a big technology splash with SM3.0 and their engine, they move their DX9 engine into a better position in the market to compete with Half Life 2's DX9 engine, which is only debuting with SM2.0. By showcasing all these new technologies they can catch the eyes of devs and publishers before Half Life 2 even gets out of the gate.
 
DaveBaumann said:
And you don't want markettable engines to support decent fallbacks?

Far Cry supports 1.1/2.0/3.0, what are you talking about? It's up to the programmer to decide what shaders they want to use where.

And TBH it does move more units. I just ordered a new 6800GT card as did my friend. I bought Far Cry and he is planning to just to check out the SM3.0 effects. I probably wouldn't have bought it otherwise, at least not until it was real cheap. So I think it increases sales by being tied to a product launch, too. Far Cry was the SM3.0 game everyone saw at the NV40 launch, so its almost a must-have game for NV40 buyers.
 
Ruined said:
Eronarn said:
Ruined said:
Doesn't make much sense to me. Crytek had to convert a bunch of 2.0 shaders to 1.1 to get the game to run well on the FX series, and you think its gonna run 3.0-emulated shaders well? Not exactly the best logic IMO.

Because the FX series has poor PS2.0, this does not impact the 9xxx series PS2.0. They run Far Cry pretty well, actually. As for graphics settings, well, I know that I'd prefer to have offset mapping and the like than a higher res (I play in 1024 max since I have a 15in monitor).

Even at 1024x768 with 4x AA/8x AF even the 9800 scores ~30fps average in Far Cry. I doubt you'd be able to run the game with offset mapping playable. Those new shaders even in SM3.0 were a significant performance hit for the 6800 in the beta patch. Should be faster in final patch, but I doubt fast enough to be good for old cards.

...so because at those settings it runs at slower frame rates, it can't manage it at all under any? I'd rather have offset mapping than AA! I usually run things at 1024x768, max settings for textures and the like, 2x AA (/w TAA), and 4x AF. The appearance of models, textures and the like is much more important than AA to me.
 
No, by specifically making engine features that only support one one shader model is not a selling point since it restricts the usage and makes licensees live more difficult. For instance, there are some elements in the UnrealEngine3 that will require PS3.0 for instruction length, however they can automatically be compiled to PS2.a and PS2.b models and fallbacks are provided to enable the same effect on PS2.0 that can't natively support those shader lengths.

If you have a potential fallback then you want to sell that fallback as part of the engine.
 
DaveBaumann said:
No, by specifically making engine features that only support one one shader model is not a selling point since it restricts the usage and makes licensees live more difficult.

That's great but as I stated above Far Cry supports SM1.1, SM2.0, and SM3.0 so your point is irrelevant. Devs should support old technology, but they shouldn't be required or expected to emulate every single effect for older cards... If you want the new effects, buy the new technology. If you don't care, keep the old technology.
 
DaveBaumann said:
No, read it again and see the relevance.

Considering Unreal3 doesn't come out until 2006, I don't think Crytek has anything to worry about competing with its engine there.

Right now the only SM3.0 game that Far Cry has to compete with is... That's right there are none. :)
 
Ruined said:
Eronarn said:
Ruined said:
... just like writing fallback shaders for every single visual effect for ATI cards and Nvidia cards that do not support the latest shader technology is not necessary.

Quote corrected.

They're shafting their own customers, not just ATI's.

Not true. The SM3.0 effects would most likely run too slow on any SM2.0 card with the exception of the X800 series based on the performance of the beta patch at E3, so if they did write fallback emulation for the SM3.0 effects, they would probably only be playable on ATI X800 cards. How worthwhile it is to code for one card series is questionable.

Looks like you are a little angry because Humus find a way to "turnaround" all this SM3.0 stuff that nvidia are promoting with 6800 cards...
Maybe you will be a little more happy if X800 cards don't run SM3.0 stuff right? This can justify your choice for a 6800.
Stop bashing Humus and go play some games with your card, this is really a great news for all us poor guys that don't have money to buy a TOP card. I hope this work on FX and R3xx cards.
Thanks Humus.
 
Ruined said:
John Reynolds said:
That's working from an assumption that all game development decisions are based on technical merits rather than the business/marketing/money side of things. In an ideal world, game development would focus entirely upon sound technical decisions, but we don't live in an ideal world.

So you agree with the statement that although the FX 59xx series and R98xx series already both struggle with FarCry, it makes sense that by adding even more intensive visual effects will not degrade the performance even more (probably to an unplayable level)?
Now this is what qualifies as 100% unsubstantiated FUD. The R98XX series does not struggles in Far Cry while the FX59XX definetely does and on top of that at lower quality. My 9800 Pro was just as playable @ 1024x768 w/ max settings as my X800 Pro is now with the same settings @ 1280x1024. If you bother to do the math you will see that upping the rez one notch for me required about 66% more fillrate to display the same scene. Very close to the difference between these two cards in most benches.
Now of course you're going to say I'm crazy, but the 9800XT actually outperformed the 6800U in several instances in Far Cry. That doesn't sound like a card that can't run SM 2.0 well.
16x12 no aa/af, avg. fps: 9800XT=33.2 (20 low), 6800U=38.6 (16 low)
16x12 4xaa/16af, avg. fps: 9800XT=17.2 (10 low), 6800U=15.5 (8 low)
http://www.hardocp.com/article.html?art=NjExLDg=

So what's the bottom line? If the Crytek patch rumours are true, there is only one reason - $$$$. Nothing else, no need to come up with any crazy theories.
 
Dave your talking to a wall. Ruined will still keep on going even when theres proof that goes aginst him. Arguing with him will only lower your self to his level and I realy don't want that to happen.
 
I would like to know why you claim these things need to be emulated ?

I only know of 1 which needs to be emulated and thats dynamic branching . IT would not only speed up r3x0 , r42x but also nv3x hardware in farcry .

hdr can be done on the r3x0 and r42x. Its been proven by half life 2 .

It may not do fp 16 blending but it can do hdr so why not included it ?

Its the same reason why tiger woods wouldn't allow certian res being picked on ati hardware but ati hardware would run it just as fast as nvidia hardware .


Its called money .

I have no problem with that . If they want to do it thats fine. I wont spend my money on those games that do that.

But to compare that to another ?

At least with half life 2 we had doom 3 , 3dmark , tomb raider telling us the same thing .
 
{Sniping}Waste said:
Dave your talking to a wall. Ruined will still keep on going even when theres proof that goes aginst him. Arguing with him will only lower your self to his level and I realy don't want that to happen.

Gotta agree with this, this argument seems to be descending into "No, it's not that way. Because I think so."
 
Bob3D said:
Looks like you are a little angry because Humus find a way to "turnaround" all this SM3.0 stuff that nvidia are promoting with 6800 cards...

You consider releasing a simple graphical demo and having no game in existence showing any support whatsoever of this technique a "turnaround"? :LOL:

Maybe you will be a little more happy if X800 cards don't run SM3.0 stuff right?

Well since the X800 is only SM2.0 they can't run SM3.0, that's right ;)

This can justify your choice for a 6800.

Ah... the "justification" argument. The same can be reversed. Maybe some are hoping their X800PRO/XT cards will not be outdated by being unable to display the latest eye candy less than a month after they bought them, and will grab onto anything that supports the notion that the X800 is just as advanced as the 6800 in shaders. In fact, that would be a much more likely "justification" than any I would make.

Stop bashing Humus and go play some games with your card, this is really a great news for all us poor guys that don't have money to buy a TOP card. I hope this work on FX and R3xx cards.
Thanks Humus.

We'll see if anything actually comes of it in real life games. Considering his demo makes the FX series run 10% *slower* when the "branching" is enabled, its not looking to be all that helpful to anything but ATI cards - that is, if it ever actually gets implemented in a real game.
 
{Sniping}Waste said:
Dave your talking to a wall. Ruined will still keep on going even when theres proof that goes aginst him. Arguing with him will only lower your self to his level and I realy don't want that to happen.

:LOL:
 
Ruined said:
Bob3D said:
Looks like you are a little angry because Humus find a way to "turnaround" all this SM3.0 stuff that nvidia are promoting with 6800 cards...

You consider releasing a simple graphical demo and having no game in existence showing any support whatsoever of this technique a "turnaround"? :LOL:

Remember, PS3.0 itself is a simple graphical demo having no game in existence. ;)

Maybe you will be a little more happy if X800 cards don't run SM3.0 stuff right?

Well since the X800 is only SM2.0 they can't run SM3.0, that's right ;)

What, so now ATI is a villain if its products can do better than people thought they could? THEY ARE SUCH EVIL PEOPLE, I TELL YOU!


This can justify your choice for a 6800.

Ah... the "justification" argument. The same can be reversed. Maybe some are hoping their X800PRO/XT cards will not be outdated by being unable to display the latest eye candy less than a month after they bought them, and will grab onto anything that supports the notion that the X800 is just as advanced as the 6800 in shaders. In fact, that would be a much more likely "justification" than any I would make.

Yes, you can use that as a justification. But here's the deal, it would be wrong. People already knew it wouldn't support PS3.0. They consciously chose to not get it, knowing that they might miss out on stuff. 6800 purchasers thought they would be getting exclusive stuff- which they may not be, since the effect is faster in PS2.0.

Stop bashing Humus and go play some games with your card, this is really a great news for all us poor guys that don't have money to buy a TOP card. I hope this work on FX and R3xx cards.
Thanks Humus.

We'll see if anything actually comes of it in real life games. Considering his demo makes the FX series run 10% *slower* when the "branching" is enabled, its not looking to be all that helpful to anything but ATI cards - that is, if it ever actually gets implemented in a real game.
[/quote]

Hey, news flash, the FX series sucks. I'd blame it on the card much sooner than the designer of the program.
 
Back
Top