Question re: GFFX, D]|[, HL2, Radeon and mods.....

jimbob0i0

Newcomer
Well we all know how 'well' GFFX copes with HL2 code when not specially 'optimised' and the same with D]|[ code but Radeon runs either fine.

Here's a question I hope someone can answer for me as I'm not certain...

Will ID be supporting any mods of any kind to D]|[? After all they are presumably going to license the engine when it is complete. IF they are then will the mods allow shaders written by the modding community as Gabe has said will be possible with HL2?

You can probably see what I'm getting at here... those will no longer be specially coded with INT12/FP16 but will most likely rather follow the ARB2 standard. Mods for HL2 will undoutably follow the DX9 standard.

So GFFX may be faster in the standard game on the hand coded inferior NV path in Doom ]|[ and may *almost* be able to keep up with the R3x0 in HL2 on it's own path but what about when the community gets behond the games and starts releasing their own content?

If NV's NV3X hasn't been withdrawn by then imagine what will happen when peeps can't play the new versions of DoD, CS, etc....

Any insights at all?
 
That's another reason why relying on shader replacement (be it at the driver level or in-game) is a no-go : it multiplies the amount of work required when updates are done...

If something as big as CS comes out with new shaders for HL2 or D3, then NV will have no choice than to butcher^H^H^H^H^H^H^Hoptimize the shaders at the driver level... Especially since most mod-makers don't run businesses, don't have to support their mods if they don't perform well on certain hardware and probably won't be shy about explaining why the performance sucks on said hardware...

But then again, the beta Detonators 61.83 will probably be there/not there (depending on whether you want to benchmark them with blinders or held NV accountable for the IQ drop in them), and offer some automagic shader optimisation...
 
I can forsee a day when Nivida users are downloading 200mb driver updates due to all of the hand written shader optimizations. It kills me they tout their new card as cinematic rendering supporting DX9, but then they lambast developers for using PS2.0 when 1.4 will do just fine??

Well heck, I'm not saying anything that hasn't been stated by hundreds of others... You just gotta ask yourself WTF are they thinking??? ATI didn't do this when the 8500 flopped against the Geforce 4... :oops:
 
Perhaps Nvidia is thinking about branching into game development. ;)

What would really be cool and hideously impractical would be an in-driver profiler that could be turned on by the user to improve performance over time. It seems to work pretty well with high-performance high-end processors, and could be a sight better than an automatic optimizer that throws all it learns away at the end of a code run.
 
Back
Top