One last time, does Ut2 utilize vertex shaders in any way?

Grall said:
jb,

Epics customers, that's EXACTLY what I'm talking about when I say they're holding back software development.

Since Sweeney won't do the job others are already doing, we're going to see a string of licensed games, all of which lack hardware acceleration of these features. No wonder the host CPU gets strained so hard, considering the polygon densities in UT2!

I don't understand why you think systems would be even more bogged-down by off-loading work onto the GPU. That's just plain backwards thinking!
*G*


Again you don't seem to realize the bigger picture here. They have had the core features of the engine frozen for a LONG time now as they had to deliver to these developers a working engine. And do you think these developers can just crank out a game in a few weeks? How long do you think America's Army has been in development? Right long time. It takes some time to properly implement PS/VS properly. So you want a quick last minute hack? UT2k3 is the first engine to really push a TnL unit so I really wouldn't call that holding back software. If anyone is holding back software is nV with their crappy DX7 GF4mx cards. Developers code to the base line. Thanks to nV the DX7 baseline lives on :(

And yes in non-bot matches your video card is a bottle neck. So requiring DX8 PS to run on non-DX8 hardware will slow peoples PC down more than having the game only require DX7 hardware.
 
Grall said:
jb,

Epics customers, that's EXACTLY what I'm talking about when I say they're holding back software development.

Since Sweeney won't do the job others are already doing, we're going to see a string of licensed games, all of which lack hardware acceleration of these features. No wonder the host CPU gets strained so hard, considering the polygon densities in UT2!

I don't understand why you think systems would be even more bogged-down by off-loading work onto the GPU. That's just plain backwards thinking!
I, too, have problems following this argument. The engine DOES support both vertex shaders and pixel shaders. It's up to the development team to decide whether they require to or want to use shaders or not. How can that hold back software development? The possibility is there. Epic simply didn't feel it necessary for UT2003, that's that. It was a design decision, not an engine limitation.

ta,
-Sascha.rb
 
We didn't find any use for them in UT2003. Basically all you realistically need vertex shaders for on DX8 cards is to set up pixel shaders. As we're happy with the DX7 blending approach for UT2003 there was no need for neither pixel nor vertex shaders. We do use pixel shaders for terrain rendering but that's just a minor optimization and the DX7 blending fallback is almost as fast and looks 100% identical.

-- Daniel, Epic Games Inc.

nggalai said:
didn't feel it necessary for UT2003, that's that. It was a design decision, not an engine limitation.
 
jb said:
If anyone is holding back software is nV with their crappy DX7 GF4mx cards. Developers code to the base line. Thanks to nV the DX7 baseline lives on :(

While true, you can't hold the blame solely on nVidia. ATI wasn't much faster putting out an entry-level DX8 card.

However, it really looks like nVidia will correct this mistake (and then some) by releasing an entry-level DX9 part early next year, and even an integrated NV3x nForce sometime next year.

For me, it really seems like an odd choice for nVidia. That is, they were very quick in releasing the GeForce2 MX after the original GeForce came out, but it doesn't look like they'll ever release an entry-level NV2x part. I'd be curious to find out the reasons for this. Did nVidia just find out that failing to place an NV2x part was a bad decision? Or have they felt since the inception of the NV2x core that it wasn't a good core for entry-level parts (whereas the NV3x core is, for some reason?).

Update: Oh, and by the way, even if nVidia had released an entry-level DX8 part around the time of the GeForce4 MX release instead, based upon the GeForce2MX's launch period, we'd still be about 1.5-2 years away from any real DX8 games. Right now it looks like instead of being 1.5-2 years away from a real DX8 game, we're 2.5-3 years away from a real DX9 game (Without a low-end DX8 card from nVidia, we may not see more than a tiny handful of fully-DX8 games).
 
Perhaps the advent of the X-Box will bring DX8-level games out faster? Develop the game for the XBox and then expend a little more effort to port it to the PC.

--|BRiT|
 
However, it really looks like nVidia will correct this mistake (and then some) by releasing an entry-level DX9 part early next year, and even an integrated NV3x nForce sometime next year.

Ahem.

"Regardless of whether or not nVidia has been able to leak roadmaps for parts baesd on DX9 tech, it doesn't mean a damned thing until they release a card with DX9 capability."
 
just out of curiosity, how large is the installed userbase of dx7 level cards that are capable of getting playable(reasonable) framerates in UT2K3?
 
Chalnoth said:
While true, you can't hold the blame solely on nVidia. ATI wasn't much faster putting out an entry-level DX8 card.

However, it really looks like nVidia will correct this mistake (and then some) by releasing an entry-level DX9 part early next year, and even an integrated NV3x nForce sometime next year.

I think ATI has been quite a bit faster than nVidia in getting out an entry-level DX8 card, since the 9000 has been out for a few months and nVidia has yet to even announce anything of the sort. You should also give ATI credit for producing the only mobile DX8 part.

Also, ATI is due to announce soon a mainstream (although not entry)- level DX9 part. I don't know how you've decided that the NV31 is going to be an entry-level part, instead of something in the $200 - $300 range.

If nVidia does release an NV3x integrated nForce next year, I will be quite impressed.
 
Back
Top