R200 & RV250 to kick NV25 in next years games ?

jb said:
DT,

why do you call it a lame a$$ game? I thought a die hard UT fan like you would be looking forward to it.

because obviously its been optimised for nVidia cards against the 8500 8)

j/k
 
Randell said:
Dolemite said:
50+ FPS at medium detail on a GF2MX at 800x600. Very playable on high detail at the same resolution or medium detail up to 1280x1024 on an indoor map.

How exactly is that forcing the upgrade path?

Well I dont call ~23fps on high deatil playable for an 'ONLINE' game.

I see UT2003 forcing upgrades as people arent used to turning down detail settings and resolutions to such low settings as Anand describes as 'medium'. Now that they will have to get playable framerates online and can see from screenshots or even friends computers what higher detail settings look like then people with Gf2MX's and less will upgrade.

Geforce2MX400 DM-Antalus 800x600 Medium Detail: 52.5 FPS
Geforce2MX400 DM-Asbestos 800x600 High Detail: 46.5 FPS
Geforce2MX400 DM-Asbestos 1280x1024 Medium Detail: 52.5 FPS

Those numbers are very respectable considering that the GF2MX400 turns in almost the same FPS in the original UT. If 50+ fps was playable for UT, it'll be playable for UT2003. If the GF2MX folks are used to turning down the detail a bit (and they'd have to be, if they've touched JK2, RTCW, etc) I'm sure they'll be ready to do it for UT2003.

Sorry guys but I just don't see UT2003 being the game to upgrade for this year.
 
Asbestos is a small indoor map 2v2 map. On any of the outdoor maps, or larger CTF arena's, especially maps with precipitation, the Gf2MX will be pulling 20-30fps average on 800x600/high details. That is not playable online, especially if there is a fair number of people with better cards/machine 'owning' you because of hardware. Plus a lot of people have the 200 not the 400 so their position will be worse.

I cant see people going, 'oh look Asbestos is on I'll jump in and play, oh no server rotation, this next map (Antalus) is unplayable I'll come off' for long without wanting to upgrade.

Plus we dont really know exactly how much Anands turning down the detail looks, but the Gf2MX400 still only pulls 34.6 fps on medium settings on Antalus at 1024x768.

All this is on an XP2000 - those people with 1 gig CPU's and Gf2MX's will be really screwed. (all IMO of course).
 
Speaking from experience with over 1000 hours of matches of UT under my belt Randell is correct, if your pulling 30-40 fps on a multiplayer game with 8 bots running around you are simply target practice. If the UT 2003 demo is anything like the leaked version I have that is NOT a indication of a real online game (I've said this many times), for one the bot count is very low and a much more realistic number would be 20.
Last night I played a match with 12 players and I was getting 145 fps avg which is very good, this server capped the players @ 12 which is unusual.

JB:
I am a huge UT fan but was very dissapointed they chose a DX7 engine again although they made the engine scalable for future upgrades.
DX8 cards have been out two years and I was hoping for a engine similar to NWO...
Stating that the Unreal Engine IMO was the best looking engine of that era so hopefully they pull off the same magic, IMO Quake 3 was the same old pallette with browns and dark colors and was getting old not to mention poor physics and gameplay...
--------------------------------------------------------------------------------
NWO

- Full 32bit rendering as default
- DVA technology for efficient, exact visibility processing
- Efficient graphics/texture compression/decompression, without any quality loss
- Multiple graphics processing pipelines, distributable over multiple processors
- Renders bezier patches, NURBS surfaces and polygons
- Supports hardware T&L for increased throughput
- Unique surface technology, blends unlimited number of textures
- High quality decal support; decals affect bumpmapping and lighting
- True bumpmapping technology, real-time calculated
- True real-time dynamic lighting, no light maps
- Full real-time Phong/Blinn and Metal shading
- Real-time calculated full quality specular highlights
- Omni, spots, directional and volumetric light-sources
- Dynamic real-time shadows, full quality, sharp or soft
- 24bit RGB true-color lighting, 48bit internal blends
- Light-DVA for fast, exact extraction of light / shadow receivers
- Fully dynamic/deformable geometry, no architectural limits
- High speed, high quality rendering of blended and skinned geometry
- Advanced particle/meta-particle rendering and blending
- Skeletal animation with constraints and expressions
- Real-time full forward/inverse kinematics
- Advanced motion blending / smoothing
- Highly sophisticated dynamics subsystems
- Full hardware accelerated rendering
 
DT,

I asked Tim about that and more or less his answer was that there are not enough DX8 cards out there as DX7 cards is the majority of what the people have. I know that any hardcore gamer has upgraded to a GF3/GF4 or 8500 class of cards. But there is a larger majority that have NOT. And thats the issue. In fact at the mod summit I was at Epic gave away about 20 GF4/8500 to the members of the mod teams. I was supprised that only a few of them had a gf3 or higher and these are the leading mod teams no less. Trust me, those leakes betas are pure crap to what the final game was. I had a chance to play it and not once did I think, man this sucks no DX8 effects. I am glad that some engines push the envlope. UT2003 is not gona to set many records to be the first engine that does this. Also remember that the tech for this game was created before a large amount of GF3/GF4/8500 cards were out. The have a bunch of people that have licenesed this engine. So they had to finalize the desgin set (DX stuff) a long time ago. Say what you will of Tim, but after having a chance to talk to him one-on-one he really is a sharp guy that believe it or not "knows his stuff" :)
 
nVidia really DID NOT help the PC world with the release of stupid geforce 4 mx, even if they're quite good for the price/perf i was really pissed at them for the reuse of the oooldd DX7 core (with LMA and stuff added ok but..) again...

Too bad.
 
Randell & DT,

You guys are still thinking in enthusiast mode. Unfortunately, hardware enthusiasts and hardcore gamers do not drive the industry. To the normal game buying public, 50 FPS at medium detail at 800x600 is very playable. That's 50 FPS on a large, outdoor map with lots of overdraw and 120 FPS with the same settings on an indoor map. If the Geforce2 MX that came with Joe Average's Gateway will run UT2003 that well, then he's not going to upgrade.
 
TBH I couldt give a fig that UT2003 has no PS effects, the effects I've seen in terms of reflective surfaces, smoke billowing, detailed textures everywhere, moving shadows from trees, detailed models, rag-doll-physics all were soo much better than I was expecting and have seen elsewhere where the same old 'QIIIA engine' look is replicated.

Question for the coders - what does full cubemaps support give you in terms of effects?

Back to is it a game to upgrade for? - The consumer 3d graphics industry is about enthusiasts - not necessarily hardware enthusiast, but gaming enthusiasts. The fact you have to turn down all the effects and lower the resolution to get a playable/good game and then still not draw enough fps to compete easily online will make people want to upgrade. I see it as the straw that broke the camels back. Several games have chugged badly recently, Morrowind, GTA3, Dungeon Seige, NWN etc on less capable systems. You throw in the second/third most anticipated game (after Doom3, Unreal 2) that needs a good system to look and play well and people will want to upgrade their graphics card.

My mate who is still running a PIII-500 and a TNT2-Ultra saw the demo and has asked me to spec an upgrade for him for later in the year.
 
I honestly hope you're right, but I just don't see it playing out that way.

For one thing, I know the development community has real data on this sort of thing and that's how they determine their baseline system.

Just take a look the MadOnion "Users Choice" chart:

http://gamershq.madonion.com/hardware/halloffame/

Geforce2MX200/MX400/GTS/ti/Pro cards make up 30%. Clearly shader-capable cards are in the minority. And remember, Mr. Joe Average isn't running 3DMark, these are your hardcore gamers, many of whom still haven't upgraded beyond the GF2 level.

If people are going to get similar framerates in UT2003 as they did in UT, which looks to be the case at the medium detail settings, then I don't think they're going to rush out and buy a new video card. IMHO...
 
The reason those cards get those frame rates is because the engine is DX7, which IMO is disappointing considering what Carmack is doing. I always looked at Epic (which is quite small) as the mini-ID software as they put out some good games that pushed the envelope. :(
 
Dolemite said:
Just take a look the MadOnion "Users Choice" chart:

http://gamershq.madonion.com/hardware/halloffame/

Geforce2MX200/MX400/GTS/ti/Pro cards make up 30%. Clearly shader-capable cards are in the minority. And remember, Mr. Joe Average isn't running 3DMark, these are your hardcore gamers, many of whom still haven't upgraded beyond the GF2 level.

The thing is, that a high-quality killer app might hopefully force some of this people to upgrade. Another thing that baffles me is that in 1998, there was more support for 3D accelerators among developers then there is for DX8+ support today, despite the fact DX8+ cards today greatly outnumber all 3D accelerators cards back then.
 
Not to bag on Epic and Sweeney, but he seems to push the envelope about 2 years too late.

Of course, I've done nothing but push peoples' buttons so I can't say that with much to back my mouth up. ;)
 
Let's put it this way ...

PS 2.0 is an improved PS 1.4 . More features and capabilities but , most imortant , it's much more easy to work with . This is one of the reasons that game developers never used DX8.1 features in their engines . Because it's to hard to program for and with DX8.1 .

OK ... so let's say that a game developer won't necesarly do a game that requires PS 2.0 capable cards but it will do one for PS 1.3 & 1.4 , for the GF3 & 4 series and for R200 , just because it;s more easy to work with .

That's a general atitude in game developers's minds : using the best graphics features in the most easy way but having most of the curent and close future cards capable of playing the game .
 
It occurs to me that Epic may want UT2K3Bench to show higher numbers, in order to encourage people to buy the game (or, more specifically, to not discourage them). Thus, I still value the more demanding Unreal Performance Test as more indicative of future performance. I'd consider a multiplayer FPS benchmark valid only if it was recorded with 16+ ppl per map, not 2 vs. 2 as Randell said Asbestos was.

Without pictures, though, I don't really consider benchmark numbers valid. And this UT2K3Bench smells too much like marketing, even though I tend to think Anand is an honest guy. I'll wait till a demo is actually released to start paying real attention.
 
correction I double checked - Asbestos is smallish but is 4v4. However other indoor maps that are 2v2 seem bigger :)

TBH if UT2003 is so scalable then as to be playable to average Joe Gamer on a PIII700 and a Gf2mx but is scalable with effects and IQ to tax a 2gig + P4 and Gf4Ti then all credit to Tim Sweeney and his team IMO.

Of course Gf2MX people who upgrade may move to a Gf4MX :rolleyes:
 
I don't think developers target minor revision APIs. They target major revisions. PS2.0 is a major revision. PS1.4 isn't. I doubt many developers are going to write code specifically for 1.4 if they can avoid it. They will target PS1.1, then PS2.0, and then, company resources permitting, do a 1.4.

Basically, developers go where the money is, and bet on where they think the standard platform will be by the time they finish development. Any sufficiently large base of pixel shading capability in the future is either going to be PS1.1 (GF3/GF4/Radeon 8500/R300/NV30/Parhelia/P10/Xabre) or PS2.0 (R300 or NV30). 1.4 represents a tiny minority of the overall market. Basically, optimizing a game engine around 1.2/1.3/1.4 is a waste of time.

2.0 shaders won't automagically "gracefully degrade" into 1.4 shaders, so supporting 1.4 will take alot of work on the part of artists, designers, and coders to come up with work arounds. And 1.1 shaders won't automagically run or look better on 1.4 Either way, there's not much incentive to support a stop-gap API that will be forgotten about in 1 year when all new cards are DX9.

If you currently own an R200 derivative, just bite the bullet and go for the R300 and stop worrying about some future game making your current card more valuable. Accept the upgrade treadmill and that you're current card will rapidly depreciate in value in the next 12 months.

Once you get over the notion that a video card is an "investment", you won't stress out about 1.4 PS being "wasted"
 
You're the coder so .... :)

Anyway , PS1.4 capable cards are not so few as R8500 cards do it , Parhelia does it and now Xabre will do it too .

That leaves just VIA S3 and nVIDIA's GF3&4 without PS 1.4 .

Also ... how about those 100 games with special ATi optimisations ... ?

And just look how Parhelia benefits from PS 1.4 in NWO !
 
Anyway , PS1.4 capable cards are not so few as R8500 cards do it , Parhelia does it and now Xabre will do it too .

Parhelia and Xabre are PS1.3 capable, not PS1.4.
 
Yep and this lame ass labelling system for DirectX compliance has got to stop. In the future I hope Microsoft inlcudes ONE Pixel shader revision for all Direct X revisions, i.e DX10 will be PS 3.0. None of this PS 2.1, 2.2, 2.25,:rolleyes: the Card is PS 3.0 so it is DX 10 compliant
 
Back
Top