Why are games so far behind 3d technology?

Its the other 40% that dont even meet the DX7 spec

Just out of interest what do you consider to be 'the DX7 spec'?.. do you mean a card must support every feature allowed by DX7 in hardware? If not then what features present in DX7 do you consider not to be needed for a card to be supporting the DX7 spec?
 
Teasy said:
Just out of interest what do you consider to be 'the DX7 spec'?.. do you mean a card must support every feature allowed by DX7 in hardware? If not then what features present in DX7 do you consider not to be needed for a card to be supporting the DX7 spec?

Complete hardware support of DX7, yes thats what I was trying to get at.
 
FWIW Blizzard probably has made way more money in terms of games sales than id, just because their games will run on anything. But not only that, despite the fact they don't "push the envelope" they also have developers who know what they are doing. Blizzard will never release a game that's a buggy PoS (NWN, for example), and they'll take as long as necessary to guarantee that.

Johnny Rotten said:
Teasy said:
Just out of interest what do you consider to be 'the DX7 spec'?.. do you mean a card must support every feature allowed by DX7 in hardware? If not then what features present in DX7 do you consider not to be needed for a card to be supporting the DX7 spec?

Complete hardware support of DX7, yes thats what I was trying to get at.

Not even GF1 and GF2 are DX7 then, because they don't support EMBM.
 
True, but there has been a clear move away from EMBM (in fact it never really 'caught' in the first place) by the major players towards dot3 blending in general which IS supported by radeon/gf quality chips.
 
Johnny Rotten said:
True, but there has been a clear move away from EMBM (in fact it never really 'caught' in the first place) by the major players towards dot3 blending in general which IS supported by radeon/gf quality chips.

What was supposed to be better about EMBM anyway? And I wonder if it never caught on because Nvidia never ended up supporting it until GF3+ (GF3 and later do support it don't they?).
 
Mr. Blue said:
This poses another question, however. If the gaming companies are lagging behind, then why accelerate the 3d technology as fast as its going? It seems that only people who want to have the bleeding edge of technology will buy and yet, most people won't even "see" these features for many many years! This is depressing.

Dot3 can be done with a GF one card. It adds significantly to realism in a scene. Why not use it?

-M
Yes it's a shame.. but i think that dev's don't use extensively DOT3 because, in a real game, this kind of bump mapping on a GF1 card makes the gameplay too slow.. try "Evolva", "Sacrifice" or "GP4" and see what happens.
 
Nagorak said:
FWIW Blizzard probably has made way more money in terms of games sales than id, just because their games will run on anything. But not only that, despite the fact they don't "push the envelope" they also have developers who know what they are doing. Blizzard will never release a game that's a buggy PoS (NWN, for example), and they'll take as long as necessary to guarantee that.

Very true, but part of that luxury is the joy of positive feedback. Making a mountain of cash is the best way to guarantee you can spend plenty of time on your next game.

Blizzard is a MUCH higher-cost operation than ID, who still have a handful of employees. Just go look at the credits on any Blizzard game. The core point is the key; the best way to make that cash is to target the general gamer.

That said, nowadays Blizzard do push the envelope just a little - for 6-player Diablo 2 the K6-2 w/TNT in our office definitely wasn't up to it; and for Warcraft III I'm not sure how much fun you'd have with their minimum spec (P2-400 and TNT/Savage4/i810).

I think that's the way to go, it will encourage people towards upgrading to their 'recommended' spec (which is any 32M card, i.e. DX7-class) without castrating your sales at the same time.
 
Nagorak said:
Unfortunately 2D games like 'The Sims' sell insanely well because they can run on a 400 MHz Celeron. :(

i dont get it... why is it unfortunately?
it is totally umimportant if a game is 2D or 3D... a game has to be fun. everything else is irrelevant.

to be totally honest my all time favorite games are mostly 2D games
 
mat said:
i dont get it... why is it unfortunately?
it is totally umimportant if a game is 2D or 3D... a game has to be fun. everything else is irrelevant.

to be totally honest my all time favorite games are mostly 2D games

You honestly feel the Sims is "fun"? That's why it's unfortunate. It'd be one thing if lack of graphics was made up for by great game play, but that hardly seems to be the case. All IMO obviously. ;)
 
Complete hardware support of DX7, yes thats what I was trying to get at.

In that case no chip has ever been DX7 spec, Radeon came closest before DX8 level chips.

Not even GF1 and GF2 are DX7 then, because they don't support EMBM.

True, but there has been a clear move away from EMBM (in fact it never really 'caught' in the first place) by the major players towards dot3 blending in general which IS supported by radeon/gf quality chips.

But wether its popular or not isn't really important, Dot3 and cube mapping aren't exactly popular yet, and when they are used to any real degree they'll be used as part of the DX8 API (Doom 3, Unreal 2) not DX7. Its still a feature in DX7 and Geforce 1 and 2 don't support it. Actually if we go by your deffintion Geforce 1 and 2 don't even support DX6 spec never mind DX7 because EMBM was introduced in DX6.

What I'm really trying to get too is that IMO you don't have to support all features of an API, like DX7 for instance, to be a DX7 spec chip.
 
Murakami said:
Mr. Blue said:
This poses another question, however. If the gaming companies are lagging behind, then why accelerate the 3d technology as fast as its going? It seems that only people who want to have the bleeding edge of technology will buy and yet, most people won't even "see" these features for many many years! This is depressing.

Dot3 can be done with a GF one card. It adds significantly to realism in a scene. Why not use it?

-M
Yes it's a shame.. but i think that dev's don't use extensively DOT3 because, in a real game, this kind of bump mapping on a GF1 card makes the gameplay too slow.. try "Evolva", "Sacrifice" or "GP4" and see what happens.
.. is it true to assume that on GeForce2 hardware the CPU has to calculate the shift for the DOT3 bump map, and on GeForce3 this task is accomplied by the GPU -in Evolva, on GeForce1/2, DOT3 bump overload the CPU up to 50%-? Is this the reason behind sporadic use of this feature in today's games? And.. do any of you know what are the 3 different modes of bump mapping available in Evolva after applied the patch? All DOT3 different modes or some of these are embossed? Thanks all.
 
A lot of you guys on this forum are so immersed in your DX9 world that you tend to be a little out of touch with the mindset of the average gamer.

Let's put it this way - the average gamer is nothing like any of us on this forum. The average gamer has never played Quake or Unreal, may or may not have Starcraft (but is extremely bad at it!), and plays "The Sims" and "NeoPets" as their primary games. They might also have some "Big Game Hunter" and "Championship Bass Fishing" games. The average gamer is completely clueless about tech. They might be a middle-school child, or a middle-aged middle-manager, but they can hardly tell the difference between PC and Mac, let alone even THINK about opening up that scary computer case and messing around with the Magical Silicon Components within.

The average gamer does not know what a videocard is. If you ask them, they will say "how do you know that?", and you will tell them to go to their display options, and they will ask you "So, is a TNT2M64 good?". Occasionally, you will spot someone complaining on a Warcraft 3 forum that their computer runs War3 very poorly, and why the fuck is that, it's a "fast" 1 ghz celeron, their friend's "slow" 700 mhz athlon (with a GeForce3) runs at much higher resolutions no problem. These people are much more tech-savvy than the average gamer. The average gamer has never visited a gaming forum, and he wouldn't know how many MHz is in his computer anyways. Anyways, the average gamer would never figure out that his computer is running Warcraft 3 poorly.

The average gamer can't tell the difference between 30 fps and 100 fps, and he also doesn't know that he can run games at resolutions higher than 640*480. I know several people IRL who claim that their 32 mb GeForce2Mx cards are quite powerful enough, because they can run every game they have at "very good-looking settings" and still be perfectly smooth. By this, they mean 640*480 with high graphics settings. The sad thing is that their GF2MXs are a lot better than some other people's computers, who are running with integrated Rage or TNT2 graphics. (or even -gasp- Intel!)

We here tend to regard anything of GeForce2MX/Radeon7000 class as being totally inferior, the bottom of the pile, etc. To the average gamer, this is a very powerful videocard. No wonder the average gamer can't tell the difference between a GeForce4MX and a GeForce4Ti - they are both vastly more powerful than their current videocards, and at 640*480 there isn't a difference between them anyways!

Blizzard got a lot of complaints that Warcraft III's minimum system requirements were exorbitant, or that it runs too slowly on low-end computers. A lot of casual gamers regard Warcraft 3's graphics as being the most advanced of any computer game they have. (not a difficult task, considering that it's being compared to The Sims...) Yet, War3's graphics engine doesn't have anything that wasn't around in Quake 2 days. Does anyone really believe a DX8 game to be viable?

In any case, the technological lag between high end computers and "average gamer" computers is really not as bad as you think. The major reason behind the lag is just that computer graphics are advancing at such a fast rate, the casual user cannot stay on the cutting edge. All things considered, it is still true that the average gamer's computer today is better than the highest-end computers 5 years ago. 5 years from now, even "The Sims" should have "Unreal Tournament 2003" level graphics.
 
Just to give you an idea of what the average gamer has, when I was at the UT2003 Mod summit a month or so ago, before Mark Rein handed out GF4/8500 cards to the group. He asked what these people were currently useing. Out of 24 of us, there was only 3 with GF4/8500, 5 with GF3 class and the rest were GF2 class. Again these were team members from UTs top mods (SF, INF, SAS, Chaos, Gods, Jailbreak, UF, Tac Ops, ect). So if the people making the top mods are mostly on GF2 class hardware. I still have one of my team members using a TNT2 :eek:
 
Sounds like the video card designers themselves should release a game with their new technology. If the R300 had a game using the technology included, it would definitely up the standards of what a game could be like. Once people become aware of what this or that really means especially if it is superior technology they will want it much more and will be willing to pay for it. I am a firm believer that Nvidia, ATI or whoever should have an updated game engine for developers to use, modify, experiment with for each generation of cards they produced. I think that would help solve this problem. Alot of time, research and effort goes into the 3dCard designs which are never used.
 
noko said:
Alot of time, research and effort goes into the 3dCard designs which are never used.
I agree. JC said he will use the DX9 level cards as his basic level for next generation engine. Then the DX8 level technology will not be used by id software.
 
Simon F said:
Murakami said:
.. is it true to assume that on GeForce2 hardware the CPU has to calculate the shift for the DOT3 bump map,
Shift? I think you might be confusing dot product bump mapping with embossing. The latter is what you use if you don't have anything else :)
(Bump map comparison page (or "mirror"))
Sorry for the mistake.. my tought was, speaking about DOT3: "Each time a triangle of the bump mapped object is moved or rotated the normals of the normal map need to be transformed again, because their direction changed in relation to the light source or the view port. This transforming of the normal map used to be done in software and thus by the CPU, which made the few games that supported dot3 bump mapping (as e.g. Evolva) rather slow once dot3 bump mapping was enabled.
The Vertex Shader is able to compute the vectors required for the transform of the normal maps, using the so-called 'texture space'. This procedure is also called 'per-vertex dot3-setup'. Further down in the 3D-pipeline the Pixel Shader of GeForce3 transforms the normal map by the vectors supplied by the Vertex Shader to create the normals that are required to display the bump mapping effect. With GeForce3 this whole procedure does not require the CPU, which makes dot3 bump mapping an effect that can be used without a major impact on game performance." from Tom's Hardware.
What can you say? And about bump mapping in Evolva? Thanks all.
 
Back
Top