Why are games so far behind 3d technology?

All:

Its good to post here again after such a long break from gaming. I've heard talk from one of our rendering software customers about experimenting with this new next-gen hardware for helping out render scenes for our movies.

Anyway, I became interested in this and would like to know why today's games still haven't made the basic new features of the 2nd and 3rd generation boards a mainstream. Dot3 has been out for ages and yet noone seems to adopt it on a continuous basis. 'The Thing' and 'DOOM3' seems to be the only interesting games to support it.

Almost every game that comes out (be it PS2, Xbox, Gamecube or PC) still uses the basic 2-pass multitexturing, low polygon count (well, PS2 games are somewhat higher), basic phong lighting model for shaders.

Why the lag?

Soon, the film companies will be using these features through Cg and will be taking advantage of the most advanced features right away..

-M
 
Two reasons: 1. Long development cycles for games(1.5 to 5, 6 years, and some even last forever to make :LOL:)
2. Installed basis- game developers mainly want their games to sell so they aim for the masses and, sadly, the masses aren`t even gf2 ultra class of gf4 mx they are more like gf1, or gf2mx.
Hope it helps :D
 
Hi Mr. Blue

The film companies dont have to worry about installed base and they have much larger budget.
 
Because games on the bleeding edge of technology generally don't sell well on retail. They only sell in the bundle market.

Some companies made a big success of this a few years back (Rage, particularly, with Incoming and Expendable being two of the biggest sellers in the world if you count numbers of units shipped) but that market is vastly smaller now. It's a long way from when you couldn't buy a video card without MechWarrior 2 being included with it.

Carmack has a big enough name that he can sell Doom3 next year, but even that is minimum-specced at the GeForce and Radeon (i.e. the DX7 class hardware, not the DX8), which are how many years old now? Even then I'd be a bit surprised if Doom3 makes the top ten list next year...

I think the statements about the consoles are somewhat inaccurate. PS2 in particular seems to be slipping behind to me, as you'd expect from its relatively limited specification (and/or the difficulty of extracting the best from it) next to the Gamecube and XBox. The latter seems well ahead (in technology terms) - as a fixed DX8 platform the games developers can target it better.
 
This poses another question, however. If the gaming companies are lagging behind, then why accelerate the 3d technology as fast as its going? It seems that only people who want to have the bleeding edge of technology will buy and yet, most people won't even "see" these features for many many years! This is depressing.

Dot3 can be done with a GF one card. It adds significantly to realism in a scene. Why not use it?

-M
 
Ironically the gaming industry is bigger in terms of money than the film industry, but there are also a lot more developers.
 
If the gaming companies are lagging behind, then why accelerate the 3d technology as fast as its going? It seems that only people who want to have the bleeding edge of technology will buy and yet, most people won't even "see" these features for many many years!

It's due to the catch-22 that the bleeding edge is always necessary for technological advancement. HDTV has been scraping along for years with a miniscule user base and very few high-definition shows, but without that "scraping along" phase HDTV would never come to exist at all.

Likewise, even though no one uses DX8 features until DX9 cards are out, that's only because the developers have to wait until enough people have DX8 cards. If videocard makers tried to wait until developers were making DX8 games before putting out DX8 cards, developers would never make DX8 games.

Developers have long learned their lesson with regard to graphical excess. Games like "Giants" didn't sell, because the vast overwhelming majority of people didn't have GeForce level graphics, they still had their VooDoo2's and generic "PCI Video Card"s. On the other hand, games like "The Sims" with butt-ugly graphics and no real gameplay to speak of sell countless gazillions of copies, because they are accessible to everybody. Only John Carmack is willing to push the graphical envelope anymore...
 
I believe that well-known/influential game developers like id and Epic have the power to accelerate the adoption of new graphics technologies. I just wish they'd exercise that power more.

Everyone makes the assumption that gamers only buy games that their hardware can handle; I argue that if the game is good enough, gamers will buy the hardware just so that they can play the game. E.g. what if id were to target Doom3 to Geforce3/Radeon8500 as a *minumum*. Require hardware T&L and DX8 pixel/vertex shaders and use them extensively. No 'NV10 codepath' (which Carmack has mentioned in his .plan). Then deliver a game that is so groundbreaking, so awe-inspiring, so freakin unbelievable, that every gamer just has to own it. This is the sort of scenario that I believe would cause a mass-migration to upgrade video cards.

Why have PC/Hardware sales been stagnating lately? Why is the whole industry hurting? It's the lack of a new "killer app" that requires users to move up to the next stage. If we keep catering to Joe Average with his Celeron 700 and TNT2, dumbing-down our latest and greatest games so that we don't dissapoint him, the whole industry is headed nowhere.

If I was the CEO of Nvidia/Intel/ATI/AMD/VIA, watching my sales slow to a trickle simply because there's no compelling reason for anyone to upgrade their PC, I'd be flying over to Mesquite, Texas to talk to Mr. John Carmack. I'd offer him incentives to drop all pre-DX8 support from Doom3. I'd give the man a new Ferrari to put in realistic AI and physics that require at least a 1 GHz P3/Athlon.

It would be a ballsy move for id, but one which I think would pay off for everyone. Doom3 is the most hyped game in history, everyone knows about it. If Doom3 isn't the killer app the industry has been waiting for, I don't know what possibly could be. If the game is good enough, people will upgrade to play it. It would give the whole hardware industry a much needed kick in the @ss. Then all game developers could take advantage of the increased hardware baseline.

Come on, id! Show a little leadership.
 
SteveG said:
It would be a ballsy move for id, but one which I think would pay off for everyone. Doom3 is the most hyped game in history, everyone knows about it. If Doom3 isn't the killer app the industry has been waiting for, I don't know what possibly could be. If the game is good enough, people will upgrade to play it. It would give the whole hardware industry a much needed kick in the @ss. Then all game developers could take advantage of the increased hardware baseline.

Come on, id! Show a little leadership.

If Doom3 is as good a game as the "killer-app" you're craving, then people will not be satisfied with their celerons and TNT2s to play it anyway.
 
Ichneumon said:
If Doom3 is as good a game as the "killer-app" you're craving, then people will not be satisfied with their celerons and TNT2s to play it anyway.

Ok, maybe Doom3 won't be playable on a TNT2, but it is targeted to the original GeForce (NV10) because Carmack has mentioned it in his .plan. I think GeForce & GeForce2 are going way too far back. Cut support off at the GeForce3.
 
Question...

What efforts are being done into the direction of more scalable graphics engines? I mean, one that would minimize the effort needed to include both high and low-poly geometry and had a good material system that had representations of each material for, say, DX7 and DX9 types of 3D cards.
 
SteveG said:
Cut support off at the GeForce3.

why would you want to cut of 95% of your customers?
if you release a DX8 only game noone will be able to play it (well except some freaks that will upgrade their PC every 5 sec ;))
 
Almost every game that comes out (be it PS2, Xbox, Gamecube or PC) still uses the basic 2-pass multitexturing, low polygon count (well, PS2 games are somewhat higher), basic phong lighting model for shaders.

I agree with you in that the massive majority of PC games seem to use very basic texturing effects and very low polygon counts (30-40,000 polys per frame). Even the exception to that like Morrowind still uses only around 100,000 polys per frame at best. But your way off with those comments in consoles. Your wrong in thinking that:

1: Console games in general on the big three consoles are anywhere near as low poly as PC games in general.

2: Console games in general on the big three consoles are only using basic texturing like most PC games.

3: PS2 is the console pushing better texturing or higher polygon counts.

XBox has quite a few games with extensive use of bump mapping, for instance DOA3 and Halo, and a few of its games are up in the 200,000 polys per frame range and maybe one or two even higher, and that should increase by allot soon. GameCube also has a few games that use bump mapping quite well, and one in particular that uses it extensively (high quality bump mapping basically everywhere!) , I'm talking about Rogue Leader. The same game is pushing around 250-300,000 polys per frame at times!.. with between 5 and 8 texture layers and I'm sure that future GameCube games will have even higher polygon counts. PS2 as of yet hasn't shown real bump mapping. Its games are quite high poly though, its best looking games maybe even come close to XBox and GameCube's best looking games (in polycounts that is not texturing).
 
Mr. Blue said:
This poses another question, however. If the gaming companies are lagging behind, then why accelerate the 3d technology as fast as its going? It seems that only people who want to have the bleeding edge of technology will buy and yet, most people won't even "see" these features for many many years! This is depressing.
Gaming companies can't help how fast 3D companies innovates. Conversely, 3D companies can't help how fast gaming companies adopt the latest 3D innovations.

That came out sounding a bit of a "Duh!", huh? :)

Dot3 can be done with a GF one card. It adds significantly to realism in a scene. Why not use it?
You're talking about released games right? Well, there are still Voodoo3s and Voodoo5s out there :)

It's a matter of risk-taking and priorities by gaming companies, it's as simple as that. I believe Epic and id ares thinking of the GeForce2 as the lowest and widest-base they'll consider for their next game. Tim Sweeney's post-UT2003/Unreal2 engine will concentrate on key DX9 areas though, as an example.
 
SteveG said:
I believe that well-known/influential game developers like id and Epic have the power to accelerate the adoption of new graphics technologies. I just wish they'd exercise that power more.

Everyone makes the assumption that gamers only buy games that their hardware can handle; I argue that if the game is good enough, gamers will buy the hardware just so that they can play the game. E.g. what if id were to target Doom3 to Geforce3/Radeon8500 as a *minumum*. Require hardware T&L and DX8 pixel/vertex shaders and use them extensively. No 'NV10 codepath' (which Carmack has mentioned in his .plan). Then deliver a game that is so groundbreaking, so awe-inspiring, so freakin unbelievable, that every gamer just has to own it. This is the sort of scenario that I believe would cause a mass-migration to upgrade video cards.

Why have PC/Hardware sales been stagnating lately? Why is the whole industry hurting? It's the lack of a new "killer app" that requires users to move up to the next stage. If we keep catering to Joe Average with his Celeron 700 and TNT2, dumbing-down our latest and greatest games so that we don't dissapoint him, the whole industry is headed nowhere.

If I was the CEO of Nvidia/Intel/ATI/AMD/VIA, watching my sales slow to a trickle simply because there's no compelling reason for anyone to upgrade their PC, I'd be flying over to Mesquite, Texas to talk to Mr. John Carmack. I'd offer him incentives to drop all pre-DX8 support from Doom3. I'd give the man a new Ferrari to put in realistic AI and physics that require at least a 1 GHz P3/Athlon.

It would be a ballsy move for id, but one which I think would pay off for everyone. Doom3 is the most hyped game in history, everyone knows about it. If Doom3 isn't the killer app the industry has been waiting for, I don't know what possibly could be. If the game is good enough, people will upgrade to play it. It would give the whole hardware industry a much needed kick in the @ss. Then all game developers could take advantage of the increased hardware baseline.

Come on, id! Show a little leadership.

I'm sorry, but you people who think this are just living in lala land. I guess it's easy to make stupid suggestions when you're not the one standing to lose a lot of money on it. :rolleyes:

id and Epic have to worry about their games selling just like everyone else, maybe more so because of their engines. On the other hand since they are also selling engines they can afford (and have to) push the envelope a little bit more. In a way Doom3 and UT2003 will just be big tech demos of their technology. Unfortunately, if they reach to far they suffer from a double whammy. First of all they make little money on the game itself and secondly no one will be interested in using their engines for their games (and make no money themselves).

Saying that people will upgrade to play these games if they're good enough is just ridiculous. If you honestly believe that then you're totally out of touch with the common public. Those who are into computers generally will usually be upgraded to the point where these games will run well, or will upgrade so they do. Most people, however, will not upgrade their machine to play these games. They'll still buy them if they can run crappily on their machine. And while we'll never see these people playing online in any way that could be described as competitive, they pay just as much money for the game as hard core gamers (maybe more on average since none of them have the know-how to pirate the game, etc). There's also a lot more of these "lamers" than you'd believe, and they probably comprise most of the money epic and id make.

So if you think id and Epic can just throw caution to the wind and program a completely cutting edge game, you're just smoking crack.

By the way, if the CEO of Nvidia flew over to Mesquite, Texas and offered John Carmack a Ferrari to drop all pre-DX 8 support, he'd probably laugh in his face. He'd stand to lose far more money by doing that, then he'd gain by accepting the bribe. The only other option would be for Nvidia to actually buy id up, and I have a feeling they are not at all interested in being bought. After all they've been very successful, by doing smart things like supporting a wide range of hardware.
 
Conversely, 3D companies can't help how fast gaming companies adopt the latest 3D innovations

I think they can, they can for example skip stupid Geforce 4MX and Radeon 7500 (those 2 cards sell a lot unfortunately) and do low cost DX8 parts only.

ATi is going to release R9000 which is a DX8 part at least (too late, damage has been done though), nVidia is probably going to release DX7 AGP8X cards again, so yes, 3D companies imo play a big role in the stall of PC game industry.
 
I agree with you in that the massive majority of PC games seem to use very basic texturing effects and very low polygon counts (30-40,000 polys per frame). Even the exception to that like Morrowind still uses only around 100,000 polys per frame at best.

I think you're too optimistic : 30-40,000 polys per frame would be great for a PC game !
It's more like 5-10,000 polys per frame unfortunately.
With the game Shogo, you had the option display the number of poly drawed per frame. It never exceed 2000 polys :eek:
The max I had was 30,000 polys/s on a P2 300 with a TNT1 @800x600x16bit
Developer tend to greatly exagerate the number of polys/s

Do someone here know games where you could display the number of poly drawed ?
 
Lessard said:
I think you're too optimistic : 30-40,000 polys per frame would be great for a PC game ! It's more like 5-10,000 polys per frame unfortunately.

No, there definitly are games with an average of around 30K polygons per frame, Aquanox and Comanche4 just for example. Though we're still not at the 100K mark advertised as possible with the first T&L cards ;-)

Do someone here know games where you could display the number of poly drawed ?

Try using 3D-Analyze (http://www.tommti-systems.com/main-Dateien/files.html) with its frame + polygon counter.
 
Lessard said:
I agree with you in that the massive majority of PC games seem to use very basic texturing effects and very low polygon counts (30-40,000 polys per frame). Even the exception to that like Morrowind still uses only around 100,000 polys per frame at best.

I think you're too optimistic : 30-40,000 polys per frame would be great for a PC game !
It's more like 5-10,000 polys per frame unfortunately.
With the game Shogo, you had the option display the number of poly drawed per frame. It never exceed 2000 polys :eek:
The max I had was 30,000 polys/s on a P2 300 with a TNT1 @800x600x16bit
Developer tend to greatly exagerate the number of polys/s

Hell, Shogo is pretty old game. Quake3 uses around 10k polygons per frame, sometimes higher sometimes lower. GTA3 is using 30-60k polygons per frame.

Why are PC games polycounts so low? I mean it wouldn't take a monster gaming rig to match Xbox specs. 1GHz processor with GF4Ti4200 or R8500 should be as good as Xbox.
 
There is also another problem: old cards with bad drivers. Every card that has a DX6 level driver will still be able to run HAL in DX8. But MS could rise the bar in DX9 if they would require a DX7 level driver for HAL in DX9. That would simply eliminate most of old cards since they would not be able to run in HAL on DX9 interfaces...
 
Back
Top