GPU tech demo and comparable games

Simon82

Newcomer
Hi,

looking at the various tech demo each company realase during new GPU presentation and looking at the full 30fps speed they achieve with a graphic distant from other title that require lot more power to render less IQ at the final stage, don't you feel a little bad? :smile:
I know that a real game has got lot of real computing problems after 3D image rendering as all CPU task with physich, IA, all the various aspects with the fact you can interact with game etc etc... but... damn.. also without taking back the discussion of a simple realistic global illuminated scene in a simple cube with a "simple" light, tech demo like Ruby or the one in the street with the taxy, reach a complexity so far comparing it to others modern game.

Surely the fact that you sit down and only watch it, permit them to create some sharp tricks that let you the sense of "cinematic graphic" but some scene look so good that others game having third person "not stoppable" cinematic scene, cannot make the same.

What do you think about it? ATI and Nvidia should make a little game to see one time their cards worth the money that cost? :p
 
Sure game studios can put something equally amazing to these demos into their game, if they want to spend more than 5 years to make one game.
 
Sure game studios can put something equally amazing to these demos into their game, if they want to spend more than 5 years to make one game.

If they used controlled camera angles would it not be possible (financially/time-wise) to acheieve? (ie need for speed 1 on ps1 & 3do or crash bandicoot on ps1)

I would not mind limited viewpoint in certain genres to achieve a much higher visual fidelity.
 
The Two Towers and The Return of the King controlled camera angles like you describe. While the games looked good they weren't better than other high quality games out at the time.
 
Hi,

looking at the various tech demo each company realase during new GPU presentation and looking at the full 30fps speed they achieve with a graphic distant from other title that require lot more power to render less IQ at the final stage, don't you feel a little bad? :smile:
I know that a real game has got lot of real computing problems after 3D image rendering as all CPU task with physich, IA, all the various aspects with the fact you can interact with game etc etc... but... damn.. also without taking back the discussion of a simple realistic global illuminated scene in a simple cube with a "simple" light, tech demo like Ruby or the one in the street with the taxy, reach a complexity so far comparing it to others modern game.

Surely the fact that you sit down and only watch it, permit them to create some sharp tricks that let you the sense of "cinematic graphic" but some scene look so good that others game having third person "not stoppable" cinematic scene, cannot make the same.

What do you think about it? ATI and Nvidia should make a little game to see one time their cards worth the money that cost? :p

Problem is it will never happen..

These tech demos are done by teams of dedicated engineers who are very familiar with their own h/w and can push it to the limits without worrying about compatibility issues.. Afterall, your going to own a Geforce 8 if your running a Geforce 8 tech demo right?

Thing is with PC games, developers have to cater for the lowest common denominator AND are restricted to utilising graphics computations/tricks that aren't dependant on one specific hardware implementation for use within the game..
With this it means that even with the most powerful graphics card on the market, games developers will never push it to the limits of what its capable of because in doing so it would restrict compatibility and restrict the target audience (e.g. only people with a specific gfx card could play the game)..

Besides, the PC games market is small enough without developers restricting there target audience further with such things..
 
the realtime techdemos that were done for PS2, shown in March 1999 and at E3 1999, were rivaled and surpassed by actual gameplay graphics in PS2 games.
 
Problem is it will never happen..

These tech demos are done by teams of dedicated engineers who are very familiar with their own h/w and can push it to the limits without worrying about compatibility issues.. Afterall, your going to own a Geforce 8 if your running a Geforce 8 tech demo right?

Thing is with PC games, developers have to cater for the lowest common denominator AND are restricted to utilising graphics computations/tricks that aren't dependant on one specific hardware implementation for use within the game..
With this it means that even with the most powerful graphics card on the market, games developers will never push it to the limits of what its capable of because in doing so it would restrict compatibility and restrict the target audience (e.g. only people with a specific gfx card could play the game)..

Besides, the PC games market is small enough without developers restricting there target audience further with such things..
Well, but you don't consider a thing. For developing a tech demo also ATI or Nvidia engineers must use (I think) Directx or OpenGL architecture so the game (as with some cool hack has been demostrated) could "theorically" run on others compatible hardware (maybe without some (few I think) proprietary api call). This let me think that developers of game having lot more task to include could in the same way develop a similar smart way of optimizing game to reach the same level of quality.
 
the realtime techdemos that were done for PS2, shown in March 1999 and at E3 1999, were rivaled and surpassed by actual gameplay graphics in PS2 games.

This is a good point.. PS2 has lived so many years that probably has been explored more from this people than Sony ones.
 
Well, but you don't consider a thing. For developing a tech demo also ATI or Nvidia engineers must use (I think) Directx or OpenGL architecture so the game (as with some cool hack has been demostrated) could "theorically" run on others compatible hardware (maybe without some (few I think) proprietary api call). This let me think that developers of game having lot more task to include could in the same way develop a similar smart way of optimizing game to reach the same level of quality.

They could but this is where the politics make it unfair..

Why should PC game developers waste their time delving into the deeper innards of the GFX hardware, learning and mastering the propriatary tech to the fullest when chances are only a small percentage of your market will ever get to see the fruits of your labours..?

Then there's the fact that features of an ATI card wont transfer to an NVidia card of around the same performance rating.. And in some cases, such features may not even make it into the successor of the card from the same manufacturer, and since the game has to be able to scale across such a vast range of hardware configs, many of which are probably unknown to the developer, it seems like an incredible amount of work that could never truely be exhaustive and never really benefit more than a small percentage of the customer base..
 
Back
Top