Imagine if NVidia code a free S.O.T.A. game graphics engine

g__day

Regular
Imagine NVidia getting absolutely sick of waiting for software developers to code for their latest hardware, rather than that two to three generations old.

Then some bright spark says- lets get an internal or external team of crack s/w developers and program a 3D game engine and release it free to game developers around the time that NVidia release their cards.

That plus Cg might just take 6 - 12 months off everyone's game development cycle and boost NVidia's sales considerably.

Do any folk here imagine this could happen in the foreseeable future? How feed up do you believe NVidia are waiting and waiting (like us) for games that can showcase their cards' full abilities?
 
But thats not the point...

You simply cant code your game around the bleeding edge of technology and make any money. Its a sure fire way to kiss your A$$ goodbye before you even get started.

You have to code to the mainstream user and toss in a coupple fancy tid-bits for the enthusiast market. Look at the games that sell millions. All will run on TNT2 level hardware. We are just now getting to the point where the Geforce/Radeon is the minimal standard.

Sure a Company *could* do what you are suggesting.. but the time required to produce cutting edge graphics/effects is at *least* 24 months. If only 100,000 people can even play the game.. and only a portion of those will actually be interested in your specific type of game.... Your Profit margin.. well.. there is no profit margin.. you are operating in the hole just from writting the idea down on a piece of paper :)
 
Releasing a free, high-quality engine for modders to go wild with is a great idea, but current game developers are doing a pretty decent job of pushing current hardware. I tried MOH:AA at Best Buy the other day for the first time, on an Alienware 2.24GHz P4 (533 FSB) GF4Ti 4400, and it was choking at 1280x1024. Blame the game, blame the hardware--either way, there's room for improvement on both sides.

I'm all for higher-poly, fully-lit and -shadowed games, though. I don't care if they'd run at lower res, I still think they'd be more immersive experiences than less detailed 16x12 4xAA 8xAF games. I suppose we'll start seeing those in two years (when DX9-HW becomes standard fare).

Still, MOH:AA looked freakin' awesome. It's a nice jump from my ~30fps 8x6 CounterStrike gaming. :)
 
I think its an interesting idea. Not sure about profitability, but if it convinced enough people to upgrade, it might break even eventually. Huge companies like ATI and nVidia could probably come up with a decent engine in, say, 6-8 months. It would have to be completely non-proprietary to really be accepted by developers and consumers, though. Although to really be accepted by developers (who are looking to make a viable product anyway), the engine would have to be scalable down to mainstream hardware, which might remove some incentive to upgrade and thus any hope of being a profitable undertaking. Hmmm...
 
I don't think it'd be so much about profitability of the game itself, but as a selling point for the hardware.

I could easily imagine nVidia (or ATI...though nVidia is currently in a better financial position) funding, from start to finish, the development of a game. Unfortunately, that game would probably suck. But, it would almost certainly really show off the technology in a way no tech demo ever could.
 
Chalnoth said:
I don't think it'd be so much about profitability of the game itself, but as a selling point for the hardware.

I could easily imagine nVidia (or ATI...though nVidia is currently in a better financial position) funding, from start to finish, the development of a game. Unfortunately, that game would probably suck. But, it would almost certainly really show off the technology in a way no tech demo ever could.

No kidding, just because they could write a great looking game doesnt mean it would actually be fun.
 
Remember that game they were using to pimp the original Geforce, "Experience"?

I know nVidia at least commissioned the DMZG portion of that game just to show off the hardware. Unfortunately, that game never came to fruition, for whatever reason.

Anyway, I think most GPU companies are targetting Doom 3 as the next big thing and the game that will finally use shaders in a way that's meaningful to gameplay. So long as they know that's coming down the pike in a year or so, I don't think they'll be in any position to come up with their own engine.

I think they've realized that traditional developer relations aren't working as far as getting devs to make use of their cards' advanced features, though. Witness Cg and Rendermonkey as proof of that. Let's just hope the HLSLs actually help developers, and aren't just fuel for f@nboy flame wars.
 
Yes, middleware companies (think Unreal Engine, think Doom Engine, think Serious Sam Engine, etc etc... ) would really appreciate it if NVIDIA or anyone else would steal their marketshare with a freebie.

ATI created the SUSHI engine which was pretty close to a middleware engine of some sort, it was highly targetted at hardware and was used to illustrate to developers how to create a sensible game engine from the hardware point of view. I am sure lots of developers would have loved the source code but ATI never handed it out. Possibly because it was in-house non-cleaned up code. Writing a middleware engine is not just one task, its ongoing development since people will want extra features, people will not figure out how a component works and need help with it, bugs, engine not suitable for game type X or Y... etc... it would be a support nightmare and all of this for free ? I think ATI, NVIDIA, etc should concentrate on the companies that design middleware engines, support them to the max and convince them to put all the new stuff in. Sounds like a much better investement, costs less time and less money and would not lead to questionable marketing : "Game developed using the NV Engine". Everyone will say : its not going to run on ATI, or it will be suboptimal on ATI, etc etc... bad idea... and don't forget writing a base engine is a lot of work, the new unreal engine started from the old one, Carmack probably also re-used some of his old code, etc. Starting from scratch would probably mean several years before something sensible appears.

K~
 
A game engine is generally mistrusted unless it's from a tried-and-tested background or has at least one AAA title to validate the quality of the engine.
 
Althernatively hardware vendors can offer direct support to the middleware vendors themselves. ATi told me at least years ECTS that they had coded PS1.4 routines into the Lithtech engine themselves, and were pretty pumped about the state of NOLF2. I've not heard anything more about that since so its going to be interesting to see if NOLF2 will use PS1.4.
 
i'll just quote myself from another thread :p
Im actually surprised that NV for example isnt licencing any engine already, they could hire some of their long-time partners with respectable portfolio to do the job for them, if all of their own developers are tied up ( Crytek , Fun Labs , Vulpine ?? )

Vulpine Vision looks to be a killer product btw. From business perspective, i dont think NV would have much to lose with such move, but the gains could be formidable.
Imagine how much more appealing NV30 launch would be if even a semi-playable title with all the eye candy would be launched at the same time. Hell, even a tetris or whatever arcade title with innovative use of graphics features would do the trick.
I _dont_ want to see another Q3 bench-o-rama again
 
Folks,

there is a lot more to an game engine that just the graphics. Physic, AI, Sound, ease of use, portability, tools all have to be considered. How long is it going to take? You would have at least a year to make a good engine. Then tack on more time for the game development as the two don't always coincide after all you can not make any maps until you have some of the game engine completed for example. So by the time you started, 18 months later you have a game, but by then the hardware has already changed. Its a good idea but its not really possible....
 
This is just one way of raising interest into what can be done to shorten the cycle between release of new hardware and release of games that utilise it.

There being no standard h/w and s/w map of the future or roadmap must sure hurt everyone. If there was consensus as to what the future looked like people on both sides of the fence could start preparing for it sooner. At the moment it must feel like driving a car by staring in the rearview mirror and seeing what just happened - not very forward thinking.

If NVidia and ATI were to jointly or collectively do something it would have to be scaleable for a wide range of cards - say GF2 up at a minimum. It would have to be done in a way that does not threaten the major software houses. It would have to be in addition to everything else they do already to help s/w houses.

Even if they just gave libraries of snazzy graphics routines for each of their cards this might help.

So too might having brain storm sessions with the major s/w houses to say how to we help you guys more - lets get wild for a day or two - then come back to reality and see what we can do to go further, faster.

Its been very interesting to see how folk think. I'll leave you with one constructive question:

If sometime in the not too distant future this current long lag problem is greatly solved - what factor or factors do you think will have most been responsible for this achievement?
 
If sometime in the not too distant future this current long lag problem is greatly solved - what factor or factors do you think will have most been responsible for this achievement?
Monopoly :)

Seriously,
If no counter actions are taken, I fear this lag will only widen in the future.
I mean, to achieve greater realism in graphics, you need a more complex graphics engine, which does seem to imply a longer development time for the engine (at least in man-years). Better physics, a greater level of interactivity, better AI, etc... also contribute to this.

Btw,
Does anybody know how many man-years (approximately) it took to develop a state-of-the-art FPS engine 10 years ago (say Wolfenstein/Doom), 5 years ago (say QuakeII), today (say Unreal2/DoomIII)? Can we use this info to predict how many man-years it will take to make a state-of-the-art FPS engine in let's say 5 years or 10 years from now?
 
But there's little reason to totally rewrite an engine today. Design time can be significantly increased by leveraging code that's already been done.

Granted, you can't get as well-optimized this way, but I think it will prove necessary. The engines that will be the most advanced will soon turn out to be ones that build on older technology, not ones that are built from scratch. If the original engine was built with enough foresight, there won't be much overhead in just upgrading (And by overhead here, I mean problems related to adding additional features or performance problems).
 
Would it be easier for ATI to just buy one of the game companies like Epic? So called ATI driver bugs . . . solved. So called nVidia driver bugs . . . new ones crop with every Epic game patch. :)

Game devs could easily jump on the graphic engine bandwagon before the nextgen hardware has reached silicon. These same games could also utilize the latest hardware features in high res, all while supporting the MX line in low res. The game devs and the graphics companies could form a symbian relationship with each making more money than ever before.
 
I think Carmack said his game engines were taking six months longer with each successive generation. Makes sense, as their getting progressively more complex.

It was maybe 12 mo.'s for Doom, 18 for Quake, 24 for Q3?

I'd think development time would decrease with higher-level languages, but I really have no idea of how game dev differs from regular software dev.
 
A "3D game engine" is different from a "3D rendering engine". There are many other considerations in creating a game engine as opposed to one that only deals with 3D rendering algorithms. I'll suppose you know which one is more difficult :).

You can stuff a primarily-3D-graphics engine with lots of (the latest-and-greatest) options but to realize a game using such an engine will involve lotsa cutbacks. Once you add in the other stuff that are crucial to a game (like physics - which take up a huge amount of CPU processing power - and sound and collision detection and multiplayer considerations, among others), you start dumbing down or ignoring all the fancy 3D stuff.

It just wouldn't work in terms of making a game but it may - nay, will work on technology demos.

Hellbinder's post basically tells it like it is - a game does not consist solely of graphics. If DOOM3, as an example, uses all the features in the most advanced DX9-class (or, IOW, the latest the current API supports) video card plus all the stuff that has absolutely nothing to do with a GPU but has all to do with the CPU, we will gawk at the graphics but will never be able to judge such a "game" as, well, a game. IOW, the (expected and definitely) low performance will not permit anything resembling a gaming experience.

But, yes, sure, someone can use all the latest 3D features in a game... but no one will buy it because the requirements are far too high taking into account what we can buy at any given time, and as such it will become a waste of time for anyone to come out with a "game engine" that takes nothing else into consideration other than graphics.
 
What is missing from this equation of the future is the involvement of some real gurus when they occur and how they sometimes soley take things to a new playing level.

I have meet one in 40 years, Bruce Ellis - now with AT&T. Bruce was a musican/composer by trade, who so impressed K&R with his knowledge of C they got him to write the optimising compiler for it in the mid eighties. Bruce took 6 months off in our honours year at Uni to go work for Ma Bell - doing anything he liked. He liked Go playing programs. On his second weekend in the USA he decided he wanted a symbolic debugger for C to further his work. A student from MIT had been working on this field for 2 years for his PhD effort and was starting to show results. Bruce started working on his own version on Friday night and by Monday morning he had the crux of the thing fully operational. He was like that all the time - true genius at work, someone who could do the incredible in no time flat if he wanted it. My final year of Computing Science class of 15 students won 5 University medals (the highest performer in an entire field of 30,000 for each of science, art, engineering, economics and mathematics). Geniuses are out there - what a movitaved one can do is literally astounding.

I am sure there are more folk like him in the world. John Carmack may be one, there may be others. If one or a few of these guys on the far outliers of any performance / intelligence bell curve of ability get involved we may have the answer a whole sooner than we hope.
 
LOL, if Nvidia released a graphics engine it'd probably suck ass. They might have decent engineers and driver programmers, but they don't know the first thing about making a game engine. Whatever... :rolleyes:
 
Back
Top