Imagine if NVidia code a free S.O.T.A. game graphics engine

first, we are talking about HW companies supplying rendering engines !only!, no physics, no sound, no AI, no networking, no game logic.
Physics and sound engines ( Havok, Karma, Miles ) are out there and people are using them. Rendering engines are out there too, but for some reason everybody still wants to write their own

Reverend, are you saying that there will be no CPU cycles left for other game elements when NVidia's rendering engine will be running ? I'd say there should be plenty left, thats the whole idea of GPU, shifting the graphics processing from CPU to GPU.
NV ( or ATI ) should know best how to write rendering code so that GPU processing power will be maximally utilized.

Also, development time shoulnt be much of a problem, with licenced graphics, physics and audio components how much would it take to produce a racing game for instance? i'd say like less than a year for sure. When in the meantime there are any significant advances in graphics hardware, near the end of the development rendering components can be upgraded with little effort, if library interfaces were designed well enough by HW company with sufficient foresight.

half a century ago, cars were mostly built by one company alone ( some italians still are ;) ) , ie. bodywork, engine parts, electronics, everything were built by the same company. Now we are seeing a LOT more specialization. I wonder how many different companies contribute in production of a brand new Mercedes, for instance ? The final product has gotten a _LOT_ cheaper too.
 
Reverend said:
You can stuff a primarily-3D-graphics engine with lots of (the latest-and-greatest) options but to realize a game using such an engine will involve lotsa cutbacks. Once you add in the other stuff that are crucial to a game (like physics - which take up a huge amount of CPU processing power - and sound and collision detection and multiplayer considerations, among others), you start dumbing down or ignoring all the fancy 3D stuff.

There's lots of Middleware engines that deal specifically with these types of elements that can be bought off the shelf and used with any rendering engines (with a little work).

[Edit]: Ah, I see no_way already mentioned that.
 
Also don't forget how saturated the market is now with good engines. The big boys (Epic, ID, Li-tech, Croteam) have all done a good job to get many titles out using their engine (or so Croteam has not had that much luck..yet). It will be very hard to move them out as their engines are well know and tested.

Also integration of middle wear engines can be a big pain. Especially when they have to two different middle wear parts with two different ways to interface. Add a third, forth and it grows exceeding complex which is not what a developer wants to do.

Then don't for get about the tools a developer will want to use. They want to move to a very simple tool set not one that is made up of 20 different ones. You could kludge a front end together that makes use off all the different tools provided by middle wear engines but that's not a fun thing to do...

I would rather see the video card companies focusing on better drivers, fewer bugs then doing engines...
 
no_way said:
first, we are talking about HW companies supplying rendering engines !only!, no physics, no sound, no AI, no networking, no game logic.

No nothing... You act like these are all separate components when in fact they are all integrated to various degrees.

You want to know the reason people write their own engine? Because the engines available don't suit their purposes!

Nvidia making a graphics engine is about the dumbest idea of all time. 1 or 2 developers would use it, and the rest would just ignore its existance. You think programmers would be so stupid as to use an engine that they know wouldn't work optimally with half the hardware out there? Still, even assuming that they supported other cards decently, the engine would still be sub-par and a rip off.

There are already engines out there that you can license that include graphics as well as other stuff including development tools. These engines are not easy to make. Notice how Carmack and other engine developers spend as much time writing a new engine as Nvidia spends creating new hardware?

Nvidia needs to focus on hardware or they're going to end up getting swept under the rug. If they try to move into engine development they'll not only end up spreading themselves too thin, they'll also end up with a crappy product. Let's leave software to the programmers who know what they're doing.
 
Chalnoth said:
But there's little reason to totally rewrite an engine today. Design time can be significantly increased by leveraging code that's already been done.

Granted, you can't get as well-optimized this way, but I think it will prove necessary. The engines that will be the most advanced will soon turn out to be ones that build on older technology, not ones that are built from scratch. If the original engine was built with enough foresight, there won't be much overhead in just upgrading (And by overhead here, I mean problems related to adding additional features or performance problems).


Well I think thats wrong. Unless you can travel forward in time, it will be almost impossible to predict where the industry will be moving. Also hardware standards for the base line are constanly moving. If you dont change your engine to refelct that, then your going to be stuck. John C also seems to be redoing his engines every few years. Tim S said that he will start doing that (in fact they have already started on the next one). Plus the new feature alone can kill you.

Of course you can borrow code, but for the most part it should be new when there has been a "decent" amount of time from your last one (ie 3 years or so).
 
Nagorak said:
No nothing... You act like these are all separate components when in fact they are all integrated to various degrees.
Various degrees ? er... Okay, kinda depends on how you write your code, but its a good coding practice to keep those things apart. Based on recent experiences im gonna say that with well-structured code and sufficient abstraction its entirely possible to install different physics or rendering classes during couple of hours in quite a large project ;D It all depends of course

You want to know the reason people write their own engine? Because the engines available don't suit their purposes!
BMW cannot buy electrics parts for their new Z8 model ;D off the shelves either, but they arent building xenon bulbs themselves either.
Software industry is already specialized in some areas, it has to move towards specialization in gaming industry as well. Otherwise development cycles will continue to grow.

You think programmers would be so stupid as to use an engine that they know wouldn't work optimally with half the hardware out there?
Its not 80'ies anymore. Programmers are no more the only people involved in game development. Good programming doenst necessarily equate to good games. Much bigger emphazis is on overall design, good artwork, gameplay choices. Target market segment, and thus also target hardware is certainly not up to programmers to decide.
If significant development time would be conserved using such an engine at expense of losing some percentage of potential customers because of unsupported hardware, it could be still good business desicion.

There are already engines out there that you can license that include graphics as well as other stuff including development tools.
These engines are not easy to make.
Yes, because they are trying to deliver almost complete game in an engine. Q3 engine will do no good for for somebody writing a tetris or space-sim. But _just_ rendering engine would.

Notice how Carmack and other engine developers spend as much time writing a new engine as Nvidia spends creating new hardware?
And thats the problem. NVidia or actually Nvidia's subcontractor could start upgrading/rewriting rendering engine at the same time when they start to work on new hardware features, not only after the hardware has hit retail shelves.

Nvidia needs to focus on hardware or they're going to end up getting swept under the rug. If they try to move into engine development they'll not only end up spreading themselves too thin, they'll also end up with a crappy product. Let's leave software to the programmers who know what they're doing
Im not saying NVidia should do it themselves. With good partnership It could work very well though.
Microsoft, once a crappy operating system development house, is now shipping XBox ...

Edit: typo-s
 
This isn't a really horrid idea.
But I think that a game engine is really the wrong approach, NVidia should run an internal development house, and just target much higher spec hardware.
It'd be much like a console first party, in the short term emphasis would be on quality and content, with bottom line a secondary issue.
Engines don't impress people and they don't sell NVidia hardware, supplying a complete engine could knock maybe 6 months off your average development time, they'd move the curve a lot faster if they just used their wallet to allow a developer to target a much higher spec box, and include very high quality assets.

FWIW I personally think that "game engines" probably hurt a technically competent developer more than they help. This is especially true in the console space, at least in the PC space your buying some degree of cross platform compatability for the money.
 
seems to me, it would make more sense for them to make their own API..

i heard lots of rumors they were planning on doing this(way back in the tdfx vs nvda days) but i guess they dropped the idea
 
seems to me, it would make more sense for them to make their own API..

Theres really nothing wrong with DX8/9/10 etc or OpenGL. In most cases it isn't the none exposed features that are the problem it's that developers have to aim at 3+ year old hardware in order to get a publisher and of course to make money.
 
I think that the nice thing about nVidia or ATI having an in-house development team would be that it would allow that team to focus completely on one platform. The game would be best bundled with all video cards of the generation it was designed for.

And it would be only one game, I should expect, in the first generation of "in house games," but that would expand later on if the idea turns out to be successful.

I think it'd be really neat, and would finally allow us to really see all the benefits of brand-new hardware from the beginning. Of course, there could be a downside, and that would be that if these games end up being too successful, compatible games (which would have vastly inferior graphics) just couldn't compete, turning the PC into another console. But, I really don't think that'll happen, as I doubt that these "in-house" games will be of high enough quality to usurp such games as those put out by Epic, id, Bioware, Black Isle, Blizzard, etc.
 
the entire http://developer.nvidia.com site is a step in the right direction :)
Its just that when i take _everything_ from this site at present time, make proper use of it ( guidelines for rendering architecture, toolkits etc ) by the end of my dev cycle i'd be behind the newest technology, because i dont know whats coming next ( NV40 ? )
 
Wasn't Crytek's CryEngine originally a tech demo for the Geforce 3? I believe the demo was originally called X-Isle and it evolved into the CryEngine which seems to make extensive use of next gen features.
 
Back
Top