Scalable graphics engine - in 15 months so much has changed!

Neeyik - now that is a good post!

But I am not saying h/w manufacturers should slow down - heck I'd prefer them to speed up!!! Wherever did you or anyone else get that idea?

I am saying hardware developers - but more over software developers should plan the future out abit better than they seem to do. I don't care too much if tic-tac-toe is the top selling game this year, when it was launched Quake 3 was pretty big, I think Doom 3 might just make an impression somewhere.

Cg and rendermonkey are the first steps I've seen to close this bridge.

Why has this taken so long? I bet Nvidia and others are frustrated how long it takes the s/w guys to optimise things for each generation of h/w.
 
Neeyik said:
"Thus there is a minimum cost at any given time in the evolution of the technology. At present, it is reached when 50 components are used per circuit. But the minimum is rising rapidly while the entire cost curve is falling (see graph below). If we look ahead five years, a plot of costs suggests that the minimum cost per component might be expected in circuits with about 1,000 components per circuit (providing such circuit functions can be produced in moderate quantities.) In 1970, the manufacturing cost per component can be expected to be only a tenth of the present cost.

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000.
"

I find it rather funny that so many people make a big deal out something he said in 1965 and that Moore himself said that the rate of increase is "uncertain". Marketing is such a wonderful thing!

thank god someone finaly said it. ;)
 
Mulciber - Moore has said a few more recent things too.

My first degree researched parallel processing technology in both the hardware and software design. For over 20 year I have worked in the IT industry and I have never seen such loss of potential.

Transistor counts and their allocation to sub processor components and the interconnects between these components will be critical design issues for CPUs and GPUs over the next 4-6 years and beyond.

Hopefully this isn't a big surprise for anyone?
 
As said above, Doom3 is the interesting one. It is going to need so much horsepower that it will drive the market to the higher end somewhat if it ships enough copies.

Anyone want to bet on whether Doom3 will make the top 10 next year? (I'm actually uncertain myself - I'd say guaranteed top 20, but top 10 may be too tough for a game in the FPS market).
 
g__day said:
Randell -

Fair enough but are you happy about the way it is today - do you ever want to see it get alot better? - don't you wish you could unlock that power sooner? Don't you feel the industry just isn't optimised well amongst the major powerbrokers?

If I thought like that I would only buy a console to game on :) Gameplay always beats graphics and I understand that unlocking hardware potential takes time to do well. I would rather play fewer greater games (in fact I do), than lots of crap good looking games. umm Aquanox/DroneZ anyone?

In fact being behind the curve game and hardware wise is the most economic and sensible way to do it. Imagine, you just get a PS1 now, never having owned anything beyond an Amiga before. You have MGS1, GT 1&2, Abes Odessey, FF7 etc etc to look forward to. To stay on the the graphics capabiltiy curve you only have demos' to play with or the games I mentioned above.
 
Randell - sensible but rather defeatist - not challenging the future to be better than today

I don't mean this to be an attack against you or anyone else - more a challenge against complacency.

It is dangerous to argue by connecting the quality of gameplay to the quality of graphics in any release schedule - many boring, crappy games are still released today (or are in development) targeting low end hardware - so what?

I understand that it takes a while to unlock the hardware potential too, its the scheduling of that time I hate with a vengence - months to years after the hardware is released. Is this because:

1) NVidia and ATi are are so secretive about what is comming out and when it will be released and how to interface to it that nobody can develop for it until it is well and truly heading for obsolence?

2) Only a select few scantified, top-end developers like John Carmack are privy to what is intended in detail but they hedge in case it isn't delivered in a satisfactory state?

3) Developers just aren't interested figuring their games won't have the longevity given expected adoption rates for the technology if games don't support it will be so low as to render it irrelevant? (nb todays self-fullfilling prophecy)

4) Some other mystic mis-alignment of the stars?

Personally I'd be happier if the majority of folk really questioned does it have to be this way - do we have to have such systemic dis-connects between hardware and software developers - leaving potential to rot on the vine - so to speak?

My clear view (realistic or not) is that this is all very artifical, caused by a lack of standards, a lack of a consistent and well communicated roadmap to the future shared by all, that people can plan for in advance. In the future people will shake their heads and say how could so many people be so dumb - why didn't they see the way to optimise competition better within a well defined framework? Why didn't they all wake up and demand in a loud voice that the situation be improved much sooner than they eventually did? Why did they suffer for years before telling folk they should do it right in the first place?

I am all for competition, were it makes strategic sense - not just for competitions sake itself. You need to get a balance between strategic competition and strategic co-operation.

I see the competition - I just don't see the appropriate level of co-ordination amongst the major stakeholders. For arguments sake lets say today and in the past it was done the way I'd like - is anybody telling me they'd want to see it revert?

An alternate view (that might take years to eventuate):

Everyone of of the four major hardware developers agree to what should be done - and more or less in what order and to what overall schedule each year for say 3-5 years out. This view is widely socialised between hardware manufacturers, game developers and API developers (Microsoft and SGI).

Hardware manufacturers compete by how they achieve these ends - the milestone themselves are well defined.

There are clear specifications as to how to interface to all features of current and planned hardware, and tests to see if said hardware is presnt and its performance levels.

Game engine developers similarly co-ordinate and compete. Their IP is the actual 3d algorithms required to match the hardware, present and scheduled. Where there is not a need for strategic competition they co-operate.

Hardware manufacturers continue with innovations such as Cg and rendermonkey to make their hardware easier to use. They also donate to game developers libraries of powerful code routines to achieve 3D effects to further speed game development.

Games are released and marketed as being entirely ready for the next 1, 2 or 3 rounds of technology - greatly improving their longevity and accelerating the demand for higher end graphic cards.

Software releases speed up to generally match the availability of hardware capability

/end_dream :)
 
Just for nostaligia I ressurrected this plea of mine over a year ago, because with Half Life 2 its seems my dream of a scalable graphics engine for leading edge hardware is finally realised thanks to Valve with Source and even Massive Development with Krass.

Apologies to Democoder and others for all the zelatos for the shots I fired around back then - and I laugh at my table of GPU improvements vs 3d mark benchmarks.

But the thrust of my passion I feel was on track - as best as I could understand matters then and now.

So I am a happy boy :)

/end ressurection selection
 
g__day said:
Games are released and marketed as being entirely ready for the next 1, 2 or 3 rounds of technology - greatly improving their longevity and accelerating the demand for higher end graphic cards.

Than the trend would be the other way round, and if this happens you would probably start screaming that you dont want to pay more for a game that its true pottential would only be unleashed after 18 months.

You see, one side of this industry should be a bit ahead to pump the driving force as a pace maker, in order to maintain the progress. Naturally the driving force is supplied by the HW side because they can afford it. ;)
 
g__day said:
That is a better argument - except you said "hardware comes before software", instead of more precisely saying "software comes 3 generations after hardware".

Why should this be acceptable to a $20 billion dollar industry?

I don't understand your points...:) How would it be a $20B industry if it was not on its present path?

Do you feel that say the 20 leading graphic architects (think John Carmack's of this world) and the major 2-4 video hardware manufacturers could sit down a few times each year and do a three year plan?

In any other industry that would be taken almost forgranted!!!

You ask these 20 guys/gals what in priority order they most want - you work out roughly when it can be delivered and if it is cost effective - you agree an API and how to test for its existance/readiness and away you go.

This industry seems to be very, very poorly optimised to me.

That's the cardinal flaw in your economic thinking. 3yr/5yr "plans" only work in free market economies for businesses which are heavily subsidized by government spending and don't primarily depend on producing "household" goods--eg, companies that make submarines, strategic bombers, booster rockets and satellites, etc.

In the more "mundane" world of computer hardware and software we have the dynamic of "competition" to consider. It's the competition among companies which spurs one company to "best" another with new and improved hardware & software. Competition is the engine that drives this economy, and that's the reason the pace of development is so incredibly fast these days.

Back in the late 80's-early 90's when the entire world-wide production of personal computers was about ~10M a year, hardware and software development proceeded at a snail's pace compared to today. Corporate budgets were much smaller, etc. I can remember waiting long, agonizing periods for companies to come out with new boxes with new and better capabilities. Ditto--software. Some really good games at the time (2d of course) were often written solely by individuals--or very small, comparatively under-financed groups of programmers (by today's standards.) You probably would have been very happy in the PC environment of the late 80's and early 90's when product development was very slow....;) But really, you probably wouldn't have been as happy as you think.

To put it another way, let's imagine that in 1999 AMD had never launched the K7 and had gone out of business, instead. How long would you think it might have been before Intel marketed a 1Ghz pentium and how much do you think you'd have had to pay for it?

What you see as inefficient is actually extremely efficient in getting new concepts from paper to silicon and into the market in very short spans of time. It's competition. It's how things work. Central planning as an economic system was disproven long ago as incredibly inefficient and easily corruptible. Vigorous competition doesn't eliminate inefficiency and corruption, of course, but it goes a long way toward mitigating them in the overall economy.
 
Guys thanks - note you are responding to points I raised over 14 months ago! As I said I ressurected this thread because now we have scalable graphics engines - finally!

But as to the points about the industry's basis for selective competition vs collaboration and agreed standards I still think there is an optimal balance to find. Competition can still be co-ordinated around standards that help game developers (look at Valve taking 5x times longer to optimise for NVidia - cause its architecture and power doesn't lend itself to be revealed and exploited in DX9).

WaltC I am currently advising (strategy) a company that turns over USD $25 Billion and thinks in 30 year plans (Japanese of course :) ) the key element I find is you have to be careful in what you select as your basis of competition. Of course innovation and competition drive things - but so to do standards. You don't waste time competing where its a distraction.

The main point of my thread was isn't it time for scalable graphics engines - halueah - the first 2 are practically here! But thanks for your comments - Iread them carefully and keep an open mind.
 
Oompa Loompa said:
You're asking for a level of scalability which is impractical. An engine which spans the pre-hardware T&L era, the DX7 fixed pipeline T&L era, the DX8 programmable pipeline era, and the DX9 floating point enhanced-programmability pipeline era just isn't in the cards.

Halflife 2?
 
g__day, I suggest that you try to develop a game. You obviously have no idea what game development entails.
 
g__day said:
...
But as to the points about the industry's basis for selective competition vs collaboration and agreed standards I still think there is an optimal balance to find. Competition can still be co-ordinated around standards that help game developers (look at Valve taking 5x times longer to optimise for NVidia - cause its architecture and power doesn't lend itself to be revealed and exploited in DX9).

I think, though, that you are assuming a problem where one doesn't exist...;) Whose problem was it, other than Valve's, when it decided to do an nV3x code path? That was an elective decision Valve took because at the time it didn't know whether an optimized path would be better than a DX8.x path for nV3x--and nVidia was apparently insisting that it would be (otherwise I can't imagine Valve doing it--especially had nVidia said, "Nah, just run it on your DX8.1 code path and you'll be fine.")

So the problem here was Valve's, but it was a problem Valve elected to take on. What would you have done to "prevent" Valve from making that decision?

As to "co-ordination", it existed long before nV3x was finalized and shipped by nVidia and R300 was shipped by ATi--it could be found in the DX9 API--standards hammered out by M$ in collaboration with developers like Valve, IHV's like nVidia, ATi, and any other interested parties. The problem was not that there wasn't enough co-ordination, but rather that, in this case, nVidia chose to deviate from the standards of the API--either that, or nVidia simply wasn't able to design and ship any better DX9-class hardware. I don't see how such an event could possibly be avoided in a competitive environment.

WaltC I am currently advising (strategy) a company that turns over USD $25 Billion and thinks in 30 year plans (Japanese of course :) ) the key element I find is you have to be careful in what you select as your basis of competition. Of course innovation and competition drive things - but so to do standards. You don't waste time competing where its a distraction.

But isn't that precisely what the point is here relative to this situation? The standard is the API. One company got it right for hardware support, one didn't. IMO, the problem with nVidia in this particular case had absolutely nothing to do with the lack of a standard.

As to 30-year plans...Come on...quit pulling my leg...:) So what happens if in 2 years something major happens that is not addressable by the 30-year plan? It gets changed, is what happens, and so the "30-year plan" is reduced to a 2-year plan, instead. Such things, unless they exist for companies who operate in a competitive vacuum, are mere guidlelines of the most general sort and cannot be considered as inflexible doctrines or future predictors at all. Not even governments can be sure of 30-year plans.

Having said that...I will agree the Japanese are much better about taking the long view than many Western companies...:) But that's mainly in the sense that they can better see the outcomes of their present actions over a period of time, and that they plan accordingly.

However, competition among technology companies is so fierce and healthy today that any company attempting to work on a 30-year plan is commiting hari-kari (please forgive my spelling...;)) It just wouldn't work in this business because before you know it company X will be making and selling something that your long-view plan never anticipated--and you'd have to scrap your plan to remain competitive. If your plan has such great latitude that it doesn't need to be changed to adapt to changing market conditions, then it really wasn't a "plan" at all, IMO.

Surely, you don't propose that M$ should propose 5-year API planning...? Or something like it? All that would do is to artificially slow down the pace of development and competition, but more importantly it wouldn't work in a competitive market. Somebody would come along with something else that introduced new feature support much faster, and all that would happen is the slower standard would be replaced by the more nimble one. That's one of the chief problems with the ARB "committee"...*chuckle* They often move glacially and as such D3d has come from behind to pass them right by in certain respects.

The main point of my thread was isn't it time for scalable graphics engines - halueah - the first 2 are practically here! But thanks for your comments - Iread them carefully and keep an open mind.

I think scalable engines are great for the developers who can successfully implement them. They also take a lot more work--but that's neither here nor there, really. I think the main thing about scalability is that it means that a developer can sell his software immediately into the market to a varied audience, because his software can address a wide range of hardware which is presently in use. I don't think it means at all that scalable game engines will not receive near-term changes, improvements, or even replacements in some case. It depends on the engine and its competition in the markets it serves, IMO.
 
Back
Top