Lightsmark - new realtime global illumination demo

How about WoW, EQI, Nintendo GBA, DS, Wii, the PS2, GoW2, System Shock2, PSP, Guild Wars, Final Fantasy XI ffs, etc...

I find it strange all the stats that indicate the vast majority of gamers are playing on outdated (at least 1 generation old) midrange or lower hardware at resolutions of 1280x1024 or lower sans HDR with low res textures, and that these guys are all raging graphics whores.
Hang on, are you trying to make the case for including this guy's technique? Why would they care about RTGI if they have no problem with outdated graphics? Of what relevence are they in a thread like this?

Any time we talk about improving graphics in games, those graphics whores are the people funding the advancements.

As for anyone questioning SMM's criticisms regarding the practical use of this in a game, consider these points:
A) This technique primarily affects moving lights, and the effect is only visibly different from other techniques when the scene is indoor and the light has dynamic shadows.
B) All the radiosity calculations are done by the CPU
C) It uses CPU readback of all vertices contributing to and receiving GI (no game does CPU readback!)
D) It is very hard to integrate this technique into a modern graphics engine.

We're not just talking about a few less polys. We're talking about a one or two orders of magnitude.
 
Mintmaster said:
Hang on, are you trying to make the case for including this guy's technique?
No. I was refuting SMM's implication that all gamers cared about was the latest/greatest graphics, and that not keeping up with the Joneses was "commercial suicide."
 
How about WoW, EQI, Nintendo GBA, DS, Wii, the PS2, GoW2, System Shock2, PSP, Guild Wars, Final Fantasy XI ffs, etc...

You haven't really done anything to prove SMM wrong by listing those games/consoles. Nobody buys a GBA game and expects 1M polygons, that platform's standards have been set and haven't been raised to that level. And as for PC games like WoW and Guild Wars...unless you can somehow prove that they make up more than 5% of games with sub-par graphics, you're still well within the (off-the-cuff) statistic he provided.
 
How about WoW, EQI, Nintendo GBA, DS, Wii, the PS2, GoW2, System Shock2, PSP, Guild Wars, Final Fantasy XI ffs, etc...
I'm not sure how I can explain market segmentation any more clearly... Or are you just going to pretend that such a thing can never matter. PSP is PSP, and is on a different curve from PS3. WoW is WoW, so don't talk as if it should be put in the same universe as Gears of War.

Secondly, there will never be a conceptual similarity between the continued subscriptions and/or ownership of an old product that hasn't diminished in value and the future sales of new product which is on a level comparable only to those older ones.

I find it strange all the stats that indicate the vast majority of gamers are playing on outdated (at least 1 generation old) midrange or lower hardware at resolutions of 1280x1024 or lower sans HDR with low res textures, and that these guys are all raging graphics whores.
Again, it's because you're deliberately comparing the wrong things under the belief that it drives a point home. People who have old PCs don't buy new titles, so for people working on titles right now, they're a non-existent market. People who own a DS aren't the ones we'd worry about if we were working a $40million 360/PS3 title. And you'd be surprised how much graphics whore-ism there is even within lesser platforms -- it's not that obvious because the level is understandably lower, but it isn't lacking when put into context.

This culture of keeping up with the Joneses is an absurd falsification much more likely perpetuated by misguided producers, IHVs trying to ensure they will be able to pimp their next hardware, and guys designing million dollar engines than these "graphics crazed consumers".
You can believe that if you like if it makes you feel better. The trend has existed for ages, and it was that trend that created the market for 3d accelerators and GPUs in the first place. It's not just graphics, either. Physics whore-ism is a reality as well, and HL2 really drove a nail in the coffin that we were doomed to enter the spiral on that front. And once again, physics middleware proliferation and PPUs are products of the trend, not sources.

Most people like to believe that the consumer is more savvy these days, and I really don't think that's a very descriptive way of putting it. Being picky and being harder to please is not the same thing is being savvy.

Hell, even many of the high end consumers are far from mindless slaves to graphics... The first thing going through their mind when they pick up a shiny new copy of UT3: what useless stuff can I turn off to get a smoother framerate.
Pfft. That's just graphics whore-ism with different prioritization (i.e. framerate whoring comes first). It's not that they don't want those features. It's that they're specifically picking to disable that which is most superficial so as to take the least away from the visuals in order to gain framerate. Being willing to make small sacrifices is not the same thing as knowing better. If the case were made for a game which wasn't so dependent on fast action as UT, I don't think these same people would be as adamant to trim away, either.
 
20 fps on this NVIDIA GeForce 8300 GS / Intel(R) Pentium(R) D CPU 3.00GHz :D
 
ShootMyMonkey said:
WoW is WoW, so don't talk as if it should be put in the same universe as Gears of War.
How about Everquest II then?

ShootMyMonkey said:
People who have old PCs don't buy new titles
I think you mean people who have old PCs don't have the opportunity to buy new titles (I guess if you mean premier first person shooters). I assure you if I could lower UT3s settings to get 30fps min on my "old" system I would already have it pre-ordered. There are however several other upcoming games I'm looking forward to which should perform adequately on my old pc.

ShootMyMonkey said:
You can believe that if you like if it makes you feel better.
Why would I need to feel better? You are the one whining about the way of the world...

Anyway, keep insulting your customers in public and blaming them for all your troubles I'm sure it will pay off someday...
 
How about Everquest II then?
What about it? I fail to see how it's a meaningful counterexample.

I think you mean people who have old PCs don't have the opportunity to buy new titles (I guess if you mean premier first person shooters). I assure you if I could lower UT3s settings to get 30fps min on my "old" system I would already have it pre-ordered.
And you'd be part of a piddly minority which is more or less meaningless. Even otherwise, UT is a vehicle for selling the engine, and it needs to be a graphical powerhouse for that reason anyway. You think companies would have licensed UE3 if Epic didn't happen to show a really good-looking demo back in 2004?

Why would I need to feel better? You are the one whining about the way of the world...
And who's trying to quip that people are loftier than they are? You can't look at a field like people here on B3D, where people overall tend to have more than zero functioning brain cells, and put that as comparable to the populace at large.

Anyway, keep insulting your customers in public and blaming them for all your troubles I'm sure it will pay off someday...
If the market at large would like to prove me wrong, then they need to do it with their wallets, not their mouths.
 
ShootMyMonkey said:
What about it? I fail to see how it's a meaningful counterexample.
Which doesn't surprise me...
ShootMyMonkey said:
And you'd be part of a piddly minority which is more or less meaningless.
Step 1: Tell the customer they don't matter.
Step 2: Profit!
ShootMyMonkey said:
Even otherwise, UT is a vehicle for selling the engine, and it needs to be a graphical powerhouse for that reason anyway. You think companies would have licensed UE3 if Epic didn't happen to show a really good-looking demo back in 2004?
I'm not saying it can't be a graphical powerhouse. I'm saying I would have appreciated it more if it could scale down to my level of hardware. What is a better selling point:
a) Use our engine which will perform well on some machines.
b) Use our engine which will perform well on most machines.

As for your attitude towards the public... whatever makes you happy.

Anyway, enough thread derailment for me.
 
I'm not saying it can't be a graphical powerhouse. I'm saying I would have appreciated it more if it could scale down to my level of hardware. What is a better selling point:
a) Use our engine which will perform well on some machines.
b) Use our engine which will perform well on most machines.
You're looking at it from very much the wrong perspective.

Neither of those criteria are of concern when you're in the market for middleware. Studios don't buy middleware licenses for the sake of the consumer. They buy them for their own workflow and production pipeline. "Whose machine will this run on?" is never part of the question, because studios who can afford these tools are likely to already have technology that can "perform well on most machines" as it is. And even if they didn't, looking backwards is cheap. For studios investing in future development, the driving question is "How many man-hours will this save me down the line?" What would I gain by buying this as opposed to developing my own? What can they provide that is just too much of a pain for us to develop in-house?

Stuff like UE3 catches the eye of those who are looking forward 2-3 years. A studio eyeing Crysis isn't going to think "Well, this isn't going to help me on the Wii, so screw it." They're looking at the transition into DX10 territory, and how it will affect art pipeline and workflow when that era comes around.

In any case, any delusions people might have about marketability of a title (and more importantly marketability of a development team) still doesn't change the original point that RTGI of this scale is simply not ready for games. The sacrifices are many times greater than acceptable.
 
I don't think anybody actually knocked the guy's actual work. I was knocking his assertion that it's useful for games "right now" or even close to that. The demo itself is fine (as are a lot of so-called RTGI frameworks), but we're still years from the point of practicality.
 
FWIW, I thought the demo looked quite good, despite the dodgy "garage-style" calendar.:)

Wasn't that the main feature of the demo? :p

I don't think anybody actually knocked the guy's actual work. I was knocking his assertion that it's useful for games "right now" or even close to that. The demo itself is fine (as are a lot of so-called RTGI frameworks), but we're still years from the point of practicality.

Well, it wouldn't be out of the ordinary to design a game around technical limitations.
 
Well, it wouldn't be out of the ordinary to design a game around technical limitations.
Excellent... Let's trivialize it some more, then, why don't we?

An order of magnitude geometric complexity loss? Pfft... I remember when we only had 2000 polys per frame and had to walk 14 miles through 6 feet of snow and the prize money at a demoparty was 10 cents and a saltine cracker!
 
An order of magnitude geometric complexity loss? Pfft... I remember when we only had 2000 polys per frame and had to walk 14 miles through 6 feet of snow and the prize money at a demoparty was 10 cents and a saltine cracker!
I don't think anyone is going to that extreme or suggesting that we raytrace spheres for our whole scene (except Intel maybe ;)), but I think there is a valid point in there. For instance I often shake my head at how technically unsophisticated Valve's Source engine is (it can run on DX7 for heaven's sake!), but they consistently end up making such great art and polish that you get stuff like TF2, which while very simple, looks great and I can't imagine that throwing more polygons or fancy lighting techniques would improve the experience a whole lot.

On the other hand, I think we all want to keep making progress in the area, as there's a lot of stuff that artists want to do, but can't yet in real-time.
 
I can't imagine that throwing more polygons or fancy lighting techniques would improve the experience a whole lot.
The problem isn't one of throwing more polygons, but of not throwing less. And even with that, there's still the fact that you still don't break even (in terms of available frame time for physics, AI, animation, etc) after throwing away 90% of what would otherwise be available polycount.

On the other hand, I think we all want to keep making progress in the area, as there's a lot of stuff that artists want to do, but can't yet in real-time.
Of course. I've done RTGI frameworks myself back in my undergrad days (mine was more of a distribution raytracing approach than a radiosity approach), since it's something everyone aims for -- plus, academia tends to serve as a shelter from reality for most, so it's easy to fall into a pattern of naive presumptions about what defines feasibility -- "What do you mean it's not feasible! All it requires is a 128-node Origin cluster! You can find those by just dumpster diving!". But it's not as though I was espousing "RTGI is here! Realtime in your game! Impress your friends! Comes with 10 free vials of LSD!"

People seem to be leaving out that there are a number of scales of complexity that artists want to achieve that reach beyond geometry. And you'd be throwing some of that away as well, including the flexibility to have outdoor scenery or scenery that lacks in density of sources for many bounces (as sparse reflectors kills the illusion). Geometry is simply the big one that stands out in this case since you're doing so much work per vertex that it becomes a major limiting factor. Making a hundred small sacrifices and a handful of enormous ones to get 1 feature that gets you 5% deeper into the uncanny valley is not a fair trade. Giving artists something new and dangerous to play with is walking the razor's edge, but taking something away is building your own gallows.

Now when you get to a point where more geometry is superfluous (and in a small subset of cases, we're there right now, though I wouldn't say we were there 10 years ago, which is where we're talking about heading), then the question becomes one of CPU power. I wouldn't be surprised if future generations don't have any need to push massive scene complexity increases on a per-frame level (although overall sizes will surely increase). Maybe DX11 will mean something, maybe it won't. Maybe Larrabee will mean something, maybe it won't. That's a different thing from saying "it's ready now." And that's the crux of my problem.

On a side note, I'm sure we can all agree that there are still plenty of areas where the current state of direct lighting simulation is still not completely acceptable. :???:
 
Last edited by a moderator:
An order of magnitude geometric complexity loss?

To be honest, I'd rather have a low-poly game with nice lighting than a high-poly game with poor lighting. Besides, you can use low-poly input and get good quality with DPS techniques, and combine that with silhuette tessellation in GS and you can have an order of magnitude less geometry look every bit as good or better.
 
To be honest, I'd rather have a low-poly game with nice lighting than a high-poly game with poor lighting.
Everyone who says things like that invariably eats their words when it's put on the table. Let's at least stop pretending that we saw a game and put it in the context of being a techdemo that did nothing more than rendering.

In any case, wording it the way you did massively trivializes the sacrifices and just as massively blows the gain out of proportion. Yes, the lighting is better than ordinary, but it limits complexity of geometry, scene, physics, animation quality, interactivity, scene variety, scale, and scope. On top of which, if you paid attention to the failure cases, they're not out of the ordinary for an in-game level, high-poly or not (and Murphy's Law dictates that in any and all games, the failure cases will be many times more obvious). And for all that, many studios can already cheat out at least some noticeable fraction of that lighting quality in ways that are better suited to their art pipelines and projects and at a very tiny fraction of the cost and complexity with virtually none of the other sacrifices. Between the two options, the latter is several trillion times more marketable than the former. If I were to stretch acceptability to levels beyond an extreme, I might be able to say that the former will work for some cheap budget downloadable XBLA-type game at the very most. And that too, the only reason it would work for that kind of arena is because the public expects less graphically.

Secondly, you can make assertions like that all you want, but it's far far too late now. Like I said, if I saw this perform at this level, say on a Dreamcast, I would have been very interested in applying it in practice.
 
Back
Top