nv30 - Photorealism?

eSa

Newcomer
Uhm. I was bored, so thought I would roam around the web. Found reference to this pic from nvnews :

http://www.trinisica.com/img/galler...ge=stilllife-translucent_21_2.jpg&w=713&h=451

Info page for the pic :

http://www.trinisica.com/sub_view_imgpage.asp?loc=3&img=stilllife-translucent

Basically it's photorealistic image done with only basic 3D Studio Max scanline rendering... Catch here is, that it seems that nVidia will use this pic as to demo what nv30 will be able to render in _real-time_. After all, it's simple scene with straightforward high quality texturing.

Now, i know there have been some speculation about the nv30, but I still would like to hear any ideas about the what to expect. That is, if we have 120 million transistors, what you can put there ?

Like how much four floating point vertex shaders would eat from the silicon budget ? It's obvious that nv30 will implement dx9 pipeline, which is fully floating point, right ?. But how much more it will be ?

Or is it just a straightforward 4 vertex shaders , 4 pixel pipes with total of 16 textures / pass, better HSR, support for internal 64-bit floating point rendering and boring 128 bit bus... ;)

In many ways nVidia has already build the hype like many times before , "Biggest thing in 3d gfx for last ten years" etc. It would be rather lame if nv30 turn out to be something like suped up gf4. Even with high quality mpg decode.

Oh, and do you think that nv30 will have conditional stament + jump in shaders ? That is, will the pixel shaders computers be fully Turing-complete with floating point support ? :p This is not dx9 specs, but is in OGL 2.0.
 
eSa said:
Catch here is, that it seems that nVidia will use this pic as to demo what nv30 will be able to render in _real-time_.

I don't know about that. It looks like something that the boys at nvnews put together themselves. I guess they got bored too...
 
I would say we are still quite some way from photorealism in games. Even that simple scene that you linked to doesn't look photorealistic.
 
LeStoffer said:
eSa said:
Catch here is, that it seems that nVidia will use this pic as to demo what nv30 will be able to render in _real-time_.

I don't know about that. It looks like something that the boys at nvnews put together themselves. I guess they got bored too...

The collage I posted yesterday was sent in by a visitor who said it was publicly available. But I have yet to hear back from him when I asked for the URL.
 
it's just a collection of some of the higher quality 3d render work that's available on the web, nv30 might be able to make some kind of near-reproduction where the same model is used in real time... but being able to reproduce the same quality of depth of field effects, antialiasing, raytracing, shadowing, etc at realtime...?

alot of those images are also probably touched up in photoshop their fair share, which would make it even harder to *exactly* reproduce them in real time..
 
Hmm, that pic looks like it doesn't even require any advanced ray-traced lighting effects.. just high polygon meshes (should be a cakewalk), highres textures, one light source, specular highlights, a touch of gloss, soft shadows, depth-of-field blurs, etc (tell me if I'm missing anything major).

Then again, like someone said, this pic isn't exactly photorealisitc itself, but pretty damn close for the most part. I wouldn't think pulling something off like this with the NV30 would be too trivial.
 

Or is it just a straightforward 4 vertex shaders , 4 pixel pipes with total of 16 textures / pass, better HSR, support for internal 64-bit floating point rendering and boring 128 bit bus...


I think 4 vertex shaders, 8 pixel pipes/shaders and 16 textures/pass with loopback. Improved HSR with higher discard rate, and 256 bit memory interface. Fully floating point pipelines with float per component, ie 128bit pixel color internally. The shaders will also feature jmp's and conditional logic.
 
The NV 30 should be an amazing GPU. They've had time to comb through all of 3dfx's technology by now and figure out what will have good synergy with Nvidias own R&D. I'm sure both the R300 and NV 30 will really be a big step up from the previous generation of GPU's.
 
The photo of Kaya in the collage is not identical to the CG one rendered on the web. And if you've seen the Final Fantasy demo, you know that a very specialized demo can pull this look off.

I bet that collage is a collage of real demo pics from demos that Nvidia did by converting existing 3D scenes they could get from the net. Nvidia probably wanted some art work they could release (do you think Square soft is gonna allow them to include their art assets in a enduser downloadable demo? no way), so they went to independent 3d artists on the net and got agreement to use their artwork.

The PR will say something like "We went out and took some average pre-rendered scenes on the net, used the new 3D Studio <-> Cg plugin to generate shaders, added some extra code, and the result is a pretty nice NV30 demo that looks pretty close to the original"


I highly doubt Nvidia would just take stock pre-rendered art and try to pass it off as realtime renderered stuff without actually having some sort of demo that atleast looks close enough already.
 
Didn't they make this claim about the GF3 (and GF2 even???). I'm sorry, but Nvidia Marketing says a ot of stuff that isn't even close to being true.

I'm sure the NV30 will be a great chip, but I seriously doubt it will be any larger step up than the GF3 was from GF2. And it won't be photorealistic either.
 
So, this is basically the so called damage control / counter propoganda againts r300 launch ? nVidia spreads small bits of info to build up hype... Sounds very probable.

About the fruit scene, I don't see a single reason why you couldn't render it real-time with nv30. It's originally done with _scanline rendering_, no raytracing or global illumination. It's just a few objects with high quality textures. You can get a way with very high quality (for real-time that is!) soft shadows using just plain shadow buffers several times with some kind of jittering. There was paper about this.
Single light source with high quality per pixel lighting is easy too. What else ? A gloss map, probably FSAA.

And I have to also disagree about the revolutionay vs. evolutionary part. I'm not a nVidia fan, but this time they really have change to make some major progress. If the pixel shaders indeed have "if statement + jump" and fully floating point operations, we have here first general computation unit available for per pixel operations ! It's also interesting, that in future you could use GPU for so much more. When there is enough precision available, you could do all kind of general computations. In some specific (not gfx based !!!) problems you could indeed exceed normal CPUs processing performance. nVidia actually had some discussion about this in some dev meeting lately.
 
eSa said:
And I have to also disagree about the revolutionay vs. evolutionary part. I'm not a nVidia fan, but this time they really have change to make some major progress. If the pixel shaders indeed have "if statement + jump" and fully floating point operations, we have here first general computation unit available for per pixel operations !

I don't know if you were referring to me, but this sounds totally evolutionary to me, not revolutionary whatsoever. You have the GF3 that first implemented limited pixel shader operations, and now the NV30 will expand on them. Just like the GF1 implemented the original T&L unit, introducing the concept of the GPU doing work instead of the CPU and then the GF3 expanded on this idea with Pixel Shaders. If you went straight from the TNT to a NV30 part that'd be revolutionary, going in steps like this is a clear evolution.

But revolutionary sounds better on the box. ;)
 
eSa said:
If the pixel shaders indeed have "if statement + jump"

This would be an interesting thing indeed, but it's not likely to happen.
Not in this generation anyway...
(It's not in DX9.)
 
DemoCoder said:
I bet that collage is a collage of real demo pics from demos that Nvidia did by converting existing 3D scenes they could get from the net.

I bet not: Why on earth should this collage hit the internet [from nVidia] before their NV30/Cg PR-lanch? It doesn't make any sense, so I'm voting for a nice fake (although they may actually be able to do something like this in real-time of course).
 
eSa said:
So, this is basically the so called damage control / counter propoganda againts r300 launch ? nVidia spreads small bits of info to build up hype... Sounds very probable.

Not likely. Probably the nForce2.
 
:eek:
It´s not the engine or the graphics card which will make this picture look good if rendered with NV30. As eSa said, it might be technically possible, so we can rule out these barriers.

It´s rather the artist.
IMHO, you (no specific person in mind, but it slowly seems this is the prevailing opinion) can´t simply say "Look, only scanline! Scanline = AA, 1-3 omni lights, hi-res textures and hi-res geometry. We can do that too with our graphics card. Only add a shadowmap here, a hi-res texture there and there you go. Welcome photorealistic images."

I dont know why, maybe someone here can explain it to me, but since the arrival of shaders, i hear more often that "photorealistic" term in combination with graphics cards. But there has to be someone who is doing the whole stuff. Model the geometry instead of map it on a box, simulate GI with more or less lights (until available in realtime, and even then this is not a guarantor for spectacular lighting), draw the specular map, diffuse map, bump map, displacement map, and so on. It´s a massive amount of work. And someone has to pay these people.

Assuming this is no problem at all, there is still something left on the road to PR. AA, caustics, refraction, reflection, DOF, motion blur. Probably overseen something here.
 
MikeC said:
The collage I posted yesterday was sent in by a visitor who said it was publicly available. But I have yet to hear back from him when I asked for the URL.

Yea, first post the stuff, then check it's authenticy. Very good journalism.
 
Well, AFAIK , Nvidia hasn't given this to Mike to post. It's going to be a VERY interesting week for me .
 
Back
Top