New raytraced shadows demo

Jallen said:
Humus said:
Yeah, I've heard that from other 6800 owners too. * Waits for Ruined to pull his usual rant *
No shadows would most likely mean that the shadow shader writes 1 to alpha for some reason.
Change line 59 in OpenGLApp.cpp to this:

Code:
		pf.alphaBits, 0,
otherwise you won't get an alpha channel in the pixelformat.

Hmm, that's obviously an error. Thanks for pointing it out. But still, that shouldn't matter unless the driver doesn't support WGL_ARB_pixel_format (which I'd be very surprised if nVidia didn't support) since it's only used to create a temporary context which is later destroyed and a new one is created using WGL_ARB_pixel_format, which properly sets the alpha:
Code:
		WGL_ALPHA_BITS_ARB,     pf.alphaBits,

Did this fix it for nVidia cards?

Edit: Uploaded a new version with this change. Tell me if it works or not now.
 
Ostsol said:
Speaking of NVNews. . . it's amazing the amount of flak Humus gets there when a demo doesn't perform better on NVidia hardware. . . (Also interesting how Ruined basically cuts and pastes his posts between forums.)

Tell me about it. I skipped posting this demo there this time around. Can't be bothered to deal with it another time.
 
Humus said:
Did this fix it for nVidia cards?
Yes, changing the pfd to include alpha bits makes it run correctly on my 6800.

Humus said:
Edit: Uploaded a new version with this change. Tell me if it works or not now.
Doesn't work for me, my manually modified version does.

I believe line 86 needs to get changed to this:

Code:
initEntryPoints(hPFwnd, pfd);
Changing that or the pfd fixes the problem for me.
 
Man, how long have I had that bug in my code. :oops: Can't believe it has worked anyway. I'm sending the wrong windows handle, thus initializing the pixel format on my main window rather than on the temporary window, then later on I'm going to specify a pixel format for that window again, which you can't do, so I guess the first selected format is the one that's going to be chosen and that temporary window will serve no purpose whatsoever. Thanks for spotting that.

I uploaded yet another version with that fix too. Now it should work, unless there were more bugs you fixed. :p
 
Humus said:
Hmm, that's obviously an error. Thanks for pointing it out. But still, that shouldn't matter unless the driver doesn't support WGL_ARB_pixel_format (which I'd be very surprised if nVidia didn't support) since it's only used to create a temporary context which is later destroyed and a new one is created using WGL_ARB_pixel_format, which properly sets the alpha
Nope, on the 61.77 drivers in Win2k, there is no support for WGL_ARB_pixel_format.
 
Runs fine with working shadows on my 6800GT. Full screen 1280x960 at a constant 75fps (can't get vsync to go off in the demo even though it's turned off in the driver settings).
 
Chalnoth said:
Humus said:
Hmm, that's obviously an error. Thanks for pointing it out. But still, that shouldn't matter unless the driver doesn't support WGL_ARB_pixel_format (which I'd be very surprised if nVidia didn't support) since it's only used to create a temporary context which is later destroyed and a new one is created using WGL_ARB_pixel_format, which properly sets the alpha
Nope, on the 61.77 drivers in Win2k, there is no support for WGL_ARB_pixel_format.

Been supported in NVIDIA drivers for quite a while: http://www.delphi3d.net/hardware/extsupport.php?extension=WGL_ARB_pixel_format

How are you checking for support?
 
I used Sisoft's utility for checking extensions, which should just be looking at the entire extension string. I suppose it's possible that it's truncating the string, so I could probably run the code that I wrote in Linux for the same purpose....

Edit:
Ah, never mind. I'm willing to bet that Sisoft's utility is not using WGL_ARB_extension_string to check for WGL extensions.
 
Humus said:
I assume you installed DX9.0c before the official release.

Hm. No, actually I did not. :) I prefer to spare myself from such hassle since I didn't have a DX9 card anyway until last week; it didn't seem to be any point in faffing with beta versions of something as critical as DX. Either I got the C version off windows update (which is where I get all my patches), if it wasn't from there, it was off of the retail Doom3 CD... :)

Anyway, the raytracing demo was really cool, and it ran really fast too. Good job man. :)
 
Humus, I can't see the lightsource on my 6800NU but I can see the light.

Screenshot00.jpg
 
Humus said:
And I'm hardly even joking, kicking around those balls can be an incredibly fascinating thing. I've spent too much time doing that today when I should be tuning the code ... :p

tEd said:
how can i interact with the balls?


:LOL: :LOL:


Sorry.... SORRY I COULND'T RESIST!!!!!!!!!
 
pat777 said:
Humus, I can't see the lightsource on my 6800NU but I can see the light.
The light source doesn't show up on NVIDIA GPUs because the texture combine mode is set to modulate the primary color with the texture. NVIDIA's GLSL implementation aliases the primary color with generic vertex attribute 3, which contains the binormal in this app and the last value in this vertex array is (0, -1, 0).

To fix the problem, set the primary color to (1, 1, 1) in MainApp::drawLight or change the texenv mode to GL_REPLACE before drawing the light.
 
london-boy said:
Humus said:
And I'm hardly even joking, kicking around those balls can be an incredibly fascinating thing. I've spent too much time doing that today when I should be tuning the code ... :p

tEd said:
how can i interact with the balls?


:LOL: :LOL:


Sorry.... SORRY I COULND'T RESIST!!!!!!!!!

You're familiar with interacting with the balls, eh? :D
 
Jallen said:
pat777 said:
Humus, I can't see the lightsource on my 6800NU but I can see the light.
The light source doesn't show up on NVIDIA GPUs because the texture combine mode is set to modulate the primary color with the texture. NVIDIA's GLSL implementation aliases the primary color with generic vertex attribute 3, which contains the binormal in this app and the last value in this vertex array is (0, -1, 0).

To fix the problem, set the primary color to (1, 1, 1) in MainApp::drawLight or change the texenv mode to GL_REPLACE before drawing the light.

THAT, however, is not my fault, but a bug in nVidia's drivers The spec clearly states that there's no aliasing between generic conventional attributes. But I added the fix anyway. The latest version should now work properly on nvidia too.
 
Back
Top