Cascades - NVIDIA's first DX10 demo available

isnt that cascade demo just 1 rock model with some water? seems like a far cry from an actual game. its actually pretty suprising how slow it runs..
 
isnt that cascade demo just 1 rock model with some water? seems like a far cry from an actual game. its actually pretty suprising how slow it runs..
Except that it's procedurally-generated rock, with the physics of a large number of water particles done on the graphics card.
 
i think a more important question is, will g80 and r600 be fast enough to properly make use of any of the major innovations dx 10 brings, or are they just high speed dx9 chips with checkbox features. i mean if that cascade demo runs at 20 to 40 fps, i dont see g80 having the power to run an actual dx10 game.

Once more, I feel it's necessary to remind people that DX10 isn't about "new featureset" so much as it's about new efficiency enhancements. DX10 gives you better ways to get more done without as much CPU (and even GPU) overhead as before. The new cards have far more processing power than anything we've had before, and are being coupled with a far more efficient graphics API than we've had before. This isn't like DX8 -> DX9...

Now don't take this to mean that all DX10 cards will be monsters of performance. There are still "new" DX9 cards today that aren't monsters; there will always be room for low-end. However, I'm not worried at all about the G80 or R600 having "checkbox-only" featuresets for DX10.
 
That was very interesting. Thanks!

So the way I understand this, this could be used in a real game to create a large world from a mathematical function without the CPU having to really care about it. Are those things that would typically consume a lot of CPU cycles and/or PCIe bandwidth and/or CPU memory and will it free up the CPU to do other stuff in a significant way or will the overall impact be fairly minimal?

Well, you could. The question is whether you want to do that to any larger extent. Procedurally generated art tends to look pretty bland. This is not exactly a new concept either. The difference is that you generate it on the GPU. To some degree you could do this already in DX9. There's also the problem of physics and game feedback which is not neccesarily trivial for real games. DX10 will probably bring a number of games that uses procedural geometry in one way or another, but I don't exactly think it'll replace art work. Mostly I expect it to be used for certain effects like particle systems.

If I understand this correctly, instancing on DX9 is pretty limited in what it can do - you send over the definition of an object then a list of parameters that slightly modify it to create different versions. But I think the type of modification is relatively fixed across all instances - it's not programmable. With DX10 you have a lot more flexibility with how you manipulate that object data.

I don't know what those limitations in DX9 would be. It's definitely programmable. Instancing becomes more useful in DX10 though, but that's more thanks for GS and other features rather than any particule upgrade to instancing capabilities. On top of my head the only new thing is the SV_InstanceID variable you can use in the shader.
 
Well, you could. The question is whether you want to do that to any larger extent. Procedurally generated art tends to look pretty bland. This is not exactly a new concept either. The difference is that you generate it on the GPU. To some degree you could do this already in DX9. There's also the problem of physics and game feedback which is not neccesarily trivial for real games. DX10 will probably bring a number of games that uses procedural geometry in one way or another, but I don't exactly think it'll replace art work. Mostly I expect it to be used for certain effects like particle systems.
Seems to me an ideal application for procedural generation is foliage. Oblivion currently does this to some degree.
 
The primary difference being that it is generally frowned upon to show the former to the world in public. :???: Please use the more polite form of this in future. (elbows, noses, bellybuttons).

But that would take away its meaning. Besides, most of us here are adults, aren't we?
 
It's chugging like mad on the one I bought today (Asus 8800GTX).
Hmmm, odd. Mine's not overclocked or anything. Perhaps I just have a lower framerate tolerance than you. But on my system, though the framerate is somewhat low, there is no stuttering at all (well, unless I get a screen full of particles from the falling water, but that doesn't happen often).
 
Hmmm, odd. Mine's not overclocked or anything. Perhaps I just have a lower framerate tolerance than you. But on my system, though the framerate is somewhat low, there is no stuttering at all (well, unless I get a screen full of particles from the falling water, but that doesn't happen often).

I think I tracked the problem actually. All my D3D performance is way too low (including 3DMark 2006). I think the problem is that my power supply doesn't provide enough juice for the 8800 : when I tried swapping the two 6 pins connectors, the system simply refused to boot, so it looks like at least one of my lines is not good enough for the card. I had a spare PSU, but it lacks the SATA cables needed for my drives. So I'll go buy a new PSU on Thursday.
 
I swapped my PSU today for a Corsair 620W, and the framerates in GPU-limited situations basically quadrupled. :) Cascades runs very smoothly now.
 
I'm happy for Corwin_B of course, but I wonder what happened with his system.

Did the videocard automatically clock the gpu down when it noticed the PSU didn't give enough power?
Then why didn't it give a proper notification of that? I don't want cards to do such things without notification.
 
There should some warning from the hardware monitor that something's not quite right, so it's rather odd that never happened. Something for the driver team to take a look at, will point them here so they can see.
 
There should some warning from the hardware monitor that something's not quite right, so it's rather odd that never happened. Something for the driver team to take a look at, will point them here so they can see.

Where is the HW monitor ? I turned off the motherboard HW monitor (very buggy, Asus makes some great hardware but lousy software, IMHO), but I don't know if the Detonators drivers include some sort of hardware monitoring feature (I'm running Vista 64). Card is an Asus EN8800GTX, I'm using Detonators 100.65.
 
Where is the HW monitor ? I turned off the motherboard HW monitor (very buggy, Asus makes some great hardware but lousy software, IMHO), but I don't know if the Detonators drivers include some sort of hardware monitoring feature (I'm running Vista 64). Card is an Asus EN8800GTX, I'm using Detonators 100.65.
Yeah, it's part of the driver (or at least should be, even in Vista x64). If it's something they've moved out to nTune (it's called the NVIDIA Sentinel if I remember rightly) then that's cause for concern. Basic hardware diagnostics shouldn't be part of optional software that might not even work on your system.
 
"nVidia Monitor" now, and yes it's in nTune. I think I understand why NV made that choice, but I think experience has shown its the wrong one and it is time to backtrack.
 
Well I have a lower watt PSU than 620W and a pretty high-end rig and my FPS are constantly smooth, so it is more than likely a driver or graphics setting issue. I must say the tech-demo is very beautiful and I am intrigued in terms of what and how well it offloads to the GPU.

I would be interested if someone could let me know how SLI would balance these calculations across the 2 cards... if indeed it does.
 
Not all power supplies are made equal, GM, not even those with similar ratings. Power supplies also do go bad from time to time.
 
Back
Top