New hair simulation demo

Amazing Humus.
You hair simulation looks quite similar to the dynamic hair in Poser6.
But you do it realtime, while rendering even a single frame is damn slow in that software package.

Of course that balloon makes collection detection a lot easier, but still I wish I could use this in Poser :)
 
Ghost of D3D said:
I have found out Humus works for ATI. Is that what you do at ATI, Humus, "pushing the envelope" of the latest ATI video cards? You have an abundance of demos at your site that I wish I have the time to check out!

Well, my demos aren't that much different from the samples I write for the ATI SDK. The biggest difference is that code that goes into the SDK is tested more and much better commented and documented. ;)
 
Mendel said:
I wanted to say this to you before but only now I feel justified enough:
"Never let the coder do the art"

Pffft, what's wrong with a balloon? Perfect approximation of a head. ;)

No seriously, better artist work would have been nice. I don't have the skills myself though. I could make something better than a balloon though if I spent some time on it. Of course, that would have meant harder work in shader as well to get the collision detection right. I already had to spend some time to shoehorn it into the 64 instructions to get it to work on R300 as well. I think it ended up as 62 instructions in the end (only 45 when compiled as ps3.0 though).
Initially I was going to do a pacman, but couldn't find a good texture to use for it, but I found this face texture though, so I went with that instead.
 
Ali said:
Wow! ~100fps at 1280*768 full screen on my laptops MR9700! I thought I was too far behind the curve 3D wise to run your demos now days.

Very nice looking hair. Is there a way to add and subtract the amount of hair the ballon head has? Also changing its length could be fun.

Ali

You can do it in code by modifying the STRAND_COUNT and STRAND_LENGTH #defines in App.h. I don't have any way to adjust it dynamically at runtime though.
 
Ghost of D3D said:
I'm quite sure there's a reason you used the ATI icon for your post!

Well R2VB is currently an ATi only feature and I got an ATi board...

Doesn't look like there will be support for R2VB through OpenGL (at least for now) so I guess I'll just have to finish my DX renderer to play with it :)

btw, nice demo. I get ~1100 FPS here on my X1900XT (default viewport).
 
I think R2VB was once considered in opengl with superbuffers but was dropped for some reason. ATI had some demos showing R2VB funcionality in opengl through their superbuffer implementation in the drivers.
 
tEd said:
I think R2VB was once considered in opengl with superbuffers but was dropped for some reason.
Probably, it was dropped in favour of Vertex Fetch 4 (< may be typo since cannot remember well, sorry) ;) It's just my believe so.

How can I run this demo on a laptop with mobility redeon 9600 (NC6000)? May I need .Net install if I want to install only driver, not a CCC? <=Sorry again if OT.
 
satein said:
Probably, it was dropped in favour of Vertex Fetch 4 (< may be typo since cannot remember well, sorry) ;) It's just my believe so.

How can I run this demo on a laptop with mobility redeon 9600 (NC6000)? May I need .Net install if I want to install only driver, not a CCC? <=Sorry again if OT.

.NET is only needed for the CCC, not drivers
 
tEd said:
I think R2VB was once considered in opengl with superbuffers but was dropped for some reason.

Yeah, it didn't go so well with the ARB (not idea why though). But after many rounds a lots of delay, it was ultimately replaced by FBOs instead, which didn't have any R2VB capabilities.
 
satein said:
Probably, it was dropped in favour of Vertex Fetch 4 (< may be typo since cannot remember well, sorry) ;) It's just my believe so.

You're confusing Vertex Texture Fetch and Fetch4. The latter is totally unrelated. VTF could be used to implement the same stuff though, but requires additional hardware, whereas R2VB only needs a driver update (assuming the card can use a linear memory layout for render targets).
 
Under OpenGL, it is possible to do a kind of R2VB with the Pixel Buffer Object extension (which is not at all the same as the more well-known Framebuffer Object!), however given that this approach requires a data copy (within the GPU's memory; a full CPU roundtrip should not be needed), it will offer a bit less performance that what should be attainable with more "direct" R2VB.
 
arjan de lumens said:
Under OpenGL, it is possible to do a kind of R2VB with the Pixel Buffer Object extension (which is not at all the same as the more well-known Framebuffer Object!), however given that this approach requires a data copy (within the GPU's memory; a full CPU roundtrip should not be needed), it will offer a bit less performance that what should be attainable with more "direct" R2VB.

Very interesting, thanks :)
 
Just out of interest. How much input do you reckon your demos have had on the industry? I was just thinking there maybe has been some kind of connection...

I mean, for one, you had those very nice bumpmapping demos a while back (maybe 2 years ago) when I still had a 9700 probably.

well, anyways, it seems such techniques are starting to get used elsewhere, but intriquingly I see it happening where Ati is involved. Namely, there is the new Ati toyshop demo, with the taxi cab and the brickwall with the bump mapping, did you have any input to that?

Other, far fetched...er case is that of some 360 games like Perfect Dark. They're using similar bump mapping and I was thinking maybe they got some input from Ati that got the idea from you.

Or maybe they independently thought it a good idea to use such technology.

Just guessing here :)

Anyways, I've enjoyed the demos over the years, even with the simplistic artwork, because they're usually focusing on a specific, interesting feature.
 
Mendel said:
How much input do you reckon your demos have had on the industry? I was just thinking there maybe has been some kind of connection...

Well, do I make a difference out there? Yes. How much? I have no idea really. I know there are developers who download my demos and play around with them, mostly homecoders, but even big developers. Occasionally I get a job offer in my mailbox. Several have been from big wellknown game companies. I've also been in meetings with developers where someone would refer to "this humus guy" without knowing I was in the room. :cool:

Mendel said:
Namely, there is the new Ati toyshop demo, with the taxi cab and the brickwall with the bump mapping, did you have any input to that?

None of the big demos have had any input from me. It's developed by another team.

Mendel said:
Other, far fetched...er case is that of some 360 games like Perfect Dark. They're using similar bump mapping and I was thinking maybe they got some input from Ati that got the idea from you.

Or maybe they independently thought it a good idea to use such technology.

Are you referring to any particular bumpmapping effect, or just plain bumpmapping? I certainly did not invent bumpmapping, it's like 25 years old or something.
 
Humus said:
Well, do I make a difference out there? Yes. How much? I have no idea really.
I have it on record that a game programmer I correspond with have mentioned your name (your real name, which I only just recently found out) as someone who had given him ideas. Not sure if this means you "make a difference" but this is evidence that your work has been/ is, at the very least , noticed.

Just keep churning out those demos. Heck, to ensure your impact, make movies of demos that will only run using next-gen-to-be-released ATI hardware! :p

Occasionally I get a job offer in my mailbox. Several have been from big wellknown game companies.
I'd be interested to know your reaction/response to such offers. As I understand it, you don't shape the next-gen ATI hardware; you show what it's capapble of via demos (don't take this the wrong way, and if I am wrong, I apologize), whereas if you work for a game developer, you may possibly shape how games are going to be made in the future (via some incredibly clever use of hardware/API features).

I've read a lot of reports of hardware guys migrating to a software (game development) job and vice-versa but I have never read any really interesting reason why such folks did it.

Sorry for the OT.
 
Humus said:
Are you referring to any particular bumpmapping effect, or just plain bumpmapping? I certainly did not invent bumpmapping, it's like 25 years old or something.


I was referring to this kind of bump mapping :)
selfshadowbumpmapping.jpg


I used to do this test with my friends, they hadnt seen the demo before. I asked them, is that geometry or bump mapping. They would always say its real polygons. I moved a bit closer to the surface and asked... and now? "Still polygons"

I actually had to move very carefully, to get to the point where I would only just barely be on the correct side of the plane to have the texture still visible to convince them that it is indeed "just" bump mapping :)
 
Last edited by a moderator:
Humus said:
I've also been in meetings with developers where someone would refer to "this humus guy" without knowing I was in the room. :cool:

That is cool.

The demos definitely have an industry-wide impact. I think that's mostly because you are an early implementor of a lot of contemporary feature/techniques (3Dc, Parallax mapping, alpha-to-coverage etc). It's just a shame now that your most famous work will be this balloon thing, simply because it one of the ugliest creations ever to grace our screens.

;)
 
Back
Top