TressFX: A new frontier of Realism in PC gaming


In game built in benchmark seems lopside, but who cares about an in game benchmark. This was to be expected be an AMD evolved title.

The actual performance playing the game is pretty much equal in that review or in line where the performance lines up in regular games, even with tressfx enabled.

Good to see that in practice tressfx is gpu vendor agnostic. But the performance hit is really to severe for one set of hair.
 
Good to see that in practice tressfx is gpu vendor agnostic. But the performance hit is really to severe for one set of hair.

It all depends on the user's tastes. Remember when 2x AA would virtually halve game performance making it run at 50% of the non-AA speed? Or when proper shadows used to crater performance? HDR, etc.?

I expect that as before with new groundbreaking techniques that the performance hit will go down as we get more powerful hardware and developers discover shortcuts they can make.

That said, unlike shadows (that I almost always turn off or lower quality for performance), I don't think I'd be turning this off. Hair in games has been bugging me for ages and ages now.

I just can't wait until video cards are powerful enough that all NPCs have this sort of hair as well.

Regards,
SB
 
In game built in benchmark seems lopside, but who cares about an in game benchmark. This was to be expected be an AMD evolved title.

The actual performance playing the game is pretty much equal in that review or in line where the performance lines up in regular games, even with tressfx enabled.

Good to see that in practice tressfx is gpu vendor agnostic. But the performance hit is really to severe for one set of hair.

I have to disagree with you on the performance hit front. You can still play the game with tressfx on irrespective of GPU you are using. You wont get solid 60fps for sure but you can still very much enjoy the game. It's like saying that enabling AA/tesselation/DOF/whatever fancy effect in games eats up too much of GPU time and is therefore useless. Who cares if you can still have playable framerates?

Honestly i am surprised how much hate this technology is getting. Its vendor agnostic, usable on midrange GPU and innovative. I would really like to see how would ppl react if it would be implemented by nvidia and not amd. There was no moaning about physix performance in batman even it literally killed my framerates on gtx 460.
I think is all boils down to brand strength and brand recognition. Amd has to do a lot in that regard. Also building more loyal fan/customer base should help.

EDIT: and i have to totally agree with SB's above post.
 
ps: there has been quite a bit of unhappiness wrt physx for various reasons

True, but it was mostly about the fact that it was specific to NVIDIA, or that the non-PhysX game was needlessly deprived of some elements (e.g. flags/banners).

I don't remember many complaints about the performance impact on GeForces. And rightly so in my opinion, because the impact wasn't huge for a new feature.
 
True, but it was mostly about the fact that it was specific to NVIDIA, or that the non-PhysX game was needlessly deprived of some elements (e.g. flags/banners).

Yeah, some developers used it well and some didn't.

Metro 2033 for example didn't exclude features in the non-hardware accelerated PhysX (or was it CUDA versus non-CUDA, I can't remember now). They just pared back the effects or reduced their intensity. I didn't have a problem with that.

Batman: AA on the other hand had fog in hardware accelerated PhysX and no fog otherwise. Um, yeah... Still not buying or ever will buy the game (and possibly any other game by this developer) due to this and the whole antialiasing shenanigans.

Regards,
SB
 
True, but it was mostly about the fact that it was specific to NVIDIA, or that the non-PhysX game was needlessly deprived of some elements (e.g. flags/banners).

I don't remember many complaints about the performance impact on GeForces. And rightly so in my opinion, because the impact wasn't huge for a new feature.

I dont say the impact is not important, i think they could have reduce it ..

http://www.pcper.com/reviews/Graphi...and-PhysX-Comparison-GTX-680-and-HD-7970/GPU-

GTX680 alone.
PhysX LOW : average = 146.5 fps
PhysX Med : average = 90.4 fps
PhysX High : average = 69.1 fps

Going from 146.5fps to 69.1fps ... its nearly half the fps. I think peoples dont complain, because they dont want to complain about it. Many complain are coming from Nivida users, and this is mostly due to the crash, tesselation problem and bug here and there ( drivers is really not optimised yet ). So ok, it is not "only hair" who use PhysX, but it is not like Borderland2 have an high graphic cost in itself When you see they push nearly 150fps without PhysX ( in low )

I remember it was the same for most games, where physX was just here from some banner and paper who fly around. If we look batman, outside clothes and some other little things, it is not really more of the hair.

 
Last edited by a moderator:
http://www.pcper.com/reviews/Graphi...and-PhysX-Comparison-GTX-680-and-HD-7970/GPU-

GTX680 alone.
PhysX LOW : average = 146.5 fps
PhysX Med : average = 90.4 fps
PhysX High : average = 69.1 fps

Going from 146.5fps to 69.1fps ... its nearly half the fps. I think peoples dont complain, because they dont want to complain. There's many Nvidia users on forum who will never complain of anything.

Yeah but 69 FPS is fine. If it's too low for you or your card is not that fast, you can always lower the settings, just like any other setting. The real problem is that you're stuck with very slow CPU physics on an AMD card.
 
True, but it was mostly about the fact that it was specific to NVIDIA, or that the non-PhysX game was needlessly deprived of some elements (e.g. flags/banners).

I don't remember many complaints about the performance impact on GeForces. And rightly so in my opinion, because the impact wasn't huge for a new feature.

On gtx 285 (gtx 460 level harware) framerates were cut in half with physix set to medium. Setting it to high would cost you about 60% or more of your original frame rate (i had occasional drops to 15-20fps).
Link.

This is TWICE the performance hit of tressfx (according to article linked earlier). Even nvidia was recommending additional GPU just for the physix calculations back then. I do remember i decided to turn physics effects off because of the performance even thou they were nice. With borderlands 2 gtx 680 will lose about 70% of its original performance (there was a pcper article about physix performance).
The funny thing is that you are right that people were not complaining about performance whith that feature turned on (at least i don't remember many complaints). Yet there are a lot of complaints about tressfx performance.

EDIT: i really need to type my posts faster :D
EDIT2: i think my math is off for the gtx 680 but 60% seems to be about right if you look at the min frame rates.
 


I dont say it is better, just peoples dont complain, because they dont want . ( at same time you can allways keep medium PhysX setting ).
 
Last edited by a moderator:
Yeah but 69 FPS is fine. If it's too low for you or your card is not that fast, you can always lower the settings, just like any other setting. The real problem is that you're stuck with very slow CPU physics on an AMD card.

Well average fps is higher than in tomb rider but mins are actually lower on the same class of hardware (if you compare performance hit in both games).
 
Physx has usually provides demonstrations that make one feel the need to upgrade their GPU. Same here with TressFX.
 
Metro 2033 for example didn't exclude features in the non-hardware accelerated PhysX (or was it CUDA versus non-CUDA, I can't remember now). They just pared back the effects or reduced their intensity. I didn't have a problem with that.

Batman: AA on the other hand had fog in hardware accelerated PhysX and no fog otherwise. Um, yeah... Still not buying or ever will buy the game (and possibly any other game by this developer) due to this and the whole antialiasing shenanigans.

Agreed with the Metro 2033, they doubled the update rate for physics with a Nvidia GPU. Would Nvidia still even allow that with new products? I didn't think they allowed PhysX to be used at all now on just the CPU or with non-Nv GPUs.
 
@SB - you would take hair over shadows? That's crazy talk man! Even bald ppl have shadows :LOL:

I will often sacrifice shadow sources or quality as my first means of recovering some fps, but turning them off completely is a throwback to 2002 or something. :p
 
@SB - you would take hair over shadows? That's crazy talk man! Even bald ppl have shadows :LOL:

The whole shadow thing goes back years and years. :) Back when I had to cut something out shadows was the one I could do without the most. Besides back a few years ago all shadows looked horrible, IMO.

Now days there's a few other things that get disabled first, DOF for example, but shadows is still high up there. Although now I usually just turn them down to low/medium and it's enough to have me happy with framerates. :)

But yes, hair I would leave on. It's not quite on the level of annoyance that aliasing is, but it's mighty annoying to me (the "blocks of hair" thing).

I'd imagine that it wouldn't even have as much of a performance hit on most NPCs (who are usually men) as they "usually" won't have flowing locks of hair that require lots of physics and collision detection. :D

Regards,
SB
 
The whole shadow thing goes back years and years. :) Back when I had to cut something out shadows was the one I could do without the most. Besides back a few years ago all shadows looked horrible, IMO.

Now days there's a few other things that get disabled first, DOF for example, but shadows is still high up there. Although now I usually just turn them down to low/medium and it's enough to have me happy with framerates. :)

But yes, hair I would leave on. It's not quite on the level of annoyance that aliasing is, but it's mighty annoying to me (the "blocks of hair" thing).

I'd imagine that it wouldn't even have as much of a performance hit on most NPCs (who are usually men) as they "usually" won't have flowing locks of hair that require lots of physics and collision detection. :D

Regards,
SB

I was ask me how it will look on a different characters.. the hair locks is a bit easy to do, of let say long hair unnattached... half short air should be easy too, but on long hair, should not be so easy to keep an hair style who look good. TombRaider was surely a good choice for start with it.
 
I'd imagine that it wouldn't even have as much of a performance hit on most NPCs (who are usually men) as they "usually" won't have flowing locks of hair that require lots of physics and collision detection. :D

Regards,
SB

Just wait until a JRPG uses TressFX… :p
 
I fear that no GPU could handle that level of hairyness. We need 14nm. Maybe GPUs could be designed with hair in mind. UltraHair. Like double Z in a way. ;)
 
Back
Top