FC performance on ps3.0

Enbar said:
Ostsol said:
Well, textures sizes are completely irrelevant when it comes to shader versions

That statement is nearly correct, but not quite. With ps2.0 you have the necessary math to easily generate the third component of a tangent space normal map from a 2 component texture map (this is the whole idea behind ati's new 2 component normal map texture compression). So with this in mind you could say a having more powerful pixel shaders enables you to have smaller textures and have the same or better quality.
True, but unrelated to the post I was replying to, which mentioned larger textures as a related result to using SM 3.0. As for a texture's size in memory, I agree with what you say, but only up to SM 2.0. I can't see how SM 3.0 would benefit any more than 2.0 from 2 component vector maps.
 
You don't even need a pixel shader to do it on older NV chips. They supported it in the texture unit itself (D3DFMT_CxV8U8) which automatically calculates the Z component of the texel before it gets to the pixel shader.
 
Malfunction said:
What a joke, this thread. If you read what jAkUp said...

Didn't notice the first 2 days, but today I head over to the nVidia booth... check out FarCry.. and I notice it is running the new PS 3.0 patch on a 6800 Ultra. It is INCREDIBLE. The rocks, shadows, are absolutely stunning, it looks as if every single rock was rendered.

And... :oops:

It was kinda laggy occasionally.. but It was running 4xaa... so i think that might have been it.

Ummm... Nvidia and Crytek are working out the driver bugs and game issues still, even still it was running well at 4xAA.

Obviously there was a large degree of visual quality compared to what he typically games with, I mean since he does own a 9800 and FarCry. :oops:

Oh ya almost forgot, he was also a beta tester for the freaking game. :oops:

jAkUp is straight up in my book.

Man, this "ATi is teh shiznit," is getting mighty old... :rolleyes:

Both cards are great and apparently SM 3.0 is just as great as 3Dc is, maybe even arguably better? :?

i wouldnt consider 'it felt like it ran at 25fps' to be good performance :?

also, later on, he said that it was probably 2x aa :oops:
 
Is the patch going to do the volumetric shadows(palm´s casting over the weapon etz) so we can see this on PS2 and not only PS3.
 
Pete said:
Without having read that NVN thread, the IQ he describes sounds exactly like that demoed at the 6800U launch--which was repeatedly described as a "SM2.0/3.0" mod by the Crytek dev presenting it--so I'd imagine a X800 would look similarly good.
 
Snarfy said:
apparently its the 'boat' level -- i have a 9800 xt sooo i could do a good comparison if i could find the exact same spots but i really care more about the performance - anyone have a benchy of the nv40 on farcry at 1280x1024 1xaf 4xaa?

Well not that it really means anything considering we know nothing about the system but...

Digit-Life did some farcry benches with only AA. (warning its one big page)

At 1280X1024 with 4XAA noAF
X800XT 57.7 fps
6800U 47.2 fps

AT 1280x1024 with 4xAA +af
X800XT 50.9 fps
6800U 32.2 fps

They used the 60.72 drivers.
 
Snarfy said:
Malfunction said:
What a joke, this thread. If you read what jAkUp said...

Didn't notice the first 2 days, but today I head over to the nVidia booth... check out FarCry.. and I notice it is running the new PS 3.0 patch on a 6800 Ultra. It is INCREDIBLE. The rocks, shadows, are absolutely stunning, it looks as if every single rock was rendered.

And... :oops:

It was kinda laggy occasionally.. but It was running 4xaa... so i think that might have been it.

Ummm... Nvidia and Crytek are working out the driver bugs and game issues still, even still it was running well at 4xAA.

Obviously there was a large degree of visual quality compared to what he typically games with, I mean since he does own a 9800 and FarCry. :oops:

Oh ya almost forgot, he was also a beta tester for the freaking game. :oops:

jAkUp is straight up in my book.

Man, this "ATi is teh shiznit," is getting mighty old... :rolleyes:

Both cards are great and apparently SM 3.0 is just as great as 3Dc is, maybe even arguably better? :?

i wouldnt consider 'it felt like it ran at 25fps' to be good performance :?

also, later on, he said that it was probably 2x aa :oops:

Hmmm...
af was set at 1.. resolution i believe was 1280x1024. If I had to take a guess at the fps, I would say 25 average. I would assume it is kinda low because of the AA, since Farcry seems to take a fairly large hit when AA is used... the only way i knew it was 4xaa is because that is what it was set at in the game options.

All the game options were at very high, and the water at ultra high. The shadow casted from the trees looked really good.. they moved as the trees moved, the whole environment looked better.

There was a buggy also present, I took a quick look at the buggy, and it seems to look the same as the buggy on my system. The only real difference I could notice were the shadows, and the gravel/sand.

He guesstimated, damn... can you get something correct in your lame ass post? :oops:

Out of that whole thread at nVnews, that was the only mention from him that he guessed 25 frames... :rolleyes:
 
being a long time pc game player, i can tell when the fps is below 30...it gets REAL choppy. I would assume that jakup felt the same thing.
 
Snarfy said:
being a long time pc game player, i can tell when the fps is below 30...it gets REAL choppy. I would assume that jakup felt the same thing.
Don't you mean shooter player? It's only ever noticable when something is moving fast. How noticable framerate becomes is entirely dependant on the screen-space distance an object moves per frame. If that distance is small, 30 fps and 60 fps aren't much different-looking.
 
Snarfy said:
being a long time pc game player, i can tell when the fps is below 30...it gets REAL choppy. I would assume that jakup felt the same thing.
More probably it's a case of the minimum FPS being low. I'd call this a driver issue, one that should be fixed before long.
 
Chalnoth said:
Snarfy said:
being a long time pc game player, i can tell when the fps is below 30...it gets REAL choppy. I would assume that jakup felt the same thing.
More probably it's a case of the minimum FPS being low. I'd call this a driver issue, one that should be fixed before long.

Lol, this is funny...

when the card was launched and farcry iq sucked on the 6800, it was 'bad drivers'

when the 61.11 drivers were rushed out to take some wind out of ati's sails, and they had MORE BUGS than the previous version, it was 'bad drivers'

when we found out that nvidia was using application specific drivers in 61.11 with the Fartcry.exe incident, it was 'bad drivers'

when someone who ACTUALLY PLAYED far cry on ps3.0 says that it had a low fps, it was 'bad drivers'

notice a trend here?

oh, and dont try any bullshi'ite about how the drivers only matter when the cards are released. Now is the time when people are formulating opinions about the products, and passing everything off on 'bad drivers' just doesnt make up for the cheating.

Also, dont try any 'ATI CHEATS TOO' crap.

Ati Catalyst 4.5 changelog said:
Caution: Forcing on Anti-Aliasing is now permitted. This will permit users to use Anti-Aliasing on games such as Madden 2004 (which users are requesting). Forcing on Anti-Aliasing (through the CATALYSTâ„¢ control panel) in the the following games will cause corruption. Anti-Aliasing is not compatible with the following games:

· Prince Of Persia: Sands of Time
· Splinter Cell / Pandoras Box
· Race Driver (TOCA)
· Crazy Taxi 3

the .exe extentions in that dll file were just to disable the ability to force on anti aliasing...ooooh, so underhanded!
 
Snarfy said:
Lol, this is funny...

when the card was launched and farcry iq sucked on the 6800, it was 'bad drivers'
No. The image quality issues appeared with the 1.1 Farcry patch. That's a game issue, not a driver issue.

I don't think anybody should expect the performance of pre-launch drivers for a new architcture to be optimal.
 
Snarfy said:
when someone who ACTUALLY PLAYED far cry on ps3.0 says that it had a low fps, it was 'bad drivers'

notice a trend here?

jAKUp, the one who actually played FarCry, and the one you based this entire thread on posted this yesterday:

http://www.nvnews.net/vbulletin/showthread.php?t=28898&page=2&pp=15

jAKUp said:
At first I wanted the x800Xt... but after playing FarCry with SM3.0... I was so impressed that I am willing to buy a 6800 Ultra if those effects aren't capable on an X800XT.

Looks like it wasn't so bad afterall :)
 
Doomtrooper said:
Why wouldn't offset mapping be capable on a X800, it is capable on a R300.

Same effect what Jakup saw below, PS 3.0 is not required.

Download self shadow bump mapping demo from Humus.

http://esprit.campus.luth.se/~humus/3D/index.php?page=Direct3D

I don't think its a question of whether the x800 is capable of doing it, its how CryTek implements the feature in game, and how fast it runs on x800 based on the way they implement it... Remember Nvidia is really pushing the FarCry SM3.0 expansion, there must be some advantage they are aware of based on the way it is coded.
 
The only reason I can think of that the PS3.0 effects will not be duplicated in PS2.0 is because they might want to have a single PS2.0 path written to the lowest common denominator. In other words, the PS3.0 path may in addition to using dynamic branching, have more than 96 instructions in many shaders. The R420 could easily render a shader that uses more than 96 instructions and multiple shaders could be created to offset the need for dynamic branching, but if the path isn't there, the increased instruction limit the R420 has will be irrelevant.
 
Doomtrooper said:
Humus demo runs over 110 fps on my machine, I'm really not concerend about performance.
A demo is a far cry from a real game. But yes, offset mapping is pretty cheap.
 
Back
Top