FC performance on ps3.0

Snarfy

Newcomer
Apparently, someone at nvnews played Far Cry on a nv40/ps3.0, and it didnt perform well -- heres a linky -- i'm not saying i believe him.

In any case, it is an interesting read, and according to him we'll have a video of him playing it pretty soon

http://www.nvnews.net/vbulletin/showthread.php?t=28772

the most interesting parts, for those of us not blessed with broadband:

Didn't notice the first 2 days, but today I head over to the nVidia booth... check out FarCry.. and I notice it is running the new PS 3.0 patch on a 6800 Ultra. It is INCREDIBLE. The rocks, shadows, are absolutely stunning, it looks as if every single rock was rendered. My friend took a video of me playing, I will see if I can get that uploaded later, here is a few pictures though:

It was kinda laggy occasionally.. but It was running 4xaa... so i think that might have been it

af was set at 1.. resolution i believe was 1280x1024. If I had to take a guess at the fps, I would say 25 average. I would assume it is kinda low because of the AA, since Farcry seems to take a fairly large hit when AA is used... the only way i knew it was 4xaa is because that is what it was set at in the game options.

All the game options were at very high, and the water at ultra high. The shadow casted from the trees looked really good.. they moved as the trees moved, the whole environment looked better.

There was a buggy also present, I took a quick look at the buggy, and it seems to look the same as the buggy on my system. The only real difference I could notice were the shadows, and the gravel/sand.

anyone have some benchies of a nv40 at 1280x1024 1xaf 4xaa so we can see how much of a performance hit that is? it seems alot lower than most benchies i saw...

edit: it wasnt showing the images so here you go

http://www.nvnews.net/vbulletin/attachment.php?attachmentid=5854

http://www.nvnews.net/vbulletin/attachment.php?attachmentid=5855
 
Snarfy said:
Apparently, someone at nvnews played Far Cry on a nv40/ps3.0, and it didnt perform well -- heres a linky -- i'm not saying i believe him.

I don't really care if what he's saying is the truth or not. But let's wait until:

1: The patch is actually released
2: DX9 2.0c isn't beta anymore

Until making any judgements. Though perhaps 25 FPS at 1280*1024, 4X FSAA isn't that bad in this case. After all, we have nothing to compare with.
 
What would be interesting would be the comparison of the same thing with PS2.0 and PS3.0. *going to see the thread @NvN*
 
apparently its the 'boat' level -- i have a 9800 xt sooo i could do a good comparison if i could find the exact same spots but i really care more about the performance - anyone have a benchy of the nv40 on farcry at 1280x1024 1xaf 4xaa?
 
Just saw that the poster over at nVnews is jakUp and I wanted to add that I knew him over there for a while and he struck me as a stand-up and honest guy, if he says something I believe him.
 
AAAAAAAARRRRGGGHH!!!!!!!! :oops:

That thread started off great, but by the middle of page two turned into more of a justification/rationalization than an exploration!

Damn it, I just HATE watching good sites die like that. :(
 
digitalwanderer said:
That thread started off great, but by the middle of page two turned into more of a justification/rationalization than an exploration!

Not trying to defend any fanboyism here but what did you expect them to explore with only two rather crappy screenshots to go from ?
 
Evildeus said:
What would be interesting would be the comparison of the same thing with PS2.0 and PS3.0. *going to see the thread @NvN*

Thats assuming the ps2.0 and ps3.0 code paths add the same amount of graphic detail, which I don't think will be the case.
Usually if a company is trying to push something ie: ps3.0, they do it by making things look better. Which means longer shaders, larger textures, etc.. In which case comparing their performance like that is pretty useless.
Just look at the changes to the ATI Ruby shaders to make it work on other cards, in alot(not all) of cases the shaders were dumbed down.
 
lyme said:
Thats assuming the ps2.0 and ps3.0 code paths add the same amount of graphic detail, which I don't think will be the case.
Usually if a company is trying to push something ie: ps3.0, they do it by making things look better. Which means longer shaders, larger textures, etc.. In which case comparing their performance like that is pretty useless.

I don't expect any major differences between the two paths afa IQ goes. I'm thinkin that Nvidia would rather show of performance then some minor differences in IQ at this point in time. How that turns out remains to be seen though.

Just look at the changes to the ATI Ruby shaders to make it work on other cards, in alot(not all) of cases the shaders were dumbed down.

AFAIK, the shaders aren't dumbed down on the NV3X, just on the R300 because of the shader instruction limits vs the R420. The performance was not particulary good though :)
 
Bjorn said:
Not trying to defend any fanboyism here but what did you expect them to explore with only two rather crappy screenshots to go from ?
*sigh*

Sorry, I guess it was just the way that a couple of posters there that I remember as some rather blatant fanboys seemed to just jump all over anyone who suggested anything possibly negative about nVidia.

It just depresses me, that place for a brief shining bit was a bastion against the darkness and now it's no different than any other fan site....it bums me. :(
 
digitalwanderer said:
Sorry, I guess it was just the way that a couple of posters there that I remember as some rather blatant fanboys seemed to just jump all over anyone who suggested anything possibly negative about nVidia.

Nothing wrong with defending Nvidia. It might actually be ok to do that sometimes you know. And i happen to think that it was in that thread.

It just depresses me, that place for a brief shining bit was a bastion against the darkness and now it's no different than any other fan site....it bums me. :(

NVNews used to be a bastion against the darkness ??
 
*sigh*

Sorry, I guess it was just the way that a couple of posters there that I remember as some rather blatant fanboys seemed to just jump all over anyone who suggested anything possibly negative about ATI.

It just depresses me, that place for a brief shining bit was a bastion against the darkness and now it's no different than any other fan site....it bums me.
________
Honda c77 history
 
Last edited by a moderator:
lyme said:
Usually if a company is trying to push something ie: ps3.0, they do it by making things look better. Which means longer shaders, larger textures, etc.. In which case comparing their performance like that is pretty useless.
Well, textures sizes are completely irrelevant when it comes to shader versions, but yeah: longer and more complex shaders implementing potentially more and certainly a variable number of lights (thanks to dynamic branching) with more complex and realistic lighting equations are definitely reasons to use SM 3.0.
 
digitalwanderer said:
AAAAAAAARRRRGGGHH!!!!!!!! :oops:

That thread started off great, but by the middle of page two turned into more of a justification/rationalization than an exploration!

Damn it, I just HATE watching good sites die like that. :(

Poor Dig :cry: :cry: :cry:

RainZ
 
stop the thread derailment plz guys...

-----------------

otherwise, i just like to buy cards that last, i dont buy them often, and i like to make a decision based on what future games that I INTEND TO PUCHASE will run like on the card

i'm mainly a FPS-RTS guy, anything with lots of action, so i want the card which runs those best.

to me, far cry represents many of the future trends in fps graphics -- directx9, heavy shaders, etc...and not one of you can deny that Far Cry is by far and away the best looking FPS on shelves right now.

also, i keep wondering if Nvidia's lead in opengl is really important? whats the difference between 200 fps and 300 fps on a 3 year old engine? When opengl produces a game with graphics as complex as farcry and that game actually hits shelves, (read: doom 3) mabye then i will start caring about opengl benchmarks again.

hell, mabye the only reason nvidia presently does better at opengl is because all of the current opengl benchmark games are 3+ years old -- perhaps nvidia wins those benchmarks because those games require simpler computations, which nvidia is good at, while ATI focuses on the harder computations, making it do better in most CURRENT game engines.

just food for thought....
 
Ostsol said:
Well, textures sizes are completely irrelevant when it comes to shader versions

That statement is nearly correct, but not quite. With ps2.0 you have the necessary math to easily generate the third component of a tangent space normal map from a 2 component texture map (this is the whole idea behind ati's new 2 component normal map texture compression). So with this in mind you could say a having more powerful pixel shaders enables you to have smaller textures and have the same or better quality.
 
What a joke, this thread. If you read what jAkUp said...

Didn't notice the first 2 days, but today I head over to the nVidia booth... check out FarCry.. and I notice it is running the new PS 3.0 patch on a 6800 Ultra. It is INCREDIBLE. The rocks, shadows, are absolutely stunning, it looks as if every single rock was rendered.

And... :oops:

It was kinda laggy occasionally.. but It was running 4xaa... so i think that might have been it.

Ummm... Nvidia and Crytek are working out the driver bugs and game issues still, even still it was running well at 4xAA.

Obviously there was a large degree of visual quality compared to what he typically games with, I mean since he does own a 9800 and FarCry. :oops:

Oh ya almost forgot, he was also a beta tester for the freaking game. :oops:

jAkUp is straight up in my book.

Man, this "ATi is teh shiznit," is getting mighty old... :rolleyes:

Both cards are great and apparently SM 3.0 is just as great as 3Dc is, maybe even arguably better? :?
 
actually u guys will see in an upcomign image quality article that nvidia has lots of cheats in their opengel code. i think it was on rage3d that someone from a 3d review site was in the process of making the article and said when he looked at quake 3 nvidia was a blurry mess compared to ati. the lod settings were just awful
 
Without having read that NVN thread, the IQ he describes sounds exactly like that demoed at the 6800U launch--which was repeatedly described as a "SM2.0/3.0" mod by the Crytek dev presenting it--so I'd imagine a X800 would look similarly good. So the question becomes does the X800 maintain the prodigious lead it displayed in "regular" FC with this new SM2.0/3.0 "mod," or will the new "mod" or new drivers or SM3.0 efficiencies help nV close the gap?
 
Back
Top