Crysis runs on DX9 (and I yet have to see something in the video they released so far which requires DX10, but maybe with DX10 hw/drivers they will be able do achieve a decente frame rate)
softw particles were discoussed at game dev recently, a very simple technquie to do, avaiable on all hardware that supports shaders
Well 360 can hardly handle Prey and Quake4. The PC versions look a lot better and run faster on a PC equipped with a X800-class card and a 2 year old CPU. So, Crysis, a game that is supposedly far ahead of Doom3-class games should really be outside of the capabilities of the consoles, especially if the engine hasn't had the consoles in mind from the start.
I suppose that you never thought that with prey and Q4 is the port of the engine to blame
you talk as is the 360 the problem
take parallel development, as oblivion
in outsides scenes the 360 has 30 fps with AA+HDR @ 1280
the same is obtained with an X1900XTX
take FEAR, the developer says that they have ported the game, and still there was other space, so they added HDR and other things
how card can do 30 FPS @1280 with AA+HDR?
what all this means?
stop to make senseless comparison please.
actually, your comparisons aren't bad -- but not accurate either. As you can see here, modern PCs beat up on the 360 a good bit. A cursory glance may reveal similar frame rates, but look at all the eyecandy turned on in the case of x1950xtx, or even x1900xtx. The 360 version does NOT run with these settings. And that's overlooking some important points -- the rest of the PC Oblivion experience is vastly better than that of the 360. No loading times from grid to grid, MUCH lower loading times from inddors to outdoors, and really a smoother experience in general. We'll ignore modding and patching for now, since that seems a little off-topic.
Then, FEAR. as evidenced here, both ATi cards give you 70 fps at likely MUCH higher IQ than 360's FEAR. b3d tested with "Max Graphics Settings" -- not exactly fair when you consider the lighter CPU load of the reduced physics and other settings, but (and without ever having seen 360 FEAR) i'd bet just about anything that the 360 isn't pushing these kinds of textures. Could be wrong! but I doubt it. Also, I see very little to make me think the HDR penalty for FEAR on ATi flagships would be more than that of Oblivion's. Oblivion's is about 20%, from what I can tell from Firing Squad's numbers. And it would take a lot more than a 20% drop in frames to drop FEAR@1280 to 30 fps.
Anyway, sorry I got so wrapped up in it all -- got kinda curious myself -- but the fact is, high-end PCs are graphics showcases. The only way they're going to fall behind consoles and stay there is if nV and ATi stop releasing new cards. It's not a fair comparison -- PCs are more expensive and take more effort to use properly -- and I think the x360 looks fantastic and is getting close to a reasonable price. The argument you're trying to make, however, is doomed.
You shouldn't base ports on a console's capibility.
Even Ati spokeman says that X1900 and 360 will give very similar experiences in real world, performance wise.
Dont forget EDRAM is aiding Xenos for HDR+AA effects.
Core for core there isnt a comparison.
Honestly it wouldnt surprise me if it was between an 1600 and an X1800 in terms of power, without the EDRAM.
When the Xbox360 launched it was "just under the power of an X1800XT"
Say a bad thing about internal HDDs, BRD, Cell, Xenon, Xenos, people treat ya like you just called their mother a bad name and you're the worst human being on earth. Its quite pathetic really.
I'm not convinced that 360 is equal to a X1900XTX though. Not in the least.
In terms of power, Xenos ran the Ruby demo quite fine (an X1800 demo) and from a raw shader performance standpoint Xenos is like 15% faster than R520. There is some tit for tat (e.g. X1800XT has 25% more texel fillrate).
I you bet the low hit of hdr on fear pc based on another totally different game, you bet on textures that you admit you don't even seen, you bet reduced physic (!!)
Its probably worth considering that the 360 version of FEAR lacks soft shadows which were by far the biggest killer in performance for the PC version. I think without them the top end cards are breaking the 100fps barrier at max details.
Also remember that 1280 on the 360 is 1280x720 whereas on the PC its 1280x1024. Thats 40% more resolution so not the best basis for comparison.
I play the old demo of FEAR at 1280x800 with 4X adaptive AA / 8X HQ AF ( without soft shadows of course ) and if i recall correct the framerate was between 37 and 75 frames.
My rig is: 1900 XTX / AMD64 x2 4400+ / 2GB DUAL DDR.
Generally from what i have seen up to today , i dont have any indication that a pc like this can have technically better games than xbox360. Maybe in papers/specs but not IRL. The only win that i give to my pc is more AF. But thats all.
So it depends on what you mean. Asking if Xenos is faster than X1900XTX is basically only asking if it's a little faster than X1800XT, which I think it is, based on the theoretical specs, Xenos should have more shader power than X1800XT for example, which only has 16 shaders.
If i am wrong correct me but from what i can see there arent soft shadows. Its just 1280x1024 - 4x FSAA (Not adaptive enabled) and 16X AF.pjbliverpool said:Here the XTX even with soft shadows if achieving 64fps and thats at the higher resolution again of 1280x1024 with 16xAF. Although the lowest framerate is 30fps which is lower than yours.
http://xbitlabs.com/articles/video/d...950xtx_13.html
Still at these very high quality settings its good enough to cap the framerate at 60fps and have it stay there a good amount of the time while never falling below 30fps.
Sorry but for me is not impressive at all. We are talking for a previous gen title without soft shadows and hdr who plays between 37/75 at hardware who cost me 1600 euros. Its ok but definitely not impressive. And even less impressive is the fact that when we look at titles with next gen material , things become worst. Look at your link for GRAW benchmark. 1280x1024 With HDR but zero AA and the game runs with 21-47 frames. The same happen with the demo of Call of juarez when i test it.That seems pretty impressive to me for a GPU running a title of this calabre which probably has very little, if any specfic optimisation for its architecture.