Are next generation consoles not good enough for Crysis?

the main probleme is maybe the ram quantity ? and raw code of 3D engine ?

crytek would problably not bet for the moment on console version of crytek2 .
but they will probably sell their engine liscence to EA.

In any case, i hope EA will not buy the entire studio.
 
softw particles were discoussed at game dev recently, a very simple technquie to do, avaiable on all hardware that supports shaders
 
softw particles were discoussed at game dev recently, a very simple technquie to do, avaiable on all hardware that supports shaders

If thats the case why havent we seen this since quake3/unreal2 based games? Are any released games featuring this?

(not being rude... geniunely curious)
 
Last edited by a moderator:
well they do come at a performance cost and with particles are something that u want as cheap as possible to draw cause they can kill performance easily.
its the old performance / visual tradeoff

btw both those games dont really support shaders as in programable shaders
 
Well 360 can hardly handle Prey and Quake4. The PC versions look a lot better and run faster on a PC equipped with a X800-class card and a 2 year old CPU. So, Crysis, a game that is supposedly far ahead of Doom3-class games should really be outside of the capabilities of the consoles, especially if the engine hasn't had the consoles in mind from the start.

I have no doubt that game engines designed to truly exploit these consoles could be amazing. Problem is that no dev to my knowledge has commited to exclusively making a game engine for the 360 or PS3. Why do that when cross-platform titles will more likely make lots of money. Risk is not cool when the games cost 10s of millions of $$. Unfortunately this generation of consoles is not suited to all this cross-platform nonsense with their architectures that rely heavily on "exclusive" optimization.

Just another reason Wii is the way to go. PS3 and 360 seem like consoles for people who don't want to spend on a PC but do want to play the same games. It's not really working out technologically.
 
Well 360 can hardly handle Prey and Quake4. The PC versions look a lot better and run faster on a PC equipped with a X800-class card and a 2 year old CPU. So, Crysis, a game that is supposedly far ahead of Doom3-class games should really be outside of the capabilities of the consoles, especially if the engine hasn't had the consoles in mind from the start.

I suppose that you never thought that with prey and Q4 is the port of the engine to blame

you talk as is the 360 the problem
take parallel development, as oblivion
in outsides scenes the 360 has 30 fps with AA+HDR @ 1280
the same is obtained with an X1900XTX

take FEAR, the developer says that they have ported the game, and still there was other space, so they added HDR and other things

how card can do 30 FPS @1280 with AA+HDR?


what all this means?

stop to make senseless comparison please. ;)
 
I do blame the Doom3 engine. It shows that ports to console from PCs are bad. I think that's basically what I said.

I'm not convinced that 360 is equal to a X1900XTX though. Not in the least. Maybe if the game was specifically tailored to it. Oblivion doesn't run well enough on any system to say that though, lol. Oblivion 360 has a host of issues aside from graphics anyway.
 
I suppose that you never thought that with prey and Q4 is the port of the engine to blame

you talk as is the 360 the problem
take parallel development, as oblivion
in outsides scenes the 360 has 30 fps with AA+HDR @ 1280
the same is obtained with an X1900XTX

take FEAR, the developer says that they have ported the game, and still there was other space, so they added HDR and other things

how card can do 30 FPS @1280 with AA+HDR?


what all this means?

stop to make senseless comparison please. ;)

actually, your comparisons aren't bad -- but not accurate either. As you can see here, modern PCs beat up on the 360 a good bit. A cursory glance may reveal similar frame rates, but look at all the eyecandy turned on in the case of x1950xtx, or even x1900xtx. The 360 version does NOT run with these settings. And that's overlooking some important points -- the rest of the PC Oblivion experience is vastly better than that of the 360. No loading times from grid to grid, MUCH lower loading times from inddors to outdoors, and really a smoother experience in general. We'll ignore modding and patching for now, since that seems a little off-topic.

Then, FEAR. as evidenced here, both ATi cards give you 70 fps at likely MUCH higher IQ than 360's FEAR. b3d tested with "Max Graphics Settings" -- not exactly fair when you consider the lighter CPU load of the reduced physics and other settings, but (and without ever having seen 360 FEAR) i'd bet just about anything that the 360 isn't pushing these kinds of textures. Could be wrong! but I doubt it. Also, I see very little to make me think the HDR penalty for FEAR on ATi flagships would be more than that of Oblivion's. Oblivion's is about 20%, from what I can tell from Firing Squad's numbers. And it would take a lot more than a 20% drop in frames to drop FEAR@1280 to 30 fps.

Anyway, sorry I got so wrapped up in it all -- got kinda curious myself -- but the fact is, high-end PCs are graphics showcases. The only way they're going to fall behind consoles and stay there is if nV and ATi stop releasing new cards. It's not a fair comparison -- PCs are more expensive and take more effort to use properly -- and I think the x360 looks fantastic and is getting close to a reasonable price. The argument you're trying to make, however, is doomed. ;)
 
actually, your comparisons aren't bad -- but not accurate either. As you can see here, modern PCs beat up on the 360 a good bit. A cursory glance may reveal similar frame rates, but look at all the eyecandy turned on in the case of x1950xtx, or even x1900xtx. The 360 version does NOT run with these settings. And that's overlooking some important points -- the rest of the PC Oblivion experience is vastly better than that of the 360. No loading times from grid to grid, MUCH lower loading times from inddors to outdoors, and really a smoother experience in general. We'll ignore modding and patching for now, since that seems a little off-topic.

Then, FEAR. as evidenced here, both ATi cards give you 70 fps at likely MUCH higher IQ than 360's FEAR. b3d tested with "Max Graphics Settings" -- not exactly fair when you consider the lighter CPU load of the reduced physics and other settings, but (and without ever having seen 360 FEAR) i'd bet just about anything that the 360 isn't pushing these kinds of textures. Could be wrong! but I doubt it. Also, I see very little to make me think the HDR penalty for FEAR on ATi flagships would be more than that of Oblivion's. Oblivion's is about 20%, from what I can tell from Firing Squad's numbers. And it would take a lot more than a 20% drop in frames to drop FEAR@1280 to 30 fps.

Anyway, sorry I got so wrapped up in it all -- got kinda curious myself -- but the fact is, high-end PCs are graphics showcases. The only way they're going to fall behind consoles and stay there is if nV and ATi stop releasing new cards. It's not a fair comparison -- PCs are more expensive and take more effort to use properly -- and I think the x360 looks fantastic and is getting close to a reasonable price. The argument you're trying to make, however, is doomed. ;)

I don't want to start a stupid system vs system war, going off topic (read the topic friend)
but you bet a lot of thing actually, you bet that oblivion is using only aa2x (look at the bench you linked), you bet the low hit of hdr on fear pc based on another totally different game, you bet on textures that you admit you don't even seen, you bet reduced physic (!!)

this is not how I usually use to think, do you know friend that a capped 30 fps game console can have a frame rate >30, <60 with vsync off (as you benchmark pc games)?

Even Ati spokeman says that X1900 and 360 will give very similar experiences in real world, performance wise.
Do you think that first gen titles are indicative of what a closed box as a console can do?
do you really think all of this and bet the rest? :rolleyes:
If the answer is 'yes' Now we have to not derail the thread, as I wrote, I'm not interested in a system to system comparison, we all know how it can ends
and the argument I'm trying to make 'it's stupid compare Q4/prey performance between console/pc' is not doomed at all, think it over friend

now we can back to talk of crysis, I hope
 
You shouldn't base ports on a console's capibility.

TBH, it looks (since no actual benchmarks are available) that the console games ported to PC look/run better on an x800xt too. Of course, those games are typically upgraded ports of Ps2 games, but even Most Wanted, Oblivion, and Call of Duty 2 looked better on the PC imo..actually which way were those ported? All 3 have console style interfaces, but I'm not sure of the base development platform. Oh well, console hardware has always done best with exclusives, even on the original Xbox which was basically a low end PC. What Bungie accomplished with Halo 2 was pretty incredible for that level of hardware and I don't think I saw any other games on the system that came close. (I guess you could say it's down to art direction as well?)

Still, I'm excited to see what games can do on the 360 (and PS3 as well) once games start coming out that were developed from the ground up for the consoles we'll see things that blow the current portware away. Sure, 360 has a few exclusives, but they haven't been truly built for the hardware yet. The top 3 best looking games on Xbox 360 will likely be exclusives. The next 7 will probably be UE3 engine based. ;)
 
Even Ati spokeman says that X1900 and 360 will give very similar experiences in real world, performance wise.

Have a link? Dont forget EDRAM is aiding Xenos for HDR+AA effects. Core for core there isnt a comparison. Honestly it wouldnt surprise me if it was between an 1600 and an X1800 in terms of power, without the EDRAM. But i've seen all kinds of "ATI compared it to this" stuff thrown around, most of it false. When the Xbox360 launched it was "just under the power of an X1800XT", so now its the X1900XTX is it? Perhaps when the X2000XT or what have you launches it will be 75% as powerful as that. No offense, just seems like people treat talk of console hardware as if it contained the key to the second coming of Christ. Say a bad thing about internal HDDs, BRD, Cell, Xenon, Xenos, people treat ya like you just called their mother a bad name and you're the worst human being on earth. Its quite pathetic really. God forbid a company say that a game wont be its best except on a computer :). I mean we've already seen that before with multi-platform launches. Games launched on multiple systems pretty much always look and perform better on computers.

By the way not all that was directed specifically at you.
 
Last edited by a moderator:
Dont forget EDRAM is aiding Xenos for HDR+AA effects.

No, the ROPs (which can do MSAA with FP10 blending and filtering) is the reason the 360 can do HDR effects without a penalty from the ROPs. Even if Xenos was not limited by bandwidth with FP16 blending and filtering the ROPs would be at half speed.

Core for core there isnt a comparison.

Correct for the most part. But then again you say...

Honestly it wouldnt surprise me if it was between an 1600 and an X1800 in terms of power, without the EDRAM.

But the eDRAM is a central part of the design--it alleviated many bandwidth related bottlenecks and allows the ROPs to hit their peak pixel fillrate, which pushes a lot of the bottlenecks back to the shader arrays.

In terms of power, Xenos ran the Ruby demo quite fine (an X1800 demo) and from a raw shader performance standpoint Xenos is like 15% faster than R520. There is some tit for tat (e.g. X1800XT has 25% more texel fillrate). Anyhow, performance wise Xenos is much closer to the X1800 than the X1600 and exceeds the X1800 in features and in a number of situations in raw performance. I have no clue why you would suggest it is in the middle ground between the X1600 and X1800.

I think it is worthwhile to note that Xenos also avoids one of the issues that stalls the X1800 in some situations. A lot of games show that the X1800 TMUs are under utilized, the extra shader performance in the X1900 getting much better TMU utilization. Xenos may have a lower peak texel fillrate but it has a more raw shader power (like the X1900, but nowhere near as much). So even with additional features aside and effeciency design gains like FP10 (does the same basic job of FP16 but with no penalty), eDRAM, unified shaders, etc ignored, one need only look at the X1900 compared to the X1800 and see there are areas where Xenos could gain some ground beyond what is on paper.

But ultimately Xenos is a console GPU and the X1800 is a desktop GPU. Different worlds. As it is Xenos has posed problems to a degree because it isn't just a PC GPU slapped into a console. It needs special design attention to use effeciently. Until then we will see a lot of game engines ignore tiling and a lot of the neat features in Xenos (tesselation, streamout, 32bit texture filtering, vertex texturing, etc) ignored.

When the Xbox360 launched it was "just under the power of an X1800XT"

And who said that? I have seen ATI refer to its peak raw shader power being slightly above the X1800 and the advantage in bandwidth Xenos has, all stuff derivable from the specs. There are cases where Xenos is slower than an X1800XT, but vice versa. Nothing has really changes.

Say a bad thing about internal HDDs, BRD, Cell, Xenon, Xenos, people treat ya like you just called their mother a bad name and you're the worst human being on earth. Its quite pathetic really.

People will treat you that way when you try comparing a flag ship level GPU, like Xenos, with a low end midrange GPU like the X1600 ;)
 
I'm not convinced that 360 is equal to a X1900XTX though. Not in the least.

It is because current games dont take advantage of X1900XTX very much at all. X1900XTX only ran 5-10% faster than a X1800XT in most cases when it was released (check the benches). It has 300% shaders, but those are not the bottleneck in most cases, so they sit idle.

So it depends on what you mean. Asking if Xenos is faster than X1900XTX is basically only asking if it's a little faster than X1800XT, which I think it is, based on the theoretical specs, Xenos should have more shader power than X1800XT for example, which only has 16 shaders.

Now, if you built a game to take ADVANTAGE of X1900XTX's shader strength, yes it would/should easily top Xenos. However no games do, so it's a moot point. In practice it barely tops a X1800XT, which means it is equal to Xenos.

Btw, if you check IGN's "head to head", that their insider service offers doing direct comparisons of games on different platforms with comparison video, screens, and detailed opinion, you really cant tell the difference in 360 vs PC shots in most cases. I think a lot of the differences appear to be overrated.



In terms of power, Xenos ran the Ruby demo quite fine (an X1800 demo) and from a raw shader performance standpoint Xenos is like 15% faster than R520. There is some tit for tat (e.g. X1800XT has 25% more texel fillrate).

What is texel fillrate?

And what makes you say Xenos is 15% more shaders than R520? It's such a specific number, do you have an inside source that ran benchmarks for that? Or did you just derive it from theoreticals (far less accurate imo)?
________
Live Sex Webshows
 
Last edited by a moderator:
I you bet the low hit of hdr on fear pc based on another totally different game, you bet on textures that you admit you don't even seen, you bet reduced physic (!!)

Its probably worth considering that the 360 version of FEAR lacks soft shadows which were by far the biggest killer in performance for the PC version. I think without them the top end cards are breaking the 100fps barrier at max details.

Also remember that 1280 on the 360 is 1280x720 whereas on the PC its 1280x1024. Thats 40% more resolution so not the best basis for comparison.
 
Its probably worth considering that the 360 version of FEAR lacks soft shadows which were by far the biggest killer in performance for the PC version. I think without them the top end cards are breaking the 100fps barrier at max details.

Also remember that 1280 on the 360 is 1280x720 whereas on the PC its 1280x1024. Thats 40% more resolution so not the best basis for comparison.

I play the old demo of FEAR at 1280x800 with 4X adaptive AA / 8X HQ AF ( without soft shadows of course ) and if i recall correct the framerate was between 37 and 75 frames.
My rig is: 1900 XTX / AMD64 x2 4400+ / 2GB DUAL DDR.

Generally from what i have seen up to today , i dont have any indication that a pc like this can have technically better games than xbox360. Maybe in papers/specs but not IRL. The only win that i give to my pc is more AF. But thats all.
 
I play the old demo of FEAR at 1280x800 with 4X adaptive AA / 8X HQ AF ( without soft shadows of course ) and if i recall correct the framerate was between 37 and 75 frames.
My rig is: 1900 XTX / AMD64 x2 4400+ / 2GB DUAL DDR.

Generally from what i have seen up to today , i dont have any indication that a pc like this can have technically better games than xbox360. Maybe in papers/specs but not IRL. The only win that i give to my pc is more AF. But thats all.

According to this the X1900XTX gets a 68fps average at 1280x960 which is higher than the resolution your running at. Thats with max details aside from soft shadows and 4x MSAA / 8x AF.

http://tomshardware.co.uk/2006/08/23/ati_radeon_x1950xtx_uk/page10.html

Perhaps the fact that your using adaptive and running on the demo which may not be as optimised as the final game are causing you to get lower performance.

Here the XTX even with soft shadows if achieving 64fps and thats at the higher resolution again of 1280x1024 with 16xAF. Although the lowest framerate is 30fps which is lower than yours.

http://xbitlabs.com/articles/video/display/ati-x1950xtx_13.html

Still at these very high quality settings its good enough to cap the framerate at 60fps and have it stay there a good amount of the time while never falling below 30fps.

That seems pretty impressive to me for a GPU running a title of this calabre which probably has very little, if any specfic optimisation for its architecture.
 
So it depends on what you mean. Asking if Xenos is faster than X1900XTX is basically only asking if it's a little faster than X1800XT, which I think it is, based on the theoretical specs, Xenos should have more shader power than X1800XT for example, which only has 16 shaders.

Well I meant that X1900XTX is well ahead of 360 cuz of shader resources and total memory bandwidth. 360 only has a 128-bit memory data bus when you exclude the eDRAM. The eDRAM is awesome but it hasn't been used much to this point I believe and that cripples 360.

Don't get me wrong though, I think 360 has a stunningly well designed GPU. I think it's far smarter than sticking a crippled 7900 into PS3. Won't really matter, however. They're both basically the same generation and capabilities and so will have similar gfx, unlike PS2 vs. Xbox say. Especially if most games are ports. Ports are just out of control.

However, on the PC front we'll probably have R600/G80 for Crysis and that will really be a lot better than the consoles. The consoles need to switch over to exlusives before then cuz they will not compare well with ported games against the upcoming PC video cards.

Personally I'm disappointed in all the ports. I won't buy a console cuz it has Half Life 2, Oblivion, Crysis, or Far Cry. I can play those on my expensive PC that isn't going anywhere. I don't use my PC exclusively for games, and I enjoy building powerful PCs too much. I will buy a 360/PS3 if they get some awesome exclusives that really leverage the platform's qualities (i.e. gamepad, couch, big TV, visibly push the hardware, etc). I'm probably going to pick up a Wii though cuz Nintendo seems to be on track with console games.
 
Last edited by a moderator:
pjbliverpool said:
Here the XTX even with soft shadows if achieving 64fps and thats at the higher resolution again of 1280x1024 with 16xAF. Although the lowest framerate is 30fps which is lower than yours.

http://xbitlabs.com/articles/video/d...950xtx_13.html

Still at these very high quality settings its good enough to cap the framerate at 60fps and have it stay there a good amount of the time while never falling below 30fps.
If i am wrong correct me but from what i can see there arent soft shadows. Its just 1280x1024 - 4x FSAA (Not adaptive enabled) and 16X AF.
min:30 - max:64
That seems pretty impressive to me for a GPU running a title of this calabre which probably has very little, if any specfic optimisation for its architecture.
Sorry but for me is not impressive at all. We are talking for a previous gen title without soft shadows and hdr who plays between 37/75 at hardware who cost me 1600 euros. Its ok but definitely not impressive. And even less impressive is the fact that when we look at titles with next gen material , things become worst. Look at your link for GRAW benchmark. 1280x1024 With HDR but zero AA and the game runs with 21-47 frames. The same happen with the demo of Call of juarez when i test it.
I mean it seems to me that 7900gtx and 1900xtx are not as capable as xbox360 to handle next gen gaming IRL but only in papers.
 
Back
Top