Are next generation consoles not good enough for Crysis?

In 3~4yrs time ( Maybe less ) a X1950XTX wont be powerful enough to run 4th or 5th generation 360 games with the same quality as 360.
 
In 3~4yrs time ( Maybe less ) a X1950XTX wont be powerful enough to run 4th or 5th generation 360 games with the same quality as 360.

And the X1950XTX will still run PC games noticably better than their respective ports to the 360. I wouldn't say a Geforce 3 could run Halo 2, but it ran Halo 1 and Doom 3 better than their Xbox ports.
 
When Xbox arrived we had GeForce 3 and Radeon 8500, two boards that were obviously outclassed by NV2A. Well Radeon 8500 maybe was its equal.

That is not the case with X1800/G70. Not only do both X1800 and G80 have huge local RAM sizes and 256-bit RAM data buses, but they also have equal or better internal capabilities than 360. They are theoretically somewhat less flexible, but the performance is ahead on the PC.
 
Last edited by a moderator:
When Xbox arrived we had GeForce 3 and Radeon 8500, two boards that were obviously outclassed by NV2A. Well Radeon 8500 maybe was its equal.

That is not the case with X1800/G70. Not only do both X1800 and G80 have huge local RAM sizes and 256-bit RAM data buses, but they also have equal or better internal capabilities than 360. They are theoretically somewhat less flexible, but the performance is ahead on the PC.

Xbox 1 also had a rather slow processor (weren't processors pushing 2ghz when it came out?) and a paltry amount of ram, and somewhat slow graphics memory. In my experience, a 2ghz computer with a geforce 3 and a good amount of ram performed better in cross platform games than the xbox (let's discount halo though since it recieved a grphical downgrade on dx8 cards on the pc), though just barely. Actually, it was probably more like identicle performance, but the Xbox was locked to 30fps at 640x480 whereas the PC might be able to handle 800x600 at 32fps or 640x480 at 40fps.
 
If i am wrong correct me but from what i can see there arent soft shadows. Its just 1280x1024 - 4x FSAA (Not adaptive enabled) and 16X AF.
min:30 - max:64

http://xbitlabs.com/articles/video/display/ati-x1950xtx_13.html

"We select the highest graphics quality settings in each game, "

And 64fps isn't the maximum, its the average. The maximum would be much higher.

Sorry but for me is not impressive at all. We are talking for a previous gen title without soft shadows and hdr who plays between 37/75 at hardware who cost me 1600 euros. Its ok but definitely not impressive.

Compared to a next gen console pushing the same game at a lower resolution, no texture filtering, inferior anti-aliasing, no softshadown, but added HDR, all at a likely lower framerate. I wouldn't say that bad.

The cost of your entire PC isn'r really an issue since it has more functions, and your paying a premium on the hardware to pay less on the games.

And even less impressive is the fact that when we look at titles with next gen material , things become worst. Look at your link for GRAW benchmark. 1280x1024 With HDR but zero AA and the game runs with 21-47 frames.

Again, your confusing average framerate with maxmimum framerate. Everyone knows GRAW is a horrible system hog for its graphics but a 47fps average at 1280x1024 with HDR and 16xAF isn't too bad. Plus the game has an "edge smoothing" option in the menu which has a similar effect to MSAA which based on the previous comments about all settings at max I can only assume is turned on.

I mean it seems to me that 7900gtx and 1900xtx are not as capable as xbox360 to handle next gen gaming IRL but only in papers.

What evidence do you have showing the X360 performing superior feats? All I see is lower resolution, generally lower image quality and framerates which are optimistically equal.
 
Last edited by a moderator:
it's a freakin port..what do you expect?
only games specificately written for a single console will (maybe) really push a system.. but not now, in 2 or 3 years, lol :)
 
In 3~4yrs time ( Maybe less ) a X1950XTX wont be powerful enough to run 4th or 5th generation 360 games with the same quality as 360.

On what do you base that? On paper the X1950XTX is arguably a bigger jump over Xenos than the Ti4600 was over NV2a.

Can you provide a benchmark in any game showing NV2a outperforming a Ti4600 at the same settings today?

Certainly there are features of Xenos than the 1900 can't replicate in the same way but in terms of producing the same level of visuals I don't see why it would be any less capable.
 
Compared to a next gen console pushing the same game at a lower resolution, no texture filtering, inferior anti-aliasing, no softshadown, but added HDR, all at a likely lower framerate. I wouldn't say that bad.

I don't know how games are running on PC right now since I haven't played on a high-end PC for ages, but I saw a GC FEAR demo running on X360 and it ceirtanly did have AF turned on.
only games specificately written for a single console will (maybe) really push a system.. but not now, in 2 or 3 years
Yeah, but I expect Gears of War and FM2 to show off the system quite nicely;)
 
Can Crysis be done on next gen?

Everytime I see new footage of Cyrsis on PC, I ask myself why the console companies bother. Some of the features, effects and gameplay innovations in that game are just so far a head of what console devcos are doing. Can a game of this calibur be achieved on the next gen of consoles that are currently heading into the market?
 
Ninja Theory

Quote:
Originally Posted by Ninja Of Chaos
How could they look as good/better then crysis, if the PS3 has no DX10 shaders whatosever?

Nao-
CELL is, in many ways, way more flexible than a DX10 geometry (vertex shader + geometry shader) pipeline.
Devs will exploit it in ways that even the original CELL designers had never imagined, it's only a matter of time, no doubts about it .


Quote:
I believe that Crysis was running at E3 on DX9 GPUs (although in SLI/Crossfire configuration). nAo, do you think that PS3 will eventually reach Crysis' graphical level?

Nao-
Yes, I do.
 
Ninja Theory

Quote:
Originally Posted by Ninja Of Chaos
How could they look as good/better then crysis, if the PS3 has no DX10 shaders whatosever?

Nao-
CELL is, in many ways, way more flexible than a DX10 geometry (vertex shader + geometry shader) pipeline.
Devs will exploit it in ways that even the original CELL designers had never imagined, it's only a matter of time, no doubts about it .


Quote:
I believe that Crysis was running at E3 on DX9 GPUs (although in SLI/Crossfire configuration). nAo, do you think that PS3 will eventually reach Crysis' graphical level?

Nao-
Yes, I do.

And how you gonna fit it in 512 MB?

How you gonna get the bandwidth to run 4-8XAA?

Will the RSX bandwidth allow Cell to be used like DX10?

etc.

And Crysis is only one..what about the better looking PC games down the pike after that?

PC game devs hold all the cards here. They're going to be working with 1-2 GB's of RAM, if not 2-4GB's, for starters. They're going to be working with Core Dou's that probably smoke a Cell/Xbox360 CPU in real terms (Carmack "PC CPU's might be twice as fast") . They're going to be working with video cards with much more raw power, and 48-60 GB's of bandwidth. Of course they wont be working in a closed box, we know that, but the former advantages will trump the latter disadvantages. And even the "yeah but you cant target high end" isn't entirely true, as PC gamers can up settings (textures, Rez, LOD, AA) in many areas if they have high end, and by the time Crysis ships 7900GT/X level cards will be mainstream (if they aren't already).

You might get a 80% port of Crysis on the consoles someday. It will look great but be lacking in certain areas compared to the PC.

But hey I'm not a dev and NAO is, it's just my opinion..

And to be fair, Heavenly Sword in it's own way is nearly if not as graphically impressive as Cyrsis. And just to spread the love Gears of War probably looks better than any PC game to date (although I think high end PC's could likely run Gears, the fact is they dont get a chance too).
________
Depakote Injury Lawyer
 
Last edited by a moderator:
"We select the highest graphics quality settings in each game, "
Again he doesnt say anywhere that they enable soft shadows with AA. I choose also the highest graphics quality for the demo and this is the reason that i dont have soft shadows.
Here is the B3D benchmark wich is closer to my settings ( 1280x960 / 4xFSAA / 8X AF) 66FRAMES. Maybe Dave could enlighten us about S-Shadows.
http://www.beyond3d.com/reviews/ati/r580/index.php?p=16

pjbliverpool said:
Compared to a next gen console pushing the same game at a lower resolution, no texture filtering, inferior anti-aliasing, no softshadown, but added HDR, all at a likely lower framerate. I wouldn't say that bad.
We dont really know how is going to look The FINAL console version of FEAR. Everything you say is hypothetical based on "work in progress" footage .

The cost of your entire PC isn'r really an issue since it has more functions, and your paying a premium on the hardware to pay less on the games.
Most people dont pay a bunch of money just to be able to play previous gen games with better framerate at maxed settings
When i m paying a premium on the hardware im expecting this hardware to come up to its next gen burden. Unfortunately i cant see this happen for the moment.

Again, your confusing average framerate with maxmimum framerate. Everyone knows GRAW is a horrible system hog for its graphics but a 47fps average at 1280x1024 with HDR and 16xAF isn't too bad.
Ok I was wrong with the max framerate but i dont buy the 47 fps average. The game suffers from horrible tearing (at least the demo dip very often under 25 frames) It doesnt have AA vs 4XAA of the 360 version , and it has bland hdr , worst graphics and lower polycount compared with its console counterpart.
As i told you the same happen with CoJ demo at the second level(open environment). The tearing is noticeable and the jaggies horible. I am forced to play the 2nd level in 1024x768 , wich is lower than the native console resolution , in order to get a better framerate and still i cant enabble AA.
Not to mention "JustCause" where the Pc version is missing most of the next gen stuff of the console version.
What evidence do you have showing the X360 performing superior feats? All I see is lower resolution, generally lower image quality and framerates which are optimistically equal.
Dont forget IRL. A High end PC with an 1900xtx carry all the proper specs to run a , dx9 level, NG-game. But this can happen only with a reasonable level of optimisation. I just dont see this happen at the moment. When we pass to NG gaming ( real time DoF, motion blur, hdr+AA etc) all i see for the time being is inferior pc games.
I really wish this change with the second wave of NG games ( BIA , STRANGLEHOLD ,COD...) or else when we make mention of cards like 7900/1900 we will speak for the wasted generation of PC gpus.
 
This topic has been done to death before, plus it's a 'PC versus console' debate in disguise which is to be frowned upon.

I believe recently a job posting was up at CryTek for an XB360 developer. Looking there now, we see a PS3 position too. Thus we'll get to see how well the consoles can manage the engine, and also how other developers compare.
 
blah, technical spec and benchmarks are great, but the fact is nothing currently on the pc looks anywhere near as good as games like ghost recon(the pc version looks so lame), moto gp, pgr etc.

i have high hopes for crysis, but it seems to run extremely bad in every video released, especially in the forest scenes.

ut2007 was looking impressive but the new gameplay video looks much worse than all the tech demos.

im certainly not a console fan as i have no plans to buy any console its just disappointing how much worse pc gaming continues to get. hopefully vista will bring a change.
 
...and by the time Crysis ships 7900GT/X level cards will be mainstream (if they aren't already).

"Mainstream" doesn't mean "me and all my friends have one".

Current market penetration of 7900GT-level cards is single-digit percentage; by the time Crysis ships it will be lucky to have reached 20%.

See, for example, the oft-cited Steam hardware survey:

http://steampowered.com/status/survey.html

which probably trends more high-end then the entire gaming population. "Mainstream" is what 5200, 6600 and 9600 are.

By the time Crysis ships, there will probably be 10-20x more Xbox360s and PS3s combined, than there are PCs capable of running it.
 
Can you provide a benchmark in any game showing NV2a outperforming a Ti4600 at the same settings today?

benchmarking a console?

you can try with games

the Nv2A lacks totally of local memory, but you can try Doom3, Riddick, Halo, Splinter Cell 4, Half Life 2
on a pc with a celeron 700, 64 MB of ram and a ti4600?

@640x480 do you think to obtain 30 fps with vsync on?

again, do you think that ti4600 can run with this performances a game with the visuals of Black?

and Nv2A is only a GF3 + 1 Vertex unit without local mem (using system mem for framebuffer and textures), on paper it's lightyears worse than a GF3 with dedicatet local memory

Certainly there are features of Xenos than the 1900 can't replicate in the same way but in terms of producing the same level of visuals I don't see why it would be any less capable.

I don't know if an X1900 can run games as Gears of Wars with the same settings, what I remember is the demo of gow showed last year at E3, 10-15 fps with a SLI rig on the pc version of the UE3, ok the code is immature because of the date and in a year of developing the console can drastically improve the performances thanks to special dedicated optimizations
this answer to your question, why R500 can do visuals that cannot be possible on X1900 (that is a great great gpu)?
because R500 is in a closed box and can beoptimized in so many ways

there're so many things that was not yet available to developers..
an efficient tiling system, for example, better extensions, a big direct360 update will be there, in time for 3th generation titles
we'll see a lot of procedural textures and memexport use in future, the whole system is almost underused in this moment and most major engines, UE3 for example, are just port of pc engines and don't use any of the special architecture of Xenos-Xenon

all we have to do is wait and see, in this moment we are in the very early life moment of the console in 2 years we'll see visuals actually not possibile in pc and in console worlds

when this happen we will have powerfull pc, but the consoles will improve strongly the visual quality.. this is the same old story, it isn't?

the major difference is that xbox was born old, as mid-to-low pc specs, 360 was born with a lot of thing yet unavailable in pc worlds (advanced shaders vs 3.0, unified arch. vs old static arch, dx-10 class functions, hardware tessellator, cache sharing, thread locking, cpu that access fragment and vertex data while the gpu is processing this data... etc)

I'm not comparing directly the performance, just because there're a lot of factors (pc gpu have a local 256 bit bus, and if you don't tile to use right the eDRAM you'll take an hit on console of 20-40%..... for example), I'm only saying that 360 is far better than xbox on his time, wait and see ;)
 
Couldnt have put it better myself MasterDisaster. The xbox was born a low end pc and the 360 a very high end pc. One of the differences though is that the xbox was so pc like that it made it easier to get decent performance from the get go. The 360 is a far more unusual machine and will take much longer to get anywhere near full potential from it. I can really see that with optimization the 360 may well keep parity in performance with high end pc for several years and coder really start to learn how to make it tick. Of course a lot of this is also down to microsoft producing ever more capable compilers, tools and in effect driver to the chipset. People may laugh at Microsoft producing high performance code but lets be honest they have some very clever people and huge part of there operation is tool building. People try to dev on the PS3 are, by general option, having a much worse simply due to Sonys inability to get decent tools built. I personally think in the mid term this will negate any possible performance lead the PS3 might have, though obviously in the end the developers will really get there head around the PS3 and them things might change slightly
 
There's nothing in Crysis you can't do on a next gen console at proper frame rate from what we have seen so far.
Maybe they're having some problem with memory occupation..

I definitely agree on this. From what we could see, I can't see any technique that can't be implemented at a reasonable framerate on PS3 or X360. The memory footprint will depend on the actual game probably. Maybe some textures have to be scaled down if no streaming architecture is in place. Again this is game-specific, but in principle it's absolutely feasible. I can see both consoles delivering a comparable image quality in the near future.

Fran/Fable2
 
Back
Top