Could PS3 and X360 manage the Crytec 2 engine ?

Status
Not open for further replies.
All that extra power "we like to boast about" allows games like Crysis to run at 720p on PC's while the level of visual fidelity would have to be greatly reduced to do so on the consoles. My GTS struggles with Crysis at 720p and high details. Why on Earth would I believe the RSX with about half the power could handle it at higher details with the same resolution?

Don't forget that 600P seems to be a more than acceptable resolution this gen which would give a 30+% boost in shader perf over 720p... let alone other optomizations that can be done.
 
So this compression goes something like 2:1(same s3tc) instead 4:1 as usual in 3Dc ?

(sorry if my mistake)

The 3Dc compression method gives 4:1 lossy compression if you're using 32-bit textures, 2:1 if you're using 16-bit. To put it simply, you go from 16*32 (or 16*16) stored bits down to 16*6 + 4*8 bits.

Technically, A8L8 isn't a compression method, it's a two-component texture format, so it nets 2 bytes per texel as opposed to 3 bytes. They store the x and y value and calculate the normalized z component in a shader i.e. they don't store the Z value. So in practical terms it should be a lossless format with 1/3 less space.
 
All that extra power "we like to boast about" allows games like Crysis to run at 720p on PC's while the level of visual fidelity would have to be greatly reduced to do so on the consoles. My GTS struggles with Crysis at 720p and high details. Why on Earth would I believe the RSX with about half the power could handle it at higher details with the same resolution?

Well, first, Carmack has said the advantage of "closed box" consoles equates to about 2X..so, RSX would according to that perhaps be able to nearly equal 8800GTX..

Second, a GTS I believe is only about 50% faster than a 7900, we're not even talking GTX 2X here..so you're even closer...

I think the big problem remains RAM, though.
 
The 3Dc compression method gives 4:1 lossy compression if you're using 32-bit textures, 2:1 if you're using 16-bit. To put it simply, you go from 16*32 (or 16*16) stored bits down to 16*6 + 4*8 bits.

Technically, A8L8 isn't a compression method, it's a two-component texture format, so it nets 2 bytes per texel as opposed to 3 bytes. They store the x and y value and calculate the normalized z component in a shader i.e. they don't store the Z value. So in practical terms it should be a lossless format with 1/3 less space.


Thanx a lot for information,but this does mean RSX /G70 can support some way a 3Dc (3dc if not my mistake was "born" in ATI labs and its "free" today like s3tc/dxtc?)? How SPU can help in this process?

(3:1 for normal maps thats good)
 
Last edited by a moderator:
Thanx a lot for information,but this does mean RSX /G70 can support some way a 3Dc (3dc if not my mistake was "born" in ATI labs and its "free" today like s3tc/dxtc?)? How SPU can help in this process?

The G7x/NV4x line does not support 3Dc decompression in hardware as Xenos would. On PC, when an application uses 3Dc, the G7x/NV4x driver converts the data to the A8L8 or V8U8 texture format.

(Devs could use CxV8U8 instead of V8U8, which does the Z calculation automatically btw).

I'm not sure how what the cost would be to implement 3Dc in software.

(3:1 for normal maps thats good)
It's less than that (misunderstood me). You're getting rid of the 3rd component (because it'll be calculated), so you're going from a set of 3 data points to 2.

*should be noted that regenerating the normal component or Z component is for tangent space.
 
Well, first, Carmack has said the advantage of "closed box" consoles equates to about 2X..so, RSX would according to that perhaps be able to nearly equal 8800GTX..

Second, a GTS I believe is only about 50% faster than a 7900, we're not even talking GTX 2X here..so you're even closer...

I think the big problem remains RAM, though.

Unless you provide a link to that quote then I doubt it and even if that then he is way of line with that remark. A RSX cant even match a 7900GT how would it then match a 8800GTX? ;)

And anyone claiming the RSX closer to a 7900GTX tha a GT must be smoking strong stuff. :LOL:

7900GTX 650+MHz, 50GB+/sec, 24/8 pixel/vertex, 16 rops
7900GT ~500+MHz, ~45GB+/sec, 24/8 pixel/vertex, 16 rops
RSX 500MHz, ~2*22.5GB/sec (~22.5GB/sec for VRAM) for whole system, 8 rops
 
Last edited by a moderator:
Well, first, Carmack has said the advantage of "closed box" consoles equates to about 2X..so, RSX would according to that perhaps be able to nearly equal 8800GTX..

And did he say specifically in relation to GPU's or was he talking about the whole system? Of course we know memory gets a huge advantage from being closed box. And to a lesser extent so does the CPU. But the GPU? No, 2x is a massive over exageration. You don't get twice the fill rate from going to the metal, nor do you get twice the texturing throughput, memory bandwdith or shader power.

The Ti4600 was FAR less than 2x more powerful than NV2a and yet were there any cross platform games from that generation that the Ti4600 couldn't run faster at the same detail and resolution? Did the xbox have games that looked twice as good as Doom 3, HL2 and Farcry running on a Ti4600?

Second, a GTS I believe is only about 50% faster than a 7900, we're not even talking GTX 2X here..so you're even closer...

In older games it may only be about 50% faster but in newer, more shader intensive games (i.e. the type of games that PS3 would be running), it can be 2x or more.

Oblivion:

http://www23.tomshardware.com/graphics_2007.html?modelx=33&model1=707&model2=709&chart=296

Crysis (vs 7900GS):

http://techreport.com/articles.x/13603/6

Call of Duty 4 (vs 7900GT):

http://www.firingsquad.com/hardware/nvidia_geforce_8800_gt_performance/page14.asp

Lost Planet (vs 7900GT):

http://www.firingsquad.com/hardware/nvidia_geforce_8800_gt_performance/page13.asp

And RSX isn't a 7900GTX. Its closer to a 7900GT but even that is debatable given its fewer ROPs and narrower bus.

Bottom line is, how many games on both PS3 and PC do you see the PS3 version able to perform at the same level (resolution, framerate, detail, image quality) as a GTS powered PC? Because I see a big fat zero.

EDIT: Just found another benchmark comparing the 7900GTX to the 8800GTS 640MB in Crysis:

http://www.amdzone.com/index.php/co...demo-dx9-gpu-and-cpu-core-scaling-performance

Here we see the GTS doubling the 7900GTX's performance. And given that the 7900GTX is quite a bit faster than RSX it appears I was wrong about my GPU being twice as fast afterall. It is in fact more than twice as fast (in Crysis). :D
 
Last edited by a moderator:
The G7x/NV4x line does not support 3Dc decompression in hardware as Xenos would. On PC, when an application uses 3Dc, the G7x/NV4x driver converts the data to the A8L8 or V8U8 texture format.

(Devs could use CxV8U8 instead of V8U8, which does the Z calculation automatically btw).

I'm not sure how what the cost would be to implement 3Dc in software.

It's less than that (misunderstood me). You're getting rid of the 3rd component (because it'll be calculated), so you're going from a set of 3 data points to 2.

*should be noted that regenerating the normal component or Z component is for tangent space.

I understand now and thanx again for infos.
 
You seem to be under the mistaken impression that tesselation is some kind of magic bullet. Its not. In fact I don't see how its even particularly applicable to Crysis. Its shaders that sap up all the power in Crysis. How would tesselation help with that? SUre it has a lot of geometry aswell but its doing a ton of other things.

Besides, the R6xx series all have Tesselators. If it could magically make the HD2600 suddenly caable of running Crysis at very high, don't you think it would have happened already?
Tesselation is a technique that saves a lot of memory, a much needed feature for consoles to keep up with the PC, in order to solve some memory issues.

As pointed out, Viva Pinata uses tesselation and, I tell you what, the game looks stunning even top end PC users would say it's acceptably good.

Look at the colours, the AA and the level of detail. It's a true "next gen" game.

viva-pinata-e3-2006-pic10.jpg


As for the the R6xx series, the PC world is a total mess, because there are too many graphics cards available and the older hinder the newer ones which feature awesome and unique techniques such as tesselation.

Without some kind of detailed explanation about what "smart techniiques" could be used and how they would improve performance over the existing techniques then I would have to put that down to wishfull thinking. There are plenty of games on PC and consoles which the GTX creams through at mich higher settings than the console. Why should we believe that all of a sudden the consoles are now capabile of running a game much better than a GTX? Did it happen with Farcry vs the R3xx series?
Console developers work intimately with the hardware, they have secret information we don't know about that perhaps can fill up volumes of textbooks.

For instance, most console games tend to have more polygons than their PC counterparts, and therefore the origin of games like HL2 and Doom 3 is well known because those developers don't know the hardware well and there's always a minimal common denominator approach.

All that extra power "we like to boast about" allows games like Crysis to run at 720p on PC's while the level of visual fidelity would have to be greatly reduced to do so on the consoles. My GTS struggles with Crysis at 720p and high details. Why on Earth would I believe the RSX with about half the power could handle it at higher details with the same resolution?
The X360 can't compete with those monsters. Period.

I've never said that about the console and I never will, but it can compete for graphics quality at 720p, perhaps not quantity (AAx8, AFx16, and those amazingly crazy numbers), though. Regarding the RSX, it can be helped by Cell while the GTX is on its own but yeah, the GTX is also in a different league

If the "raw power" of my GTS is going unused when its struggling to run Crysis as 720p would you mind telling me were it is? Because I wouldn't mind going to find it so I can get all those extra frames that my GPU is apparently leaving on the table.
Thank goodness you are technically minded and can tweak the settings yourself, I have no idea how the average user could have figured that out and tweaked it.

Even so, you are experiencing performance issues... and you shouldn't.
The real question is, how many PC games will take FULL advantage of the massively powerful and expensive graphic cards? I for one prefer developers tweak the game for me because it's their job after all, and they know better.

Sometimes the engine they use marks the limit though, that's my main complaint so far with UT3, for instance, not very friendly with AA, perhaps (don't know) tesselation, etc.

Cheers pbjliverpool
 
Tesselation is a technique that saves a lot of memory, a much needed feature for consoles to keep up with the PC, in order to solve some memory issues.

As pointed out, Viva Pinata uses tesselation and, I tell you what, the game looks stunning even top end PC users would say it's acceptably good.

Look at the colours, the AA and the level of detail. It's a true "next gen" game.

Viva Pinata looks great, I have played the demo on 360 but its mostly art IMO, I'm not seeing anything technically marvelous there. Certainly nothing even remotely in Crysis's league. Oh, and VP works fine on the PC aswell without resorting to Tesselation so obviously its look isn't reliant on then.

As for the the R6xx series, the PC world is a total mess, because there are too many graphics cards available and the older hinder the newer ones which feature awesome and unique techniques such as tesselation.

Theres absolutely nothing stopping a PC dev from making use of a particular feature in a specific GPU architecture. Id did it with the Doom 3 engine and several other games do it aswell. Hell, you mentioned one yourself above. Viva Pinata.

Console developers work intimately with the hardware, they have secret information we don't know about that perhaps can fill up volumes of textbooks.

Thats basically saying nothing. You can't just say that developers will use "special techniques" to get almost magical capabilties out of the consoles and then back that up with "I don't know what they are because i'm not a console dev". If you can show some evidence of these techniques then thats a good starting point but as I said with my previious post, did the xbox ever match 9700 performance? Has either current gen console even hinted at showing capabilites on par with a GTX in any cross platform game?

For instance, most console games tend to have more polygons than their PC counterparts, and therefore the origin of games like HL2 and Doom 3 is well known because those developers don't know the hardware well and there's always a minimal common denominator approach.

HL2 and Doom 3 were from the last generation of consoles. They both go far beyond anything from the last generation as so comparing there polygon counts to modern games is pointless.

Modern PC games absolutely do not have lower polygon counts than their console counterparts. If anything they have more. Look at Crysis, look at Oblivion, look at Lost Planet. All have more geomtery than their console versions.

The X360 can't compete with those monsters. Period.

I've never said that about the console and I never will, but it can compete for graphics quality at 720p, perhaps not quantity (AAx8, AFx16, and those amazingly crazy numbers)

You seem to be under the impression that the additional power is only of use for resolutions higher than 720p with insane image quality. Thats simply not true. More power is more power at any resolution. If a GTX can do the same at 1920x1200 with 8xAA as the 360 can do at 720p with 2xAA then the GTX can do a LOT MORE ate 720p with 2xAA. Crysis is the perfect example of what a GTX can do when you limit resolution and AA to console levels.

though. Regarding the RSX, it can be helped by Cell while the GTX is on its own but yeah, the GTX is also in a different league

Yes the GTX is in a different league and even it can't handle Crysis at max details and 720p so what makes you think the much weaker GPU's of the consoles could?

Thank goodness you are technically minded and can tweak the settings yourself, I have no idea how the average user could have figured that out and tweaked it.

I'm afraid I have no idea what your talking about there. What did I tweak? I'm simply playing Crysis on my GPU and watching my GPU slow to a crawl because all its "raw power" is being used up, and its still not enough. Thus a reduce the load on my GPU by lowering the details levels. No technical know how required for that.

Even so, you are experiencing performance issues... and you shouldn't.

Of course I should, Crysis at max detail is to much for my GPU to handle so of course i'm going to experience performance issues if I don't dial down the details. This is an example of how my GPU's power isn't being wasted as i'm clearly using it all.

The real question is, how many PC games will take FULL advantage of the massively powerful and expensive graphic cards?

Most of them. I want to be playing all my games at 1920x1200 with 16xAA/16xAF and a solid 60fps. I can do that with very few modern releases and thus far from being underutilised, I would say I don't have enough power in my GPU. In fact there are quite a few games in which I can't even achieve 30fps at those settings.

I for one prefer developers tweak the game for me because it's their job after all, and they know better.

Well then consoles are perfect for you. Many PC games though, Crysis included will do this for you anyway by detecting the best settings for your system.

Sometimes the engine they use marks the limit though, that's my main complaint so far with UT3, for instance, not very friendly with AA, perhaps (don't know) tesselation etc.

I think UT3 works with AA in the latest driver. Its supposed to have an AA mode in DX10 aswell like Gears, hopefully we will se this in the final release as it wasn't in the demo.

Regarding PC games not using all the GPU's features, like tesselation, well thats not just a PC problem. Very few 360 games use that feature either. And how many use 4xAA with tiling? Or Memexport? Just because its in a console doesn't mean its going to be heavily utilised.
 
But understand that Viva Pinata doesn't do the draw distance, detail at distance and texture detail. There are several things that differ, object detail, texture detail etc. viva Pinata is no proof at all that Crysis could be possible on consoles due to tesselation.

And most consoles games dont have more polygons than Pc games and those that have are also available for PC. DOOM3 and HL2, 2 great games had quite some polygons and more than most console games last-gen.

Even so, you are experiencing performance issues... and you shouldn't.
The real question is, how many PC games will take FULL advantage of the massively powerful and expensive graphic cards? I for one prefer developers tweak the game for me because it's their job after all, and they know better.

You see just becouse one experiences framerate problems doesn't mean that the engine is unoptimized. How about all those console games with tearing and framerate problems, Mass Effect with lots of slowdowns, Lair etc. It may also be that it is doing a lot of stuff instead of prebaking, have little objects cast shadows etc. And games dont have to take full advantage of the PC hardware for it to still be far ahead. And I tell you that PC devs also tweak their engines, I hope you understand that! :smile:
 
Tesselation isn't that much of a wonder, it's basically converting a parametric higher order surface into a polygonal mesh; if the algorythm is adaptive, then you can also change the number of polygons created. More advanced methods may also add view-dependency.
But it can only do the following things:
- reduce dataset size for geometry before it's processed in CPU/GPU
- scale on-screen triangle size to increase rasterization efficiency of quads
- can provide a sort of automatic LOD as well

It won't allow for any big wonders, and it will require specialized artwork. Properly detailed models will be so complex that they'll have far too many polygons to make it a win over today's highly optimized models for most cases. The only real use on characters and other objects is for extreme close-ups, which is quite rare in games.
Terrain and water in most games is already implemented with some sort of tesselation, but it's really not the most complex thing in a game...
 
Yes there is, Rallisport challenge for Pc even with a 9700 wasn't able to run as smooth as the xbox version that was locked at 60fps...

Rubbish. I was running RSC on my Ti4200 at 800x600 with a butter smooth framerate and the Ti4600 was at least 25% more powerful.

I don't recall RSC running at a locked 60fps anyway.
 
Rubbish. I was running RSC on my Ti4200 at 800x600 with a butter smooth framerate and the Ti4600 was at least 25% more powerful.

I don't recall RSC running at a locked 60fps anyway.

Rsc didnt run at 60fps on xbox, on the other hand to run halo on pc as well as the xbox version, you need at least double the processing power, a graphics car like radeon 9600xt and at least 8 times the ram memory of the original xbox, wich was 64 mb if im not mistaken.

With this im not saying that i think crysis would run as well on the 360 and ps3 with the max pc settings, i think thats very unrealistic thinking, but i do belive that you will see more impressive games in this console generation than crysis.
 
Rsc didnt run at 60fps on xbox, on the other hand to run halo on pc as well as the xbox version, you need at least double the processing power, a graphics car like radeon 9600xt and at least 8 times the ram memory of the original xbox, wich was 64 mb if im not mistaken.

With this im not saying that i think crysis would run as well on the 360 and ps3 with the max pc settings, i think thats very unrealistic thinking, but i do belive that you will see more impressive games in this console generation than crysis.

Memory and CPU perhaps. But GPU? Absolutely not. As with RSC, I was playing Halo on my Ti4200. It ran at 800x600 at least as well as the xbox version ran at 640x480 and at 640x480 it ran like butter.

Bottom line is that the 8800GTX is to the xbox 360 as the 9700pro was to the xbox. Now take a game that the 9700pro really struggled to run...

Lets say Farcry.

Now remind me which game on the last generation of consoles equaled Farcry?

That would be my argument.
 
Well the thread title is kind of vague as running an engine and running Crysis at 1080p with max details, 4-8x AA and 16 AF leaves a quite a margin in between. Even the highest end PC will struggle to run Crysis with those settings and thinking that X360 or PS3 will someday display similar "stuff" on screen is borderline lunacy.

I know many console owners like to think that the games will improve by a huge margin from what we have now, but I'm thinking that they will not improve that much. In my opinion X360 and PS3 are fairly close in total system power and X360 has been out two years. Its games are not going improve that much over the best we have now. I think games like Gears of War, Bioshock GRAW 2 and some others are still decent looking compared to the 2010 titles, if you look everything that is on the screen instead of some smaller details.

People often bring in "the PS2 improvements over time card", but two years after the launch of PS2 we already had GT 3 FF10 and MGS 2 among some other pretty good looking games and despite the great awesomeness of God of War 2, MGS 3 and the other best looking PS2 games, the difference really wasn't THAT large between those generations. Granted the PS2 launch games improved a lot, but we are way past the launch window now.

Despite the PS3 being out only a year, its tech is older compared to what X360 had year before. Sony in-house studios have been arranged to extract results out of that box more efficiently than what happened with PS2. PS3 launched late, but its tech was relatively ready for quite some time, so we are not as early in the life of PS3 as it might seem at first glimpse.

PS3 might have slightly longer legs than X360 has, and it benefits being a closed box against a PC, but a Closed Box PS3 has nothing against a Quad core PC with 4GB of ram and 8800 GTX and even that setup can't run Crysis maxed! A console game might have the same scale with much lesser detail or the same detail in a very confined environment, but the whole Crysis is not happening on these consoles. It depends on the person what is close enough and I'm sure that we will see Crysis creamed!!11! etc. shouted by console gamers in heat, when some impressive looking game will appear despite it being nowhere near Crysis.

Luckily it's more fun to play on a console :devilish:
 
All things done by the ID tech 5 engine at 60fps... :p

...Haha, isn't even close to what Crysis engine does. Why don't you read up on what both engines does and how they differ, How about starting with checking the graphical scope of each game. Which one has far more draw-distance, vegetation of high detail, soft-vegetation + physics based, etc etc. In ID's game there is just rocks and buildings and very limited draw distance. Textures have less detail upclose and so does the shadows etc. ;)
 
Rsc didnt run at 60fps on xbox, on the other hand to run halo on pc as well as the xbox version, you need at least double the processing power, a graphics car like radeon 9600xt and at least 8 times the ram memory of the original xbox, wich was 64 mb if im not mistaken.

halo for PC ran perfect on a Geforce4ti4200 64MB at 800*600 and I think I had Quincunx 2xAA (or maybe it didn't work in this game). A 1GHz CPU and 256MB RAM. The game did 30fps smooth with very few dips. :smile:

With this im not saying that i think crysis would run as well on the 360 and ps3 with the max pc settings, i think thats very unrealistic thinking, but i do belive that you will see more impressive games in this console generation than crysis.

Nah so far what has been shown of games to come like KZ2, FFS, MGS4, etc are nowhere near at all and games already released are neither.
 
halo for PC ran perfect on a Geforce4ti4200 64MB at 800*600 and I think I had Quincunx 2xAA (or maybe it didn't work in this game). A 1GHz CPU and 256MB RAM. The game did 30fps smooth with very few dips. :smile:

Well at the time it took my athlon xp 2600+ and a radeon 9700pro to run halo 3 and 512 mb ram to run the game, but oh well i think it was running at 1280 x 1024 i often forget that.

Nah so far what has been shown of games to come like KZ2, FFS, MGS4, etc are nowhere near at all and games already released are neither.

Dont forget the improvements Bungie made from halo to halo 2 on grahpics, kill Zone 2 is not looking anything special from where i stand, theres already alot of better fps with better graphics than kill zone 2.(that "ps3 hidden power" does not convince me)
 
Status
Not open for further replies.
Back
Top