Crysis 4 [Xbox, PS, Switch?, PC]

Don't miss The Land of Pain and The Alien Cube. ;) Developed by one person, I believe. Amnesia-likes. Kingdom Come and Prey are also Cryengine.

Where can they take Crysis next? Let us proceed to the Ceph homeworld. The aliens have the tentacles, meaning potential ancient horrors. So possibly our protagonist will suffer the depths of madness as the truth of their existence is revealed. We will need stealth mode to have any hope of eluding the impossible nightmares in the lowest levels of their home. And juice up on nano catalyst to get er done.
 
Last edited:
Because time makes fools of us all Nesh. Sure a game that needs 256 gigs of ram and dual 3080s for max settings might not play well like that today but 5 years from now ? a decade ? There are plenty of games I wish had more graphical settings so that when I went back to them years or decades later I could get a better experience with them
What you will get then is a game that was designed to sell itself as a state of the art experience which you cant enjoy at present, a game that will offer an ok-ish experience now at lower settings, and when the time comes to enjoy it as it was intended it will be obsolete and you wont be enjoying it as much because you have already finished it and other games surpassed it
 
What you will get then is a game that was designed to sell itself as a state of the art experience which you cant enjoy at present, a game that will offer an ok-ish experience now at lower settings, and when the time comes to enjoy it as it was intended it will be obsolete and you wont be enjoying it as much because you have already finished it and other games surpassed it

So you mean PC gaming from the 1980s through 2012ish ?

Growing up there was always games that required upgrades. I remember shoveling snow all winter and cutting grass in the spring and summer so I could afford a 4MB ram upgrade for my 386 to play doom better. I remember spending my birthday money on a riva 128 to play quake . I remember buying a new card for doom 3. I remember buying a cd rom drive for one of the wing commanders.

Back then it wasn't just cpu or ram or graphics chip either. I remember buying audio cards also. I remember playing half life 2 on my radeon 9700 and having a great experiance and then playing again on my x1800 and enjoying the enhanced visuals and higher res it came with.

Even now there are a bunch of games that run poorly on my 3080 that in a few years I'd love to go back and play again. Could only imagine what cyber punk will look like on 2-3 new generations of hardware.
 
So you mean PC gaming from the 1980s through 2012ish ?

Growing up there was always games that required upgrades. I remember shoveling snow all winter and cutting grass in the spring and summer so I could afford a 4MB ram upgrade for my 386 to play doom better. I remember spending my birthday money on a riva 128 to play quake . I remember buying a new card for doom 3. I remember buying a cd rom drive for one of the wing commanders.

Back then it wasn't just cpu or ram or graphics chip either. I remember buying audio cards also. I remember playing half life 2 on my radeon 9700 and having a great experiance and then playing again on my x1800 and enjoying the enhanced visuals and higher res it came with.
Aside from normal upgrading every some healthy period of time and the crazy mess PCs were back then, I guess for the sake of tradition you want to go back to that mess of having to upgrade multiple configurations every month to play the next game? Or have to wait a generation or two to play it properly?
Even now there are a bunch of games that run poorly on my 3080 that in a few years I'd love to go back and play again. Could only imagine what cyber punk will look like on 2-3 new generations of hardware.
Well yeah Cyberpank is another example of a game you cant enjoy properly at present, despite your high end PC unless you start overspending again. Which you arent like "the good old times" and thus you cant enjoy how it is intended. You yourself said it runs poorly and want to visit it 2-3 generations later to play it properly. Which is exactly my point.
So on one hand you pretend you are ok to work your ass off to upgrade but in reality you arent doing it.

edit: you are also ignoring one more thing. That games overtargeting specs just dont play well on 99% of people's configurations of which a lot are already expensive. A game that doesnt play well doesnt sell well. Developers arent going to waste talent, time and millions at their loss just so you will enjoy the game 10 years later.
 
Last edited:
Aside from normal upgrading every some healthy period of time and the crazy mess PCs were back then, I guess for the sake of tradition you want to go back to that mess of having to upgrade every month to play the next game?

Well yeah Cyberpank is another example of a game you cant enjoy properly at present, despite your high end PC unless you start overspending again. Which you arent and thus you cant enjoy how it is intended. You yourself said it runs poorly. Which is exactly my point

Nah , I want to go back to a time where games grow with my hardware. I don't want to be limited by what a 10 year old console can do when it comes to my games. Heck I don't even want to be limited by what two year old consoles can do.

I enjoyed Cyberpunk a lot , put in 80 hours with the game. Played it with DLSS 4k . Can't wait till more content for it comes out and I will most likely play it again when I get a 5080 to replace my 3080. The game did run poorly on my radeon vega , the 3080 made it a lot more playable. I can't wait to play with ray tracing maxed out , true 4k , and all settings at high.


Not to mention I have just been talking about single player experiences. What about multiplayer experiences that people play for years or decades? There are plenty of MMOs that I played for years or decades and they grew with my hardware .

Also I rather buy a game that has higher fidelity options avalible even if I can't run them than have to wait 5 or 6 years and have to rebuy the game as a remastered edition.
 
Nah , I want to go back to a time where games grow with my hardware. I don't want to be limited by what a 10 year old console can do when it comes to my games. Heck I don't even want to be limited by what two year old consoles can do.

I enjoyed Cyberpunk a lot , put in 80 hours with the game. Played it with DLSS 4k . Can't wait till more content for it comes out and I will most likely play it again when I get a 5080 to replace my 3080. The game did run poorly on my radeon vega , the 3080 made it a lot more playable. I can't wait to play with ray tracing maxed out , true 4k , and all settings at high.


Not to mention I have just been talking about single player experiences. What about multiplayer experiences that people play for years or decades? There are plenty of MMOs that I played for years or decades and they grew with my hardware .

Also I rather buy a game that has higher fidelity options avalible even if I can't run them than have to wait 5 or 6 years and have to rebuy the game as a remastered edition.
Why do I feel that this is a stab at consoles rather than an objective discussion? This isnt about consoles. I feel where you are coming from comes from the Glorious PC Master Race myth that consoles are holding back PCs.
You do realise that Cyberpank and Crysis targeted beyond the specs of consoles and still had problems playing on high end PCs with graphic cards, RAM and CPUs that eat consoles for supper?
 
Why do I feel that this is a stab at consoles rather than an objective discussion? This isnt about consoles. I feel where you are coming from comes from the Glorious PC Master Race myth that consoles are holding back PCs.
You do realise that Cyberpank and Crysis targeted beyond the specs of consoles and still had problems playing on high end PCs with graphic cards, RAM and CPUs that eat consoles for supper?

It's not a stab at consoles , its a different business model. However there was a marked change in the xbox 360 generation where a lot of pc developers started targting consoles as their main platform. That killed innovation with regards to pc hardware. Then there was a big jump when the ps4/ xbox one came and of course that again stagnated innovation again.

Cyberpunk played just fine on my pc like I said so I'm not sure where your going with this. My 4 year old vega ran it better than the ps5 and xbox series x.
 
It's not a stab at consoles , its a different business model. However there was a marked change in the xbox 360 generation where a lot of pc developers started targting consoles as their main platform. That killed innovation with regards to pc hardware. Then there was a big jump when the ps4/ xbox one came and of course that again stagnated innovation again.

Cyberpunk played just fine on my pc like I said so I'm not sure where your going with this. My 4 year old vega ran it better than the ps5 and xbox series x.
What do you mean it killed innovation in regards to PC hardware? Can you be more specific what kind of innovation it killed?
Nobody denied that there are PCs out there that play better than consoles. But thats besides the point.
 
What do you mean it killed innovation in regards to PC hardware? Can you be more specific what kind of innovation it killed?
Nobody denied that there are PCs out there that play better than consoles. But thats besides the point.
it killed innovation in taking advantage of newer hardware on the pc side. Games were all targeting the older systems like in the xbox 360 era there was 512 megs total for ram. Lots of video cards grew to 4 and 8 gigs of ram. Few games took advantage with redoing hire res textures. Many just let you run a higher resolution instead of creating a more dense world of visuals.

It will be the same with this gen , the ps5 and xbox series are pretty terrible at ray tracing. So we will see poor implementations that hold back vs what the pc hardware can do. It will only become more obvious when rdna 3 and the next nvidia cards come.
 
What you will get then is a game that was designed to sell itself as a state of the art experience which you cant enjoy at present, a game that will offer an ok-ish experience now at lower settings, and when the time comes to enjoy it as it was intended it will be obsolete and you wont be enjoying it as much because you have already finished it and other games surpassed it

And what reason would exist for thinking like that?

Any game created in that fashion will have a perfectly playable setting that would exactly match what you get with current games. The only difference is that the ultra level settings would be beyond the reach of almost all current hardware. But lower setting levels would provide the exact same gaming experience as any game that is created for this generation of gaming.

So, how in the world would it not be enjoyable? Unless, of course you are saying that no game created now days is an enjoyable experience?

If Horizon Forbidden West had a graphical setting level on a hypothetical PC release that is a significant step above what it currently has on PS5 and is a step beyond all but the top 1% of PC hardware. How would that suddenly make Horizon Forbidden West a mediocre gaming experience?

Alternatively if Horizon Forbidden West has an extra graphical setting level that runs poorly on PS5 because that graphical level was the developers looking forward to what they think PS6 hardware would be. How would that suddenly make HZF a mediocre experience when the game and graphics for PS5 would still be exactly the same.

Regards,
SB
 
it killed innovation in taking advantage of newer hardware on the pc side. Games were all targeting the older systems like in the xbox 360 era there was 512 megs total for ram. Lots of video cards grew to 4 and 8 gigs of ram. Few games took advantage with redoing hire res textures. Many just let you run a higher resolution instead of creating a more dense world of visuals.

It will be the same with this gen , the ps5 and xbox series are pretty terrible at ray tracing. So we will see poor implementations that hold back vs what the pc hardware can do. It will only become more obvious when rdna 3 and the next nvidia cards come.
What do you mean? What you describe isnt innovation. What you are asking for is dialing up what already exists. And what happened is that games were taking advantage of the extra memory and the console ports just had lower PC settings.
Lets not pretend that PC exclusives also didnt exist. Because they did. And yet we didnt see much there. Also this is still irrelevant to the discussion we had about over-targeting PC specs.
 
And what reason would exist for thinking like that?

Any game created in that fashion will have a perfectly playable setting that would exactly match what you get with current games. The only difference is that the ultra level settings would be beyond the reach of almost all current hardware. But lower setting levels would provide the exact same gaming experience as any game that is created for this generation of gaming.

So, how in the world would it not be enjoyable? Unless, of course you are saying that no game created now days is an enjoyable experience?

If Horizon Forbidden West had a graphical setting level on a hypothetical PC release that is a significant step above what it currently has on PS5 and is a step beyond all but the top 1% of PC hardware. How would that suddenly make Horizon Forbidden West a mediocre gaming experience?

Alternatively if Horizon Forbidden West has an extra graphical setting level that runs poorly on PS5 because that graphical level was the developers looking forward to what they think PS6 hardware would be. How would that suddenly make HZF a mediocre experience when the game and graphics for PS5 would still be exactly the same.

Regards,
SB
I am talking about specific titles like Crysis were the whole experience were designed around extremely high specs.
Now by norm, fortunately what we get in general is what you describe. Games optimized to run well on normal configurations, turning the quality level up on the highest if you have the hardware.
 
What do you mean? What you describe isnt innovation. What you are asking for is dialing up what already exists. And what happened is that games were taking advantage of the extra memory and the console ports just had lower PC settings.
Lets not pretend that PC exclusives also didnt exist. Because they did. And yet we didnt see much there. Also this is still irrelevant to the discussion we had about over-targeting PC specs.

All that is innovation. IF your able to display more on screen at once you can create more realistic worlds full of life and introduce more npcs into the world to fill it out. PC has been way ahead of consoles for the last two decades.
 
I am talking about specific titles like Crysis were the whole experience were designed around extremely high specs.
Now by norm, fortunately what we get in general is what you describe. Games optimized to run well on normal configurations, turning the quality level up on the highest if you have the hardware.

But it's not what we have now. There are no longer any developers on any platform that really push what is possible on the bleeding edge of hardware or attempt to innovate and push the boundaries of graphical rendering such that it may not be possible to run it at an acceptable speed on generally available hardware.

All get now are developers playing it safe and never really pushing what the hardware can do.

I get that people that have gamed almost exclusively on consoles their whole lives, wouldn't be familiar with how game developers used to push hard against the hardware limitations of hardware that was available when their games were being developed. But coming from the PC side where this used to be relatively common throughout the 90's (Doom, System Shock, Ultima Underworld, Ultima IX, Quake, Quake 2, Unreal ... so many other games that were unplayable at release at maximum settings on the vast majority of PCs) and early 2000's (Far Cry and Crysis among many other games) it saddens me to look at the state of graphics in games now-a-days WRT pushing the boundaries and innovating. BTW - those same examples, for the most part ran perfectly fine if you just played them at medium settings on the vast majority of PCs. Crysis is the lone example where even at medium settings it was punishing due to CPUs hitting a wall WRT single threaded performance.

Regards,
SB
 
But it's not what we have now. There are no longer any developers on any platform that really push what is possible on the bleeding edge of hardware or attempt to innovate and push the boundaries of graphical rendering such that it may not be possible to run it at an acceptable speed on generally available hardware.

All get now are developers playing it safe and never really pushing what the hardware can do.

I get that people that have gamed almost exclusively on consoles their whole lives, wouldn't be familiar with how game developers used to push hard against the hardware limitations of hardware that was available when their games were being developed. But coming from the PC side where this used to be relatively common throughout the 90's (Doom, System Shock, Ultima Underworld, Ultima IX, Quake, Quake 2, Unreal ... so many other games that were unplayable at release at maximum settings on the vast majority of PCs) and early 2000's (Far Cry and Crysis among many other games) it saddens me to look at the state of graphics in games now-a-days WRT pushing the boundaries and innovating. BTW - those same examples, for the most part ran perfectly fine if you just played them at medium settings on the vast majority of PCs. Crysis is the lone example where even at medium settings it was punishing due to CPUs hitting a wall WRT single threaded performance.

Regards,
SB
I think what you describe is the phenomenon where developers are hitting closer their saturation points with how much they can handle and push.
Back in the old days of 3D hardware, we had a lot of hardware manufacturers and software developers trying to find ways to display better visuals when nothing was standardized and 3D was too young.
So every time and then we would get another breakthrough in visuals.
Again this is still something different from what is being discussed.
For example it is one thing to target a current high end PC to its fullest potential and bring something that hasnt been done before and another to make a game that requires 256GB of RAM, dual 3080ti and no CPU able enough to handle the physics and one million NPCs at steady framerates.
 
All that is innovation. IF your able to display more on screen at once you can create more realistic worlds full of life and introduce more npcs into the world to fill it out. PC has been way ahead of consoles for the last two decades.
For me thats not innovation. Innovation is what Unreal Engine 5 does that can push more detail that thought possible with a given hardware. Cranking up GPUs, RAM and CPUs because you want to brute force more objects and textures is not in my book.
 
Consoles don't hold back PC. There are plenty of PC exclusive games out there free to push the visuals beyond console capabilities.

The "hold back" arguments works both ways.

Console holds back pc due to devs simply cater to the lowest common denominator. Usually related to graphics settings like ludicrous texture resolution and even basic things like proper AF.


PC hood back console due to devs simply cater to the lowest common denominator. Usually related to controller feature like gyro aiming.
 
There’s a large difference in dialling up existing settings higher versus developing an entire game around specific hardware.

there’s nothing stopping the former from happening. But the latter, Aside from maybe star citizen, I haven’t seen a PC developer target base specs around the top 1% of hardware configs.

the current console specs are more than sufficient as a baseline imo.

It is important to note: budgeting for a spec is extremely efficient in terms of hardware usage. Whereas scaling up is extremely inefficient.

We experience these same issues in the real world, having a factory and a labour force is very efficient when paired to produce a specific output. If you want more output with these locked factory and labour force you need to run more shifts, you have to pay OT and more incentives to get more output; this is less efficient than having a larger labour force to match the size of a larger factory.

so, when the comment is I hope crisis makes the meme: but can it run crysis? return; you need to ask yourself if you mean the former or the latter. Dialling up settings until your computer is broke is easy. Making full and efficient use of all that power is hard.
 
Last edited:
Back
Top