You don't even necessarily need AI, just a scalable engine with an eye towards the future.
Imagine for a second that the original Crysis released on a fixed platform instead of being able to run on a PC of any spec. Also, in this situation that the code for it was exactly the same (IE - forward looking code that expected faster and more capable hardware than was available at the time). Now since they'd be releasing it on a fixed platform, they'd likely lock down the settings so that it could run at 30 FPS (due to how "heavy" and demanding the engine was) with a very small chance of another setting option for 60 FPS (at much greater reduced IQ).
For hardware available at that time, that would likely mean some mix of low and medium settings with maybe a high setting or two enabled for a more "mainstream" oriented hardware setup (say around 500-750 USD PC specs at the time the game launched).
Fast forward 4-5 years and again with "mainstream" oriented hardware (under 1k USD PC specs) and settings can be bumped quite significantly.
We've seen how the graphics in Crysis can change quite dramatically depending on the level of hardware that it can run on (IE - the settings that are used).
So unlike the actual PC release where someone could immediately run max settings (at a virtual slideshow) and see how the game would look, a fixed platform wouldn't expose those higher setting levels (texture quality, filtering, POM, higher geometry settings, view distance, lighting quality, etc.) at launch. So in 4-5 years when new hardware came out, it would have been a rather dramatic increase in game visuals as opposed to how the PC release played out where you already knew what the game could look like, but now it's playable at higher settings.
Keep in mind that Crysis was still considered one of the best looking games available even 4-5 years after it released.
Of course, the argument against something like that is that there really aren't any developers anymore that really push the envelope with graphics in their games (meaning, coding for some future combination of unannounced hardware products) due to the constant complaining by a vocal segment of the PC community that constantly complains that they can't run maxed Ultra settings on their 2-3 year old hardware.
So then the question would be, could MS or Sony convince developers to start programming like that again? IE - like PC game developers coded back in the late 90's to early 2000's before all the big PC devs went multiplatform and started to focus on consoles as the primary development target.
Regards,
SB