The game is realtime controlled, so isn't a cutscene.
It is still Computer Generated Imagery, if you want to categorize based on a 100% strict interpretation.
But realtime tools are starting to encroach on the offline renderer space. eg. If someone were to make an animated short in Source Filmmaker and feature at the Oscar's, it'd be listed as a computer animation and, to the uninitiated, be developed in the same ways as a Pixar or Dreamworks animation.
Yes, and if we were to see something like that, it'd blur the lines a bit.
But in the end, the animation industry feels that current game engines require far too many preprocessing steps - which involve artist time - so the increased rendering speed is far outweighed by the extra man hour cost. It also increases iteration time for a lot of things, having to re-bake a lot of stuff instead of just hitting render and going home at the end of the day.
So it is unlikely for even more complex game engines to gain adaption in the animation industry, at least until it becomes less dependent on precalculating stuff.
And on the other hand, movies "rendered out" for games to mask load times are still rendered in the game engine, usually on consoles, by the developers. Calling it CGI will lead to people mixing up material created this way, and material created as traditional CG animation.
I can understand wanting to draw a distinction between game-engine renderers and traditional CGI renderer, but I'm not sure the CGI moniker is the way to do that because it won't remain the domain of non-realtime graphics forever.
Again, if you stick to a strict interpretation, games are CGI as well - at the time this expression was conceived, rendering stuff in realtime was unimaginable.
Or, if you follow the industry usage of this expression, then game engine stuff is not CGI, no matter if it's rendered into video or presented in realtime.