Last of Us [PS4]

What? He is bragging the fact that PS4 version will have pre-rendered cutscenes, CGI at 1080p60fps?

Bragging? Don't you hate it when you play a game on PC in pristine 1080p at 60fps only to get to an in engine, but prerendered cut scene from the 360 game that is a super compressed 720p Bink file at 30fps? He's just saying they have to rerender all the cut scenes to avoid that, but it presents disc capacity problems.
 
"We brought in all the hi-res models, and then it's on par with what you saw in the cutscenes," Druckmann said, speaking about the upcoming PS4 release with EDGE. "There's an improved lighting model. After that, we started looking across the board; enemies look a little blurry up close, so that was pretty easy. We ramped those up and saw a pretty significant difference.

That's all I needed to hear... :yep2:
 
Can you call this CGI? It's rendered in the engine, with higher settings than realtime - but it's not a full scale CG production...
 
Can you call this CGI? It's rendered in the engine, with higher settings than realtime - but it's not a full scale CG production...
Personally, I think it's obnoxious that CGI isn't just a way to shorten "computer-generated imagery."

You're asking about where exactly we can draw a completely arbitrary dividing line between a pair of roughly-defined moving targets. It's probably better to just let language context handle it; it's not like the use of "CGI" here is at all confusing.
 
prerendered in engine cutscenes rendered on a higher spec machine than the target hardware is nothing new for ps3 and 360. A lot of PS3 games in particular did it because the size of the bluray let the devs put them out in a high quality to mask the fact that they were pre-rendered. LAst of us did it, LA Noire did it, DmC did it in certain cases. Final Fantasy 13-1 was a very infamous case, as that's why it was so many disks on 360 and not on PS3.

They switched to real time rendering in FF13-2...but the framerate suffered for it. THe engine was clearly not optimized all that well to handle the load.

I actually admired Bungie with all of their 360 halo games(especially reach) as everything they did was real time and fluent. Framerate did suffer from time to time, but they pulled off a lot of impressive things on the 360 hardware directly
 
Often games on both platforms used a pre-rendered cutscene simply to mask load times. You could start playing a relatively low bitrate video file immediately while the level loads in the background instead of having to load all the level data ahead of time before playing a cutscene rendered in real-time. It was exceedingly common last gen.
 
I just wanna know how close can the gameplay graphics match the cutscene's. We know the models are on par now, there's an improved lighting system too and high res texture to boot, I hope they get to 95% on par at least. Just what kind of lighting model does the cutscene use anyway or how taxing is it to modern gpus?
 
So I guessed right with the teaser trailer. Its cutscene models indeed ! and better lighting. Whoooo........Make it easy on my wallet ND, I really want it !
 
Personally, I think it's obnoxious that CGI isn't just a way to shorten "computer-generated imagery."

You're asking about where exactly we can draw a completely arbitrary dividing line between a pair of roughly-defined moving targets. It's probably better to just let language context handle it; it's not like the use of "CGI" here is at all confusing.

Using that logic, even the game itself should be considered CGI...

I draw the line between using the engine itself, and using an animation package and its renderer. It's not arbitrary at all, there are clrear differences in the workflow and software and hw infrastructure.

Thus, cinematics rendered using the game engine but at higher settings are not CGI in my opinion.
 
Using that logic, even the game itself should be considered CGI...
The game is realtime controlled, so isn't a cutscene. although if you were to redcord the gameplay and sell it as a DVD, it could pass for a CGI animation.
I draw the line between using the engine itself, and using an animation package and its renderer.
But realtime tools are starting to encroach on the offline renderer space. eg. If someone were to make an animated short in Source Filmmaker and feature at the Oscar's, it'd be listed as a computer animation and, to the uninitiated, be developed in the same ways as a Pixar or Dreamworks animation. As far as the film classification, it'd be a computer generated image. Or if ND were to produce a LoU or Uncharted movie with the quality dialled up to 11, that'd also be classified by the movie industry as a CGI movie.

I can understand wanting to draw a distinction between game-engine renderers and traditional CGI renderer, but I'm not sure the CGI moniker is the way to do that because it won't remain the domain of non-realtime graphics forever.
 
The game is realtime controlled, so isn't a cutscene.

It is still Computer Generated Imagery, if you want to categorize based on a 100% strict interpretation.

But realtime tools are starting to encroach on the offline renderer space. eg. If someone were to make an animated short in Source Filmmaker and feature at the Oscar's, it'd be listed as a computer animation and, to the uninitiated, be developed in the same ways as a Pixar or Dreamworks animation.

Yes, and if we were to see something like that, it'd blur the lines a bit.

But in the end, the animation industry feels that current game engines require far too many preprocessing steps - which involve artist time - so the increased rendering speed is far outweighed by the extra man hour cost. It also increases iteration time for a lot of things, having to re-bake a lot of stuff instead of just hitting render and going home at the end of the day.
So it is unlikely for even more complex game engines to gain adaption in the animation industry, at least until it becomes less dependent on precalculating stuff.

And on the other hand, movies "rendered out" for games to mask load times are still rendered in the game engine, usually on consoles, by the developers. Calling it CGI will lead to people mixing up material created this way, and material created as traditional CG animation.

I can understand wanting to draw a distinction between game-engine renderers and traditional CGI renderer, but I'm not sure the CGI moniker is the way to do that because it won't remain the domain of non-realtime graphics forever.

Again, if you stick to a strict interpretation, games are CGI as well - at the time this expression was conceived, rendering stuff in realtime was unimaginable.

Or, if you follow the industry usage of this expression, then game engine stuff is not CGI, no matter if it's rendered into video or presented in realtime.
 
And on the other hand, movies "rendered out" for games to mask load times are still rendered in the game engine, usually on consoles, by the developers. Calling it CGI will lead to people mixing up material created this way, and material created as traditional CG animation.

I understand that this is a sensitive topic to you LaaYosh, and I respect and admire the work of your studio, but with that said, I fail to see how this is a problem.
 
For me it's real time or non real-time that counts. And I consider both CGI in the strictest terms.

Yeah...real time or not...does it suddenly 'change' your weapons when the cutscene kicks in? Or change your outfit to standard? This kind of breaks the immersion most imo...
 
What's next? Microsoft should release kinect hardware that runs on a Playstation?

Some games are worth buying hardware for, that is the play exclusives bring to the table.

And I think the elephant posting style is what annoys some of the be3d users, this is a highly moderated forum by some of the best mods to be found (getting brown tongued) it's everything the Internet isn't, you either adapt or get out whether you want or not, good luck :)

Yeah agree on that! I think exclusivity of games on consoles is really important because it will make consumer decide which console that suits them perfectly. If all console game devs release their games on all platforms, then consumers can just pick whatever console they want because all games will be available on all consoles.
 
Yeah...real time or not...does it suddenly 'change' your weapons when the cutscene kicks in?

There still are some huge differences in the content and the quality that's achievable. We in offline CG can still deliver things that are impossible to do in any game engine and that can significantly effect the impression that the cutscene can create. I can't get into details at the moment about our stuff, but later this year we can get back to this.
 
There still are some huge differences in the content and the quality that's achievable. We in offline CG can still deliver things that are impossible to do in any game engine and that can significantly effect the impression that the cutscene can create. I can't get into details at the moment about our stuff, but later this year we can get back to this.

Oh, now I am curious!! Looking forward to it!!
 
http://www.polygon.com/2014/5/16/5723830/last-of-us-ps4-port-hell

"I wish we had a button that was like ‘Turn On PS4 Mode’, but no," Druckmann said. "We expected it to be hell, and it was hell. Just getting an image onscreen, even an inferior one with the shadows broken, lighting broken and with it crashing every 30 seconds … that took a long time. These engineers are some of the best in the industry and they optimized the game so much for the PS3’s SPUs specifically. It was optimized on a binary level, but after shifting those things over, you have to go back to the high level, make sure the systems are intact, and optimize it again.

Druckmann also said that the game and its cutscenes are running at 1080p and 60 frames per second, a process he said "involved rendering them all from scratch." Though the creative director characterized The Last of Us Remastered as "a straightforward port," he said that PlayStation 4 has him open to giving players new options.
 
Back
Top