Digital Foundry Article Technical Discussion Archive [2012]

Status
Not open for further replies.
What player decisions in the games couldn't be carried over to a different engine?

The characters themselves for one?

I mean, apparently as it is there is some bug in ME3 where you cannot import your character from ME2's face, unless you edited it in ME2. I heard a lot of griping about it from Jeff Cannata on Weekend Confirmed.
 
I didn't understand the "we didn't have enough memory for holster animation" comment made by devs.
There are so many sequences in the game where your character isn't holding any weapon at all, and in some of them s/he is even wearing the armour, so why not just use this same animation set?
 
The characters themselves for one?

I mean, apparently as it is there is some bug in ME3 where you cannot import your character from ME2's face, unless you edited it in ME2. I heard a lot of griping about it from Jeff Cannata on Weekend Confirmed.
That's just a matter of ME3 not reading the format for the face variables saved by the original ME though, a bug as you said. Those player decisions along with all the others are a just a list of values within set arable ranges stored in the save file; what color are your eyes, what skills do you specialize in, did you let Wrex die or save him, and so on. What could prevent any of those choices from being read from save files and represented in a different engine, say CryEngine3. or Adventure Game Interpreter for that matter?
 
Last edited by a moderator:
Cloud Gaming Face Off: Gaikai vs OnLive

For those interested, Digital Foundry did a comparison article for 2 cloud services: OnLive and Gaikai. They both have different business models. OnLive is more like a retailer and Gaikai is utilized by publishers but will be allowing full game streaming soon.

http://www.eurogamer.net/articles/digitalfoundry-face-off-gaikai-vs-onlive

Graphics settings/Image quality: Gaikai is the clear winner due to maxed out graphics settings
One of the most compelling arguments in favour of cloud gaming is that hardware is upgraded server-side, potentially allowing for better-than-consoles visuals. Right now, only Gaikai is really delivering in terms of high-end visuals.

OnLive is running games on low settings. Only plackpack titles can be maxed out.
We came away disappointed with OnLive in this regard when we first looked at the service, and in the here and now, nothing much has really changed. Far from delivering high-end, console-beating experiences, we are treated to a mixture of downgraded visuals with higher than console frame-rates.
In order to maintain the target 60FPS as closely as possible, graphics settings are dialled back to a standard that is in some cases visibly worse than what we are seeing on the current consoles. In previous analyses, we deduced that OnLive servers most likely use dual core CPUs combined with something along the lines of an NVIDIA 9800GT or 9800GTX. These conclusions were based on matching graphics between the cloud service and the original PC game, then measuring performance. As OnLive blocks off graphics settings menus, it was the only way we could compare.

Gaikai is running games at max setting, which can be adjustable.
Interestingly, Gaikai takes a different approach in allowing the user to tweak the video settings in some titles, while also providing an upper-end baseline for games where the user is unable to make any changes in this area. A look at the display settings menu in From Dust hosted on Gaikai reveals that an NVIDIA GeForce GTX 560Ti is used in the terminals to run the game, while the refresh rate is set up to allow for a 60FPS update - a Core i5 and GTX 560ti combo would be enough to run all of the titles featured on Gaikai at 60Hz, most of them at 1080p.....
In all of the games we've tried the graphics settings are generally set to maximum (or thereabouts), with high levels of texture filtering present and with 8x multi-sampling anti-aliasing (MSAA) enabled. The batch of screenshots below give you a good idea of just how much of a visual upgrade you are getting over what is routinely available in OnLive, and on current console hardware

Framerate and Performance: OnLive is the winner due to encoding video at 60FPS.
OnLive's most obvious advantage is temporal resolution - it aims for 60FPS on our comparison games and does a decent enough job of maintaining it, while Gaikai offers an inconsistent update closer to 30FPS.

OnLive is smoother since the video is encoded at 60FPS.
Running at 60FPS, the action is smoother with OnLive and the visible judder that is present with the game on Gaikai is completely absent. Even when OnLive does drop frames, it's far less noticeable compared to its rival. What we describe as a 'perceptual' 60FPS is in play here, where the dips in performance are so small that they can go unnoticed by the player when engrossed in gameplay.

Gaikai is less smooth and framerate dips are much more noticable due to lower video encoding rate. However, Gaikai mentions running their games at 60FPS in the datacenter in many articles. http://www.edge-online.com/news/gaikais-perry-cloud-latency-not-concern

Compression Quality: Gaikai is the clear winner. Gaikai is using x264 software compression and OnLive is using a custom hardware encoder although both are rendering h.256 video.

In terms of video quality, Gaikai's advantage is clear. While OnLive often looks muddy, blurry and is filled with heavy compression artifacts during the general run of play, video quality is much more solid with Gaikai - in fast-paced scenes with lots of complex scenery, detail is maintained, and despite there being some visible artifacts on show at times, we really get the impression that we are looking at something closer to a native high definition presentation.

Gaikai seems to encode the game at a lower framerate in order to allow more video bandwidth for a boost in video quality. However, Gaikai did state that they were running games at 60FPS in the datacenter (http://www.edge-online.com/news/gaikais-perry-cloud-latency-not-concern). Also, Gaikai can use more bandwidth than OnLive however, it does hover around 5mbps on average from my personal test.

On the other hand, Gaikai appears to aim for a more manageable 30FPS, seemingly relying on the use of more local datacentres to keep the level of input lag in check. Our theory is that the games themselves may actually be running at 60FPS server-side with Gaikai, but with the video encoder not encoding every frame generated, giving a lower frame-rate client-side.

Basically, fewer unique frames means less work for the compressor to deal with when encoding the video stream. From another perspective, dropping down to 30FPS also provides double the amount of bandwidth for image quality and thus delivers overall clarity closer to the experience of gaming on local hardware.

Latency: Gaikai is the overall winner due to the varied location of their datacenters. However, there are some cases where OnLive can match the latency of Gaikai. Although, Gaikai was actually able to match console latency in some test concerning Bulletstorm (133ms).

Higher frame-rates should give OnLive a clear latency advantage over Gaikai, but testing demonstrates that this is not the case. Gaikai's more local datacentres offer the service an important advantage over its rival

AC Brotherhood
OnLive: 183ms, 183ms, 183ms Gaikai: 167ms, 183ms, 167ms

AC Brotherhood (under load)
OnLive:200ms, 183ms, 200ms Gaikai: 216ms, 167ms, 216ms

Orcs Must Die
OnLive: 283ms, 250ms, 283ms Gaikai: 183ms, 183ms, 183ms

Orcs Must Die (under load)
OnLive: 283ms, 300ms, 283ms Gaikai: 183ms, 183ms, 183ms

The results are rather interesting. Not only does the amount of latency present in some of the titles running on Gaikai equal that of OnLive, but there are also times where controller response is significantly faster despite the frame-rate being lower due to the 30FPS video encode. Taking a closer look at Assassin's Creed on Gaikai reveals that when the frame-rate is consistently hitting the target, the level of latency hovers around the 167ms to 183ms mark - it's playable, but inconsistent. With the game running flat-out at 60FPS, OnLive is more consistent, if slightly slower than Gaikai. When performance drops, both platforms are impacted, but it's Gaikai that seems to suffer the most.

On the other hand, when looking at Orcs Must Die we get far better latency results and as such a completely different feeling of controller response. With Gaikai, baseline latency isn't exactly wonderful at 183ms, but we found it to be consistent and it was possible to adjust to the gameplay experience. Gaikai annihilated OnLive in terms of response here with a consistent 83ms-100ms advantage - remarkable.

Other tests were equally polarising when it came to seeing how well Gaikai fared when put up against games running on the Xbox 360 and natively on PC. Sometimes we got some incredibly good results: playing Bulletstorm, we found controller response to be slightly slower than the 360 game, but sometimes it hit parity - a truly remarkable achievement. The quickest response time we measured was 133ms (identical to Bulletstorm 360) but it could drift to 150ms, in-line with The Darkness 2 on the 360 or Killzone 2 on PS3 - and we also recorded the odd 166ms measurement too. A matter of 17 or 33ms in additional lag might seem miniscule but you can definitely feel it in the inconsistency of the response.

Overall, I would have to go with Gaikai for this Face-Off. However, both services can't beat the local PC experience.
Steve Perlman's group takes honours for frame-rate, while David Perry's team command clear advantages in game spec, video encoding and overall latency. The signs are there that this can work: Bulletstorm on Gaikai running with the same latency as the Xbox 360 game - however inconsistent it may be - is as clear an indicator as any that cloud could evolve into a viable contender. The technology here is achieving miracles as is, and with advances in infrastructure and improved engineering, it's only going to get better.

TL:DR: Gaikai is running games at max settings (which are adjustable)and has better latency than OnLive. However, OnLive has a much more stable framerate since they are encoding the video at 60 fps. OnLive has been running certain Playpass games on low settings like Assasins Creed. Also, OnLive has recently been in the habit of launching games at lower graphics settings than increasing them months after launch. I beleive this has to do with server capacity issues. This has happened with Batman Arhkam City and Saints Row The third. However, users can adjust graphics settings in Playpack subscription games.
 
There are so many sequences in the game where your character isn't holding any weapon at all, and in some of them s/he is even wearing the armour, so why not just use this same animation set?

I guess it's like this... There's a limited amount of RAM reserved for animations at any time. The game never loads all animations. You don't need to go from conversation to fighting on the Citadel, for example.

ME3 added more animations for changing cover and rolling and climbing ladders. These all could be used in combat at any time so it's not an option to fetch them from DVD on the fly. Running around without any weapon requires a different animation. To make room for the above mentioned new anims, these had to be thrown out.

So it's not just the clip of holstering a weapon but also all the walking/running animations without a weapon that had to go to make room for the new ones.

BTW, armor or not doesn't really matter for the animations. Also, a lot of the animation is re-used for all aliens, even krogan have the same ones as salarians and Shepard. That's also why you don't see hanar, volus, elcor or keepers whenever there's combat, because they all require new clips for their custom skeletons as well.
 
What could prevent any of those choices from being read from save files and represented in a different engine, say CryEngine3. or Adventure Game Interpreter for that matter?

I'm sure any game dev could list a dozen reasons immediately. Different animation systems, script languages, save file formats etc.
Bioware would basically have to rebuild significant amounts of UE3 systems in the new engine, and spend all the money and time on customizing it for their needs, too. ME1 took several years of R&D to finetune UE3 for their needs...
They can't restart from scratch and expect to finish on time before the generation is over. They're actually pretty lucky as noone could have foreseen in 2005-2006 that we'd still have X360s in 2012.
 
Sure things like different animation systems and script languages make moving the series to a different engine harder, but not in any way relevant to carrying over all the player decisions I've seen throughout the first two games anyway. If you can provide actual examples to the contrary, please share. As for save file formats, obviously the new game would have to be made to read the old format, but that's hardly a major undertaking.
 
Last edited by a moderator:
Perhaps now with the trilogy over the next ME game (if there is one) might run on frostbite 2.0. Especially since Frostbite 2.0 is next-gen ready (not that UE3 isn't too, rather just that moving to their own internal tech would save them royalty fees paid to Epic for using their engine). It would be in EA's interest to fund Bioware making such a transition imho.

A comparison of MENext in FB 2.0 with ME1/2/3 would make for an interesting article indeed.
 
I guess it's like this... There's a limited amount of RAM reserved for animations at any time. The game never loads all animations. You don't need to go from conversation to fighting on the Citadel, for example.
Well yes, but then it also limits the kind of scenarios you can have in game doesn't it?
For example you can be talking and then immediately go into a fight, or for instance the fight in Chora's Den in Mass Effect 1 where you have to investigate something and then immediately find out that there are hostiles in the area and it seamlessly turns into a gunfight.

But I guess Bioware made different design choices for this game.
 
For those interested, Digital Foundry did a comparison article for 2 cloud services: OnLive and Gaikai. They both have different business models. OnLive is more like a retailer and Gaikai is utilized by publishers but will be allowing full game streaming soon.

That is an awesome idea for an article.

One thing I'd like DF to do sometime, is check out latency on a console on an "average" HDTV vs Onlive or Gakai on a average PC monitor.

HDTV's typically have a lot more latency than PC monitors, so I've always thought it's possible Onlive/Gakai could actually have less latency once you factor in the display than console games. It would need testing though. Obviously, local PC games should beat Onlive every time.

I also really wish they had thrown some comparison screens/videos with X360 versions of games in there. They state Gakai version of AC looks better than console, but I'd like a little conclusive proof of such.

Also as of when I just visited the article, none of the videos are working they all throw an error.
 
Yeah there were definitely a few and also on the Nintendo handhelds. In fact the fighters on Vita are all 60fps I think.
 
Yeah there were definitely a few and also on the Nintendo handhelds. In fact the fighters on Vita are all 60fps I think.

They are. And they are drop dead gorgeous to boot, especially Blazblue (which is far and away the game that comes the closest to its console counterparts. The bloom effect was toned down/axed from the polygonal backgrounds, but that's the only difference I could make out)

Considering how stunning 2d art looks on the Vita, I'm really hoping for a Castlevania game by Igarashi. Unfortunately the Derp is far too strong with Konami for something awesome like that to happen.
 
Status
Not open for further replies.
Back
Top