You can always trust the developer (or how Ubisoft screws you over and over again)

Kaotik

Drunk Member
Legend
Supporter
Everyone knows that games shown on E3 and other shows don't really match the reality you get on your screen once the game is out. Many devs do this, but probably the most notorious one is Ubisoft.

YouTube-user CrowbCat has put together a little video, showing off Watch Dogs, The Division, Rainbow Six: Siege, Far Cry 4 and Far Cry 3 - the video includes videos shown on E3 and other shows, compared to actual gameplay on PC 1080p max settings


On top of my head from the most recent games, Witcher 3 received several notable downgrades between the supposed "gameplay videos" shown over the years and final game, any other games that come to mind?
 
A lot. But I forgot.

Currently the one I remember is only Destiny.

It was downgraded in a bunch of things. Then got upgraded in The Taken King. Actually in the patch right before the taken king.

But it's still missing the bumpy ground and little stones that was shown at e3. Sorry I forgot the technical term. It's the one that make a real 3d bumps from flat geometry. Not just fake texture bump Mapping.
 
But it's still missing the bumpy ground and little stones that was shown at e3. Sorry I forgot the technical term. It's the one that make a real 3d bumps from flat geometry. Not just fake texture bump Mapping.
Parallax Occlusion Mapping, I think.
 
Ah, this old chestnut. Gamers do seem to be a bit dense about how games are created and the changes that are made throughout the process, despite the fact that developers have never been more open about the process and explain the problems that can arise that simply weren't predictable earlier on. We've even had the darlings of the industry like Naughty Dog and CD Project Red, explain why final software don't look like original visions or early tech targets. But all you get from gamers is Duh duh downgrade duh duh.

When a gamer buys a console or a gaming graphics card, these products should come with a dunce hat to wear because most gamers seem incapable of learning.

dunce21.jpg
 
Last edited by a moderator:
Btw in that video of the division.. Damn.. I totally forgot to glitch into the locked area using portable armor :(

Now its already patched.
 
A lot. But I forgot.

Currently the one I remember is only Destiny.

It was downgraded in a bunch of things. Then got upgraded in The Taken King. Actually in the patch right before the taken king.

But it's still missing the bumpy ground and little stones that was shown at e3. Sorry I forgot the technical term. It's the one that make a real 3d bumps from flat geometry. Not just fake texture bump Mapping.

Parallax Mapping (or Cone Step Mapping) are the ones doing ray tracing inside the surface (ie surface stays planar if you look from a tangent), displacement mapping is the one using tessellation and displacing the geometry (it will have a correct silhouette)

I'm amazed by the differences shown in the video. That said event videos are often done with a very changing product that is rarely representative of the final one. (Plus desire to wow I assume.)
 
But I can't stop to wonder why ubisoft is the most brazen in giving unattainable target?

They did it again and again even with various different developer studio under Ubisoft umbrella.

Sure other game got downgrades or changes too. But ubi is so consistent with the big difference.
 
Ah, this old chestnut. Gamers do seem to be a bit dense about how games are created and the changes that are made throughout the process, despite the fact that developers have never been more open about the process and explain the problems that can arise that simply weren't predictable earlier on. We've even had the darlings of the industry like Naughty Dog and CD Project Red, explain why final software don't look like original visions or early tech targets. But all you get from gamers is Duh duh downgrade duh duh.

When a gamer buys a console or a gaming graphics card, these products should come with a dunce hat to wear because most gamers seem incapable of learning.

dunce21.jpg
Eh, a lot of the changes in that video were made solely to accommodate consoles. The cool texture on the rock wall in Far Cry 4 comes to mind. If it didn't happen in every single big Ubi title then I'd give them the benefit of the doubt, but they are clearly trying to mislead here. Some of the changes are huge, like the Rainbow Six Siege rooftop scene. There's no way they thought that would run in the final product given the magnitude of the downgrade there. It's not like going from 1080p to 900p, it's like the difference between a PS3 and PS4 game and they clearly knew the specs of the target hardware going in vs Naughty Dog who started work on U4 before the PS4 was even announced.
 
Last edited:
That doesn't change what I said. Please pickup your hat on the way out! ;-)
I read your post and simply think you're wrong in passing it off as something we should learn to deal with. BTW I don't ever wear hats so if I left one you can keep it :D
 
Well, things change in 2+ years, something that's common between TW3, Division, and even Halo 2's reveal. etc.

Building an isolated demo may have actually run fine on the HW, but scaling up just turned out impractical as development went on.
 
Well, things change in 2+ years, something that's common between TW3, Division, and even Halo 2's reveal. etc.

Building an isolated demo may have actually run fine on the HW, but scaling up just turned out impractical as development went on.
A lesson Ubi should probably have learned by now IMO.
 
Probably. At least Activision and EA seem to stick to proper reveals in the same year of release.

TW3 - well... we all know how many times they pushed back the game. :p I suppose if CDPR really does learn, we won't hear from them about Cyberpunk until 2077. >_>
 
I read your post and simply think you're wrong in passing it off as something we should learn to deal with. BTW I don't ever wear hats so if I left one you can keep it :D

The creative process is an iterative one. Game technologies and mechanics that ship are rarely the same as those sketched out when the project was green lit because so many things change during production. This isn't limited to games but music, TV, movies - anything built around a creative process. Things that are great in theory are sometimes not in practice (like Pizza Hut's hotdog crust pizza). Things that sound awesome are too complicated to engineer. New ideas get proposed and adopted. When it comes to graphics, things that work in small scripted demos can fall apart when applied to a full dynamic open world with physics and unpredictable AI.

This process is never going to change because its how creative processes work. They aren't fixed, they ebb and flow. That's how creativity works.

The only way I can see people not having to deal with that is that publishers, PR and devs don't demo games until they are 99% complete, i.e. after they've gone gold because it's not until then that you know if everything has been optimised and runs acceptably or you have to start trimming things to get the game running acceptably.
 
I totally understand that, and that's why some games get downgraded from full HD to something less or some textures and geometry, particle effects etc. get downgraded (these things are perfectly understandable). What we're seeing from Ubisoft is IMO something else entirely. We can agree to disagree.

It's like what fast food companies do to their food in commercials, but at least it's largely understood that the food won't look anything like that when the minimum wage employee hands you your taco. The gaming press and gamers alike eat these demos up rarely if ever pointing out (or even realizing) that the final product will scarcely resemble what was shown on stage.
 
Last edited:
There are cases with rather subtle differences between demos and final releases but end up looking spectacular regardless, like what happened with Witcher 3, and then there are complete shit shows where the demo is used exclusively to deceive gamers into pre-orders and almost none of its assets are even used in the final game, like Colonial Marines.


I get that some people are more okay with it than others. However, pretty much almost everyone in the consumer side would agree that demos and screenshots should represent the final product as close as possible, since we usually pay for fully interactive games and not pre-rendered videos.

What I don't get is people like DSoup who feel the need to insult the ones who simply point out and criticize said differences, furthering this weird insult obsession by posting weird pictures of men with a "Dunce" hat.

I know there tastes for everything. People who simply don't like videogames, people who like wearing latex and being wiped, people who enjoy being peed on, etc.. so it's no wonder that some people would enjoy being deceived again and again by pre-rendered game demos. I'm fine with that, really.

But enjoying the deception and insulting the ones who don't, then trying to excuse it through something-something-creative-iterations, when hundreds of new games don't deceive people through their demos every year? That a whole new level of weird, IMHO.
 
I totally understand that, and that's why some games get downgraded from full HD to something less or some textures and geometry, particle effects etc. get downgraded (these things are perfectly understandable). What we're seeing from Ubisoft is IMO something else entirely. We can agree to disagree.

It's like what fast food companies do to their food in commercials, but at least it's largely understood that the food won't look anything like that when the minimum wage employee hands you your taco. The gaming press and gamers alike eat these demos up rarely if ever pointing out (or even realizing) that the final product will scarcely resemble what was shown on stage.

You have to also consider that UBIsoft is also the largest Gameworks partner for Nvidia. It's likely when creating those early demo's they heavily used the stock Gameworks libraries that Nvidia provided to them with management hoping it would reduce the development time and thus cost associated with each title as well increasing the graphics quality of the title. They look great and may even perform decently individually. But throw everything in to max out the "bling" factor and suddenly you find your game performs like ass on most modern hardware running all those Gameworks graphical routines once you start building the actual game.

Gearing up for release involves getting the game into something that can perform well across a variety of hardware and not just 2x Titans in SLI. :p You can see they are getting better at it. Watchdogs was horrible. AC: Unity was horrible. Far Cry 4 wasn't so bad. Far Cry Primal was mostly not so bad. Well at least better with releasing games in a runnable state.

The Witcher 3 also suffered from some of that. Hairworks was still a significant performance hog when the game was released, even on Nvidia's own hardware, rendering the game unplayable on the majority of hardware if a user didn't have it turned off. I wonder how many Gameworks effects libraries were completely cut from the game in order to release it in a state that would run on most people's hardware?

But hey, at least it's a step forward from using CGI (promoted as representative of gameplay) like in Killzone 2 or a lot of games from the 90's and early 2000's. Although it could be argued it's potentially even more misleading, especially if Press got to play with a "playable" E3 (or other gaming centered event) build of the game.

Regards,
SB
 
TWhat I don't get is people like DSoup who feel the need to insult the ones who simply point out and criticize said differences, furthering this weird insult obsession by posting weird pictures of men with a "Dunce" hat.
That was humour. Perhaps a poor attempt, but humour nonetheless.
 
You have to also consider that UBIsoft is also the largest Gameworks partner for Nvidia. It's likely when creating those early demo's they heavily used the stock Gameworks libraries that Nvidia provided to them with management hoping it would reduce the development time and thus cost associated with each title as well increasing the graphics quality of the title. They look great and may even perform decently individually. But throw everything in to max out the "bling" factor and suddenly you find your game performs like ass on most modern hardware running all those Gameworks graphical routines once you start building the actual game.

Gearing up for release involves getting the game into something that can perform well across a variety of hardware and not just 2x Titans in SLI. :p You can see they are getting better at it. Watchdogs was horrible. AC: Unity was horrible. Far Cry 4 wasn't so bad. Far Cry Primal was mostly not so bad. Well at least better with releasing games in a runnable state.

The Witcher 3 also suffered from some of that. Hairworks was still a significant performance hog when the game was released, even on Nvidia's own hardware, rendering the game unplayable on the majority of hardware if a user didn't have it turned off. I wonder how many Gameworks effects libraries were completely cut from the game in order to release it in a state that would run on most people's hardware?

But hey, at least it's a step forward from using CGI (promoted as representative of gameplay) like in Killzone 2 or a lot of games from the 90's and early 2000's. Although it could be argued it's potentially even more misleading, especially if Press got to play with a "playable" E3 (or other gaming centered event) build of the game.

Regards,
SB
The most glaring example (Rainbow Six Siege) doesn't seem to have anything to do with Gameworks effects. It's literally like comparing two different games, even more extreme than the differences between cross generation titles that come out on old and new consoles. I find this difficult to blame on NVIDIA.
 
Back
Top