Microtransactions: the Future of Games? (LootBoxes and Gambling)

From Ubisoft financials:

Player Recurring Investment (
DLC, In game purchases, Season Passes, Subscriptions, Advertising) - 175 million Euros.

More than the revenue from physical or download sales (though not combined). :yep2:
 
As long as this increases the production values and the supported time of AAA games even further I have no problem with that.

And that the game is balanced to not incentivise microtransactions. Shadow of War took a lot of shit, unfairly I thought, but I never considered (or feel like I need too), spend actual money.
 
Or just charge more. AAA games have launched at $60 for a long, long time and not kept pace with inflation which impacts development costs.
But then the core game in terms of gameplay duration-content has reduced generally across the board in AAA games.
I appreciate there are examples outside of this, but from what I understand this is the trend and helps towards balancing the fixed price.

One that stands outside that trend may be Assassin's Creed Origins for the core game, would be interesting to compare core content to that to Witcher 3 beyond the scope of support with recurring cost revenue Origins has.
 
Last edited:
But then the core game in terms of gameplay duration-content has reduced generally across the board in AAA games.
I appreciate there are examples outside of this, but from what I understand this is the trend and helps towards balancing the fixed price.

One that stands outside that trend may be Assassin's Creed Origins for the core game, would be interesting to compare core content to that to Witcher 3 beyond the scope of support with recurring cost revenue Origins has.
I think development costs is a major win for CDPR. Due to currency exchange and likely some cost of living differences between say California and anywhere else in the world lol, the ROI is much higher than other studios.
 
I think development costs is a major win for CDPR. Due to currency exchange and likely some cost of living differences between say California and anywhere else in the world lol, the ROI is much higher than other studios.
Yeah that is a very good point.
One of the articles about a studio covered the per dev employee cost say between California and outside of the state - they had to move outside the state.
I was using Witcher 3 more as the gold standard that fits the gameplay/content design of Assassin's Creed Origins; Witcher 3 had its own financial risks and the studio CPDR still does.

Maybe a more relevant comparison would be Assassin's Creed 3 to Origins, both from a core content perspective, and same with other game series/franchises with over a 5 year spread.
But then again a franchise/series can be bounced around studio geographically, which does happen.
 
Yea, I mean I imagine there are a lot of cost of living type things that matter. I mean when Shifty is upset that people are leaving the UK to come to Canada to dev work lol, I think I can understand some of it, even though I think cost of living is high here. The governments here have been pressing to setup more studios here, which is why you see them growing in MTL, TOR, Van. I think only Bioware is on it's own in Edmonton.
 
But then the core game in terms of gameplay duration-content has reduced generally across the board in AAA games. I appreciate there are examples outside of this, but from what I understand this is the trend and helps towards balancing the fixed price.
My AAA experiences this year - Horizon Zero Dawn, Shadow of War, Mass Effect Andromeda, have all been long gaming experiences. Rolling back a year there was COD Infinite Warfare (loooong), Uncharted 4 (too long), Rise of the Tomb Raider (too long), Watch Dogs 2 (longer than WD1). I'm sure there are shorter AAA games but I'm not playing them.
 
That has nothing to do with being lazy and everything to do with trying to make a living/profit (if you had a working game made super efficiently and the chance to double your profits just by adding loot boxes, would you refuse?). The same attitudes and mindset regards development exist now as did in your imaginary ye olde days (very varied between different dev companies), only the workloads have gotten bigger and bigger as the games and demands have. Your attitude is insulting, disrespectful, and rather typical of the modern entitled gamer. It's really no wonder we're seeing more of a schism between them-and-us IMO. Doesn't matter what the devs do, they're lazy because games aren't perfect, and with so many whinging gamers out there, I can see why devs and pubs who perhaps cared more for the art once upon a time would decided to just sod it and milk them for all they can.

Edit: In fact more than that, you're not just calling them lazy but stupid too. You're suggesting all these devs could reduce costs and so require less sales to remain profitable, but instead they rack up the bills during development in the hopes of making more back on loot boxes. "Okay guys, let's not work efficiently for a total cost of $50 million so we can be safer regards recovering costs and increase our odds of staying open another year. Let's instead burn money and rack up $100 million in bills, and then add loot boxes to try and recover that. What's that? Work efficiently, save costs, and add loot boxes to make more money? That sort of thinking is only for the smart business-savvy entrepreneur. Our options are only efficient and no microtransactions, or wasteful and microtransaction gambling, as is the Unwritten Dev Code by which we must abide or be snubbed by our GameDev peers."

Bonkers, insulting argument based on nothing but a dislike of loot boxes projected into a shot-gun character assassination of the guys and gals busting their guts to make your games.

Missed this mean post of you! So here is my take on current AAA development:

I really do believe that when game studios run into troubles during the development, the try to solve them by throwing more people at the problem (instead of going the indie route and use creativity and genius; as an example, I brought NT with Hellblade that gave us really good inside with their dev yt videos).

And yes, in this sense they go the 100mill way and not 50mill way and take the lazy (throw money at the problem) route. And then everyone is complaining that the costs of development is so high...throw more people at the problem who then introduce DLC, Lootboxes, MT...

Here comes a real question I am curious about: is it really necessary to have hundreds of devs for a single game? No one seems to question these extreme dev numbers nowadays and take it as a given? But, why?

PS: After this mean post of you, I don‘t like you anymore Shifty!!!!!!! (Well, I kinda still like you, but I will never admit it from now on!!!!!)
 
Interesting video arguing that games are not more expensive to make


This brings me to the question: are we 100% sure that dev costs increase?
 
Last edited:
Interesting video arguing that games are not more expensive to make


This brings me to the question: are we 100% sure that dev costs increase?
Yes.
The real issue is content creation.
In any large AAA studio, you're looking at a 30/70 split in terms of actual developer programmers vs. Asset/Content/World/Lighting creators.
As graphical fidelity scales in size, the amount of work on both team starts to increase dramatically, you get into new types of roles that wouldn't exist in other games like
motion capture
sound capture
lighting and art direction
camera direction

as you increase fidelity and scale the game from low to high end systems, someone has to make the art for the different areas. It's not like the game knows how to scale a 4K texture and reduce it's quality so that it runs on a weaker box. Someone else is going to be responsible for reducing geometry (while still keeping it good looking), will be responsible for reducing texture sizes. etc

When scenes get super complex and budgets on CPU and GPU get tough, then more and more types of batching and bundling has to occur, more meshes start to change. So originally maybe there was 3 meshes in 1 room, but they ran out of budget on the low end system, and need to remake their meshes for 1 in this case. Then perhaps later on, the lighting and shadows don't work, so they need to move them around. And then of course, performance testing and optimization.

All of this costs money. And it's like constructing a building with few blue prints. A bunch of people constantly changing things not because the developers are incompetent, or PM is incompetent, but because as we demand higher fidelity graphics, the boundaries in which that operates is different than for those with lower machines. So the more platforms you support $$$$$$$$$$$$$$$$$$$$$$$$$. And all of it, is on overhead supporting different platforms. Exclusives on the other hand, have all that $$$$$$$$$$ and its poured into 1 version of the game; with significantly less overhead (at least, until mid gen refreshes got introduced)
 
Kotaku did a nice article on this subject.
I asked a ton of developers how they calculate their budgets.
A few of the bigger companies wouldn't get into specific numbers -- like I said, notoriously secretive -- but all of the studios that did answer offered the same magical number: $US10 ($13),000 ($12,560). Specifically, $US10,000 ($12,560) per person per month.
"That's a good ballpark number," said Obsidian's Adam Brennecke, executive producer of Pillars of Eternity and its upcoming sequel.
"Based on the average salary for a developer plus overhead, it costs about $US10,000 ($12,560) per person at the studio. Some are more expensive. And that's how you usually do budgets with publishers too."
That number -- which might go even higher if you're in an expensive city like San Francisco -- accounts for salary, office rent, insurance, sick days, equipment, and any other costs that come up over the course of development.
https://www.kotaku.com.au/2017/09/why-video-games-cost-so-much-to-make/
Then consider how many are involved in the development even in big Indy studios such as CDPR.

But personally I do think there is a trend that maybe game content is offset compared to years ago; whether duration or game detail content (beyond visuals).

You can do games much cheaper but the effort seems crazy and not something a AAA studio under a large publisher would necessarily do; look at Ninja Theory that saved money as an example by buying cheap lighting and modding cameras and techniques for their visual rendering with Senua Hellblade and that cost under $10m to develop, nice looking game as well.
And no "monthly recurring revenue" :)
Some reports though are suggesting the gaming industry is becoming high stakes for the independent studios wanting to develop games with the feel of AAA.
https://www.engadget.com/2015/09/16/ninja-theory-hellblade-indie-game/
 
Last edited:
Seems a bit relevant to the topic at hand. The recent patch for Star Wars: Battlefront II has drastically reduced prices on Heros. Some take this as a sign that the publishers are taking notice. I'm not convinced. I think they only lowered the price because their metrics indicated not many gamers were opting to buy. In order to entice gamers to purchase they were forced to lower the price to bring more through the door.

https://www.trueachievements.com/n3...ii-patch-drastically-reduces-prices-on-heroes
As microtransactions and loot boxes continue to dishearten the gaming community more than ever before, at least some complaints are being heard loudly and clearly. That's evidenced by today's patch for Star Wars Battlefront II that drastically cuts the cost of unlocking heroes by a massive 75%.
 
Yes.
The real issue is content creation.
In any large AAA studio, you're looking at a 30/70 split in terms of actual developer programmers vs. Asset/Content/World/Lighting creators.
As graphical fidelity scales in size, the amount of work on both team starts to increase dramatically, you get into new types of roles that wouldn't exist in other games like
motion capture
sound capture
lighting and art direction
camera direction

as you increase fidelity and scale the game from low to high end systems, someone has to make the art for the different areas. It's not like the game knows how to scale a 4K texture and reduce it's quality so that it runs on a weaker box. Someone else is going to be responsible for reducing geometry (while still keeping it good looking), will be responsible for reducing texture sizes. etc

When scenes get super complex and budgets on CPU and GPU get tough, then more and more types of batching and bundling has to occur, more meshes start to change. So originally maybe there was 3 meshes in 1 room, but they ran out of budget on the low end system, and need to remake their meshes for 1 in this case. Then perhaps later on, the lighting and shadows don't work, so they need to move them around. And then of course, performance testing and optimization.

All of this costs money. And it's like constructing a building with few blue prints. A bunch of people constantly changing things not because the developers are incompetent, or PM is incompetent, but because as we demand higher fidelity graphics, the boundaries in which that operates is different than for those with lower machines. So the more platforms you support $$$$$$$$$$$$$$$$$$$$$$$$$. And all of it, is on overhead supporting different platforms. Exclusives on the other hand, have all that $$$$$$$$$$ and its poured into 1 version of the game; with significantly less overhead (at least, until mid gen refreshes got introduced)
Costs are increasing with "diminishing returns" for the reason that many of these technologies have already been implemented in a time when microtransactions almost did not exist. In addition porting is not as complex as it is used to.
Consoles are a lot more similar compared to any of the previous generations and their PC based architecture makes scaling a lot easier since devs are already taking into account that there are various configurations in the PC space.
The microtransactions are not a necessity. It is a "newly" discovered method for profit maximization. It's existence is unrelated to costs.
Microtransactions grew from mobile gaming and since it is a formula that generates money, and businesses want money, they decided to establish it as a norm in the console/PC space as well
 
Seems a bit relevant to the topic at hand. The recent patch for Star Wars: Battlefront II has drastically reduced prices on Heros. Some take this as a sign that the publishers are taking notice. I'm not convinced. I think they only lowered the price because their metrics indicated not many gamers were opting to buy. In order to entice gamers to purchase they were forced to lower the price to bring more through the door.

https://www.trueachievements.com/n3...ii-patch-drastically-reduces-prices-on-heroes
I think they screwed themselves over. They wanted the high end characters to be a real achievement to unlock but made it easy to simply buy. Now due to either pressure or the community not liking their pay-to-win garbage and not partaking, they've had to lower the cost and now everyone will have access to them in very little time.
 
For the past 15 years a larger and larger segment of the consumer based has been shifting its focus to competitive and cooperative multiplayer. Instead of getting a few weeks out of a game its becoming half a year or longer. Its reducing the number of titles people are purchasing, at the same time a large segment of casuals shifted away to mobile and its free to play.
At the same time as gamers expectations for visual fidelity and realism in their games is increasing. Fallout 4 got alot of criticism for its clunky animations and visuals which were not on par with Witcher 3.
I remember at the beginning of the generation people were wanting open world games. They were praising games that were openworld which pushed alot of devs to pursue that as well. This also raises costs.
People want voice acting for all the npcs in games now days. No longer is text acceptable except in indie games.
 
Seems a bit relevant to the topic at hand. The recent patch for Star Wars: Battlefront II has drastically reduced prices on Heros. Some take this as a sign that the publishers are taking notice. I'm not convinced. I think they only lowered the price because their metrics indicated not many gamers were opting to buy. In order to entice gamers to purchase they were forced to lower the price to bring more through the door.

https://www.trueachievements.com/n3...ii-patch-drastically-reduces-prices-on-heroes


Unfortunately, that's not the only change they made.
They lowered the credit "prices" for the heroes, but what they didn't say is they drastically lowered the credits being earned by each activity.
Gameinformer is saying they slashed the campaign completion award from 20 000 to 5000 credits.
http://www.gameinformer.com/b/news/...efront-ii-review.aspx?utm_content=buffer3929d

That's a 75% cut. The same 75% they cut on each heroes' "price".
Which they obviously failed to mention in the blog post.

Edit: to sum it up in digital PR terms, they're trying to make the rage outdated:

 
Last edited by a moderator:
Yes.
The real issue is content creation.
In any large AAA studio, you're looking at a 30/70 split in terms of actual developer programmers vs. Asset/Content/World/Lighting creators.
As graphical fidelity scales in size, the amount of work on both team starts to increase dramatically, you get into new types of roles that wouldn't exist in other games like
motion capture
sound capture
lighting and art direction
camera direction

as you increase fidelity and scale the game from low to high end systems, someone has to make the art for the different areas. It's not like the game knows how to scale a 4K texture and reduce it's quality so that it runs on a weaker box. Someone else is going to be responsible for reducing geometry (while still keeping it good looking), will be responsible for reducing texture sizes. etc

When scenes get super complex and budgets on CPU and GPU get tough, then more and more types of batching and bundling has to occur, more meshes start to change. So originally maybe there was 3 meshes in 1 room, but they ran out of budget on the low end system, and need to remake their meshes for 1 in this case. Then perhaps later on, the lighting and shadows don't work, so they need to move them around. And then of course, performance testing and optimization.

All of this costs money. And it's like constructing a building with few blue prints. A bunch of people constantly changing things not because the developers are incompetent, or PM is incompetent, but because as we demand higher fidelity graphics, the boundaries in which that operates is different than for those with lower machines. So the more platforms you support $$$$$$$$$$$$$$$$$$$$$$$$$. And all of it, is on overhead supporting different platforms. Exclusives on the other hand, have all that $$$$$$$$$$ and its poured into 1 version of the game; with significantly less overhead (at least, until mid gen refreshes got introduced)

But the video I posted takes this into account and argues that a single game costs more (only because the publisher chooses so, by having less but bigger titles), while the overall (!) dev costs stay constant or even go down.

And that is a real difference imo and gives another perspective to the ‚games are more expensive to make‘ debate.

If you take the overall costs spend for development, the guy in the video argues that the costs actually go down a bit over the years (!). So publisher spend less and less total money (all projects) for game developments over the years. In this sense, game development gets cheaper.

They have less games per year and use DLC, loot boxes, MT, remaster etc etc to extend the life of the game.

An extreme example, I admit not super representative but still interesting: GTAV.

The art and assets created are being used in what feels like 100 versions of the game through the last five years. How many other games did they develop in this time? How do you calculate the dev costs of GTAV...per version released (remaster, pc, etc)?

The video also pointed out that even before changing to ‚less games, but bigger per year‘...all publisher were very profitable already, so no immediate need to change and introduce MT etc.
 
Assuming that devs know what they do and that they are evil down to their heart (see certain EA tweets in the last days):

Very smart move!

1.) devs know that mt will give bad press and risk their investment.

2.) let‘s make a trick: make everything super expensive first, downright obvious unfair to fuel the uproar.

3.) uproar happens, let it burn and cook a bit.

4.) jump in as the big savior and reduce stupidly high costs down to only ‚very very high costs that are still unfair, but drastically better than before‘.

5.) watch positive press and players and reddit feeling that they have accomplished a change in a big company.

6.) get ready for money.

That feeling of accomplishment...all gamers are addicted to it.

Nice played, „Evil Ampire“ :)
 
They have less games per year.
Why is that?

I think the video makes a fair argument but is wrong. Back in the old days, so many games failed to make money it was high risk, with publishers enduring the chance of small-game sales in the hopes one would hit big. They learnt from experience that big games are more stable and less risky because that's what gamers want. Instead of gamers buying twenty 10 hour long games a year at $60 a pop, they're buying three 100+ hour games a year, each of these games costing more than one of those 10 hour games to make, so publishers have reduced output, focussing on just a few landmark, dependable titles, which typically offer crazy numbers of hours gameplay online.

If you look at it in terms of how much money gamers spend per hour gaming, without MT that would be dramatically reduced, I think. Total income into the gaming industry would be down, as far fewer games are played.
 
Last edited:
Back
Top