Yes, the title of this post makes it clear: if some video game monetization strategies are deemed “acceptable” it certainly means that some others… aren’t! Following various controversies and during these times of exploration, the video game industry seems in dire need of some guidance and clear lines not to cross when developing and implementing its monetization strategies. Therefore, it would be interesting to try to define what is and what constitutes “acceptable monetization” in video games. This post delves deeper into the dark side of monetization, and tries to underline a few limits that should not be crossed by game publishers and developers who aim at building long-lasting, trustful relationships with players.
In a previous post, I discussed at length the most recent innovations in monetization models, such as the strongly disputed loot boxes and, at the other end of the spectrum, the much more promising Games as a Service approach. Along the lines of my research, I came across many examples of the thin red line between marketing efficiency and player-enraging propositions.
After paid loot boxes were declared an illegal form of gambling in Belgium, video game companies were begrudgingly forced to remove them from their games and we saw various reactions from big publishers. Most companies, like Blizzard or Valve, did it while simply expressing disagreement over the ruling, while 2K Games complied but called for their fans to contact local regulators and government representatives to vouch for the practice. The only company who has refused to comply so far is EA, whose CEO, Andrew Wilson, believes that paid loot boxes do not constitute a form of gambling and therefore plans to ‘push forward’ with the model.
The situation escalated because of this refusal. EA has been placed under criminal investigation following a request from the Belgian Gaming Commission and seems heading towards a legal battle with the country’s government. Furthermore, this led a group of 15 European gambling regulators and the Washington State Gambling Commission to join forces in order to “address the risks created by the blurring of lines between gaming and gambling”, especially for children. Indie developer Rami Ismail praised the move stating that “as an industry, we’ve failed so hard at any common-sense, containment and regulation of loot boxes, that I’m very OK with international governments taking that into their own hands.”
This tremendous negative exposure doesn’t help the already disgraced loot box practice but is especially damaging to EA’s image (and, from an outsider’s perspective, to the whole industry). Its relentless pursuit of short-term profit may prove very detrimental for the company’s long-term image and profit.
I've reached the point where I think as an industry, we've failed so hard at any common-sense containment and regulation of loot boxes, that I'm very OK with international governments taking that into their own hands. https://t.co/klMkruAeqx
— Rami Ismail (@tha_rami) September 18, 2018
Acceptable monetization in video games
Following this controversy and because of these times of exploration, the industry seems in dire need of some guidance and clear lines not to cross when developing and implementing its monetization strategies. The goal of this article is to help game companies look at lessons from the past and develop a critical mindset when considering certain practices. The tremendous potential gains offered by some of them can be blinding and it is crucial to develop a moral compass or be aware, at least, of players’ reaction.
“Acceptable monetization” in video games can be defined as monetization mechanics and strategies that feel acceptable to the public and can therefore safely be implemented and relied on, without the risk of driving players away, triggering backlash and controversy, or hurting a company’s image and long-term profitability.
Here are a few rules developers should follow:
Only sell complete, full-featured games
When Mass Effect 3 launched in 2012, it was received with heavy criticism because of a disappointing ending to an otherwise compelling franchise in combination with somewhat abusive DLC (Downloadable Content) practices. Mass Effect immersed players in a multiple-branching adventure where all the decisions they made – including important and difficult ones like who lives and dies – were taken into account and led to different outcomes. This created a profound engagement and investment from fans into the game’s universe. Unfortunately, the ending offered no option at all, and all the previous decisions players had made were completely disregarded. The most hardcore fans felt disrespected and voiced their disappointment.
To mitigate the player outcry, EA and Bioware released Mass Effect 3: Extended Cut, a free DLC for the game that expands on the original ending and provides more closure and outcomes to Commander Shepard’s story. But even before that, a month after the game launched, they released another DLC called Resurgence Multiplayer Pack. Although it was supposed to be free, the extension accidentally appeared on sale first, for $4. The error was promptly corrected but led players to believe it wasn’t a simple mistake. Many suspected that the DLC wasn’t initially intended to be free but was ultimately made so as a way for EA to appease players.
Many even suspected that Mass Effect 3’s ending was purposefully left vague so Bioware and EA could sell the true ending separately later down the road and generate more profit. This theory gained traction and was further reinforced by Mass Effect 3’s day-one DLC “From Ashes” controversy. From the day it came out, the game already offered paid additional content, which turned out to be lucrative for EA in the short term, given that 40% of consumers who purchased the game in-store bought the DLC as well. However, by looking inside the game’s source files, some hackers managed to find a way to unlock the DLC and activate the “From Ashes” additional character without paying. Obviously, players accused EA of purposefully stripping the main game from this content in order to sell it separately and generate more revenue. EA refuted these claims, but the harm had already been done to their image.
The first, and probably most important, element in any game and with any type of monetization, is that the base game needs to be complete (or at least feel complete). This is especially true with the premium model. If, in addition to paying for a fully-priced game, players need to spend even more to access the full game, in no circumstances can this appear right to them. We already saw a similar player feedback when EA came up with the Online Pass system in 2010.
As an attempt to take a bite out of the second-hand market, they launched the “Project Ten Dollar” where some parts of a game bought from a reseller were locked behind a paywall, so the players had to pay $10 extra to access the complete game. Many other companies followed but the practice was dropped since because of player outrage. As a rule of thumb, companies should never go out of their way to deliberately block access or remove certain parts of their game while requiring more money to access it (unless it’s a freemium or episodic game). Releasing a game that is (or feels) final and complete is capital. In this situation, paid DLCs are acceptable but they should bring extra content, and not merely allow the players to finalize the experience they were supposed to have paid for.
Stay away from Pay-to-win
The second point, which is already commonly accepted in the industry, is that games shouldn’t implement or rely on “pay-to-win”. A game needs to be accessible and enjoyable in the same way for all. In an online competitive game, if some players can pay their way to victory or towards a significant advantage over others, those who cannot afford it will be frustrated, rebel or simply abandon the game. The loot box controversy initially exploded because EA took the practice one-step too far by implementing pay-to-win mechanics. This was enough to spark an industry-shaping backlash and catapulted the already disliked loot box model onto the mainstream public stage.
Most notably, World of Tanks developer, Wargaming, took a strong stance against pay-to-win and assured none of their future games would implement the practice. In an interview with Gamasutra, studio VP of publishing, Andrei Yarantsau explained it is impossible to “provide a truly triple-A free-to-play experience without absolutely making sure all combat options are free of charge to all players. We don’t want to nickel and dime our players – we want to deliver gaming experiences and services that are based on the fair treatment of our players, whether they spend money in-game or not.”
It is worth noting, however, that the acceptability of a monetization mechanic can differ according to culture. As CNBC reports, pay-to-win mechanics are more acceptable in some Asian countries, like China, than in Western countries. This likely stems from different cultural norms and early habits that emerged with the local gaming market. In the Western markets, pay-to-win isn’t acceptable or even viable in the long run and companies have found alternatives in the form of purely cosmetic upgrades.
Don’t take advantage of the most vulnerable
The last important point, that has appeared clearer in recent years with the democratization of microtransactions and free-to-play, is that games and monetization mechanics shouldn’t take advantage of the most “vulnerable” players.
In 2013, a five-year-old child managed to spend £1700 on Zombies vs. Ninjas, a free-to-play iPad game, in just a few hours of play. The company behind the game didn’t actively prey on the child to have him spend this much, and Apple ended up reimbursing the family soon after, so this case didn’t lead to a controversy. However, the extreme simplicity of in-game access to the purchase system, and the absence of safeguard, raised concerns about the way video games take children into consideration. It also served as a reminder that taking advantage of younger generations can never be acceptable.
But children aren’t the only category of vulnerable people. Today, this extends to taking advantage of human psychology in order to trick consumers into spending more. People prone to gambling addictions, for example, are starting to be considered as vulnerable, and playing with their psychological impulses can be frowned upon. In slot machine or other games of chance, coming close to winning – but still losing – has been shown to trigger the same reward systems in the brain as having won. This is called the “Near Miss Effect” and keeps gamblers playing and spending longer as they experience the high of winning and subconsciously feel like they have. This phenomenon is, to some degree, found in certain monetization practices and explains why loot boxes are facing heavy regulations in countries like Belgium.
Another mechanic somewhat abusing human psychology is item rotation and setting a time limit on how long players have to purchase certain items, thus creating a false sense of urgency. This has been used in games like FIFA 19, but Fortnite is probably the most famous example with its whole monetization system revolving around this, as I detailed here. The game’s seasonal approach, with the Battle Pass, aims at keeping players engaged as long and frequently as possible so they get exposed to the rotating items from the shop. The items disappear after 24 hours so players have little time to decide if they want to purchase them. This practice isn’t (yet) widely discussed but I can very well see it become a hot topic of discussion and controversy in the coming months. This is even more likely to become problematic given that many children play Fortnite due to its cartoony look and high accessibility (free-to-play).
Acceptable monetization: Finding balance
There are probably more elements that define acceptable video game monetization – and I welcome your inputs in the comments below – but following this basic set of rules and guidelines shall already help companies steer away from controversies and maintain a good image over time. The first two principles have already been made quite clear by the gaming community and lessons seem to have been learned.
The real point of concern today revolves around the abuse of psychological impulses to trick people into spending more. Because the industry’s consuming habits and monetization models are shifting, and because the potential gains are so big, the risk of slip up and abuse is greater than ever. We are now hearing about Activision Blizzard’s patent describing “how machine learning could be used to entice players to spend more” by matching those who bought a certain item with others who haven’t, so they get exposed to it and want to purchase it as well. This kind of behavior might soon raise ethical issues. What’s at stake in the long term, is the relationship that video game developers and publishers intend to build with their players, as the strongest structuring force lies more and more with data. It is the one that will shape the future, and, in that sense, Activision Blizzard’s experiment and its outcomes will be interesting to follow.
Overall, microtransactions and F2P monetization techniques generate a lot of discussion, specifically because of the way they are used. Some, like Nintendo and Shigeru Miyamoto, outright refuse to implement the methods and advise against relying exclusively on them. Others, like YouTube channel Extra Credits, believe in the model and even welcome it. They offer new perspectives to the current practices to make the player “enjoy spending money on the game”. This would benefit consumers and developers alike, and offer balance between players’ search for acceptable and fair monetization methods and companies’ search of increasing profits.
In this video, YouTube channel Extra Credits explains why most of the industry is doing microtransactions and F2P monetization wrong and proposes ways to implement them in a healthy and beneficial way for consumers and developers.