As with everything in the cyclical nature of gaming, the industry is back to discussing a concept that was a foregone conclusion for games just a handful of years ago. In the same way that we’re contemplating what ownership in gaming means (when two decades ago players loved owning—and possibly reselling at GameStop—their game disks), we are again considering the lifetime of a game and what it means to play a game forever.
In an age of planned obsolescence and digital games-as-a-service (GaaS), it’s hard to believe that you can still fire up Ocarina of Time on an N64 connected to a CRT and beat the Water Temple again. That Zelda came out at a time when games shipped as locked and final entities without any opportunity for updates, additional content, and of course no multiplayer. The benefit of having games that lived offline and generally free of game-breaking bugs meant that one could play and replay the game forever, or at least as long as the hardware lasted.
The landscape has shifted hugely since then, with most games distributed digitally along with the player expectation of constant new content, iterative improvements, and in many cases, multiplayer. In this new world, the hardware can come and go (you get a new phone every year, upgrade your gaming PC every three years, etc.), but the games persist with you. At least, that is, until the developer shuts them down.
GaaS are notoriously resource exhaustive. The cost of hosting servers as well as the large development teams for new modes and content—in addition to regular updates and fixes—means that devs shell out a lot of money with typically diminishing returns over time as players spend less or leave for other games. Some titles these days manage to stay fresh, filled with gamers, and financially rewarding for a decade or more (look at GTA Online), but most will see a steady decline to the point that the developer is losing money keeping the game going. That’s when games die.
Sony’s shutdown of Concord was a perfect example of inverted GaaS math: if that game had been an old-school, single player, one-and-done premium purchase, even modest day one sales wouldn’t mean that the disks were pulled from shelves and the game wiped into oblivion. Instead, they would have been heavily discounted to try to shift as many remaining units as possible. The game would potentially even find a small but cultish group of fans who’d probably speak of it for years. But since this was a live ops game that could only thrive with massive audiences, it was immediately shut down. Any extra day the game was live was rubbing salt into an already $400M wound.
Not every game will have that kind of day-one defeat, but basically all games will reach a point where developers don’t see the benefits in keeping that burden on their plates. As gamers we’ve become almost accustomed to the idea that games will die and everything that we’ve bought and earned—all the street cred, all the achievements, all the status, all the benefits— will also die. Like clockwork, EA takes the oldest fka FIFA title and winds it down, users buy the new copy, buy athletes, make teams, and start earning their spot at the top all over again. This practice is now getting examined with increased skepticism from the mainstream, and lawmakers are starting to balk at this concept of fake ownership. If you’re “buying” digital goods but you can lose access to them at any time, should it be called buying instead of licensing?
What if it didn’t have to be this way?
There’s a real cost to developers in closing a game, not least of which is player trust and sentiment, with financial repercussions. How heavily do you want to invest in a game as a player knowing that it could be over at any point? Would you be likely to buy into a new version of a game given that the developer pulled the rug out from under the old, perfectly-fine version? Does the feeling that a game’s community is waning mean you’re more likely to stop purchasing and leave the game earlier than you otherwise would because you know what’s coming?
All of these feelings hurt developers at their bottom line, and late-in-life games can still earn developers millions of dollars a month, sums that no dev would brush away as long as there was a return and it didn’t cannibalize future game launches (more on that later).
There is a way to do this, but solutions need to solve for:
The financial upside
The tech burden
Handling improvements/fixes/new content
Fortunately, a cadre of technologies and new modalities of game maintenance and ongoing support are offering developers and players the opportunities to enjoy and earn from their games forever.
Here’s a subset of what is going to help unlock this:
Distributed or decentralized servers
Item ownership
Community ownership
User generated content/modding
Shared upside between devs and community
Let’s briefly dive into each one, and at the end we’ll tie the strategy all up together.
Distributed or decentralized servers
We’ll start with solving issue #2 above: tech costs. Running multiplayer servers is extremely costly. Of course, cloud hosting helps to offset those costs scalably compared to dedicated, owned servers. This is especially true as, perhaps, the game’s MAU wanes and you don’t need as many servers. Newer entrants like Aethir aim to reduce the costs further, which is great, but all of this comes down to either a) making the server costs low enough that the remaining player spend more than supports it, or b) shifting the costs onto gamers.
In some instances, players are already used to this. Want to spin up a Minecraft server for just you and your friends? You have to pay for that. But it is a new concept to turn to players and say, “If you want to keep playing this game at all, you have to pony up.” While that kind of “threat” is tonally unusual, we know that hardcore and dedicated players (who are typically the ones left in late-stage game audiences) are more willing to pay to keep access to their games, which is how World of Warcraft maintains a subscription model after two decades in a market driven by free-to-play. Once the costs of servers are moved onto the remaining audience, decentralized hosting has the added benefit of giving no one power the ability to shut down the game, increasing player trust via a distributed network.
Item ownership
The ability for a player to own his items does a number of things, not least of which is increasing retention and a player’s commitment to a game, which is a good thing to have if you need that player to come back to the game indefinitely. (Obviously, an item owned in a game that shuts down isn’t likely to be financially valuable, but the affinity or status attached to it could persist, and that can certainty help players stay in a game longer than they otherwise would, knowing there’s at least some kind of escape valve for their accrued cache.)
For a game that intends to live on forever, however, the security of owning the in-game assets becomes a near-requirement in symmetry with the persistence of the game, i.e. it would be weird to feel like you could own and play the game indefinitely but not own and leverage your purchased items indefinitely. There’s also added safety in leveraging something like the blockchain for this item ownership, as that interoperable ledger would protect against potential data losses (and therefore losses of purchased items) when moving the game from centralized servers to a distributed network. These kinds of migrations have led to account and data losses to games in the past; any mistake along those lines would be catastrophic to the effort of an ongoing, endeared player base.
Ownership also plays a role in aligning incentives between community and developer, long after the developer has ceded control. More on this later.
Community ownership
Also not necessarily a requirement, community ownership of a game is part of a schema that is important for the logistical upkeep of the game, like handling payments for servers via players (what is the body that will organize payments, even in the case of decentralized servers?) and rally players to make strategic and design decisions for the game when the developer relinquishes that control. Player communities across Discord or Reddit help maintain chatter for games over the long term, but it’s only with the agency for change that a community can influence the game’s future in a positive way. No one wants design by committee, but ignoring the voices of players is something that developers do at their peril—often with terrible results—so it follows reasonably that players making guiding choices for a game will probably net results that they favor.
Community ownership is also important for moderation, both for in-game chat which will no longer have developer employees monitoring things, and for modded or user-generated content, which we’ll discuss in a bit. Some kind of organized structure is important for these sorts of activities, and a conscientious and deliberate shift of ownership to a community is a step in this direction.
Many gamers see agency over their games as their ultimate right, a notion that they’ve paid into the game and dedicated enough time—sometimes tens of thousands of hours and dollars—that they should be rewarded with greater influence over where the game goes. That community guidance might be more noise than benefit while the developer is still actively evolving the game, but when they’ve decided to move on, it’s only through the community that the title can continue to have advocates for its best interest.
There are different models for community ownership, and none is proven at scale. In the blockchain world, DAOs (decentralized autonomous organizations) are the governing structures that aim to coordinate resources and make decisions for some entities, and aspects of their setup are valuable for community ownership of games, even if the pay-to-win voting mechanics have the potential to allow outsize influence from some over others. Perhaps a more stable structure would be voting weight based on hours played, which could also be verified on-chain.
User generated content and modding
Even the best game will struggle to keep players around if there’s nothing new. I’ve written quite a bit about modding and user generated content, concepts that help developers offload to the community the burden of providing new modes and content. UGC is powerful even in the live ops stage of a game, when creators maintain rolling updates and their evangelism in touting their own creations brings in more players and revenue. When a developer is done with its hands-on role in a game, however, a thriving modding and creator community becomes the only mechanism for generating new things to buy and new modes to play.
This is an area where blockchain technology is the scalable solution. Some players might start contributing to the game’s content and modes out of sheer passion—and there’s a long history of this—while others will rightfully expect a return on their efforts. No developer wants to be accountable for creator payouts when they don’t even want to have a hand in the maintenance of the game. Imagine Roblox dedicating legal, accounting, and technical resources to a small but dedicated cohort of creators years after the dev moved onto a new game—it won’t happen. Web3 lets UGC creators build and sell their assets without the developer ever needing to touch the transaction by bypassing the centralized payments structure that was required for that old system.
Aligning incentives
And on that last point, not only do creators need paying, but the developers should get their cut, too. The original IP holders, even when they walk away from the game, will continue to legally own the rights to the IP and therefore should have a stake in its ongoing success. They will continue to sell the assets that they created in-house, but should also get a share of those UGC items built on the backbone of their original creation. Again, web3 comes in here, with the ability to enforce and automatically handle the split of creator royalties between modder and developer, with maybe even a third portion going to a community treasury to fund things like marketing, tournament prize pools, and more.
Bringing it all together
If we assume all the above criteria are met—the game is hosted in a cost-efficient and decentralized way, the players own their items as long as (and beyond) the lifetime of the game, the community is in a seat of authority over the direction of the game, there are regular refreshes of content, and the incentives make creators and developers happy—then we’ve satisfied the 3 upfront challenges that a game needs to address in order to maintain itself for eternity. If done correctly, a no-risk, win-win scenario arises: players are happy with their favorite game and developers get a steady stream of incremental revenue.
Yes but…
Isn’t killing games part of the rinse-and-repeat that drives much of the gaming industry? Won’t EA suffer horribly if it doesn’t force players to buy the latest EA Sports FC and all the players and battlepasses all over again because players are still happy with the identical-looking prequel?
In some cases, yes, which is why forever games are not a one-size-fits-all strategy. Some games will become franchises that find their success in a scorched earth past and annual releases, but those games are few and far between. Aside from linear premium purchases (and even those could benefit from new story modes and DLCs created and sold by the community), many games hope to last forever. It’s just that none do. Apex Legends wishes to stay relevant for decades to come, but when it shuts down it won’t necessarily spin out Apex 2, and it also likely won’t shut down with zero players or revenue. In that scenario, EA can benefit from a persistence of gamers while it works on net-new IP. And even if it does intend to create a sequel, players will have additional comfort knowing that their investments in the new release will be protected by similar longevity and care as the previous title.
So when?
There are titles that have already contributed to early experiments with forever games, but in most cases there hasn’t been enough time to see what happens since the developers are still actively involved in the game. Then there are the examples of games who have managed to maintain their relevance decades after the developers moved on. MUD1, the world’s first multi-user dungeon (launched in 1978) and the world’s first online RPG (in 1980) still has versions of it maintained by a dedicated community using its open source repo. But this setup is just a slice of what a forever game needs for the ideal scenario.
The reality is that no game has managed to build for all of the necessary requirements above to truly show the world what’s possible. Some games have distributed hosting, some have item ownership, others have UGC communities, but none have brought together these components in a way that would let a game go on and on. I think this future is nearing. There are games being built right now by experienced AAA developers with millions of existing users, knowledgeable in community-driven economies, who know how to build the kinds of experiences that players would want to stay involved with for many years. And they are building with the intention of eventually walking away from the game, letting the dedicated playerbase keep things going and financially beneficial for the developer.
It’s also clear that players and the regulators who protect them are increasingly at odds with the notion that games or game items, once purchased, may not be enjoyed forever. The narrative is shifting, and with it I think we’ll see more of our favorite titles available for a long time.
👾❤️🔥