Turn off the Lights

Are We Really Ready for an Always-Online Future?

"Botched launches and server issues raise the question: is this really in the players' best interests?"
About four years ago, Blizzard unleashed Diablo III on the world. Diablo III is an always-online game in which even the single-player campaign required players to first connect to the online servers. As a result, most early adopters recall seeing a few instances of this. [caption id="" align="alignnone" width="580"]The "benefits" of Always-online The "benefits" of Always-online[/caption] Blizzard took a lot of heat for the botched launch of Diablo 3, and then Maxis/EA repeated the mistake with their botched launch of SimCity, and then Microsoft basically lost the console war before it started by attempting to make the Xbox One an always-online device. I was a grad school student at the time, and decided that the issue of always-online DRM and the publishers that pushed it was worth presenting an academic paper on, but that was years ago, right? Online infrastructure has changed drastically since then, and surely game developers have learned from the mistakes of the past. Oh. Ohhhhh. Oh boy. First, let's be fair about this. Most online games crash in the first week because the server strain is never higher than it is in that opening rush to play the game for the first time. So it's understandable to a degree. But as gamers, we have to ask ourselves if this is okay. When you drop $60 or more on a new game, and most or all of its features are always-online, we are basically giving up control over when and how we play, and allowing the developers to dictate that to us. Most of the time when you buy a new game, you can fire it up and play it whenever and wherever you feel like, but this paradigm is one where you can be booted from the game entirely, for hours at a time, for things that have nothing to do with you. Now, in some cases, this is unavoidable. Games like Tom Clancy's The Division and Destiny are specifically built to be MMO experiences. As with World of Warcraft, the whole game is about cooperating with others. In this case, some server issues are unavoidable, and that's probably a sacrifice that avid players gladly make for the experience they're given. But seeing always-online elements creep into a game like Street Fighter V is cause for alarm. Part of the issue with Street Fighter V is simply that it released with a total dearth of features, and the discussion of game developers' growing tendency to release incomplete content and then complete it with DLC (looking at you, Destiny) is a different discussion entirely. The reality is, neither players nor developers are ready for this. Not yet. Developers have not consistently shown they have the infrastructure to handle server issues, especially at launch. Perhaps more importantly, putting games exclusively online simply excludes a huge number of potential players. According to a Wikipedia-published 2015 report by Akamai, only 46 percent of Americans have internet speeds faster than 10Mb/s, and 20 percent have slower than 5 Mb/s. At 10 Mb/s, gaming runs pretty smoothly. At 5Mb/s, it's almost unplayable if latency is any kind of issue. What this means is that an always-online game is going to rather difficult to play for about 34 percent of Americans, and basically impossible for 20 percent. For those counting, that is over 172 million people in the United States alone for whom playing The Division or Street Fighter V is an impossible or sub-par experience, and that's if the game servers are totally fine. Again, this sort of thing is accepted when developers are making a game in which the online multiplayer aspect is the whole game. There simply isn't any way around that. But what happens when a required online connection creeps into games which have historically had solid single-player or local multiplayer options? Diablo 3, SimCity, and Street Fighter V are all games that can be enjoyed in a cave in the middle of Montana, if the developers allowed, but they don't. [caption id="" align="alignnone" width="900"]middle of nowhere Ideally, if you pay $60 for your single-player game, you ought to be able to take a generator out here and play it. Period.[/caption] Does this lead us to a future in which the next iterations of Tomb Raider, Persona, and Uncharted require a persistent internet connection in order to play? How does that benefit the player exactly? Naturally, the benefits to the developer for keeping the game online are obvious. By requiring constant license checks against the server, they can cut down on piracy. By putting players' profiles constantly in the public eye, they can sell players cosmetic changes via microtransactions. Most important, they can practically force loyal players into purchasing new expansions, since it becomes necessary to "keep up" with fellow players. But are the benefits of always-online to players as striking as the obvious benefits to developers and publishers? Certainly not, especially when it comes to games that don't absolutely have to be MMO experiences. In many cases, the server problems alone are enough to make the game literally unplayable. That isn't what you would call an enhancement of the gaming experience. This is an issue that will assuredly continue to come up as online infrastructure continues to grow, and as developers continue to subscribe to the model of long-term DLC. It may not be a big issue now, but the danger is with the slippery slope. Microsoft already attempted to create an always-online interface for all of their games, and they backed down after intense public backlash. But the issue remains in play. The more always-online services, games and features creep into our lives, the more normalized it becomes, and the closer we move back to this future:

Comments

Meet the Author

About / Bio
I've been a gamer since my grandfather got me into Tetris at age 4. Now I play a little of everything, but I champion the truly unique and unusual. Unsurprisingly, I also really like Japan. On that note, you should all play Valkyria Chronicles.

Follow Us