The game does still have some modern contrivances. A series of weekly challenges ensures that players come back periodically, rather than lapsing for long periods – which, in a competitive multiplayer game, tends to mean they never come back at all.
In 2001, a young graduate named John Hopson wrote a detailed article for the industry website Gamasutra on the topic of behavioural game design. Hopson, an amateur game-designer, decided to pair his hobby and his research, detailing in coolly clinical terms concepts that the industry had previously handled primarily through intuition and artistry.
“Under what circumstances do players stop playing, and how can you avoid them?” Hopson asked. “Motivation is relative: the desire to play your game is always being measured against other activities.”
Hopson offered a number of suggestions to developers who wanted to encourage gamers to play harder, play for longer and play more frequently. Offer them a variety of tasks to carry out, for instance, so that if the main challenge becomes unappealing, they have other goals to achieve within the game without simply clocking off; avoid punishing difficulty spikes, which can cause a player to quit entirely; and offer rewards in a variable schedule, so that there is always the chance of a reward very soon, to keep them playing forever.
In the years since Hopson’s article was published, his way of thinking about games has become tremendously influential. His essay has been cited almost 80 times across academic literature and has been put on reading lists at game-design courses, while Hopson himself saw his star rise just as much, spending more than a decade at Halo and Destiny developer Bungie, where he used player data to perfect the real-world answers to the questions he asked in that first essay. He is now the head of analytics at multiplayer online game developer ArenaNet, after a stint founding and leading another team at World of Warcraft developer Blizzard.
But that influence has brought controversy. Games made with this sort of approach in mind, critics contend, are engineered for addiction, encouraging compulsive patterns of behaviour and abusing the mind’s weaknesses to keep players coming back for more.
The problem is that little of what Hopson suggested in 2001 was that much of a surprise to developers of the day. The key difference was that they described the same techniques as achieving a different goal, a word that was noticeably absent from Hopson’s essay: “Fun.”
When we hear stories like that of the mobile gamer who ran up $16,000 (£12,300) of credit-card debt, or the schoolkids spending €500 (£435) of their parents’ money on players in Fifa, it is easy to identify the problem. Some games have become ruthlessly engineered machines for extracting money, using the worst elements identified by Hopson all those years ago, combining them with insights from advertising, gambling and behavioural economics and a sprinkling of big data analytics.
But adopt a wider view and the narrative becomes muddled. Take Fortnite, the teen world’s current gaming obsession. As a multiplayer game, it is necessarily limited in how many of Hopson’s tricks can be employed: the main drivers of pacing and difficulty during a match are other human beings, after all. And it eschews “loot boxes”, the quasi-gambling approach to in-game rewards popularised by titles such as Overwatch, Fifa and Battlefield, in favour of a more conventional mixture of free and paid-for cosmetic unlockables.
The game does still have some modern contrivances. A series of weekly challenges ensures that players come back periodically, rather than lapsing for long periods – which, in a competitive multiplayer game, tends to mean they never come back at all. And a system of experience points, rewarding both pure endurance as well as skill, helps push players into that feeling that they ought to play “one more match”, particularly during (another manipulative quirk) “double XP weekends”. These are promotional periods in which players can boost their experience points, tokens generally awarded for the completion of missions, overcoming obstacles and opponents and so on – which can be deployed to bring back lapsed players and generate buzz in the middle of seasons.
By and large, however, the narrative of Fortnite is a narrative as old as gaming: it is fun, and, as a result, some people play it too much.
This much is true: compulsive gaming was not caused by a psychological approach to development. It is as old as the medium itself. In 1978, Space Invaders was so popular in Japanese arcades that, urban legend has it, the game led to a nationwide shortage of 100-yen coins. Whether or not that is true (more coins were minted in 1979 than the year before, but more were minted in 1977 than either of them), it is certainly the case that the success of the game was enough to cement video games as a cultural phenomenon, and not simply a fad that would blow over. The home conversion, for the Atari 2600, quadrupled sales of that console and made Atari the undisputed giant of the living room.
In the US, Space Invaders was big, but Pac-Man was huge. Pac-Mania makes the media storm around Fortnite look like nothing more than scattered media showers. Picking up where Space Invaders had left the industry, with more than four in five teenagers already having visited an arcade at least once in 1980, Pac-Man exploded beyond the boundaries of the nascent industry, spawning a Saturday-morning cartoon, a top-10 single and even a presidential mention for eight-year-old Jeffrey Lee, who (supposedly) set a record score of 6.1m and was praised by Ronald Reagan for the achievement.
Alongside that craze was the first boom in people with a difficult relationship with games. Physical ailments, for sure – a light-hearted letter to the New England Journal of Medicine from 1981 details the author’s “Space-Invaders Wrist”, and may be the first documented instance of video-game-induced RSI – but also psychological ones. The games, one parent wrote to the New York Times in 1982, “are cultivating a generation of mindless, ill-tempered adolescents”. Some of the complaints could be from Mumsnet posts about the evils of Fortnite today, complaining about “the anger and frustration (is rage too strong a term?) in a young person’s eyes when he is suddenly ‘wiped out’ by a hostile projectile and his quarter spent to no purpose”.
One of the weirder products of the “golden age” of video games was a Martin Amis book, Invasion of the Space Invaders: An Addict’s Guide to Battle Tactics, Big Scores and the Best Machines – half recognisably Amis non-fiction about the 1982 gaming scene, half bizarre how-to guide forced through his style. He described “a young actress with a case of Pac-Man Hand so severe that her index finger looked like a section of blood pudding – yet still she played, and played through her tears of pain”.
But those early games also had built-in limiters, preventing compulsion from getting out of control. For one, they were in arcades: physical establishments you had to go to, and that tended to take unkindly to teens sleeping on the floor in an effort to maximise gaming time. They were also funded fairly directly by the players; and so when the spare change runs out, the gaming session is over.
Today, those realities have changed. Not only can you play games such as Fortnite at home – thanks to the popularity of the smartphone and Nintendo Switch versions of the game, you can play them anywhere. And many of the most popular games are now free-to-play, funded by advertising or optional paid-for-upgrades, meaning that when the cash runs out, the game doesn’t have to stop.
Even for full-price blockbuster games, the financial barrier is minuscule in the history of entertainment. A game such as Bethesda Softworks’ Skyrim, or CDProjekt’s Witcher 3, can easily contain hundreds of hours of enjoyable play and costs £60 brand new, or as little as £10 a few years later. Even bought new, that’s something like 30p an hour – an absurdly low expense.
So efficient are games at providing cheap entertainment to fill leisure time that some economists have queried whether the creation of the medium may be having noticeable macroeconomic effects. Prof Erik Hurst, for instance, speculated that games are, in effect, raising the value of leisure time, reducing – at the margin – the incentive for young people to seek employment. When all you had was daytime TV for company, the argument goes, living in your parent’s basement at 25 with no job was unappealing; now, a couple of games as gifts for birthdays and Christmases, and the ability to piggyback on the household wifi, can enclose you in a cocoon of entertainment that you may not want to leave.
Hurst writes: “If we go to surveys that track subjective wellbeing – surveys that ask people to assess their overall level of happiness – lower-skilled young men in 2014 reported being much happier on average than lower-skilled men in the early 2000s did. This increase in happiness is despite their employment rate falling by 10 percentage points and the increased propensity to be living in their parents’ basement.
“These video games and technology innovations – iPhones, Facebook and Instagram – are both cheap in relative terms, and fun.”
Maybe that is all it is: games are cheap and fun. There have been worse crises to hit the youth of Britain.
In 2012, more than 10 years after his initial essay, Hopson, by then head of user research at Bungie, revisited the topic at Gamasutra. This time, he did use the f-word – arguing that it should be the guiding light for the work of any psychologist in gaming. Behavioural approaches, he wrote, “are ethical if the designer believes the player will have more fun … than they would otherwise. You have to believe in the fundamental entertainment value of the experience before you can ethically reward players for engaging in that experience.”
In other words, it all comes down to trust. You may make games to make money – and if you are Epic, creators of Fortnite, you make a lot of money – but you make that money by entertaining, not by hijacking the brain’s biases and turning teens into money spigots with their parents’ credit cards.
As for Hopson, I trust him to be on the right side of that equation, too. At least, I hope I do: by the time they turned off the feature that let you check, I had invested something like two-and-a-half weeks of playtime into Destiny, the last game Hopson saw from inception to release. Cheap, fun and a purveyor of wonderful sci-fi nonsense about space wizards who come from the moon. What’s not to like?
Games,Culture,Technology,PlayStation 4,PlayStation,Xbox,Xbox One,Nintendo,Nintendo Switch,Fortnite