Let the Gameplay Tell the Story

Words aren’t the only way to tell a story! Today, I’m talking about video games.

I’ll just put it out there: I love video games. I’ve played them since I was little, from the Atari 2600 to the Commodore 64 to the original NES, up through the current console generation. PC gaming has been a constant part of the experience, too. I’ve witnessed the evolution of video games from chunky, abstract blocks and equally primitive beeps and buzzes to today’s 4K, megatextured, photorealistic, $100 million-plus blockbusters. So, I think it’s fair to say I have a pretty broad perspective on video gaming and its history, and here I want to focus on the evolution–and the hits and misses–of storytelling in video games.

Early graphical video games had almost no conventional storytelling abilities at all. In the Atari 2600 era, if there was any story to speak of in a given game, it was told by the cover art (most of which was quite imaginative and beautiful), and the included instruction manual. _Yars’ Revenge _is a personal favorite of mine, as it included a comic book to explain the game’s premise. (I’ll note it was one of those games where the story is absolutely baffling if you only played the game and didn’t read the manual/comic.) This practice spoke to the technical limitations of the time–a system like the 2600 didn’t have the memory space available to tell a complex story, which is why most games consisted of a handful of screens and sprites.

Adventure was the most ambitious Atari title, in terms of storytelling. Here, you explored a fantasy world with many different locations, and dealt with roaming enemies while on a quest to retrieve a golden chalice. Again, the story was told almost completely without text, but it provided a template that future adventure games would expand upon.

Notice how I mentioned graphical video games above? That’s because text-based games came first, even if they aren’t what brought video gaming to the masses. In a text-based adventure game–a genre usually called “interactive fiction”–players explore the game world by reading textual descriptions of locations and items, and by typing in commands like “go north,” “take apple,” and so on. Colossal Cave Adventure is widely recognized as the first such game, and in fact Adventure was designed to be a graphical adaptation of it!

These separate threads–of graphical and text-based games–eventually converged in the adventure game trend of the ‘80s. Developers LucasArts and Sierra On-Line popularized this format and made some of the most famous series in it, such as Monkey Island and King’s Quest. The presentation involved a player moving around graphically-rendered rooms while typing in commands in an interactive fiction style. The graphics were still crude and the commands still simple, but video games had now developed a unique means of communicating stories. Players could move around a game world they could actually see, interact with it, leave lasting effects on that world, and experience a story throughout. It’s easy to take for granted now, but without the groundwork laid by these early forms of video games, it’s impossible to say what gaming would look like today.

Another technique developed early on, especially for arcade games, was the cutscene. Everyone knows what cutscenes are now, of course, and almost every game seems to have them–but early in the video game era, they were rare! Usually, these were a few static screens of hand-drawn art that served to communicate the premise or backstory of a game, often accompanied by explanatory text. It was also conventional for games to end with a cutscene or a series of them, as the player’s reward for completion.

Over time, it became common for cutscenes to bridge the gap between levels of a game. A player would work through a level or stage, then have a cutscene to explain the transition to the next one. This often involved plot twists or other events that reset the player’s expectations for what would come next. Used well, these could be very memorable. Cutscenes eventually became a standard feature of games that had a story to tell, so naturally they evolved into more and more complex forms.

This took off in the ‘90s as CD-ROM technology became commonplace on home computers, and as home video game consoles adopted this technology, as well. First, it’s useful to understand what storage limits were like before the CD. Video games, up until the use of CD-ROM technology, could only be a few megabytes at most. Hard drives simply weren’t big enough to accommodate more, and with floppy disks maxing out at a couple of megabytes, spreading a game across 10 or more floppies wasn’t something anybody enjoyed dealing with (though a decent share of games and software were distributed that way!) A CD, on the other hand, could cram a whopping 650 megabytes! Obviously, this left room for music tracks using conventional CD audio technology, but it also allowed plenty of space for video. As in movies.

It probably seems goofy now, but in the ‘90s there was a wild explosion of games with video cutscenes. The quality of these varied a lot. Some were made on obvious shoestring budgets, while others were produced to Hollywood blockbuster standards. People were wowed by this technology–having actual movie-quality video in a video game was brand-new, and astonishing. For many games, this replaced cutscenes altogether. Series like Wing Commander went all-out with this approach, telling their stories between stages/missions with elaborate, scripted, CGI-laden video with professional actors. Wing Commander even went as far as having branching scenes: you could choose how to react to certain events in cutscenes and a totally different video would play depending on your choices. This was handled with a remarkable seamlessness given the technology of the era.

This discussion would not be complete without mentioning the prototype for full-motion video games, Dragon’s Lair. Built on Laserdisc technology in the early ‘80s, its hand-drawn animation was created by Don Bluth, and the game consisted almost wholly of traditionally animated sequences. Unfortunately, as a result it wasn’t very interactive. It was a technical marvel with beautiful artwork, but not a very good _game, _as it didn’t offer much gameplay for the player.

But games laden with full-motion video cutscenes shared this problem, too. It was sometimes jarring to see a Hollywood-quality scene play out before your eyes, then be dropped right back into chunky, low-framerate graphics to actually play your game. Video games still had a ways to go to bridge this gulf.

There were a few different approaches taken to attempt this. One was to keep video cutscenes, but make them more interactive. The PlayStation 1-era Final Fantasy games did this, albeit on a fairly limited basis. While a video cutscene played, the player could move characters around in it, though certainly not with absolute freedom. It had no impact on gameplay, either–it was just to give you something to do until the cutscene ended.

A similar approach involved rendering cutscenes entirely within the game’s own engine. This kept the graphical quality consistent, but of course looked a lot worse than full-motion video.

Another option was quick-time events, which are still commonly used today. Rather than make a cutscene wholly interactive, the player is given prompts to push a particular button or enter a certain sequence within a limited amount of time. Doing this successfully provided some positive result and usually played out some victorious cutscene–failing could result in anything from losing whatever bonus was being offered to seeing a “Game Over” screen. Dragon’s Lair was one of the first games to use this approach, and in fact most of the game consists of such quick-time events.

Given the iterative and evolutionary nature of video game technology, it’s hard to say which game managed to make cutscenes fully immersive, but a good candidate is the original Half-Life. Though it occasionally took control from the player, usually it communicated the story simply by having your character walk around and listen to people, and interact with objects in the environment. Despite a couple of occasions where the player’s controls are frozen, almost all storytelling in Half-Life comes from simply playing the game. Scientists and security guards talk to you (though you can walk away, or yes, kill them), scripted scenes unfold that you can either intrude upon, walk away from, or just watch, and major events transpire as a result of your actions, and you have to figure out how to deal with them in real-time.

These types of cutscenes are still in common use today, in varying combinations. The Half-Life style is probably the rarest in heavily story-driven games, for a handful of reasons. One is that players today tend to expect a lot of guidance, and being able to walk away from important information (and then never having a way to find it again) makes for a horribly frustrating experience in most cases. As the technical sophistication and complexity of games improves, it likewise becomes harder to give players a lot of options to impact the story, because this means scripting more and more events and contingencies.

Instead, it’s more common to have a cutscene that consists either of quick-time events (make your choice before time runs out!), choices that pause the action while you decide, and then games where setting off a particular event temporarily takes control from the player to play through a scripted event or cinematic.

My problem with these approaches is mainly that they do take control from the player. In games where you can freely roam the world or otherwise explore the environment around you, it’s jarring to suddenly have that control taken away to fill in some of the story. It makes the storytelling feel artificial. When writing fiction, a common bit of advice is, “show, don’t tell!” But video games have a bad habit of telling you the story by taking control away and hitting you in the face with narrative, rather than showing you the story by giving you a world to explore, characters to interact with, and playing out the consequences of your actions.

As a result, there are a few kinds of games I especially gravitate toward. One is strategy games, where there is either no set story, or that story is mostly window dressing compared to the drama of the unfolding gameplay. Civilization, for instance, eschews conventional storytelling altogether. You pick a civilization to play as, set the parameters of your world, and from there, nothing is scripted! The story that emerges is completely arbitrary, resulting from your actions and those of the other civilizations (be they human or AI). Will you be a world-conquering military power, a broker of peace, or a culture-loving pacifist intent on dominating the planet with the excellence of your music and movies? It’s up to you–the game doesn’t tell you which you have to be, and you can play in any combination you’d like. Not knowing how other civilizations/players will react to you develops a unique story in each playthrough, as well.

In the more scripted realm, strategy games like the XCOM series do have a somewhat linear story to tell, but the real drama is on the battlefield. Every battle plays out a story in miniature: hunting for aliens, taking cover, pressing the attack, tending the wounded, and so on. Maybe you’ll hit hard and sweep quickly to victory–or maybe reinforcements you didn’t anticipate will wipe out half your squad and you’ll beat a hasty, demoralized retreat. These battlefield events resonate through the story when told through cutscenes, as well. Items found during battles unlock new options and provide cutscenes to explain them, for instance.

The other type of game I gravitate toward is the open-world role-playing game or sandbox game. These may have scripted storylines to a greater or lesser extent, but in many cases one can delay the story or ignore it altogether to strike out on your own and simply explore the world, developing your own little stories through interaction and discovery.

Some such games also relate the story by doling out pieces of lore as you play. Elder Scrolls games are famous for this, along with their open worlds, and Dark Souls games hardly explain anything through straight-up narrative. Instead, the player is expected to feel their way through a dangerous world and discover the story by way of artifacts and combat encounters.

Lately, I’ve been playing a game called Kenshi, which was long in development but which only came to my attention recently. Kenshi provides no scripted narrative at all. Instead, the player begins by choosing a particular starting scenario–a lone wanderer, an escaped slave, a band of would-be merchants–and is dropped into a hostile world with little guidance and no objective other than to survive.

You can control up to 30 characters (more with mods) in squads of up to 10 characters each. None of your characters are special or uniquely powerful–they run the risk of injury and death in every single encounter, and encounters are common. Your hovel might be raided by starving bandits looking for scraps of food. Or, once you build a formidable outpost in the sandy wasteland, you’ll draw the attention of a local gang lord, who sends a 20-strong force to either demand “taxes” or punish your defiance. These events are random but, in concert, produce an experience that is both organic and unpredictable. Your thriving base might fall victim to overpowering thugs who want the precious resources you’ve labored to harvest. Your best bet might be to abandon it, to have your people fall back to the safety of a town’s walls, where at least the guards will protect you.

One of your characters, minding her own business, might be abducted by slavers, chained and branded. Will you put together a team to break her out, knowing she’ll carry a bounty an escaped slave for quite some time, and be unable to visit towns without being beaten and captured again? Or will you leave her to her fate? The game does not pass judgment either way–it’s a harsh world, and it’s survival by any means necessary.

There’s no end goal in Kenshi, either. There are quests offered by other characters, and bounties you can pursue, but there’s no epic, scripted storyline, and no defined ending. The story is the one you tell, by deciding how to survive in the game’s unyielding world, and how to respond to the random events it throws at you. Losing most of your squad and retreating back to a town with a single character, as painful as it is, at least lets you continue on. You can always recruit more, rebuild, and try your luck again. In that sense, it is a game of persistence and caution, of calculated risk, and then occasional daring–sometimes with a heavy price.

And no cutscenes! The closest thing Kenshi has to cutscenes are brief shifts of camera focus to arriving traders and enemies. This is just to draw your attention so you can decide how to deal with them. Otherwise, there are no distractions and the game never takes the focus or control away from you, the player. You live and die by your choices, and the only story is the one you create by your struggle to survive.

I don’t want to end this by making it seem like an advertisement for Kenshi, though I do think _Kenshi _is a great (if rough-edged) game. I also love grand strategy games, like Europa Universalis and Crusader Kings, not to mention Stellaris. I’m fascinated by games that don’t focus on giving you a scripted experience, but instead let the player decide how to face a dynamic, unpredictable setting. I love games that tell more directed stories, too, like the Final Fantasy series, but it’s easy for those to feel more like movies that you are occasionally asked to interact with. The games where I tell the story, where I decide what I want to do and then grapple with the consequences of those choices–those are the ones that really pull me in and challenge me, and make me examine what kind of person I want to be in these virtual worlds. To put it another way: I want to play the game; I don’t want the game to play me.