Part Four: Corporate Warfare
If you want a crystal ball into the future of console video games, current computer games are a good indication. It’s been that way since SpaceWar! was programmed into a minicomputer during the sixties and commercial video games emerged in the seventies. That’s just the way it works: A genre finds its origins on the computer (usually created or popularized by an independent or individual developer), genre faces peer review from a tough audience, genre evolves into a smash hit, mega-publisher “dumbs the genre down”, console juggernaut activated. (And no, don’t think I’m dismissing the genre revolutions that occurred in the arcades and on consoles. Those topics will come another day.)
The current universe of console first-person shooters was built by a previous decade of Doom, Quake, Half-Life, and so forth. And even as shooter developers discovered that there was a much larger audience on the consoles, even as they piggybacked onto Xbox Live, those developers were not out of the clear. They couldn’t begin filling their solid gold bathtubs with diamonds quite yet. At the end of the twentieth century, publishers and developers working on numerous games in numerous genres on numerous platforms had fallen into a trap. They had embraced what would become the biggest issue facing the entire pay-to-own video game industry. And it only seems fitting that the pioneering efforts of the personal computer would gut nearly all significant commercial game development for the personal computer.
Beginning in the mid-seventies, American college students began tinkering with university mainframes in order to build their own video games. The most popular formula became the computer-role playing game, inspired by the recent phenomenon of Dungeons and Dragons. Thanks to the limited processing power of these computers, “role-playing games” became a misnomer. The only thing that dnd, dungeon, pedit5, moria, and avatar did particularly well was crunch numbers. Deal damage, kill monsters, gain levels, get loot. It wasn’t role-playing. Role-playing implies choice. Role-playing implies meaningful decision-making. It was dungeon crawling to the core. By the mid-eighties, computers had become more powerful and computers were capable of doing more. American game creators agreed. They headed back to the drawing board. They began introducing branching paths, building free-roaming worlds, and introducing elements of choice. On the other end of the Pacific, it was a completely different story. During the mid-eighties, Japanese gamers were playing the American role-playing games of the early eighties. They were still getting the number-crunching fix. Japanese developers derived inspiration from those early role-playing games and created a pair of monsters. More precisely, 1986’s Dragon Quest and 1987’s Final Fantasy.
Once those two games sold millions, the Japanese became content with fucking that chicken. Japanese Role-Playing was born. Tossing aside some exceptions to the rule (Secret of Mana, Earthbound, Chrono Trigger, Star Ocean), the JRPG became so derivative that the genre was closer to bookreading, only instead of turning the page, you’d press the A button. Obviously, Japanese developers couldn’t keep making the same role-playing games forever. These conservative Japanese developers decided on two means for “improving” their brand of role-playing. The first? “Bigger, more complex stories!” While nobody would dare mistake Final Fantasy IV for the best of storytelling, it exposed gamers to a complexity of narrative that hadn’t been seen outside of the adventure game genre. The second? “Bigger, more complex graphics!” And come mid-nineties, the Japanese developers finally had access to technology that could win them unprecedented audiences. Namely, video game consoles that used compact discs. These developers now had free reign to create the “photo-realistic” worlds seen in recent Western computer hits such as Myst and The 7th Guest. It allowed developers to load their games with full-quality symphony music and full-motion video with little regard for whether they had enough storage space to fulfill their orgasm. These developers could now be what they secretly desired to be: Shitty movie makers.
More than 200 animators and programmers. A multi-million-dollar production. Over two years in the making! And a cast of thousands! They said it couldn’t be done in a major motion picture. They were right.
– Narration, Final Fantasy VII Commercial, aired in the United States, 1996*
These developers had an ultimate goal: Create the video game equivalent of 1977’s movie epic Star Wars, the movie that used a massive budget and revolutionary special effects to redefine the way that its medium was developed, marketed, and sold to audiences. SquareSoft was the biggest of the established Japanese role-playing publishers, so it was predictable that they would get there first. The development of 1997’s Final Fantasy VII was built around a budget of approximately forty-five-million dollars. At the time, it was rather obscene. When the game began a march towards the eight-figure sales barrier, it validated every penny spent. Console developers and publishers around the world took notice, and they wanted some time to think about the success of that game. The question they had to ask is whether they were in the business of developing video games or motion pictures.
Consider the positives and the negatives. On the down side, it would be extremely expensive for these companies to one-up each other, all trying to create the prettiest game engine or the newest video game graphics breakthrough. On the other hand, Final Fantasy VII was the talk of both consumers and the industry, a cornerstone title of the Sony PlayStation that doubled as a colossal “go fuck yourself” to Nintendo. And think about it: You’re a gigantic publisher with millions upon millions of dollars to spend on video games. Smaller competitors don’t have this luxury. What better to shut those small guys out by convincing consumers a video game is no good unless it’s peppered with “production values”? You can design the industry in a way that small companies need your money to make a game people will buy. The major publishers all concurred. They were taking their war chests and going all-in. They wanted to be Silicon Hollywood.
Pictured: The 2009 version of a franchise that plays no better than it did in 1994.
When this call to arms began, it sent the price of developing a video game soaring into skies only inhabited by the most absurd logic. These companies had little issue with their decision. That is, until they realized the mistake they had made. Developers have often lived and died from the sales of game to game. Gigantic publishers were now assuming that risk. In 2005, the disastrous release of platforming standout Psychonauts nearly ruined publisher Majesco and sent the company’s shareholders into a mutiny. 2007’s Rock Band has been reported as having a development budget as high as 200 million dollars. Almost fifty million dollars was poured in seventh-generation sandbox game This is Vegas…before it was cancelled.* In 2009, it had been estimated that the price to develop a game for the Xbox 360 or Sony PlayStation was roughly thirty million dollars.* The business model they embraced was beginning to price out the middle-class video game, the cheaper cousin of Grand Theft Auto or Halo or whatever. “Production values” were designed to get consumers responding positively to games like Halo. Consumers were now responding negatively to games that didn’t feature “teh awesome graffix”.
When first-person shooter developers scrambled to reap the lucrative world of Xbox Live, it re-oriented where the genre could make money. If consoles hadn’t proven a more profitable platform for shooter development in the wake of Halo: Combat Evolved, Xbox Live confirmed they could be. When the frightening nature of spiraling game development costs were realized, it was a guillotine for computer game development. It was here that exclusive development for the big ol’ box pretty much crapped on its ass. Yeah, a couple of companies carried on. But at this point, computer gamers were spending more time organizing their crappy boycotts than playing the latest games.
You have to remember: From the moment that GoldenEye 007 was released for the Nintendo 64, it attracted a different type of gamer to a different type of shooter. That had and has not changed. That prohibited companies from creating console shooters that would appeal to a less profitable computer gaming community. Computer gamers would not tolerate second-rate ports of console games to their machine. When this audience received computer ports of Gears of War, Halo 2, and Unreal Tournament 3, they perceived that they were getting inferior versions of the games being developed for the consoles; that publishers were screwing over the long-time audience whose money helped to build the genre. Neither publishers nor developers had any interest in amending the situation. Rather than simply explaining that “It’s economics, stupid!”, they burned their bridges. They used the disappointing sales of these crappy ports to play the intellectual property card. That is, “We are no longer developing these games for the computer because of software piracy.” Only a handful of companies would continue to orient first-person shooter development towards the personal computer. The rest left for the consoles and didn’t look back.
These rising development costs pushed both the console and computer gaming industries into a vicious cycle: In order to make a profit, these games had to become a substantial investment. It was this particular concern that prompted Activision CEO Bobby Kotick to spew his now-infamous rejection of games that could not be “exploited every year across every platform, with clear sequel potential that can meet our objectives of, over time, becoming 100-million-plus franchises”.* They had to be built for sequels and they had to be built for mainstream appeal. And rather than price the games downward and cheaper (which hundreds of companies and thousands individuals did in order to snag a piece of the burgeoning smartphone gaming market), the major console and computer game publishers decided to go more expensive; to further leverage their advantage over smaller publishers. Giant publishers had zero interest in going cheaper. They simply wanted to make more expensive games for cheaper, if you understand what I’m saying.
No? It’s simple: You can assume less risk on that large budget by making a game that appeals to a wider audience. That’s how big business typically responds to “our profit margins are dwindling”. Movie producers have almost exclusively built their summer lineups around established properties. That is, “sequels, toy lines, comic book heroes”. It only becomes a summer movie if there’s built-in brand recognition to go with it. In television, a number of channels have begun dabbling in programming that belies the stated goal of their network. That is, their name. Cartoon Network begins airing live action shows. Science-fiction network SyFy broadcasts professional wrestling. The highest-rated shows on The History Channel are reality television. Those networks assumed less risk by appealing to more people with cheaper programming. Sure, by doing so, they gutted their identity. But money is money, am I right? Video game publishers assumed the same strategy used by the corporate suits behind modern television and modern movies. With the question of which platform to build games for firmly out of the way, these companies still needed a theme or setting that could be counted on to make money; a go-to setting. It took a couple of games to legitimize it, but they legitimized it.
During the early part of the aughts, the “corporate needs a low-risk, high-reward” pitch (the reality television or comic-book movie of video games, so to speak) became the World War II video game. Using the ridiculous success of the Counter-Strike model as a point of leverage (“Games with ‘realistic combat’ are fun!”), Electronic Arts was the first publisher to make headway, scoring big with both the Medal of Honor and Battlefield franchises. Much like Halo, the Battlefield franchise won audiences with its exceptional vehicle play. Once Electronic Arts validated World War II, the rest of the imitators followed in tow. In 2003, Activision snared the cast-offs from 2015 Inc. (the developers of Medal of Honor: Allied Assault) and that developer began work on the Call of Duty franchise. In 2005, Ubisoft headed into the market with the Brothers in Arms series. In 2006, THQ went the real-time strategy route, publishing real-time strategy stalwart Company of Heroes.
North American publishers unanimously agreed: This World War II thing was good stuff. It’s not hard to understand why the games were successful. Military combat is the easiest way to take the most popular genre and give it the widest appeal possible. Everybody loves the military. And if you don’t, you’re probably a fucking communist. And of course, this isn’t “demons are emerging from the other side of the portal” or “shooting each other with guns has become the sport of the twenty-second century”. Everybody understands real wars between real human beings. Everybody “gets” the idea of “shoot the other dude in the head”. Well, the greatest military conflict man has ever waged against itself is World War II. That makes it the easiest sell in war video games. What better to set the scene than human history’s one global conflict, truly a conflict of “land, sea, and air”? The one time where it felt like the embodiment of evil had a pretty good shot at “taking over the entire world”? Damn, I can hear the Battlefield 1942 intro music just thinking about it.*
But you know what the best part about this is? It’s all public domain! You don’t have to worry about backstory because millions of men, women, and children already spilled blood to create it. You don’t have to worry about who the good guys are and who the bad guys are. You don’t have to worry about having a crack team of weapon and item designers who can create interesting fictional weapons. You don’t have to worry about paying licensing fees and royalties to create a game based on the next big media franchise. All you have to do is give the Nazis their Lugers, give the Russians their AK47s, set up a couple of well-designed battlefields and call it the Battle of Paris. (And yes, I am aware of how many factual inaccuracies there are in that statement.) If a publisher somehow loses the rights to a franchise, they can simply create a new one featuring the same locations, places, guns, and historical figures. 2015 Inc. employees get sick of making Medal of Honor? They leave the company and form Infinity Ward, create Call of Duty. Then Infinity Ward gets reduced to shambles, the company heads form Respawn Entertainment. I have full faith that you will enjoy their new military-themed shooter.
It’s so easy! You don’t even have to worry about people calling your game a rip-off! How can one military combat game rip off another? That’s why you haven’t seen many games designed to capitalize on the success of the hugely-popular Halo and fewer of them have lived to be called anything but a rip-off, the Killzone franchise (originally touted as a “Halo Killer”) being an exception. The entire racket is lazy as fuck and profitable as hell. A North American corporate empire fascinated with the idea of savagely protecting their intellectual property creamed themselves when they realized the death and horrors of real human wars can be profited upon. They ran with it.
Of course, there’s some risk in butchering a public domain concept. One company hits it big, then another, and suddenly, sixty similar games are all released in a span of three months. At least when people raced to copy Mario and Street Fighter, they had to create a new lineup of repulsive characters. Some creative thought was required. With every developer and publisher drawing from the same source material, all building the same invasion of Normady, the World War II shooter spree made the Guitar Hero-led rhythm game boom look profound and original. Not simply because Steven Spielberg’s 1998 motion picture classic Saving Private Ryan had recently made an emphatic statement that “You will never ever capture the drama of World War II better than we did, so please don’t try.” (Little surprise that 1999’s Medal of Honor featured creative direction from Spielberg.) Not just because the video game industry has about as much movie-making talent as Akira Kurosawa had in half a testicle. (Your video game movie-making “geniuses” are David Cage and Hideo Kojima. You are being treated very poorly.) But see, good war drama threads a tight line between the fragile nature of the human body (as the species is mowed down by the weapons it created); the idea that each of those disposable human beings has a family; the idea that those disposable human beings are chess pieces being mauled by the misaligned intentions of kings and queens and dictators and dudes who simply want to rule the world. I would like to propose that placing teams of eight in an abandoned city, having them attempt to “capture each other’s flag”, earning points and unlockables for killing enough soldiers, and having players “respawn” after suffering from “death” undermines that drama.
In 2002, the World War II shooter blossomed. By 2007, interest in the genre was cratering. Fortunate for the companies that bought into the World War II craze, they had left themselves an exit out. Unfortunate for the consumers who bought the games, “a change of strategy” involved “change the year the game is set in”. Electronic Arts began doing this the moment that Battlefield 1942 became a smash hit, and did it with mixed success. 2004’s Battlefield Vietnam reminded everybody that it wasn’t the best idea to create a video game based on a conflict defined by guerilla warfare, carpet bombing, and utter stalemate. (It’s the same reason people don’t create first-person shooters based on World War I.) 2005’s Battlefield 2 placed the model in the modern era, never quite matching the popularity of the World War II-themed Call of Duty 2. 2006’s Battlefield 2142 accomplished what you might expect, delivering a generic “war in the future video game” and attaching it to the name of a franchise better used elsewhere. When those games were placed against the popularity of World War II, World War II was still winning out. World War II brand awareness was still king. But more importantly, those Battlefield games were built for and played their best on the personal computer. You know, the platform that was getting the shit kicked out of it. Electronic Arts and developer Digital Illusions CE was one of the few tandems remaining that made development for the personal computer a primary goal of the Battlefield franchise. But remember: “Plays well on the computer” forfeits the “crazy party actions!!1” that had become a must for console video games. The first game to combine “modern urban warfare” with “split-screen multiplayer” would get the women.
From the moment that the 2015 Inc. cast-offs formed Infinity Ward and created Call of Duty, they were video game rockstars. Call of Duty 4: Modern Warfare would not change much, both from a financial and design perspective. It’s likely Infinity Ward made the “radical” decision to move Call of Duty into the modern era. (“Wait, what? Radical?” Remember who we’re dealing with here. Activision declined Harmonix’s request to create a band-based follow-up to smash hit Guitar Hero. Activision then scrambled to create a band-based rhythm game when Rock Band sold millions.) I guess there’s some irony in knowing that 2007’s Call of Duty 4: Modern Warfare would kill off the World War II game. After all, the World War II-themed games were the blueprint for Modern Warfare. But credit to Infinity Ward: They convincingly stated that a franchise doesn’t have to change much in order to deliver a lot of fun.
Much like the World War II shooters that preceded Modern Warfare, it’s conceptually no different than Counter-Strike. Instead of Terrorists versus Counter-Terrorists, it’s “Russians With Bad Accents” versus the “Stay Frosties”. Only two significant additions survived the series’ sixty-year trek into the present. The first was killstreaks, which embraced one of the most important rules of video game design: Reward the player for doing well. Three kills in a row gave you access to the radar. Five kills in a row allowed the player to deploy an airstrike. Seven kills in a row allowed the player to call an attack helicopter into play. It was simple stuff, packed with the bare minimum of strategy and decision-making to keep it interesting. The second was a “ranking” system, a role-playing-style experience system where players unlocked new weapons and special attributes (known as “perks”) for use in the multiplayer mode. And while the idea of playing dozens of hours in order to place yourself on a level playing field with other players is rather preposterous, it didn’t break the multiplayer mode and provided newcomers with a sense of progress. Call of Duty 4 can be nagged for being another game of simple tactics and simple execution, but it’s probably the best of the modern multiplayer shooters and the best “urban warfare” shooter to-date, a wonderful compromise between the hardcore and casual demographics; it was run-and-gun enough for the casual gamer and patient enough for genre veterans.
The Halo franchise had similar appeal, but no multiplayer-oriented shooter had ever delivered a publisher the kind of sales and profits generated by Call of Duty 4. It dominated the sales charts in 2007 and showed significant legs through 2008, prodding into eight-figure sales territory. Even though 2008’s Call of Duty: World At War moved rather quietly into the same sales status as its predecessor, World War II was officially on its way out. (Nobody cared for the motif; the most beloved featured in World War II turned out to be its “Zombie Mode”, where players did battle with a watered-down version of what Doom did better over fifteen years ago.) The real anticipation was reserved for the “sequel to Modern Warfare“. It turned out that fighting assholes in the modern day was an even better sell than World War II. World War II had a significant flaw that doomed it to becoming “same old shit”: The Americans always win the war, Nevelle Chamberlain is always an appeaser, Stalin always gets pissed off at Hitler. That war never changes. The present does. And with most people too stupid to fathom even the slightest curiosity in global politics or current events, it’s enough to call your storyline “Pakistani terrorists steal a nuclear weapon” because Fox News once caused your grandmother to shit her pants. And you can keep making Iran or Russia the bad guys until they become ancient history. At which point, you can start stereotyping the new assholes on the block, whether they be Chinese or North Korean or whomever.
After getting throttled by the Casual Gaming Revolution of 2006™, the third-party giants found a reprieve. Call of Duty raced up the sales charts at a time when casual gaming juggernauts such as Guitar Hero, Wii Play, and Wii Fit were the games dominating the sales charts. The success of Call of Duty was an aberration for Activision and an opening for everybody else. Then something unfortunate happened, very unfortunate for anyone who wanted to take the Call of Duty 4 blueprint and place a creative spin on it: Real-world things happened. Namely, “worst economic calamity of the last seventy years”, an event fittingly created by corporate excess. The economic crisis of 2008 forced the hands of game publishers. Those companies decided that they would become as conservative as possible. They would make games that “worked”. Call of Duty is what worked. Therefore, Call of Duty is what they decided to make. In the early aughts, Battlefield and Medal of Honor were merely a supplement to a still-diverse world of shooters. The success of Call of Duty and the Great Recession assured consumers that the military shooter would be the one to rule them all. While “modern warfare” will ultimately change and evolve, the industry of video game megapublishers decided that their games would not.
Call of Duty continued its success and became the figurehead for the entire genre. The sales of a sequel typically reflect the amount of interest in the previous game. Because if a game is ‘good enough’ to merit a sequel, then that must mean this is a franchise worth playing!” The success of 2009’s Call of Duty: Modern Warfare 2 became a vote of confidence in the 2007 predecessor. Modern Warfare 2 became the most anticipated video game in recent memory, was lauded by critics, and sold four-and-a-half million units on its opening day. It would go on to set sales records on the Xbox 360 and PlayStation 3. Even the particularly awful version for the personal computer set a franchise record for sales on the platform. Debate over the quality of the game featured numerous complaints. Modern Warfare 2 featured a greater level of customization than Call of Duty 4, but thanks to the lack of a beta test, the introduction of new killstreaks, perks, and weapons in Modern Warfare 2 proved impossible to balance. The game’s open-ended map design was also criticized. Designed to eliminate camping, it presented players with far fewer safe zones and turned the game into a bit of a free-wheeling clusterfuck. Regardless, Activision cackled with approval. Modern Warfare 2 went on to sell over 20 million units, making it one of the best-selling non-pack-in titles of all-time.
When 2004’s Killzone was touted as a “Halo killer”, it disappointed the five or six console shooter fans who weren’t busy playing Halo. While Killzone 2 had been revealed at E3 2005 (a rendered video featuring no actual play footage), the final product looked and played awfully similar to Call of Duty when it was finally released in 2009. The only difference was about sixty-four shades of brown. Critics touted Killzone 2 as a worthy console-exclusive shooter for the Sony PlayStation 3 library. The game performed reasonably, selling approximately three million units. The success of the game yielded a second sequel, which also looked and played similar to Call of Duty. That game undersold Sony’s expectations, but the two sequels have combined to sell roughly four-to-five million units.
2010’s reboot of the Medal of Honor franchise was set in modern times, taking players into the heart of Afghanistan. It was one of the most anticipated games of the year. That is, until people finally played it. In the world of video game journalism, any anticipated video game that receives lower than an eighty-five on review agglomerator MetaCritic is probably one of the worst video games of the year. (For current examples, please see 2011’s Fable III.) Medal of Honor currently holds a MetaCritic review average in the seventies.* In design and direction, the game was nearly identical to Call of Duty. Despite being a derivative knockoff of a superior product, it was announced in February of 2011 that the game had sold over five million copies.
After a well-publicized fallout with Infinity Ward, Activision released Treyarch-developed Call of Duty: Black Ops in November of 2010. Developer Treyarch played it conservatively, opting to reign in the chaotic game flow unleashed in Modern Warfare 2. The laws of sequels and sales did not apply to Black Ops. The questionable quality of the previous title resulted in the sales of seven million copies on the first day of release, becoming the fastest-selling video game in the history of the entire industry. As of this writing, Black Ops has surpassed over one billion dollars in sales and has become the best-selling video game in the series.
In 2011, Gears of War and Unreal Tournament creator Epic Games published the People Can Fly-developed Bulletstorm, a first-person shooter advertised as a rebuttal to modern urban warfare, mocking both the characters and clichés of the motif. The company was so serious about selling parody to the people that Bulletstorm was promoted with the game demo Duty Calls, an outward assault on Call of Duty and its ilk. After selling roughly 300,000 copies in its first week, industry analysts declared the sales of Bulletstorm “disappointing”* and Epic Games has not published any sales figures for the product. Video game sales site VGChartz estimates the sales of the title at less than a million units.* Bulletstorm was criticized for the length of its single-player campaign and its linear level design, complaints often associated with the Call of Duty franchise.
Later this year in 2011, Activision will publish Modern Warfare 3. It will sell a predictable number of copies and probably set some sales records. It will be the biggest video game of the year. The second biggest video game of the year will be Battlefield 3. At an April 2011 New York advertising conference, Electronic Arts CEO John Riccitiello stated that a one-hundred-million-dollar advertising campaign will usher in the release of Battlefield 3, and Riccitiello stated that the game would be “designed to take down” Call of Duty.* Much like the commercials for Final Fantasy VII advertised full-motion video cutscenes as “in-game footage”, Battlefield 3 advertisments are using the far-superior graphics in the personal computer version (labeled only as “in-game footage”) to sell the game to console audiences. Electronic Arts has decided the means for competing with Call of Duty is to smother the internet and television with ad placement, spending tens of millions on advertising and misleading customers. Those tens of million dollars will not be spent on video game development.
So that’s how it all came to pass: The first-person shooter flourished on the personal computer and computer game developers took the genre in a number of directions. One of these directions ushered in the modern shooter. After several attempts to create shooters for consoles, developers discovered that the tactical shooter was the most lucrative of the few subgenres that could play reasonably on a controller. Xbox Live made this market even more lucrative and gave developers control of their “intellectual property” in a way the personal computer never could. And then in order to combat rising development costs, those developers settled on the most generic motif possible in the genre, backing it with gigantic advertising campaigns whose funding could have been spent on original first-person shooters.
“So let me get this straight: You just spent thousands of words to announce the conclusion that we knew all along? That companies are going to continue selling Call of Duty until it no longer sells? You bitch! You cheated me!” To answer that, I have three words for the genre: Evolve or die. There is a difference between “selling what sells” and “selling out the future”. Are we forgetting the history of this industry? Pong clones dominated the seventies. They gave way to maze games, which dominated the early eighties. They ceded way to platformers, which dominated the mid-eighties into the late-nineties. They shared an overlap with Japanese Role-Playing Games, which were hugely popular in the late nineties. Tactical shooters are the latest video game fad. The genres that survived their fad phase evolved. Something will take their place, if cheap, disposable mobile phone video games have not already begun to do so. Evolution does not mean “Take the development model of John Madden Football and transform your franchise into Call of Duty: Roster Update.” Evolution does not mean “Turn Tomb Raider into a tactical shooter and call it Uncharted.” Evolution does not mean “Turn computer-role-playing into a tactical shooter and call it Mass Effect.” That’s not how this industry works. I promise you: This industry will leave tactical shooters in the boneyard if somebody doesn’t stand up and do something interesting with it, the kinds of interesting things that people were doing with the first-person shooter genre nearly two decades ago.
Class dismissed. If you have any lengthy, in-depth questions, feel free to ask them. Don’t worry. Call of Duty isn’t going to change while you’re doing that.
Continue to Continue to Epilogue: Controller of the Future