I originally drafted this back in 2014, when Gamergate was at the height of its controversy and cultural presence. I never got around to publishing it, though. Some of the references and examples may be rather dated now in 2020, but the issues discussed herein are still relevant, especially as the cultural dominance of video games has only increased and grown more diverse since 2014.
In 2005, Roger Ebert awarded the movie adaptation of the video game Doom a single star, writing that it was “ ‘inspired by’ the famous video game. No, I haven’t played it, and I never will, but I know how it feels not to play it, because I’ve seen the movie.”
That comment caused a bit of controversy, as gamers argued that Ebert’s lack of familiarity with the source material meant he was ill-prepared to review the movie. As a result of the controversy, Ebert declared that video games couldn’t be art:
I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art. To my knowledge, no one in or out of the field has ever been able to cite a game worthy of comparison with the great dramatists, poets, filmmakers, novelists and composers. That a game can aspire to artistic importance as a visual experience, I accept. But for most gamers, video games represent a loss of those precious hours we have available to make ourselves more cultured, civilized and empathetic.
Not surprisingly, that comment caused even more controversy, and it was a controversy that Ebert revisited several times before his death in 2013. In 2010, for example, he wrote that “no video gamer now living will survive long enough to experience the medium as an art form,” and in 2012, he tweeted a link to a scathing review of The Last of Us, one of that year’s most acclaimed games.
Time and again, various video games were presented to Ebert as proof that the medium could produce “art,” but as far as I know, he was never dissuaded from his initial stance. Those gamers’ persistence was telling, though. There was clearly something about games being considered “art” that was important to them. A desire for legitimacy, perhaps, considering the medium’s growing popularity and cultural cachet. And once something acquires a certain level of cultural influence, it’s only natural for people to analyze, dissect, and discuss it: where it came from, what it means, what it’s trying to say, and so on.
Two of the most rewarding classes I took in college were a pair of art history classes taught by Christin Mamiya that surveyed art from the Middle Ages through the late 20th century. We spent hours poring over countless paintings and sculptures, and in the process, learned about their creators and the social, cultural, and political forces that shaped their creation.
We deconstructed numerous works and learned about the messages hidden within even the most (seemingly) insignificant detail: the way a character was positioned, the presence of some apparently random item in the background, or even the color palette and lighting style that was used.
It might seem that analyzing art so thoroughly would kill any possible enjoyment of, say, Michelangelo’s David, but I experienced the exact opposite. Dusty old art made centuries ago gained new life in my eyes as I began to understand that it wasn’t created in a vacuum, but rather, by flesh-and-blood human beings who had desires to convey, stories to tell, agendas to promote, and messages to pass on to future generations.
I learned that art, even timeless art, is always born out of a specific context, and were it not for outside forces (social, political, etc.) shaping them in countless ways, such great works would probably be less than they are.
In recent years, video games have come under a similar type of intense critical scrutiny, and for two reasons that are closely related:
- Video games are more popular than ever and have become a mass medium on par with movies and television in terms of budget, production complexity, and aesthetic and technical complexity and detail.
- The medium’s audience — which was pretty niche up until recently (insert any number of gamer-related clichés here if you’d like) — has grown increasingly diverse. A 2013 study found that women 18 and older are one of the fastest growing video game audiences and Nintendo audiences were split right down the middle, gender-wise, in 2012.
With this explosion in both cultural cachet and audience diversity, it should surprise exactly no one that games are starting to be analyzed, not just from a “traditional” gamer perspective with “traditional” gamer criteria, but also from new perspectives and with new criteria for what might constitute a good, compelling, or worthwhile game. (Consider Polygon’s review of Bayonetta 2, which gave the game good marks for its action, gameplay, and storytelling while criticizing the over-sexualization of its main character.)
But for some in the “Gamergate” movement, the shift away from such “tradition” was cause for alarm. The shift was interpreted as an attempt by folks like Anita Sarkeesian and other “social justice warriors” to purposefully impose feminist and progressive readings onto games. Or, as Gamergate advocate Milo Yiannopoulos wrote, “an army of sociopathic feminist programmers and campaigners, abetted by achingly politically correct American tech bloggers, are terrorising the entire community — lying, bullying and manipulating their way around the internet for profit and attention.”
Essentially, it was a call for games to be left alone — for games to be treated “as games” free from “social justice warrior” bias and criticism that threatened to upset the status quo that gamers had enjoyed for so many years. One of the best examples I found of this was a post titled “Dear Gaming Journalists — Why I’m Still Here” in which Kelly Maxwell proposed a “gamer bill of rights” that included the following suggestions and statements:
- “If the act of reviewing a game triggers a need to write a page and a half decrying its content, maybe that game isn’t for you. Please consider passing that review off to a journalist on your team who might enjoy the content.”
- “We desire accountability, reform and an acknowledgement that the customer should decide what they play — not journalists pushing dogmatic fringe ethics about gamer susceptibility to content, i.e. Jack Thompson in a new coat.”
- “We will continue to stand up for Freedom of Expression, Artistic Vision and a free market to decide what it wants. We suggest fans of Gone Home review Gone Home — and fans of God of War Review God of War. It is in this moment that professionalism engages, and a journalist demonstrates whether they are worthy of our trust.”
Maxwell’s suggestions are interesting precisely because of what they assume about journalism and criticism, e.g., that something should only be reviewed by people who will probably like and enjoy it (and won’t write “a page and a half decrying its content”), and that “dogmatic fringe ethics” should be avoided. (The Jack Thompson reference is rather telling, too.)
Imagine if Maxwell applied such criteria to other forms of media criticism. What if the only people allowed to write about, say, Star Wars were “true” sci-fi fans? Or if the only people allowed to write about punk rock were people who would “enjoy the content”? Or if the only people encouraged or permitted to write reviews about Game of Thrones were those who had read — and liked — George R.R. Martin’s novels?
What we’re really talking about now is the squelching of conversation and opinion, one that leaves our cultural conversations impoverished. And this all presupposes three things:
- The stuff we love should not be subjected to “unsafe” critical opinions (i.e., they should only be analyzed by fans for fans).
- If something we love is criticized, we are necessarily implicated by the criticism.
- There is little to no value in reading and hearing opinions and criticism that pushes back against our own interests, biases, and perspectives.
With regards to #1, you frankly can’t have it both ways. If you have something you love, you can either do your best to keep it hidden for just you and your friends or you can promote, encourage, and defend it. Gamers have taken the latter course of action in recent years, and with tremendous success.
Video games have never had so much cultural presence, but along with that increased presence comes a certain responsibility to be willing to let games take their “hard knocks” in the marketplace of ideas. And if you can’t do that, then maybe that says more about your faith in the medium (i.e., that it’s not strong enough to withstand such criticism) than it does about anything else, including any critics.
With regards to #2, this ventures into particularly thorny philosophical areas related to art and morality. If you enjoy playing violent video games, does that make you a sociopath? If you enjoy playing games in which female characters are denigrated, does that make you a sexist pig? If you don’t like playing a game as a homosexual character, does that make you a hateful bigot?
Most people would probably answer “No” to questions like these. At least, I hope that most people are perfectly capable of drawing a line between what happens inside and outside a game. But that doesn’t mean we shouldn’t listen to the criticism and consider that maybe, just maybe, any such status quo needs some challenging.
And finally, with regards to #3, there’s something incredibly myopic about insisting on only hearing voices that are similar to your own, or that espouse views similar to your own. For starters, it’s always possible to ignore opinions. If you don’t like what folks like Sarkeesian or Yiannopoulos have to say, then simply ignore it. But why should we fear criticism raising questions within our own hearts and minds concerning the media we consume and its potential effects on us? Even if we ultimately reject the answer they might arrive on, such a process may be deeply helpful when it comes to helping us better define and delineate our own stances.
In the aforementioned “gamer bill of rights,” Kelly writes “that diversity of thought is a Right” — yet another irony in a piece full of them. Diversity of thought is simply not possible in an environment that discourages, decries, and seeks to prevent beloved media artifacts from being assessed, critiqued, and challenged by those who might be considered “other” (i.e., those who exist outside the status quo or the assumed proper audience for such media artifacts). Furthermore, this is not true for video games alone, recent controversies and internet arguments notwithstanding. It’s equally true for books, movies, music, paintings — any form of media.
That being said, there is a criticism that ought to be roundly criticized: the criticism that originates from an uninformed and ignorant — and, some might say, knee-jerk — position. This is criticism offered by someone who has never seen, read, or experienced in any significant way, the artifact that is the target of their criticism. Who decries a movie despite having never seen it, or at the very least, researched it and what the director’s intentions may have been. Who criticizes music for controversial lyrics without considering for a moment the context that may have given rise to such lyrics. Who criticizes a video game for being pornographic despite having never played the game in question, much less seen its explicit content for themselves.
Gamers were right to consistently push back against Ebert, who, for all of his legitimate talent and knowledge as a movie critic, seemed largely disinterested in actually taking time to learn about and experience the medium he was talking about. Even if video games aren’t “art” proper (assuming you could even construct such a definition), there’s no denying the amount of ingenuity and imagination on display in the medium. If several recent games are any indication, it’s only a matter of time before the medium transcends its early roots and comes so close to being “art” as to do away with any distinctions altogether.