Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: criticaltheory

City space and emotion: Affect as urban infrastructure

For a change of pace this week, I thought I’d write about affect in relation to the urban condition. Specifically I am going to focus on Nigel Thrift’s chapters on spatialities of feeling from his book Non-representational Theory: Space, Politics, Affect. Thrift begins the first chapter by characterizing cities as “maelstroms of affect,” and asserting the “utter ubiquity of affect as a vital element of cities” (p. 171). Thrift questions why “the affective register” has not formed “a large part of the study of cities,” and states “to read about affect in cities it is necessary to resort to the pages of novels, and the tracklines of poems” (p. 171).

I have to question what Thrift means by “the study of cities,” particularly in relation to the history of urban sociology. There is a lengthy history in this tradition of studying the affective register of cities, from Weber’s anomie and Simmel’s blasé attitude, through the emergence of modern criminology and social scientific studies of urban anxiety and the fear of crime.

There are, of course, prolific approaches available for studying cities. In addition to approaches from fiction and prose, and the aforementioned social scientific methods, there abound philosophical, psychogeographic, and theological engagements with urban life. One approach to the study of cities that has been especially amenable to the affective register is the domain of urban design and planning. Practitioners and commentators from this realm (who often, erroneously and unfortunately, mistake their practice for urbanism entire) have long used affective language to describe and design urban spaces: happy streets, friendly spaces, menacing buildings, etc.

Thrift is not explicitly discussing “smart” urbanization projects, but of course much of the analysis across these two chapters is directly applicable to such initiatives. Shockingly, Ernst Bloch also says much of relevance to smart cities in his 1929 essay “The Anxiety of the Engineer”. Thrift’s summation of Bloch’s “apocalyptic” vision of cities from that essay reads like a ripped-from-the-headlines encapsulation of contemporary urbanization trends: “Transfixed by the idea of a totally safe and calculable environment, the capitalist city is fixed and unbending in the face of unexpected events: ‘it has rooted itself in midair’” (p. 198). It’s a fantastic connection to make, though I despair at my ever-growing reading list.

Lastly, I want to touch upon Thrift’s discussion of the misanthropic city. My first reaction was to respond that cities aren’t misanthropic, people are; but then I recalled my recent trip to Las Vegas. Returning to the affective register of urban design, I must say that Vegas is certainly a misanthropic city. It is a city built for money, not for people. To the extent that it is built for people, it is designed not to affirm or edify humanity’s highest qualities, but is rather constructed to amplify our basest and most animalistic aspects. Compulsion, lechery, and stupefaction are the human attributes “celebrated” in that space. From an urban design perspective, Las Vegas is among the most misanthropic of cities.

Of course, Thrift is not referring to misanthropic urban design (although the invocation of infrastructure is an interesting, and perhaps fecund, reference point for urban affect), but to misanthropic attitudes and behaviors among urban denizens. I do not ascribe to calls for kindness and idealized sense of community in the city, as I find they are often simplistic and embarrassingly maudlin. Indeed, the disconnectedness and universal strangeness that has long been decried as manifestations of the inherent disharmony of urban life, are in fact principal among the reasons that I love life in the metropolis. Nevertheless, I do appreciate that amidst the anxiety and imminent catastrophe of urban life, Thrift finds spaces for kindness and hope.

Thoughts on polemics, Audre Lorde, and Do the Right Thing

Radical black feminist writer and activist Audre Lorde found productive potential in anger. According to Lester Olson, in his article "Anger among allies": “Lorde distinguished between anger and hatred, and she salvaged the former as potentially useful and generative” (p. 287). Lorde’s distinction between anger and hatred is developed in a quote from her remarks: “Hatred is the fury of those who do not share our goals, and its object is death and destruction. Anger is a grief of distortions between peers, and its object is change” (p. 298).

In a quote from her address titled “The Uses of Anger,” Lorde uses the metaphor of the virus to describe hatred:

“We are working in a context of oppression and threat, the cause of which is certainly not the angers which lie between us, but rather that virulent hatred leveled against all women, people of Color, lesbians and gay men, poor people - against all of us who are seeking to examine the particulars of our lives as we resist our oppressions, moving toward coalition and effective action.” (emphasis added)

This thematic link between hatred and disease is also present in Spike Lee’s film Do the Right Thing. While the film’s characters never state the distinction between anger and hatred as explicitly as Lorde does, the film makes many associations that establish a difference between the two. The action of the film takes place in a roughly 24 hour period, during the hottest day of the summer in Brooklyn, New York. The temperature is referenced throughout the film, and the link between the heat and character’s emotions is made early on. Anger is associated with heat: characters talk about “getting hot” as a euphemism for getting angry. By extension then, the hottest day of the summer could also be understood as the angriest.

Hatred, on the other hand, is continually linked with sickness and disease. Early in the film, when pizzeria owner Sal arrives with his two sons to start business for the day, his son Pino says of the pizza shop:

“I detest this place like a sickness.”

Sal admonishes his son, saying: “That sounds like hatred.”

This connection returns at the end of film, again in front of Sal’s Famous Pizzeria, which at this point has been reduced to a smoldering shell. Mookie seeks Sal out to ask for the wages he is due from the previous week’s labor. Angrily, Sal throws $500 in $100 bills at Mookie, twice as much as he is owed. Mookie leaves $200 on the ground, telling Sal that he only wants what he has earned. There is a stalemate as the two men stare off, the $200 between them, and each of them waiting for the other to pick it up. Apparently not understanding why Mookie would leave the money lying on the ground Sal asks him:

“Are you sick?”

Mookie replies: “I’m hot as a motherfucker; I’m alright, though.”

Mookie’s response here should not be understood merely as a comment about the weather. Yes, he is hot because of the summer heat, but the associations presented by the film make clear the deeper meaning of this exchange. Mookie is angry, angry as a motherfucker; having endured the ordeal of the hottest day of the summer, culminating in his throwing a trashcan through a shop window, and now he finds himself the following day with his various responsibilities still in place, but now without a source of income. But he does not hate Sal. He is not infected by hatred. He is not sick.

If the film associates hatred with sickness and disease, how does it relate or portray love? The radio DJ character, Mister Senor Love Daddy, seems like an obvious connection. Another important component is the name of Senor Love Daddy’s radio station: We Love Radio 108 (“Last on your dial, first in your heart.”). The name of the radio station not only presages Clear Channel Communications’ eventual rebranding to I Heart Radio (kidding, of course), it also establishes a connection between love and another of the film’s characters: Radio Raheem.

Radio Raheem is arguably the character most closely associated with the concepts of love and hate. Raheem has custom brass knuckles on each hand: the word “LOVE” on his right hand, and the word “HATE” on his left. Through the presence of these words on his knuckles, and his performance of the accompanying story about the struggle between love and hate, “the story of life,” Radio Raheem recalls Reverend Harry Powell from the 1955 film Night of the Hunter. Reverend Powell has the words “love” and “hate” tattooed on his knuckles: love on the right hand, and hate on the left. He also tells “the story of life,” which, although using different language than Raheem, tells essentially the same account of a struggle between hate and love, where hate has the upper hand for a while but is eventually beat out by love.

In Night of the Hunter, Reverend Powell’s performance of pious geniality conceals a dark secret: he is a serial killer, traveling the country seducing widows whom he soon murders before absconding with what wealth he can steal. In Do the Right Thing, Radio Raheem is not revealed to be a serial killer, but he is done in by a sort of serial killing: the recurring killing of men of color perpetrated by police officers. The characters of the film react to Raheem’s death in a personal way (“They killed Radio Raheem!”), but it is clearly also a reaction to this serial killing of black men that contributes to the crowd’s reaction (someone is heard exclaiming, “They did it again!”). 

A final question: Is Do the Right Thing a polemic? I find it interesting to consider the question in light of the definitions offered by various authors. In her article on Larry Kramer's polemical form, Erin Rand writes of polemics: 

“Hence, polemics refute dominant ideologies and modes of thinking by rejecting the primacy of reason an invoking explicitly moral claims. In polemics a moral position is not simply advanced through rhetoric, but morality actually does rhetorical work.” (p. 305)

Rand traces the meaning of “polemic” to the Greek polemikos, meaning “warlike",  and when Lee’s film was released many reviewers and commentators were concerned that it amounted to a call for violence. I am not sure the film satisfies Rand’s four elements of rhetorical form, but I do believe it satisfies the rhetorical move that Olson calls shifting subjectivities:

“An advocate articulates a shift in the second persona of an address, wherein the auditors or readers occupy one kind of role initially and then, drawing on what is remembered or learned from that position, are repositioned subsequently into a different role that is harder for them to recognize or occupy, but that might possess some transforming power.” (p. 284)

As film critic Roger Ebert recounted in an essay about the film:

“Many audiences are shocked that the destruction of Sal's begins with a trash can thrown through the window by Mookie (Lee), the employee Sal refers to as “like a son to me.” Mookie is a character we're meant to like. Lee says he has been asked many times over the years if Mookie did the right thing. Then he observes: “Not one person of color has ever asked me that question.” But the movie in any event is not just about how the cops kill a black man and a mob burns down a pizzeria. That would be too simple, and this is not a simplistic film. It covers a day in the life of a Brooklyn street, so that we get to know the neighbors, and see by what small steps the tragedy is approached.”

Some critics and audience members objected to what they interpreted as Lee’s call for violence, and at least an implicit approval of property destruction. We heard similar rhetoric last year, when protests in response to the deaths of Michael Brown and Eric Garner became characterized by media emphasis on incidents of property damage and looting. The state response to protests is always characterized by a tolerance so long as demonstrations are peaceful and “civil,” and when this line is broached it functions to demonize and dismiss the “protestors” at large. Is this not evocative of the white woman who purportedly said to Audre Lorde, “Tell me how you feel, but don’t say it too harshly or I cannot hear you”?

Critical perspectives on the Isla Vista spree killer, media coverage


Reuters/Lucy Nicholson

Reuters/Lucy Nicholson

  • Immediately following Elliot Rodger's spree killing in Isla Vista, CA last month Internet users discovered his YouTube channel and a 140-page autobiographical screed, dubbed a "manifesto" by the media. The written document and the videos documented Rodger's sexual frustration and his chronic inability to connect with other people. He specifically lashed out at women for forcing him " to endure an existence of loneliness, rejection and unfulfilled desires" and causing his violent "retribution". Commentators and the popular press framed the killings as an outcome of misogynistic ideology, with headlines such as: How misogyny kills men, further proof that misogyny kills, and Elliot Rodger proves the danger of everyday sexism. Slate contributor Amanda Hess wrote:

Elliot Rodger targeted women out of entitlement, their male partners out of jealousy, and unrelated male bystanders out of expedience. This is not ammunition for an argument that he was a misandrist at heart—it’s evidence of the horrific extent of misogyny’s cultural reach.

His parents saw the digitally mediated rants and contacted his therapist and a social worker, who contacted a mental health hotline. These were the proper steps. But those who interviewed Rodger found him to be a “perfectly polite, kind and wonderful human.” They deemed his involuntary holding unnecessary and a search of his apartment unwarranted. That is, authorities defined Rodger and assessed his intentions based upon face-to-face interaction, privileging this interaction over and above a “vast digital trail.” This is digital dualism taken to its worst imaginable conclusion.

In fact, in the entire 140-odd-page memoir he left behind, “My Twisted World,” documents with agonizing repetition the daily tortured minutiae of his life, and barely has any interactions with women. What it has is interactions with the symbols of women, a non-stop shuffling of imaginary worlds that women represented access to. Women weren’t objects of desire per se, they were currency.

[...]

What exists in painstaking detail are the male figures in his life. The ones he meets who then reveal that they have kissed a girl, or slept with a girl, or slept with a few girls. These are the men who have what Elliot can’t have, and these are the men that he obsesses over.

[...]

Women don’t merely serve as objects for Elliot. Women are the currency used to buy whatever he’s missing. Just as a dollar bill used to get you a dollar’s worth of silver, a woman is an indicator of spending power. He wants to throw this money around for other people. Bring them home to prove something to his roommates. Show the bullies who picked on him that he deserves the same things they do.

[...]

There’s another, slightly more obscure recurring theme in Elliot’s manifesto: The frequency with which he discusses either his desire or attempt to throw a glass of some liquid at happy couples, particularly if the girl is a ‘beautiful tall blonde.’ [...] These are the only interactions Elliot has with women: marking his territory.

[...]

When we don’t know how else to say what we need, like entitled children, we scream, and the loudest scream we have is violence. Violence is not an act of expressing the inexpressible, it’s an act of expressing our frustration with the inexpressible. When we surround ourselves by closed ideology, anger and frustration and rage come to us when words can’t. Some ideologies prey on fear and hatred and shift them into symbols that all other symbols are defined by. It limits your vocabulary.

While the motivations for the shootings may vary, they have in common crises in masculinity in which young men use guns and violence to create ultra-masculine identities as part of a media spectacle that produces fame and celebrity for the shooters.

[...]

Crises in masculinity are grounded in the deterioration of socio-economic possibilities for young men and are inflamed by economic troubles. Gun carnage is also encouraged in part by media that repeatedly illustrates violence as a way of responding to problems. Explosions of male rage and rampage are also embedded in the escalation of war and militarism in the United States from the long nightmare of Vietnam through the military interventions in Afghanistan and Iraq.

For Debord, “spectacle” constituted the overarching concept to describe the media and consumer society, including the packaging, promotion, and display of commodities and the production and effects of all media. Using the term “media spectacle,” I am largely focusing on various forms of technologically-constructed media productions that are produced and disseminated through the so-called mass media, ranging from radio and television to the Internet and the latest wireless gadgets.

  • Kellner's comments from a 2008 interview talking about the Virginia Tech shooter's videos broadcast after the massacre, and his comments on critical media literacy, remain relevant to the current situation:

Cho’s multimedia video dossier, released after the Virginia Tech shootings, showed that he was consciously creating a spectacle of terror to create a hypermasculine identity for himself and avenge himself to solve his personal crises and problems. The NIU shooter, dressed in black emerged from a curtain onto a stage and started shooting, obviously creating a spectacle of terror, although as of this moment we still do not know much about his motivations. As for the television networks, since they are profit centers in a highly competitive business, they will continue to circulate school shootings and other acts of domestic terrorism as “breaking events” and will constitute the murderers as celebrities. Some media have begun to not publicize the name of teen suicides, to attempt to deter copy-cat effects, and the media should definitely be concerned about creating celebrities out of school shooters and not sensationalize them.

[...]

People have to become critical of the media scripts of hyperviolence and hypermasculinity that are projected as role models for men in the media, or that help to legitimate violence as a means to resolve personal crises or solve problems. We need critical media literacy to analyze how the media construct models of masculinities and femininities, good and evil, and become critical readers of the media who ourselves seek alternative models of identity and behavior.

  • Almost immediately after news of the violence broke, and word of the killer's YouTube videos spread, there was a spike of online backlash against the media saturation and warnings against promoting the perpetrator to celebrity status through omnipresent news coverage. Just two days after the killings Isla Vista residents and UCSB students let the news crews at the scene know that they were not welcome to intrude upon the community's mourning. As they are wont to do, journalists reported on their role in the story while ignoring the wishes of the residents, as in this LA Times brief:

More than a dozen reporters were camped out on Pardall Road in front of the deli -- and had been for days, their cameras and lights and gear taking up an entire lane of the street. At one point, police officers showed up to ensure that tensions did not boil over.

The students stared straight-faced at reporters. Some held signs expressing their frustration with the news media:

"OUR TRAGEDY IS NOT YOUR COMMODITY."

"Remembrance NOT ratings."

"Stop filming our tears."

"Let us heal."

"NEWS CREWS GO HOME!"

Critical Pedagogy and Imperialism; social media and commodity fetishism

Gramsci has had a huge impact on critical pedagogy especially because of the importance he attached to the role of culture, in both its highbrow and popular forms, in the process of hegemony which combines rule by force with rule by consent. His discussion on the role of intellectuals in this process also infuenced discussions centering around educators as cultural workers in the critical pedagogy field. Henry Giroux has been particularly influential here. One issue which deserves greater treatment in critical pedagogy, in my view, is that of ‘powerful knowledge’ which, though not necessarily popular knowledge and also needs to be problematised, should still be mastered for one not to remain at the margins of political life.

[...]

Following Freire, I would say: the commitment to teaching is a political commitment because education is a political act. There is no such thing as a neutral educaton. We must always ask on whose side are we when we teach?  More importantly we should ask, with whom are we educating and learning? I ask this question in the spirit of Freire’s emphasis on working with rather than for  the oppressed.

In tying Marxist ideology to social media, there are a number of things to clarify, as the comparison is not a perfect one. Perhaps the most questionable caveat is the ownership of the modes of production. In the social media model, it can be said that the proletariat themselves own the modes of productions since they typically own the computer or devices that they are using to channel their intellectual labor through. Additionally, almost all popular social media networks today allow users to retain the copyright of the content that they post  (Facebook, a; MySpace, n.d.; Twitter, n.d.). Thus, it would seem that making the argument that users are alienated from the results of their intellectual labor power is a moot point.

[...]

I humbly suggest that in the social media model, owning the output or product of intellectual labor power has little if anything to do with Marx’s species being. Instead, I feel that it is the social connections created, broken, strengthened, or weakened that feed directly to the worker’s species being. Since the output of the intellectual labor power in this case is not a tangible good, the only “finished product” that the worker can place value in and not be alienated from is the actual social connection that their output generates; not the actual output itself. This allows for a supra or meta level of social connection above that of the social connections embodied in physical outputs outlined by Marx.

TV still sucks, we should still complain about hipsters, your job shouldn't exist

None of this could be happening at a worse time. According to the latest S.O.S. from climate science, we have maybe 15 years to enact a radical civilizational shift before game over. This may be generous, it may be alarmist; no one knows. What is certain is that pulling off a civilizational Houdini trick will require not just switching energy tracks, but somehow confronting the “endless growth” paradigm of the Industrial Revolution that continues to be shared by everyone from Charles Koch to Paul Krugman. We face very long odds in just getting our heads fully around our situation, let alone organizing around it. But it will be impossible if we no longer even understand the dangers of chuckling along to Kia commercials while flipping between Maher, “Merlin” and “Girls.”

  • Zaitchik's article name checks pertinent critics and theorists including Adorno's "cultural industry," Postman’s “Amusing Ourselves to Death,” and even Jerry Mander's "Four Arguments for the Elimination of Television." Where this article was discussed on sites like Reddit or Metafilter commenters seemed angry at Zaitchik, overly defensive as if they felt under attack for watching "Hannibal" and "Game of Thrones". I thoroughly enjoyed Zaitchik's piece, even if it doesn't present a fully developed argument, because the perspective he presents strongly resonates with many of the philosophical foundations that have shaped my own views on media, particularly the media ecology tradition. A large part of Zaitchik's argument is that even if television content is the highest quality it has ever been, the form of television and its effects are the same as ever:

Staring at images on a little screen — that are edited in ways that weaken the brain’s capacity for sustained and critical thought, that encourage passivity and continued viewing, that are controlled by a handful of publicly traded corporations, that have baked into them lots of extremely slick and manipulating advertising — is not the most productive or pleasurable way to spend your time, whether you’re interested in serious social change, or just want to have a calm, clear and rewarding relationship with the real world around you.

But wait, you say, you’re not just being a killjoy and a bore, you’re living in the past. Television in 2014 is not the same as television in 1984, or 1994. That’s true. Chomsky’s “propaganda model,” set out during cable’s late dawn in “Manufacturing Consent,” is due for an update. The rise of on-demand viewing and token progressive programming has complicated the picture. But only by a little. The old arguments were about structure, advertising, structure, ownership, and structure, more than they were about programming content, or what time of the day you watched it. Less has changed than remains the same. By all means, let’s revisit the old arguments. That is, if everyone isn’t busy binge-watching “House of Cards.”

It’s been something to watch, this televisionification of the left. Open a window on social media during prime time, and you’ll find young journalists talking about TV under Twitter avatars of themselves in MSNBC makeup. Fifteen years ago, these people might have attended media reform congresses discussing how corporate TV pacifies and controls people, and how those facts flow from the nature of the medium. Today, they’re more likely to status-update themselves on their favorite corporate cable channel, as if this were something to brag about.

The entertainment demands of the 21st Century seem (apparently) bottomless. We’ve outsourced much of our serotonin production to the corporations which control music, sports, television, games, movies, and books. And they’ve grown increasingly desperate to produce the most universally acceptable, exportable, franchisable, exciting, boring, money-making pablum possible. Of course that is not new either… yet it continues to worsen.

Various alternative cultures have been attempting to fight it for decades. The beats, hippies, punks, and grunge kids all tried… and eventually lost. But the hipsters have avoided it altogether by never producing anything of substance except a lifestyle based upon fetishizing obscurity and cultivating tasteful disdain. A noncommital and safe appreciation of ironic art and dead artists. No ideals, no demands, no struggle.

Rarely has the modern alternative to pop culture been so self-conscious and crippled. The mainstream has repeatedly beaten down and destroyed a half-century’s worth of attempts to keep art on a worthwhile and genuine path, but now it seems the final scion of those indie movements has adopted the: ‘if you can’t beat‘em, join‘em’ compromise of creative death.

  • In an interview for PBS, London School of Economics professor David Graeber poses the question: should your job exist?

How could you have dignity in labor if you secretly believe your job shouldn’t exist? But, of course, you’re not going to tell your boss that. So I thought, you know, there must be enormous moral and spiritual damage done to our society. And then I thought, well, maybe that explains some other things, like why is it there’s this deep, popular resentment against people who have real jobs? They can get people so angry at auto-workers, just because they make 30 bucks an hour, which is like nowhere near what corporate lawyers make, but nobody seems to resent them. They get angry at the auto-workers; they get angry at teachers. They don’t get angry at school administrators, who actually make more money. Most of the problems people blame on teachers, and I think on some level, that’s resentment: all these people with meaningless jobs are saying, but, you guys get to teach kids, you get to make cars; that’s real work. We don’t get to do real work; you want benefits, too? That’s not reasonable.

If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how they could have done a better job. Real, productive workers are relentlessly squeezed and exploited. The remainder are divided between a terrorised stratum of the, universally reviled, unemployed and a larger stratum who are basically paid to do nothing, in positions designed to make them identify with the perspectives and sensibilities of the ruling class (managers, administrators, etc) – and particularly its financial avatars – but, at the same time, foster a simmering resentment against anyone whose work has clear and undeniable social value. Clearly, the system was never consciously designed. It emerged from almost a century of trial and error. But it is the only explanation for why, despite our technological capacities, we are not all working 3-4 hour days.

Ernesto Laclau dies

  • Ernesto Laclau, post-Marxist critical theorist and significant figure in discourse analysis (along with his wife and collaborator Chantal Mouffe), died on April 13. An obituary by British historian and academic Robin Blackburn was posted on the Verso web site:

Ernesto and Chantal used the work of Antonio Gramsci to reject what they saw as the reductionism and teleology of much Marxist theory. Though sometimes calling himself a ‘post-Marxist’ and an advocate of ‘radical democracy’, Ernesto insisted that he remained a radical anti-imperialist and anti-capitalist. His criticisms of Marx and Marxism were made in a constructive spirit, and without a hint of rancour.

Ernesto was recognised as leading thinker in Latin America but also as an intellectual star in the academic world, co-authoring Contingency, Hegemony and Universality with Slavoj Žižek and Judith Butler in 2008. He gave courses at a string of leading universities in Europe and the Americas, including North Western and the New School for Social Research. Ernesto became Emeritus professor at Essex in 2003, but the Centre he established continues its work.

With collaborators including his wife, Chantal Mouffe, and the cultural theorist Stuart Hall, Laclau played a key role in reformulating Marxist theory in the light of the collapse of communism and failure of social democracy. His "post-Marxist" manifesto Hegemony and Socialist Strategy (1985), written with Mouffe, was translated into 30 languages, and sales ran into six figures. The book argued that the class conflict identified by Marx was being superseded by new forms of identity and social awareness. This worried some on the left, including Laclau's friend Ralph Miliband, who feared that he had lost touch with the mundane reality of class division and conflict, but his criticisms of Marx and Marxism were always made in a constructive spirit.

Political populism was an enduring fascination for Laclau. His first book, Politics and Ideology in Marxist Theory (1977), offered a polite but devastating critique of the conventional discourse on Latin America at the time. This "dependency" approach tended to see the large landowners – latifundistas – as semi-feudal and pre-capitalist, while Laclau showed them to be part and parcel of Latin American capitalism which fostered enormous wealth and desperate poverty.

Witnessing the impact of the Perónist movement in Argentina led Professor Laclau to a fascination with populism. He wrote a celebrated essay on the subject in the 1970s and then a full-length book, On Populist Reason (2005), looking at the rise of leftist politicians such as Hugo Chávez across much of Latin America. Both the current president of Argentina, Cristina Fernández de Kirchner, and her late husband and predecessor Néstor Kirchner, are said to have been great admirers of his work.

Laclau’s theory of populism has played a critical role in my research. Without his theoretical insights and captivating character, I could not have expanded my initial observations of populist practices to this level.  Beside his theoretical legacy and rich intellectual input outside academia, Prof. Laclau also contributed to the training and development of students and researchers from different parts of the world – thanks to the IDA programme he founded.  His death is a great loss.

Video mélange: David Harvey, Antonio Negri, and Saints Row IV

 

 

Ender's Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It's been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman's latest Big Picture post uses his own review of the Ender's Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or "gimmick") to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person "Hulk speak"  writing style the all caps seems to be played out. Nevertheless, I'm sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.

  • This video from Satchbag's Goods is ostensibly a review ofHotline Miami, but develops into a discussion of art movements and Kanye West:

  • This short interview with Slavoj Žižek in New York magazine continues a trend I've noticed since Pervert's Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek's contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team's blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin's Creed game for the New York Times. I haven't played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

  • Speaking of postmodern, self-referential, meta-fictional video games: The Stanley Parable was released late last month. There has already been a bevy of analysis written about the game, but I am waiting for the Mac release to play the game and doing my best to avoid spoilers in the meantime. Brenna Hillier's post at VG24/7 is spoiler free (assuming you are at least familiar with the games premise, or its original incarnation as a Half Life mod), and calls The Stanley parable "a reaction against, commentary upon, critique and celebration of narrative-driven game design":

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters' ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves - our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn't exactly rocket science, thanks to Tor and some bitcoins. Here's a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is "imaginary worlds," and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Epic EVE battle, Critical games criticism, indie developer self-publishing

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online's history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST's withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won't be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, "Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That's what we think will actually generate a bunch of creativity on the system." With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it "generally like we think about Marketplace today." According to developers we've spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of 'freemium'. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the 'free-to-play with in-app purchases' model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a 'reward' state in the player's brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a 'Skinner box', named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate" and “important." So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.


[...]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate" spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism. 

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

 

The Ideology of Scarface, Community as PoMo masterpiece, Present Shock reviewed, etc.

[youtube=http://www.youtube.com/watch?v=YanhEVEgkYI&w=560&h=315]

[youtube=http://www.youtube.com/watch?v=7MCmBHPqz6I&w=560&h=315]

In The Godfather, the blurring of the line between crime and the “legitimate” economy can still seem shocking. In Scarface, the distinction seems quaintly naïve. In The Godfather, Don Vito almost loses everything over his refusal to deal in heroin. In Scarface, Tony Montana knows that coke is just another commodity in a boom economy. Michael Corleone marries the wispy, drooping Kate Adams to give his enterprise some old-fashioned, WASP class. When Tony Montana takes possession of the coked-up bombshell called Elvira Hancock, he is filling his waterbed with cash, not class. Even more excruciatingly, Scarface tells us these truths without any self-righteousness, without the consoling promise that manly discipline can save America from its fate. In the moral economy of this movie, the terms of critique have become indistinguishable from the terms of affirmation. “You know what capitalism is?” Tony answers his own question: “Getting fucked.”

Donovan put Neumann in charge of the Research and Analysis Branch of the OSS, studying Nazi-ruled central Europe. Neumann was soon joined by the philosopher Herbert Marcuse and the legal scholar Otto Kirchheimer, his colleagues at the left-wing Institute for Social Research, which had been founded in Frankfurt in 1923 but had moved to Columbia University after the Nazis came to power.

An update of the promise, that the media could create a different, even a better world, seems laughable from our perspective of experience with the technologically based democracies of markets. As a utopia-ersatz, this promise appears to be obsolete in the former hegemonial regions of North America and western and northern Europe. Now that it is possible to create a state with media, they are no longer any good for a revolution. The media are an indispensable component of functioning social hierarchies, both from the top down and the bottom up, of power and countervailing power. They have taken on systemic character. Without them, nothing works anymore in what the still surviving color supplements in a careless generalization continue to call a society. Media are an integral part of the the everyday coercive context, which is termed “practical constraints.” As cultural techniques, which need to be learned for social fitness, they are at the greatest possible remove from what whips us into a state of excitement, induces aesthetic exultation, or triggers irritated thoughts.

[...]

At the same time, many universities have established courses in media design, media studies, and media management. Something that operates as a complex, dynamic, and edgy complex between the discourses, that is, something which can only operate interdiscursively, has acquired a firm and fixed place in the academic landscape. This is reassuring and creates professorial chairs, upon which a once anarchic element can be sat out and developed into knowledge for domination and control. Colleges and academies founded specifically for the media proactively seek close relationships with the industries, manufacturers, and the professional trades associations of design, orientation, and communication.

There are five ways Rushkoff thinks present shock is being experienced and responded to. To begin, we are in an era in which he thinks narrative has collapsed. For as long as we have had the power of speech we have corralled time into linear stories with a beginning, middle and ending. More often than not these stories contained some lesson. They were not merely forms of entertainment or launching points for reflection but contained some guidance as to how we should act in a given circumstance, which, of course, differed by culture, but almost all stories were in effect small oversimplified models of real life.

[...]

The medium Rushkoff thinks is best adapted to the decline of narrative are video games. Yes, they are more often than not violent, but they also seem tailor made for the kinds of autonomy and collaborative play that are the positive manifestations of our new presentism.

 

Multiple angles on gaming's Ebert, Kubrick, and Citizen Kane

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.

  • Ebert later clarified that he believed "anything can be art," but video games cannot be "high art". Among those who disagreed with Ebert's assessment was film director Clive Barker. Ebert responded to some of Barker's points in an article. Part of Barker's comments dealt with the importance of critics to video games:

Barker:"It used to worry me that the New York Times never reviewed my books. But the point is that people like the books. Books aren't about reviewers. Games aren't about reviewers. They are about players."

Ebert: A reviewer is a reader, a viewer or a player with an opinion about what he or she has viewed, read or played. Whether that opinion is valid is up to his audience, books, games and all forms of created experience are about themselves; the real question is, do we as their consumers become more or less complex, thoughtful, insightful, witty, empathetic, intelligent, philosophical (and so on) by experiencing them?

  • The idiosyncrasies of video game reviews themselves have become so well known that game reviews are practically considered a genre (see this satirical take from Something Awful: If films were reviewed like video games). Earlier this month video game designer Warren Spector wrote a blog post titled Where's gaming's Roger Ebert? In the post Spector argues that gaming journalism and criticism currently is geared toward specialized groups like developers, publishers, academics, and hard-core gamers, but not "normal people":

What we need, as I said in an earlier column, is our own Andrew Sarris, Leonard Maltin, Pauline Kael, Judith Crist, Manny Farber, David Thomson, or Roger Ebert. We need people in mainstream media who are willing to fight with each other (not literally, of course) about how games work, how they reflect and affect culture, how we judge them as art as well as entertainment. We need people who want to explain games, individually and generically, as much as they want to judge them. We need what might be called mainstream critical theorists.

And they need a home. Not only on the Internet (though we need them there, too), not just for sale at GDC, but on newsstands and bookstore shelves - our own Film Comment, Sight and Sound, Cahiers du Cinema. Magazines you could buy on the newsstand. Why? Because currently, criticism of this - what little we have of it - reaches only the already converted. To reach the parents, the teachers, the politicians, we need to be where they shop. Even if you never pick up a film magazine, the fact that there are obviously serious magazines devoted to the topic makes a difference in the minds of the uninitiated.

To wonder aloud when or where the Roger Ebert of games criticism will emerge is wrongheaded. First, we must ask where is our Scorsese, our Hitchcock, our Coppola, our Tarantino? Where is gaming’s Stanley Kubrick?

A precious few developers may already be taking those first, intrepid steps along that road. Once these new developers are ascendant, once “adult” is no longer just a byword for “graphic” on this medium, perhaps then we can start to discuss a new critical grammar for games, and begin the search for its greatest practitioner.

The game industry is not waiting for its formative masterpieces to materialize from the hazy future. They're here, right now, walking among us. The future was 2002, and in many ways we have yet to surpass it. Like Citizen Kane, Metroid Prime is a landmark in both technical innovation and pure creativity.

  • Writing in the Financial Post, Chad Sapieha says that video games will never have a Citizen Kane moment. Interestingly, his argument isn't based on the artistic merits of video games, but rather on the particularities of the medium: video games become obsolete with technological advancements. A film made in the 1940s may still be available to view on DVD or other format, but a video game released just twenty years ago likely exists as only a memory.

I'd go so far as to suggest that, over time, many games released today will end up sharing more in common with stage productions than books or movies or music. They will be appreciated in the moment, then eventually disappear. People will write about and record their experiences, and those words and videos will continue on to posterity, acting as the primary means by which they are remembered by gamers of the future.

[...]

What I'm saying is simply this: Video game "classics" should be viewed as a breed apart from those of other entertainment mediums. Any attempts at comparison are fundamentally flawed thanks to unavoidable expiration dates imposed by the unstoppable evolution of hardware and advancements in game design.

Our medium is a fantastic vessel than can go places and do things others cannot. Games don’t need to beckon reflection or emotion in order to be good, and I don’t require validation from other people for the hobby to seem like a worthwhile use of my time. Indeed, Citizen Kane is incredible. It’s beautiful, thought-provoking, and inspiring … and film can keep it. Video games don’t need any of it; they never have and never will.

The problem with gaming’s incessant desire to be just like big brother Hollywood is multifarious and exceedingly annoying – like a thousand-headed hydra puffing away on an equal number of vuvuzelas. Have games or games criticism earned a place in the rarefied pantheon of unanimously beloved “mainstream” art? No, not really. Would it be cool if we had a Citizen Kane or, as Warren Spector suggests, an Ebert? I guess so.

But everyone waiting for those shining beacons of cultural acceptance to descend from on-high utterly fails to understand two key points: 1) in this day and age, creating direct analogs to those landmarks is actually impossible, and 2) games and games criticism are in the midst of a renaissance. An unstoppable explosion of evolution and creativity. The formation of an identity that is, frankly, far more exciting than film. Why aren’t we championing that to everyone with (or without) ears? Why are we instead breathlessly awaiting the day our medium suddenly and inexplicably conforms to somebody else’s standard?

Powered by Squarespace. Background image of New Songdo by Curry Chandler.