Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: ideology

Memes, Enthymemes, and the Reproduction of Ideology

In his 1976 book The Selfish Gene, biologist Richard Dawkins introduced the word “meme” to refer to a hypothetical unit of cultural transmission. The discussion of the meme concept was contained in a single chapter of a book that was otherwise dedicated to genetic transmission, but the idea spread. Over decades, other authors further developed the meme concept, establishing “memetics” as a field of study. Today, the word “meme” has entered the popular lexicon, as well as popular culture, and is primarily associated with specific internet artifacts, or “viral” online content. Although this popular usage of the term is not always in keeping with Dawkins’ original conception, these examples from internet culture do illustrate some key features of how memes have been theorized.

This essay is principally concerned with two strands of memetic theory: the relation of memetic transmission to the reproduction of ideology; and the role of memes in rhetorical analysis, especially in relation to the enthymeme as persuasive appeal. Drawing on these theories, I will advance two related arguments: ideology as manifested in discursive acts can be considered to spread memetically; and ideology functions enthymemetically. Lastly, I will present a case study analysis to demonstrate how the use of methods and terminology from rhetorical criticism, discourse analysis, and media studies, can be employed to analyze artifacts based on these arguments.

Examples of memes presented by Dawkins include “tunes, ideas, catch-phrases, clothes fashions, ways of making pots or building arches” (p.192). The name “meme” was chosen due to its similarity to the word “gene”, as well as its relation to the Greek root “mimeme” meaning “that which is imitated” (p.192). Imitation is key to Dawkins’ notion of the meme because imitation is the means by which memes propagate themselves amongst members of a culture. Dawkins identifies three qualities associated with high survival in memes: longevity, fecundity, and copying-fidelity (p.194).

Distin (2005) further developed the meme hypothesis in The Selfish Meme. Furthering the gene/meme analogy, Distin defines memes as “units of cultural information” characterized by the representational content they carry (p.20), and the representational content is considered “the cultural equivalent of DNA” (p.37). This conceptualization of memes and their content forms the basis of Distin’s theory of cultural heredity. Distin then seeks to identify the representational system used by memes to carry their content (p.142). The first representational system considered is language, what Distin calls “the memes-as-words hypothesis” (p.145). Distin concludes that language itself is “too narrow to play the role of cultural DNA” (p.147).

Balkin (1998) took up the meme concept to develop a theory of ideology as “cultural software”. Balkin describes memes as “tools of understanding,” and states that there are “as many different kinds of memes as there are things that can be transmitted culturally” (p.48). Stating that the “standard view of memes as beliefs is remarkably similar to the standard view of ideology as a collection of beliefs” (p.49), Balkin links the theories of memetic transmission to theories of ideology. Employing metaphors of virility similar to how other authors have written of memes as “mind viruses,” Balkin considers memetic transmission as the spread of “ideological viruses” through social networks of communication, stating that “this model of ideological effects is the model of memetic evolution through cultural communication” (p.109). Balkin also presents a more favorable view of language as a vehicle for memes than Distin presented, writing: “Language is the most effective carrier of memes and is itself one of the most widespread forms of cultural software. Hence it is not surprising that many ideological mechanisms either have their source in features of language or are propagated through language” (p.175).

Balkin approaches the subject from a background in law, and although not a rhetorician and skeptical of the discursive turn in theories of ideology, Balkin does employ rhetorical concepts in discussing the influence of memes and ideology: “Rhetoric has power because understanding through rhetorical figures already forms part of our cultural software” (p.19). Balkin also cites Aristotle, remarking that “the successful rhetorician builds upon what the rhetorician and the audience have in common,” and “what the two have in common are shared cultural meanings and symbols” (p.209). In another passage, Balkin expresses a similar notion of the role of shared understanding in communication: “Much human communication requires the parties to infer and supplement what is being conveyed rather than simply uncoding it” (p.51).

Although Balkin never uses the term, these ideas are evocative of the rhetorical concept of the enthymeme. Aristotle himself discussed the enthymeme, though the concept was not elucidated with much specificity. Rhetorical scholars have since debated the nature of the enthymeme as employed in persuasion, and Bitzer (1959) surveyed various accounts to produce a more substantial definition. Bitzer’s analysis comes to focus on the enthymeme in relation to syllogisms, and the notion of the enthymeme as a syllogism with a missing (or unstated) proposition. Bitzer states: “To say that the enthymeme is an ‘incomplete syllogism’ – that is, a syllogism having one or more suppressed premises – means that the speaker does not lay down his premises but lets his audience supply them out of its stock of opinion and knowledge” (p.407).

Bitzer’s formulation of the enthymeme emphasizes that “enthymemes occur only when the speaker and audience jointly produce them” (p.408). That they are “jointly produced” is key to the role of the enthymeme is successful persuasive rhetoric: “Owing to the skill of the speaker, the audience itself helps construct the proofs by which it is persuaded” (p.408). That the enthymeme’s “premises are always drawn from the audience,” and the “successful construction is accomplished through the joint efforts of speaker and audience,” Bitzer defines as the “essential character” of the enthymeme. This joint construction, and supplying of the missing premise(s), resonates with Balkin’s view of the spread of cultural software, as well as various theories of subjects’ complicity in the functioning of ideology.

McGee (1980) supplied another link between rhetoric and ideology with the “ideograph”. McGee argued that “ideology is a political language composed of slogan-like terms signifying collective commitment” (p.15), and these terms he calls “ideographs”. Examples of ideographs, according to McGee, include “liberty,” “religion,” and “property” (p.16). Johnson (2007) applies the ideograph concept to memetics, to argue for the usefulness of the meme as a tool for materialist criticism. Johnson argues that although “the ideograph has been honed as a tool for political (“P”-politics) discourses, such as those that populate legislative arenas, the meme can better assess ‘superficial’ cultural discourses” (p.29). I also believe that the meme concept can be a productive tool for ideological critique. As an example, I will apply the concepts of ideology reproduction as memetic transmission, and ideological function as enthymematic, in an analysis of artifacts of online culture popularly referred to as “memes”.

As Internet culture evolved, users adapted and mutated the term “meme” to refer to specific online artifacts. Even though they may be considered a type of online artifact, Internet memes come in a variety of different forms. One of the oldest and most prominent series of image macro memes is the “LOLcats” series of memes. The template established by LOLcats of superimposing humorous text over static images became and remains the standard format for image macro memes. Two of the most prominent series of these types of memes are the “First World Problems” (FWP) and “Third World Success” image macros. Through analysis of these memes, it is possible to examine how the features of these artifacts and discursive practices demonstrate many of the traits of memes developed by theorists, and how theories of memetic ideological transmission and enthymematic ideological function can be applied to examine ideological characteristics of these artifacts.

 

References

Balkin, J.M. (1998). Cultural software: A theory of ideology. Dansbury, CT: Yale

University Press.

Bitzer, L. F. (1959). Aristotle’s enthymeme revisited. Quarterly Journal Of Speech,

45(4), 399-408.

Dawkins, R. (2006). The Selfish Gene. New York, NY: Oxford University Press. (original

work published 1976)

Distin, K. (2005). The selfish meme: A critical reassessment. New York, NY: Cambridge

University Press.

McGee, M. C. (1980). The “ideograph”: A link between rhetoric and ideology. Quarterly

Journal Of Speech66(1), 1-16.

Media Ecology Monday: Golumbia and the Political Economy of Computationalism

In The Cultural Logic of Computation Golumbia raises questions and addresses issues that are promising, but then proceeds in making an argument that is ultimately unproductive. I am sympathetic to Golumbia’s aims; I share an attitude of skepticism toward the rhetoric surrounding the Internet and new media as inherently democratizing, liberating devices. Golumbia characterizes such narratives as “technological progressivism,” and writes that “technological progressivism […] conditions so much of computational discourse.” Following the “Arab Spring” and watching the events unfold was exhilarating, but I was always uncomfortable with the narrative promoted in the mainstream news media characterizing these social movements as a “Twitter revolution,” and I remain skeptical toward hashtag activism and similar trends.

So while I was initially inclined toward the project Golumbia laid out in the book’s introductory pages, the chapters that followed only muddled rather than clarified my understanding of the argument being presented. The first section contains a sustained attack on Noam Chomsky’s contributions to linguistics, and their various influences and permutations, but also on Chomsky himself. I don’t know why Golumbia needed to question Chomsky’s “implausible rise to prominence,” or why Chomsky’s “magnetic charisma” needs to be mentioned in this discussion of linguistic theory.

Golumbia focuses on Chomsky’s contributions to linguistics, because that is where his interests and argument draw him; based on my own interests and background I would’ve preferred engagement with the other side of Chomsky’s contributions to communication studies, namely the propaganda model and political economy of the media. I suspect that a fruitful analysis would be possible from considering some of the issues Golumbia brings up in relation the work of Chomsky and others in ideological analysis of news media content. The notion of computationalism as ideology is compelling to me; so is the institutionalized rhetoric of computationalism, which is a separate, promising argument, I think.

 In reading I have a tendency to focus on what interests me, appeals to me, or may be useful for me. Some of Golumbia’s concepts, such as “technological-progressive neoliberalism” and its relation to centralized power, fall into this category. I’m still skeptical about computationalism as an operationalizable concept (it seems like there are already multiple theoretical models and critical perspectives that cover the same territory, I’m not convinced that Golumbia makes the case for a need for the term), others may be more productive. Ultimately I will use a quote from Golumbia (addressing Internet and emerging technologies) that reflects my feelings on this book: “We have to learn to critique even that which helps us.”

Critical perspectives on the Isla Vista spree killer, media coverage


Reuters/Lucy Nicholson

Reuters/Lucy Nicholson

  • Immediately following Elliot Rodger's spree killing in Isla Vista, CA last month Internet users discovered his YouTube channel and a 140-page autobiographical screed, dubbed a "manifesto" by the media. The written document and the videos documented Rodger's sexual frustration and his chronic inability to connect with other people. He specifically lashed out at women for forcing him " to endure an existence of loneliness, rejection and unfulfilled desires" and causing his violent "retribution". Commentators and the popular press framed the killings as an outcome of misogynistic ideology, with headlines such as: How misogyny kills men, further proof that misogyny kills, and Elliot Rodger proves the danger of everyday sexism. Slate contributor Amanda Hess wrote:

Elliot Rodger targeted women out of entitlement, their male partners out of jealousy, and unrelated male bystanders out of expedience. This is not ammunition for an argument that he was a misandrist at heart—it’s evidence of the horrific extent of misogyny’s cultural reach.

His parents saw the digitally mediated rants and contacted his therapist and a social worker, who contacted a mental health hotline. These were the proper steps. But those who interviewed Rodger found him to be a “perfectly polite, kind and wonderful human.” They deemed his involuntary holding unnecessary and a search of his apartment unwarranted. That is, authorities defined Rodger and assessed his intentions based upon face-to-face interaction, privileging this interaction over and above a “vast digital trail.” This is digital dualism taken to its worst imaginable conclusion.

In fact, in the entire 140-odd-page memoir he left behind, “My Twisted World,” documents with agonizing repetition the daily tortured minutiae of his life, and barely has any interactions with women. What it has is interactions with the symbols of women, a non-stop shuffling of imaginary worlds that women represented access to. Women weren’t objects of desire per se, they were currency.

[...]

What exists in painstaking detail are the male figures in his life. The ones he meets who then reveal that they have kissed a girl, or slept with a girl, or slept with a few girls. These are the men who have what Elliot can’t have, and these are the men that he obsesses over.

[...]

Women don’t merely serve as objects for Elliot. Women are the currency used to buy whatever he’s missing. Just as a dollar bill used to get you a dollar’s worth of silver, a woman is an indicator of spending power. He wants to throw this money around for other people. Bring them home to prove something to his roommates. Show the bullies who picked on him that he deserves the same things they do.

[...]

There’s another, slightly more obscure recurring theme in Elliot’s manifesto: The frequency with which he discusses either his desire or attempt to throw a glass of some liquid at happy couples, particularly if the girl is a ‘beautiful tall blonde.’ [...] These are the only interactions Elliot has with women: marking his territory.

[...]

When we don’t know how else to say what we need, like entitled children, we scream, and the loudest scream we have is violence. Violence is not an act of expressing the inexpressible, it’s an act of expressing our frustration with the inexpressible. When we surround ourselves by closed ideology, anger and frustration and rage come to us when words can’t. Some ideologies prey on fear and hatred and shift them into symbols that all other symbols are defined by. It limits your vocabulary.

While the motivations for the shootings may vary, they have in common crises in masculinity in which young men use guns and violence to create ultra-masculine identities as part of a media spectacle that produces fame and celebrity for the shooters.

[...]

Crises in masculinity are grounded in the deterioration of socio-economic possibilities for young men and are inflamed by economic troubles. Gun carnage is also encouraged in part by media that repeatedly illustrates violence as a way of responding to problems. Explosions of male rage and rampage are also embedded in the escalation of war and militarism in the United States from the long nightmare of Vietnam through the military interventions in Afghanistan and Iraq.

For Debord, “spectacle” constituted the overarching concept to describe the media and consumer society, including the packaging, promotion, and display of commodities and the production and effects of all media. Using the term “media spectacle,” I am largely focusing on various forms of technologically-constructed media productions that are produced and disseminated through the so-called mass media, ranging from radio and television to the Internet and the latest wireless gadgets.

  • Kellner's comments from a 2008 interview talking about the Virginia Tech shooter's videos broadcast after the massacre, and his comments on critical media literacy, remain relevant to the current situation:

Cho’s multimedia video dossier, released after the Virginia Tech shootings, showed that he was consciously creating a spectacle of terror to create a hypermasculine identity for himself and avenge himself to solve his personal crises and problems. The NIU shooter, dressed in black emerged from a curtain onto a stage and started shooting, obviously creating a spectacle of terror, although as of this moment we still do not know much about his motivations. As for the television networks, since they are profit centers in a highly competitive business, they will continue to circulate school shootings and other acts of domestic terrorism as “breaking events” and will constitute the murderers as celebrities. Some media have begun to not publicize the name of teen suicides, to attempt to deter copy-cat effects, and the media should definitely be concerned about creating celebrities out of school shooters and not sensationalize them.

[...]

People have to become critical of the media scripts of hyperviolence and hypermasculinity that are projected as role models for men in the media, or that help to legitimate violence as a means to resolve personal crises or solve problems. We need critical media literacy to analyze how the media construct models of masculinities and femininities, good and evil, and become critical readers of the media who ourselves seek alternative models of identity and behavior.

  • Almost immediately after news of the violence broke, and word of the killer's YouTube videos spread, there was a spike of online backlash against the media saturation and warnings against promoting the perpetrator to celebrity status through omnipresent news coverage. Just two days after the killings Isla Vista residents and UCSB students let the news crews at the scene know that they were not welcome to intrude upon the community's mourning. As they are wont to do, journalists reported on their role in the story while ignoring the wishes of the residents, as in this LA Times brief:

More than a dozen reporters were camped out on Pardall Road in front of the deli -- and had been for days, their cameras and lights and gear taking up an entire lane of the street. At one point, police officers showed up to ensure that tensions did not boil over.

The students stared straight-faced at reporters. Some held signs expressing their frustration with the news media:

"OUR TRAGEDY IS NOT YOUR COMMODITY."

"Remembrance NOT ratings."

"Stop filming our tears."

"Let us heal."

"NEWS CREWS GO HOME!"

Fukuyama: 25 years after the "End of History"

 

I argued that History (in the grand philosophical sense) was turning out very differently from what thinkers on the left had imagined. The process of economic and political modernization was leading not to communism, as the Marxists had asserted and the Soviet Union had avowed, but to some form of liberal democracy and a market economy. History, I wrote, appeared to culminate in liberty: elected governments, individual rights, an economic system in which capital and labor circulated with relatively modest state oversight.

[...]

So has my end-of-history hypothesis been proven wrong, or if not wrong, in need of serious revision? I believe that the underlying idea remains essentially correct, but I also now understand many things about the nature of political development that I saw less clearly during the heady days of 1989.

[...]

Twenty-five years later, the most serious threat to the end-of-history hypothesis isn't that there is a higher, better model out there that will someday supersede liberal democracy; neither Islamist theocracy nor Chinese capitalism cuts it. Once societies get on the up escalator of industrialization, their social structure begins to change in ways that increase demands for political participation. If political elites accommodate these demands, we arrive at some version of democracy.

When he wrote "The End of History?", Fukuyama was a neocon. He was taught by Leo Strauss's protege Allan Bloom, author of The Closing of the American Mind; he was a researcher for the Rand Corporation, the thinktank for the American military-industrial complex; and he followed his mentor Paul Wolfowitz into the Reagan administration. He showed his true political colours when he wrote that "the class issue has actually been successfully resolved in the west … the egalitarianism of modern America represents the essential achievement of the classless society envisioned by Marx." This was a highly tendentious claim even in 1989.

[...]

Fukuyama distinguished his own position from that of the sociologist Daniel Bell, who published a collection of essays in 1960 titled The End of Ideology. Bell had found himself, at the end of the 1950s, at a "disconcerting caesura". Political society had rejected "the old apocalyptic and chiliastic visions", he wrote, and "in the west, among the intellectuals, the old passions are spent." Bell also had ties to neocons but denied an affiliation to any ideology. Fukuyama claimed not that ideology per se was finished, but that the best possible ideology had evolved. Yet the "end of history" and the "end of ideology" arguments have the same effect: they conceal and naturalise the dominance of the right, and erase the rationale for debate.

While I recognise the ideological subterfuge (the markets as "natural"), there is a broader aspect to Fukuyama's essay that I admire, and cannot analyse away. It ends with a surprisingly poignant passage: "The end of history will be a very sad time. The struggle for recognition, the willingness to risk one's life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination, and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands."

 

In an article that went viral in 1989, Francis Fukuyama advanced the notion that with the death of communism history had come to an end in the sense that liberalism — democracy and market capitalism — had triumphed as an ideology. Fukuyama will be joined by other scholars to examine this proposition in the light of experience during the subsequent quarter century.

Featuring Francis Fukuyama, author of “The End of History?”; Michael Mandelbaum, School of Advanced International Studies, Johns Hopkins University; Marian Tupy, Cato Institute; Adam Garfinkle, editor, American Interest; Paul Pillar, Nonresident Senior Fellow, Foreign Policy, Center for 21st Century Security and Intelligence, Brookings Institution; and John Mueller, Ohio State University and Cato Institute.

Ender's Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It's been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman's latest Big Picture post uses his own review of the Ender's Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or "gimmick") to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person "Hulk speak"  writing style the all caps seems to be played out. Nevertheless, I'm sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.

  • This video from Satchbag's Goods is ostensibly a review ofHotline Miami, but develops into a discussion of art movements and Kanye West:

  • This short interview with Slavoj Žižek in New York magazine continues a trend I've noticed since Pervert's Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek's contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team's blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin's Creed game for the New York Times. I haven't played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

  • Speaking of postmodern, self-referential, meta-fictional video games: The Stanley Parable was released late last month. There has already been a bevy of analysis written about the game, but I am waiting for the Mac release to play the game and doing my best to avoid spoilers in the meantime. Brenna Hillier's post at VG24/7 is spoiler free (assuming you are at least familiar with the games premise, or its original incarnation as a Half Life mod), and calls The Stanley parable "a reaction against, commentary upon, critique and celebration of narrative-driven game design":

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters' ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves - our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn't exactly rocket science, thanks to Tor and some bitcoins. Here's a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is "imaginary worlds," and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Warren Ellis on violent fiction, death of the Western, Leatherface as model vegan

As we learn early on, the movie’s killers, the murderous Sawyer family (comprised of Leatherface, Grandpa, et al), used to run a slaughterhouse, and the means they use to slaughter their victims are the same as those used to slaughter cattle. They knock them over the head with sledgehammers, hang them on meat hooks, and stuff them into freezers. Often this takes place as the victims are surrounded by animal bones, a detail that could be explained away as the evidence of their former occupation—except that the cries of farm animals (there are none around) are played over the scenes.

Through the past century of Western movies, we can trace America's self-image as it evolved from a rough-and-tumble but morally confident outsider in world affairs to an all-powerful sheriff with a guilty conscience. After World War I and leading into World War II, Hollywood specialized in tales of heroes taking the good fight to savage enemies and saving defenseless settlements in the process. In the Great Depression especially, as capitalism and American exceptionalism came under question, the cowboy hero was often mistaken for a criminal and forced to prove his own worthiness--which he inevitably did. Over the '50s, '60s, and '70s however, as America enforced its dominion over half the planet with a long series of coups, assassinations, and increasingly dubious wars, the figure of the cowboy grew darker and more complicated. If you love Westerns, most of your favorites are probably from this era--Shane, The Searchers, Butch Cassidy and the Sundance Kid, McCabe & Mrs. Miller, the spaghetti westerns, etc. By the height of the Vietnam protest era, cowboys were antiheroes as often as they were heroes.

The dawn of the 1980s brought the inauguration of Ronald Reagan and the box-office debacle of the artsy, overblown Heaven's Gate. There's a sense of disappointment to the decade that followed, as if the era of revisionist Westerns had failed and a less nuanced patriotism would have to carry the day. Few memorable Westerns were made in the '80s, and Reagan himself proudly associated himself with an old-fashioned, pre-Vietnam cowboy image. But victory in the Cold War coincided with a revival of the genre, including the revisionist strain, exemplified in Clint Eastwood's career-topping Unforgiven. A new, gentler star emerged in Kevin Costner, who scored a post-colonial megahit with Dances With Wolves. Later, in the 2000s, George W. Bush reclaimed the image of the cowboy for a foreign policy far less successful than Reagan's, and the genre retreated to the art house again.

Westerns are fundamentally about political isolation. The government is far away and weak. Institutions are largely irrelevant in a somewhat isolated town of 100 people. The law is what the sheriff says it is, or what the marshall riding through town says, or the posse. At that scale, there may be no meaningful distinction between war and crime. A single individual's choices can tilt the balance of power. Samurai and Western stories cross-pollinated because when you strip away the surface detail the settings are surprisingly similar. The villagers in Seven Samurai and the women in Unforgiven are both buying justice/revenge because there is no one to appeal to from whom they could expect justice. Westerns are interesting in part because they are stories where individual moral judgment is almost totally unsupported by institutions.

Westerns clearly are not dying. We get a really great film in the genre once every few years. However, they've lost a lot of their place at the center of pop culture because the idea of an isolated community has grown increasingly implausible. In what has become a surveillance state, the idea of a place where the state has no authority does not resonate as relevant.

The function of fiction is being lost in the conversation on violence. My book editor, Sean McDonald, thinks of it as “radical empathy.” Fiction, like any other form of art, is there to consider aspects of the real world in the ways that simple objective views can’t — from the inside. We cannot Other characters when we are seeing the world from the inside of their skulls. This is the great success of Thomas Harris’s Hannibal Lecter, both in print and as so richly embodied by Mads Mikkelsen in the Hannibal television series: For every three scary, strange things we discover about him, there is one thing that we can relate to. The Other is revealed as a damaged or alienated human, and we learn something about the roots of violence and the traps of horror.

Rushkoff on Manning verdict, Chomsky/Žižek on NSA leaks, looking for McLuhan in Afghanistan

We are just beginning to learn what makes a free people secure in a digital age. It really is different. The Cold War was an era of paper records, locked vaults and state secrets, for which a cloak-and-dagger mindset may have been appropriate. In a digital environment, our security comes not from our ability to keep our secrets but rather our ability to live our truth.

In light of the recent NSA surveillance scandal, Chomsky and Žižek offer us very different approaches, both of which are helpful for leftist critique. For Chomsky, the path ahead is clear. Faced with new revelations about the surveillance state, Chomsky might engage in data mining, juxtaposing our politicians' lofty statements about freedom against their secretive actions, thereby revealing their utter hypocrisy. Indeed, Chomsky is a master at this form of argumentation, and he does it beautifully in Hegemony or Survival when he contrasts the democratic statements of Bush regime officials against their anti-democratic actions. He might also demonstrate how NSA surveillance is not a strange historical aberration but a continuation of past policies, including, most infamously, the FBI's counter intelligence programme in the 1950s, 60s, and early 70s.

Žižek, on the other hand, might proceed in a number of ways. He might look at the ideology of cynicism, as he did so famously in the opening chapter of The Sublime Object of Ideology, in order to demonstrate how expressions of outrage regarding NSA surveillance practices can actually serve as a form of inaction, as a substitute for meaningful political struggle. We know very well what we are doing, but still we are doing it; we know very well that our government is spying on us, but still we continue to support it (through voting, etc). Žižek might also look at how surveillance practices ultimately fail as a method of subjectivisation, how the very existence of whistleblowers like Thomas Drake, Bradley Manning, Edward Snowden, and the others who are sure to follow in their footsteps demonstrates that technologies of surveillance and their accompanying ideologies of security can never guarantee the full participation of the people they are meant to control. As Žižek emphasises again and again, subjectivisation fails.

In early 2011, award-winning photographer Rita Leistner was embedded with a U.S. marine battalion deployed to Helmand province as a member of Project Basetrack, an experiment in using new technologies in social media to extend traditional war reporting. This new LRC series draws on Leistner’s remarkable iPhone photos and her writings from her time in Afghanistan to use the ideas of Marshall McLuhan to make sense of what she saw there – “to examine the face of war through the extensions of man.”

Epic EVE battle, Critical games criticism, indie developer self-publishing

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online's history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST's withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won't be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, "Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That's what we think will actually generate a bunch of creativity on the system." With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it "generally like we think about Marketplace today." According to developers we've spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of 'freemium'. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the 'free-to-play with in-app purchases' model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a 'reward' state in the player's brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a 'Skinner box', named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate" and “important." So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.


[...]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate" spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism. 

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

 

The Ideology of Scarface, Community as PoMo masterpiece, Present Shock reviewed, etc.

[youtube=http://www.youtube.com/watch?v=YanhEVEgkYI&w=560&h=315]

[youtube=http://www.youtube.com/watch?v=7MCmBHPqz6I&w=560&h=315]

In The Godfather, the blurring of the line between crime and the “legitimate” economy can still seem shocking. In Scarface, the distinction seems quaintly naïve. In The Godfather, Don Vito almost loses everything over his refusal to deal in heroin. In Scarface, Tony Montana knows that coke is just another commodity in a boom economy. Michael Corleone marries the wispy, drooping Kate Adams to give his enterprise some old-fashioned, WASP class. When Tony Montana takes possession of the coked-up bombshell called Elvira Hancock, he is filling his waterbed with cash, not class. Even more excruciatingly, Scarface tells us these truths without any self-righteousness, without the consoling promise that manly discipline can save America from its fate. In the moral economy of this movie, the terms of critique have become indistinguishable from the terms of affirmation. “You know what capitalism is?” Tony answers his own question: “Getting fucked.”

Donovan put Neumann in charge of the Research and Analysis Branch of the OSS, studying Nazi-ruled central Europe. Neumann was soon joined by the philosopher Herbert Marcuse and the legal scholar Otto Kirchheimer, his colleagues at the left-wing Institute for Social Research, which had been founded in Frankfurt in 1923 but had moved to Columbia University after the Nazis came to power.

An update of the promise, that the media could create a different, even a better world, seems laughable from our perspective of experience with the technologically based democracies of markets. As a utopia-ersatz, this promise appears to be obsolete in the former hegemonial regions of North America and western and northern Europe. Now that it is possible to create a state with media, they are no longer any good for a revolution. The media are an indispensable component of functioning social hierarchies, both from the top down and the bottom up, of power and countervailing power. They have taken on systemic character. Without them, nothing works anymore in what the still surviving color supplements in a careless generalization continue to call a society. Media are an integral part of the the everyday coercive context, which is termed “practical constraints.” As cultural techniques, which need to be learned for social fitness, they are at the greatest possible remove from what whips us into a state of excitement, induces aesthetic exultation, or triggers irritated thoughts.

[...]

At the same time, many universities have established courses in media design, media studies, and media management. Something that operates as a complex, dynamic, and edgy complex between the discourses, that is, something which can only operate interdiscursively, has acquired a firm and fixed place in the academic landscape. This is reassuring and creates professorial chairs, upon which a once anarchic element can be sat out and developed into knowledge for domination and control. Colleges and academies founded specifically for the media proactively seek close relationships with the industries, manufacturers, and the professional trades associations of design, orientation, and communication.

There are five ways Rushkoff thinks present shock is being experienced and responded to. To begin, we are in an era in which he thinks narrative has collapsed. For as long as we have had the power of speech we have corralled time into linear stories with a beginning, middle and ending. More often than not these stories contained some lesson. They were not merely forms of entertainment or launching points for reflection but contained some guidance as to how we should act in a given circumstance, which, of course, differed by culture, but almost all stories were in effect small oversimplified models of real life.

[...]

The medium Rushkoff thinks is best adapted to the decline of narrative are video games. Yes, they are more often than not violent, but they also seem tailor made for the kinds of autonomy and collaborative play that are the positive manifestations of our new presentism.

 

2nd Update: Žižek responds to Chomsky's "Fantasies"

  • Žižek v. Chomsky continues: Žižek has responded to Chomsky's last comment in an article in the International Journal of Žižek Studies. You can read the entire article here, select excerpts follow. I am particularly interested in how Žižek focuses on conflicting definitions of ideology as a key factor in Chomsky's misunderstanding of Žižek's work:

For me, on the contrary, the problem is here a very rational one: everything hinges on how we define “ideology.”

[...]

This bias is ideology - a set of explicit and implicit, even unspoken, ethico-political and other positions, decision, choices, etc., which predetermine our perception of facts, what we tend to emphasize or to ignore, how we organize facts into a consistent whole of a narrative or a theory.

  • After a rational and diplomatic refutation of Chomsky's comments, Žižek ends the essay with a parting blow:

Chomsky obviously doesn’t agree with me here. So what if – just another fancy idea of mine – what if Chomsky can not find anything in my work that goes "beyond the level of something you can explain in five minutes to a twelve-year-old because” because, when he deals with continental thought, it is his mind which functions as the mind of a twelve-years-old, the mind which is unable to distinguish serious philosophical reflection from empty posturing and playing with empty words?

 

Hollywood implosion: end of an era?

  • Last month Steven Spielberg and George Lucas caused a bit of a stir when they predicted an impending "implosion" of Hollywood that would forever alter the filmmaking industry. Speaking at a USC event, Spielberg posited a scenario in which a series of big budget flops would necessitate a change in the Hollywood business model:

"That's the big danger, and there's eventually going to be an implosion — or a big meltdown. There's going to be an implosion where three or four or maybe even a half-dozen megabudget movies are going to go crashing into the ground, and that's going to change the paradigm."

People complain that The Lone Ranger is boring, that it's almost totally devoid of fun except for the final 10 minutes, that it's ridiculously violent and yet inert. And all of these things are true — but you have to understand, it's all part of a calculated strategy, to sink far enough to burrow all the way to the infarcted heart of the terrible superhero origin story.

The goal is to show you who is to blame for the crappiness of so many superhero origin movies — you — and to punish you for allowing movies like The Lone Ranger to exist.

[...]

We tend to think of superhero movies as power fantasies, in which the use of America's status as a superpower is reflected by the hero struggling to use his or her power responsibly. But Lone Ranger seems to be making the case that the real seductive fantasy of these stories is absolution from blame — the Lone Ranger gets the Native American seal of approval from Tonto, as long as he's wearing the mask. He gets surcease from America's original sin.

Zizek's guide to ideology, Netflix tackles TV, digital dualist conservatism

Other ideological “masterpieces” that Žižek points to are much subtler, precisely because they occupy more prominent positions in the western cultural imaginary. He reads Jaws as a condensation of all the “foreign invaders” that privileged societies like upper-middle-class America worry will disrupt their peaceful communities. Part of what makes Fiennes’ film such a great showcase for Žižek’s approach to cultural studies is the persuasive effect of supplementing his explications with film clips. After listening to Žižek’s account of the ideological coordinates of the film, it’s difficult not to notice that all of the beach-goers scrambling to make it to the shore in one piece are affluent white Americans.

  • Writing for Memeburn, Michelle Atagana considers the strategies employed by Netflix in trying to “win television”. The strategies include producing original content, feeding binge habits, and using product placement.

If Netflix refines its model and signs on more shows, chances are it will make a formidable foe of big cable players such as HBO. The model that the company is currently working could also be exported to film, essentially making the next cinematic experience wherever, whenever and on whatever device the audience wants.

  • The Society Pages’ Cyborgology blog is one of my favorite resources for probing and provocative analysis of new media issues from a sociological perspective. One of the most interesting concepts considered by the blogs contributors is the notion of Digital Dualism. A recent post by Jesse Elias Spafford refines the digital dualism concept:

I posit that digital dualism, in fact, draws from both the ontological and the normative analyses. Specifically the digital dualist:

  1. Establishes an ontological distinction that carves up the world into two mutually exclusive (and collectively exhaustive) categories—at least one of which is somehow bound up with digital technology (e.g., that which is “virtual” vs. that which is “real”.)

  2. Posits some normative criteria that privileges one category over the other. (In most cases, it is the non-technological category that is deemed morally superior. However, charges of digital dualism would equally apply to views that favored the technological.)

After Earth's ideology, Assange on the new digital age, Voyager re-explored

  • The new M. Night Shyamalan film and Smith dynasty vehicle After Earth underperformed at the box office last weekend, opening in third place. Neither the film or its box office numbers interest me, but elements of its inception and marketing are curious. Up until a few years ago Shyamalan's name featured prominently in promotional materials for films (most recently in 2010 for the Shyamalan-directed Last Airbender and the Shyamalan-produced Devil). Yet during the months of promotion for After Earth the director's name wasn't mentioned. In a piece on the Mother Jones site Asawin Suebsaeng refers to Shyamalan "he who must not be named", and asserts that the director's decline from "the next Hitchcock" to a "critical and pop-cultural punchline" made his association with the movie a liability for the studio.
Much in the same way that a marketing campaign will go out of its way not to use the word "gay" when promoting a film about two despondent gay cowboys, the marketing campaign for After Earth has gone out of its way not to mention the words "M. Night Shyamalan." That sort of tells you everything you need to know about how highly Sony thinks of the 42-year-old director and his current standing.
  • Other commentators have focused what influence star Will Smith's affiliation with Scientology may have had on the film. (At the end of last year when trailers for After Earth and Oblivion were both playing before new releases I noted that not only the similarity between the film's post-apocalyptic-Earth plots, but also the fact that both movies starred prominent celebrity Scientologists.) The Hollywood Reporter ran an analysis of the film written by a former member of the church.
Will Smith’s character is pretty much devoid of all emotions for the entire movie. While this may be part of his character or something that was directed in the script, in Scientology, one goes through great amounts of training and counseling to control one’s emotions and “mis-emotion,” as described by Hubbard. Anyone who has done even the smallest amount of Scientology training will recall sitting and staring at a person for hours on end without being allowed to blink, smile or turn one’s head. Will Smith pretty much masters that for the entirety of this movie.
Without being too obvious, Smith has delivered an incredibly mainstream platform for the Church's ideology. After Earth’s subtext makes every beat feel like a nod to the lessons of L. Ron Hubbard. Fleeing Earth to another planet only to return to home mirrors the idea of thetan resurrection. The ship Cypher and Kitai take on their mission isn't that far off from the Douglas DC-8–esque ship that took Xenu's kidnapped souls to earth. And the prominently advertised volcano that functions as a backdrop to a large After Earth set piece? Just look at the cover to Hubbard's book that started it all —Dianetics.
If After Earth were intentional propaganda, it would be an even bigger failure than it already is – the path to self-enlightenment is reduced to an overlong, tedious quest to find shit. Who wants to join that club? For the strong-willed, fear may be a choice, but for everyone else this weekend, avoiding boredom is an even clearer choice.
  • Perhaps The Onion's analysis has it, and audiences found the gimmick of Smith-and-son starring in a movie "more annoying than appealing".
Let’s just say, for argument’s sake, that I was an average, everyday American consumer. Would I enjoy seeing an incredibly rich and famous man use his money and power to make his children incredibly rich and famous? Would I enjoy seeing the face of a young teenager plastered on movie posters across the entire nation, not because of who he is, but because of who his father is? To be totally honest, I’m not so sure I would. In fact, it’s conceivable that I might find it unbelievably infuriating and downright unbearable.
This book is a balefully seminal work in which neither author has the language to see, much less to express, the titanic centralizing evil they are constructing. “What Lockheed Martin was to the 20th century,” they tell us, “technology and cybersecurity companies will be to the 21st.” Without even understanding how, they have updated and seamlessly implemented George Orwell’s prophecy. If you want a vision of the future, imagine Washington-backed Google Glasses strapped onto vacant human faces — forever.
“Caretaker” invokes ’90s environmentalism, a superpower’s role as world police, and two oppositional parties working together to run that superpower as best as they can, but it’s nothing so much as a reminder of Gene Roddenberry’s Prime Directive. Starfleet is expressly prohibited from interfering with the progress of pre-warp societies. The Caretaker’s species had no such guidelines and nearly wiped out a whole species. Now, Voyager has the task of upholding Alpha Quadrant standards in the absence of Alpha Quadrant hierarchy.

Powered by Squarespace. Background image of New Songdo by Curry Chandler.