Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Mike Gane interview: Baudrillard, academia, more

  • The upcoming issue of the International Journal of Baudrillard Studies features an interview with Baudrillard scholar Mike Gane. The interview touches upon a variety of topics, including Gane's interactions with Baudrillard, media coverage of Margaret Thatcher's death, and hypothesizing what Baudrillard would be writing about were he alive today:

One could ‘see’ the specific things Baudrillard would have picked up – extreme phenomena like sovereign debt. Today he would be writing on fracking, drones, etc.

Gane also addresses the present state of academia:

The essential point is that the whole educational experience has changed, and the student has become oriented to enterprise, and to developing, accumulating, human capital. The student gets used to appraising the lecturer’s performance just as the lecturer grades the student, and the Sunday Times grades the university. So, all the discussion about declining standards focuses on the wrong issue.  What has happened is a transformation of individualism, not towards a new freedom in the classical liberal sense, but towards a new individual who builds up capital and exploits this competitively. The university staff members are equally thrown into a competitive game network, where to outperform others is essential to survival. Almost everything is assessed and ranked with a degree of Kafkaesque bureaucratisation that is hardly believable. Whereas the system of 40 years ago was simple and relaxed, with liberal values, and within it there were known traditional hierarchies, today it is hyper-bureaucratised and hyper-legalised and the hierarchies have changed and keep changing.   Thus to understand what has happened it is essential to see that neoliberalism does not diminish the action of the state; it avoids direct state intervention but only to insert new mechanisms and values insidiously where none existed before: for example, in Britain it is only now, forty years after the initial entry of neoliberalism, that an enterprise element is being required on each degree course, and that an enterprise element is to be counted within the work profile of academics. And these new mechanisms do not stand still; the system is in constant movement, as if in permanent crisis. This why Baudrillard, and others like Žižek, have described this as a new totalitarianism which works not by imposing a system of commands but rather a game framework into which the individual is absorbed and has to adapt at a moment’s notice.

  • In a recent Atlantic article Ian Bogost considered the McRib sandwich through the lens of Lacanian psychoanalysis. The aphoristic ending of the essay recalls the Baudrillardian turn on the function of Disneyland and prisons:

Yet, the McRib’s perversity is not a defect, but a feature. The purpose of the McRib is to make the McNugget seem normal.

Ender's Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It's been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman's latest Big Picture post uses his own review of the Ender's Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or "gimmick") to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person "Hulk speak"  writing style the all caps seems to be played out. Nevertheless, I'm sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.

  • This video from Satchbag's Goods is ostensibly a review ofHotline Miami, but develops into a discussion of art movements and Kanye West:

  • This short interview with Slavoj Žižek in New York magazine continues a trend I've noticed since Pervert's Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek's contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team's blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin's Creed game for the New York Times. I haven't played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

  • Speaking of postmodern, self-referential, meta-fictional video games: The Stanley Parable was released late last month. There has already been a bevy of analysis written about the game, but I am waiting for the Mac release to play the game and doing my best to avoid spoilers in the meantime. Brenna Hillier's post at VG24/7 is spoiler free (assuming you are at least familiar with the games premise, or its original incarnation as a Half Life mod), and calls The Stanley parable "a reaction against, commentary upon, critique and celebration of narrative-driven game design":

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters' ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves - our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn't exactly rocket science, thanks to Tor and some bitcoins. Here's a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is "imaginary worlds," and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Manifesto for a Ludic Century, ludonarrative dissonance in GTA, games and mindf*cks, and more

Systems, play, design: these are not just aspects of the Ludic Century, they are also elements of gaming literacy. Literacy is about creating and understanding meaning, which allows people to write (create) and read (understand).

New literacies, such as visual and technological literacy, have also been identified in recent decades. However, to be truly literate in the Ludic Century also requires gaming literacy. The rise of games in our culture is both cause and effect of gaming literacy in the Ludic Century.

So, perhaps there is one fundamental challenge for the Manifesto for a Ludic Century: would a truly ludic century be a century of manifestos? Of declaring simple principles rather than embracing systems? Or, is the Ludic Manifesto meant to be the last manifesto, the manifesto to end manifestos, replacing simple answers with the complexity of "information at play?"

Might we conclude: videogames are the first creative medium to fully emerge after Marshall McLuhan. By the time they became popular, media ecology as a method was well-known. McLuhan was a popular icon. By the time the first generation of videogame players was becoming adults, McLuhan had become a trope. When the then-new publication Wired Magazine named him their "patron saint" in 1993, the editors didn't even bother to explain what that meant. They didn't need to.

By the time videogame studies became a going concern, McLuhan was gospel. So much so that we don't even talk about him. To use McLuhan's own language of the tetrad, game studies have enhanced or accelerated media ecology itself, to the point that the idea of studying the medium itself over its content has become a natural order.

Generally speaking, educators have warmed to the idea of the flipped classroom far more than that of the MOOC. That move might be injudicious, as the two are intimately connected. It's no accident that private, for-profit MOOC startups like Coursera have advocated for flipped classrooms, since those organizations have much to gain from their endorsement by universities. MOOCs rely on the short, video lecture as the backbone of a new educational beast, after all. Whether in the context of an all-online or a "hybrid" course, a flipped classroom takes the video lecture as a new standard for knowledge delivery and transfers that experience from the lecture hall to the laptop.

  • Also, with increased awareness of Animal Crossing following from the latest game's release for the Nintendo 3DS, Bogost recently posted an excerpt from his 2007 book Persuasive Games discussing consumption and naturalism in Animal Crossing:

Animal Crossing deploys a procedural rhetoric about the repetition of mundane work as a consequence of contemporary material property ideals. When my (then) five-year-old began playing the game seriously, he quickly recognized the dilemma he faced. On the one hand, he wanted to spend the money he had earned from collecting fruit and bugs on new furniture, carpets, and shirts. On the other hand, he wanted to pay off his house so he could get a bigger one like mine.

Ludonarrative dissonance is when the story the game is telling you and your gameplay experience somehow don’t match up. As an example, this was a particular issue in Rockstar’s most recent game, Max Payne 3. Max constantly makes remarks about how terrible he is at his job, even though he does more than is humanly possible to try to protect his employers – including making perfect one-handed head shots in mid-air while drunk and high on painkillers. The disparity and the dissonance between the narrative of the story and the gameplay leave things feeling off kilter and poorly inter-connnected. It doesn’t make sense or fit with your experience so it feels wrong and damages the cohesiveness of the game world and story. It’s like when you go on a old-lady only murdering spree as Niko, who is supposed to be a reluctant killer with a traumatic past, not a gerontophobic misogynist.

What I find strange, in light of our supposed anti-irony cultural moment, is a kind of old-fashioned ironic conceit behind a number of recent critical darlings in the commercial videogame space. 2007's Bioshock and this year’s Bioshock: Infinite are both about the irony of expecting ‘meaningful choice’ to live in an artificial dome of technological and commercial constraints. Last year’s Spec Ops: The Line offers a grim alchemy of self-deprecation and preemptive disdain for its audience. The Grand Theft Auto series has always maintained a cool, dismissive cynicism beneath its gleefully absurd mayhem. These games frame choice as illusory and experience as artificial. They are expensive, explosive parodies of free will.

To cut straight to the heart of it, Bioshock seems to suffer from a powerful dissonance between what it is about as a game, and what it is about as a story. By throwing the narrative and ludic elements of the work into opposition, the game seems to openly mock the player for having believed in the fiction of the game at all. The leveraging of the game’s narrative structure against its ludic structure all but destroys the player’s ability to feel connected to either, forcing the player to either abandon the game in protest (which I almost did) or simply accept that the game cannot be enjoyed as both a game and a story, and to then finish it for the mere sake of finishing it.

The post itself makes a very important point: games, for the most part, can’t pull the Mindfuck like movies can because of the nature of the kind of storytelling to which most games are confined, which is predicated on a particular kind of interaction. Watching a movie may not be an entirely passive experience, but it’s clearly more passive than a game. You may identify with the characters on the screen, but you’re not meant to implicitly think of yourself as them. You’re not engaging in the kind of subtle roleplaying that most (mainstream) games encourage. You are not adopting an avatar. In a game, you are your profile, you are the character you create, and you are also to a certain degree the character that the game sets in front of you. I may be watching everything Lara Croft does from behind her, but I also control her; to the extent that she has choices, I make them. I get her from point A to B, and if she fails it’s my fault. When I talk about something that happened in the game, I don’t say that Lara did it. I say that I did.

Anachrony is a common storytelling technique in which events are narrated out of chronological order. A familiar example is a flashback, where story time jumps to the past for a bit, before returning to the present. The term "nonlinear narrative" is also sometimes used for this kind of out-of-order storytelling (somewhat less precisely).

While it's a common technique in literature and film, anachrony is widely seen as more problematic to use in games, perhaps even to the point of being unusable. If the player's actions during a flashback scene imply a future that differs considerably from the one already presented in a present-day scene (say, the player kills someone who they had been talking to in a present-day scene, or commits suicide in a flashback), this produces an inconsistent narrative. The root of the problem is that players generally have degree of freedom of action, so flashbacks are less like the case in literature and film—where already decided events are simply narrated out of order—and more like time travel, where the player travels back in time and can mess up the timeline.

The first of the books are set to be published in early 2014. Some of the writers that will be published by Press Select in its first round have written for publications like Edge magazine, Kotaku, Kill Screen and personal blogs, including writers like Chris Dahlen, Michael Abbott, Jenn Frank, Jason Killingsworth, Maddy Myers, Tim Rogers, Patricia Hernandez and Robert Yang.

Inside Korea's gaming culture, virtual worlds and economic modeling, Hollywood's Summer of Doom continued, and more

  • I've long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea's Gaming Culture. From this westerner's perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:

Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam's Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea's own NCSoft, whose European base is but a stone's throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.

"It's relaxing," says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. "And dangerous," he adds. "It's easy to lose track of time playing these games, especially when you have so much invested in them. I'm always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I've been here for half the day."

[youtube=http://www.youtube.com/watch?v=Kue_gd8DneU&w=420&h=315]

Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.

In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”

The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.

The big thorny question is this: is technology destroying jobs faster than it creates them?

[...]

We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.

We begin with a love story--from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives...and that's chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.

[youtube=http://www.youtube.com/watch?v=pHCwaaactyY&w=420&h=315]

"These outages are absolutely going to continue," said Neil MacDonald, a fellow at technology research firm Gartner. "There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person."

From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.

What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.

What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.

[youtube=http://www.youtube.com/watch?v=GpO76SkpaWQ&w=560&h=315]

This 'celebrification' is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.

Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.

Spielberg's theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood's troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer's grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.

[youtube=http://www.youtube.com/watch?v=F4mDNMSntlA&w=420&h=315]

Videodrome turns 30

Videodrome’s depiction of techno-body synthesis is, to be sure, intense; Cronenberg has the unusual talent of making violent, disgusting, and erotic things seem even more so. The technology is veiny and lubed. It breaths and moans; after watching the film, I want to cut my phone open just to see if it will bleed. Fittingly, the film was originally titled “Network of Blood,” which is precisely how we should understand social media, as a technology not just of wires and circuits, but of bodies and politics. There’s nothing anti-human about technology: the smartphone that you rub and take to bed is a technology of flesh. Information penetrates the body in increasingly more intimate ways.

  • I also came across this short piece by Joseph Matheny at Alterati on Videodrome and YouTube:

Videodrome is even more relevant now that YouTube is delivering what cable television promised to in the 80s: a world where everyone has their own television station. Although digital video tools began to democratize video creation, it’s taken the further proliferation of broadband Internet and the emergence of convenient platforms like YouTube and Google Video to democratize video distribution.

  • There's also my Videodrome-centric post from a couple of years ago. Coincidentally, I watched eXistenZ for the first time last week. I didn't know much about the film going in, and initially I was enthusiastic that it seemed to be a spiritual successor to Videodrome, updating the media metaphor for the New Flesh from television to video games. I remained engaged throughout the movie (although about two thirds into the film I turned to my fiancee and asked "Do you have any idea what's going on?"), and there were elements that I enjoyed but ultimately I was disappointed. I had a similar reaction at the ending of Cronenberg's Spider, thinking "What was the point of all that?" when the closing credits started to roll, though it was much easier to stay awake during eXistenZ.

Bogost on Facebook feudalism, narrative possibilites in games, the gamification of sex

The short truth is this: Facebook doesn't care if developers can use the platform easily or at all. In fact, it doesn't seem to concern itself with any of the factors that might be at play in developers' professional or personal circumstances. The Facebook Platform is a selfish, self-made altar to Facebook, at which developers are expected to kneel and cower, rather than a generous contribution to the success of developers that also happens to benefit Facebook by its aggregate effects.

A lot of reactions to the narrative of [Bioshock] Infinite that I encountered were that it “didn’t make sense,” and that it was “being weird for the sake of being weird.”

Those reminded me of criticisms leveled at two of my favorite filmmakers: David Lynch and Stanley Kubrick. I think these comments arise because Infinite doesn’t go all the way, it hesitates. It tries to stick to conventional logic. It strews about Voxaphones to explain its abstractions.

  • Shujaat Syed at Player Effort writes about "making linear story telling interesting in video games by acknowledging the fourth wall":

At their core, video games are authoritarian. They have rules that need to be followed, and you are restricted to the game play systems and a story the programmers and designers have created. However, compared to other forms of media, they offer a breadth of freedom that is unmatched. I will not be speaking about the freedom of exploration. What I will be talking about is the freedom of creating a different type of narrative that is only possible through video games by breaking the 4th wall between the game and the player. This is one of our mediums greatest advantage, however, very rarely, is this power explored. With video games, we can have truly powerful forms of narrative, but at most we get ideas that could theoretically work as movies. Open-world sandbox games can dodge this because the player is free to create their own narrative alongside the main plotline, and this is a concept that is entirely unique to video games. It’s the linear story-based games where the narrative is usually much harder to distinguish than what you would get from a book or movie.

In addition to registering your decibel levels (I’m hoping mine will get a boost from the garbage truck always idling outside my window), Spreadsheets will also monitor your overall duration, frequency, and somehow, thrusts per minute. Apparently this does not require supplementary electrodes.

What’s more, you can unlock “badges” and the like. For example, to meet the “Hello Sunshine” achievement, worth 10 points, you must take on the ultimate challenge of our time: “perform morning sex.”

Warren Ellis on violent fiction, death of the Western, Leatherface as model vegan

As we learn early on, the movie’s killers, the murderous Sawyer family (comprised of Leatherface, Grandpa, et al), used to run a slaughterhouse, and the means they use to slaughter their victims are the same as those used to slaughter cattle. They knock them over the head with sledgehammers, hang them on meat hooks, and stuff them into freezers. Often this takes place as the victims are surrounded by animal bones, a detail that could be explained away as the evidence of their former occupation—except that the cries of farm animals (there are none around) are played over the scenes.

Through the past century of Western movies, we can trace America's self-image as it evolved from a rough-and-tumble but morally confident outsider in world affairs to an all-powerful sheriff with a guilty conscience. After World War I and leading into World War II, Hollywood specialized in tales of heroes taking the good fight to savage enemies and saving defenseless settlements in the process. In the Great Depression especially, as capitalism and American exceptionalism came under question, the cowboy hero was often mistaken for a criminal and forced to prove his own worthiness--which he inevitably did. Over the '50s, '60s, and '70s however, as America enforced its dominion over half the planet with a long series of coups, assassinations, and increasingly dubious wars, the figure of the cowboy grew darker and more complicated. If you love Westerns, most of your favorites are probably from this era--Shane, The Searchers, Butch Cassidy and the Sundance Kid, McCabe & Mrs. Miller, the spaghetti westerns, etc. By the height of the Vietnam protest era, cowboys were antiheroes as often as they were heroes.

The dawn of the 1980s brought the inauguration of Ronald Reagan and the box-office debacle of the artsy, overblown Heaven's Gate. There's a sense of disappointment to the decade that followed, as if the era of revisionist Westerns had failed and a less nuanced patriotism would have to carry the day. Few memorable Westerns were made in the '80s, and Reagan himself proudly associated himself with an old-fashioned, pre-Vietnam cowboy image. But victory in the Cold War coincided with a revival of the genre, including the revisionist strain, exemplified in Clint Eastwood's career-topping Unforgiven. A new, gentler star emerged in Kevin Costner, who scored a post-colonial megahit with Dances With Wolves. Later, in the 2000s, George W. Bush reclaimed the image of the cowboy for a foreign policy far less successful than Reagan's, and the genre retreated to the art house again.

Westerns are fundamentally about political isolation. The government is far away and weak. Institutions are largely irrelevant in a somewhat isolated town of 100 people. The law is what the sheriff says it is, or what the marshall riding through town says, or the posse. At that scale, there may be no meaningful distinction between war and crime. A single individual's choices can tilt the balance of power. Samurai and Western stories cross-pollinated because when you strip away the surface detail the settings are surprisingly similar. The villagers in Seven Samurai and the women in Unforgiven are both buying justice/revenge because there is no one to appeal to from whom they could expect justice. Westerns are interesting in part because they are stories where individual moral judgment is almost totally unsupported by institutions.

Westerns clearly are not dying. We get a really great film in the genre once every few years. However, they've lost a lot of their place at the center of pop culture because the idea of an isolated community has grown increasingly implausible. In what has become a surveillance state, the idea of a place where the state has no authority does not resonate as relevant.

The function of fiction is being lost in the conversation on violence. My book editor, Sean McDonald, thinks of it as “radical empathy.” Fiction, like any other form of art, is there to consider aspects of the real world in the ways that simple objective views can’t — from the inside. We cannot Other characters when we are seeing the world from the inside of their skulls. This is the great success of Thomas Harris’s Hannibal Lecter, both in print and as so richly embodied by Mads Mikkelsen in the Hannibal television series: For every three scary, strange things we discover about him, there is one thing that we can relate to. The Other is revealed as a damaged or alienated human, and we learn something about the roots of violence and the traps of horror.

Rushkoff on Manning verdict, Chomsky/Žižek on NSA leaks, looking for McLuhan in Afghanistan

We are just beginning to learn what makes a free people secure in a digital age. It really is different. The Cold War was an era of paper records, locked vaults and state secrets, for which a cloak-and-dagger mindset may have been appropriate. In a digital environment, our security comes not from our ability to keep our secrets but rather our ability to live our truth.

In light of the recent NSA surveillance scandal, Chomsky and Žižek offer us very different approaches, both of which are helpful for leftist critique. For Chomsky, the path ahead is clear. Faced with new revelations about the surveillance state, Chomsky might engage in data mining, juxtaposing our politicians' lofty statements about freedom against their secretive actions, thereby revealing their utter hypocrisy. Indeed, Chomsky is a master at this form of argumentation, and he does it beautifully in Hegemony or Survival when he contrasts the democratic statements of Bush regime officials against their anti-democratic actions. He might also demonstrate how NSA surveillance is not a strange historical aberration but a continuation of past policies, including, most infamously, the FBI's counter intelligence programme in the 1950s, 60s, and early 70s.

Žižek, on the other hand, might proceed in a number of ways. He might look at the ideology of cynicism, as he did so famously in the opening chapter of The Sublime Object of Ideology, in order to demonstrate how expressions of outrage regarding NSA surveillance practices can actually serve as a form of inaction, as a substitute for meaningful political struggle. We know very well what we are doing, but still we are doing it; we know very well that our government is spying on us, but still we continue to support it (through voting, etc). Žižek might also look at how surveillance practices ultimately fail as a method of subjectivisation, how the very existence of whistleblowers like Thomas Drake, Bradley Manning, Edward Snowden, and the others who are sure to follow in their footsteps demonstrates that technologies of surveillance and their accompanying ideologies of security can never guarantee the full participation of the people they are meant to control. As Žižek emphasises again and again, subjectivisation fails.

In early 2011, award-winning photographer Rita Leistner was embedded with a U.S. marine battalion deployed to Helmand province as a member of Project Basetrack, an experiment in using new technologies in social media to extend traditional war reporting. This new LRC series draws on Leistner’s remarkable iPhone photos and her writings from her time in Afghanistan to use the ideas of Marshall McLuhan to make sense of what she saw there – “to examine the face of war through the extensions of man.”

Multiple angles on gamification

  • This week my fiancée told me about an app she had recently installed on her phone. As she excitedly described it, users of the app can "check in" at a retail store (it sounded like your location is verified through GPS) and you receive points for doing so, presumably to redeem for store purchases but I don't recall all the details. I should also mention that his app is not Foursquare, though I am not sure how the two apps differ specifically. Apps like this exemplify the gamification trend in marketing and advertising. There is an entire wiki dedicated to gamification, with detailed pages like this one describing the various game mechanics used in gamification.

Gamification applies basic game thinking and game mechanics to a non-gaming context. Many gamification models reward users for participating, completing defined user tasks, or achieving goals. A great example is Foursquare, which awards points and perks for "checking-in" to places you go. Although some models introduce distinguishable game-related features, gamification of online shopping includes any type of game thinking applied to an online shopping model.

Gamification makes things fun because it taps into our basic human appetite for competition, stature, and social interaction. Rather than feeling tricked or manipulated, we feel a sense of control when participating in transparent game-oriented shopping. As a result, shopping becomes more exciting and rewarding, while increasing highly sought-after engagement and customer loyalty for retailers and brands.

  • This LinkedIn post by Dan Sanker describes gamification as "the application of game elements and digital game design techniques to non-game problems" and considers potential applications:

Small tools, influenced by simple game mechanics can be used to modify people’s behavior. [...] There is a long way to go to make some mundane tasks more engaging. I think the paradigm that rang true the most this week, especially after talking with the kids about their experiences – is that we need to start thinking about customers, consumers, employees and/or students less as ‘Users’ and more as ‘Players.’ Are there ways to enjoy the experience of buying, procuring, working and learning? It might be a better way for us to consider interacting with Generation Z and those who come after them.

But gamification hasn't just grabbed the attention of the corporate world. Teachers are trying to make learning more fun by introducing games into the classroom in the hope of keeping children engaged for longer. This made me think about how many banks, building societies and other financial services providers are using gamification to encourage kids to start saving or educate them about money.

In the minds of Silicon Valley’s eternal optimists, and the journalists who so unconditionally love them, gamification is the possibility of rendering intricately complex processes, such as education or health care, more effective by transforming them into games. If kids aren’t reading, goes the gamified mantra, perhaps some friendly competitive system of badges and leaderboards might provide the missing incentive. And if adults are getting a tad too heavy, just slap a gizmo on their wrists that challenges them to burn more and more calories each day and they’ll play along.

As a professor of video games, I’ve strong doubts that the same principles that compel us to save Princess Zelda or defeat Donkey Kong apply in the classroom, the boardroom, or the emergency room. Like most game scholars, I view gamification as the creation of the TED-circle nabobs, largely empty, feel-good fodder for the intellectually limp. But the idea isn’t totally useless: There are some special categories of human events, rare and far-between, whose own innate absurdities are so profound that a touch of gamification might actually do them good.

I’m talking, of course, about the Israeli-Palestinian peace process.

We are naturally drawn to entertaining, visually appealing, easily digestible information sources and the power is in our hands to choose who, when, where and on what we will engage.  Witness the rise of video consumption on mobile as part of this trend.

Gamification may be the answer but the problem is that businesses can rush into it without necessarily lifting the bonnet to see what is making it work. There are a number of services putting their hands up to execute it for you but executing without a clear view of what motivates your audience can and will prove fatal.

The concept of gamifying products and services came into being when marketers realised that loyalty programmes are becoming too banal to retain consumers. A number of leading brand names, including Hungama, Zapak, Adobe and Microsoft, have used the concept successfully to create a habit of their product amongst users.

Microsoft created a unique gamified tool that allowed users to learn the new MS Office applications and earn rewards, thus making the whole process interactive.

Epic EVE battle, Critical games criticism, indie developer self-publishing

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online's history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST's withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won't be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, "Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That's what we think will actually generate a bunch of creativity on the system." With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it "generally like we think about Marketplace today." According to developers we've spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of 'freemium'. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the 'free-to-play with in-app purchases' model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a 'reward' state in the player's brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a 'Skinner box', named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate" and “important." So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.


[...]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate" spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism. 

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

 

Zimmerman media coverage, Scorcese on reading cinema, remediation in Game of Thrones, and much more

The reports are based on an ABC News interview with Juror B29, the sole nonwhite juror. She has identified herself only by her first name, Maddy. She’s been framed as the woman who was bullied out of voting to convict Zimmerman. But that’s not true. She stands by the verdict. She yielded to the evidence and the law, not to bullying. She thinks Zimmerman was morally culpable but not legally guilty. And she wants us to distinguish between this trial and larger questions of race and justice.

ABC News hasn’t posted a full unedited video or transcript of the interview. The video that has been broadcast—on World News Tonight, Nightline, and Good Morning America—has been cut and spliced in different ways, often so artfully that the transitions appear continuous. So beware what you’re seeing. But the video that’s available already shows, on closer inspection, that Maddy has been manipulated and misrepresented. Here are the key points.

In the recording heard by NBC viewers, Zimmerman appeared to volunteer the information, “This guy looks like he’s up to no good. He looks black.”

Edited out was the 911 dispatcher asking Zimmerman if the person he was suspicious of was “black, white or Hispanic,” to which Zimmerman had responded, “He looks black.”

Though Zimmerman and his attorneys have filed a lawsuit against NBC News for the malicious editing of the 911 tape, what CNN did is far worse.

NBC News was attempting to make Zimmerman look like a racial profiler. CNN, on the other hand, was attempting to make Zimmerman look like an enraged outright racist (there was no racial angle in ABC's fraud). It also took CNN far longer to retract their story than either NBC or ABC.

Moreover, on its own airwaves, CNN would allow the complete fallacy that Zimmerman had said "fucking coon" to live on.

Pulling teeth doesn’t do justice to the painful viewing experience accompanying this sort of news manufacture - making news from no news. Even the daily palaver known as Changing the Guard was spun to look like an integral prelude to the long-awaited arrival. And the waiting went on, and on, and on, and the longer it went on, the more desperate and dull the coverage became. Sometimes people complain about the high salaries enjoyed by news presenters, especially the public service variety, but by golly they earnt their crust trying, albeit failing, to sustain the suspense.

Light is at the beginning of cinema, of course. It’s fundamental—because cinema is created with light, and it’s still best seen projected in dark rooms, where it’s the only source of light. But light is also at the beginning of everything. Most creation myths start with darkness, and then the real beginning comes with light—which means the creation of forms. Which leads to distinguishing one thing from another, and ourselves from the rest of the world. Recognizing patterns, similarities, differences, naming things—interpreting the world. Metaphors—seeing one thing “in light of” something else. Becoming “enlightened.” Light is at the core of who we are and how we understand ourselves.

[...]

Or consider the famous Stargate sequence from Stanley Kubrick’s monumental 2001: A Space Odyssey. Narrative, abstraction, speed, movement, stillness, life, death—they’re all up there. Again we find ourselves back at that mystical urge—to explore, to create movement, to go faster and faster, and maybe find some kind of peace at the heart of it, a state of pure being.

Despite stormy forecasts, Hollywood appears to be too unwieldly or too unwilling to shift direction towards smaller, cheaper pictures. Guests at Comic-Con learned about upcoming studio productions including Pirates of the Caribbean 5, Thor 2, Fantastic Four 3 and a reboot of Godzilla. The director Joss Whedon came to the event to lament that "pop culture is eating itself" and called for "new universes, new messages and new icons". He then revealed the title of his next film to be Avengers: Age of Ultron.

Repeat after me: Edward Snowden is not the story. The story is what he has revealed about the hidden wiring of our networked world. This insight seems to have escaped most of the world's mainstream media, for reasons that escape me but would not have surprised Evelyn Waugh, whose contempt for journalists was one of his few endearing characteristics. The obvious explanations are: incorrigible ignorance; the imperative to personalise stories; or gullibility in swallowing US government spin, which brands Snowden as a spy rather than a whistleblower.

The video site is aiming to showcase some geek culture by pronouncing 4-10 August its first ever ‘Geek Week’ and promoting some of the genre’s top channels which cover everything from sci-fi to comics, gaming and superheroes. To do this, its own channel will be featuring videos from users like Nerdist, the official Doctor Who channel, MinutePhysics and more than a hundred others, with every day of the week hosted by a different user. It’ll even include the first trailer for the new Thor movie, The Dark World.

That said, things kept nagging me. Blackfish does raise some valuable secondary issues - how SeaWorld markets itself, how labor issues are at stake in addition to environmental ones - but as a spectator I kept wanting the film to pursue lines of analysis that it would suggest but never develop.

[...]


In short, if there's an ur-ideology to the American progressive documentary, it's that demand-side drivers of political situations (Gramsci's hegemony, ideology, what have you) don't matter, it's merely the supply side of oligopoly, big money, and corporate control. Or to be less political, as a film scholar I can't help but notice than in a film about the business of spectacle, the spectator is both crucial (SeaWorld viewers provide the vital footage of the incidents) and completely effaced.

And what of the YouTube creator? How has AdSense helped or hindered their careers? In most cases, the advertising structure has been a blessing to creators as it’s allowed them to launch careers solely through YouTube. AdSense gave us a new type of celebrity for a new generation.

Creators have had their fair share of AdSense woes in the past, though. Last year, one of YouTube’s biggest names, Ray William Johnson, entered a very public dispute with Maker Studios. Johnson claimed that Maker Studios was holding his AdSense account “hostage” even after he had terminated his contract with them.

If you watch big budget entertainments, there's no escaping these sorts of moments. The trope familiar to the Scooby-Doo generation, in which a few nagging uncertainties are resolved with a "there's just one thing I don't understand" kickoff, has now become a motif. Characters must constantly address questions on behalf of a too-curious audience awash in complexly-plotted mega-stories. The movies are trying to plug leaks in a boat before the whole thing sinks—never quite repairing it, but doing just enough to get by.

What I’m talking about here is the unavoidable shift that occurs when content is remediated—that is, borrowed from one medium and reimagined in another. In this case, the content of the book series A Song of Ice and Fire (ASOIAF) is remediated to Game of Thrones, the HBO television series. Some of the differences in this instance of remediation seem pragmatic—remembrances are turned into scenes of their own, dialogue is shortened, characters omitted or altered for the sake of brevity and clarity. I am no purist, and I recognize that with remediation comes necessary alteration for the content to suit the new medium. But other differences speak volumes about our cultural biases and expectations surrounding those with socially-othered bodies—like Tyrion, Sam, and, of course, women. What can we say about these differences? And perhaps more importantly, what do they say about us?

Why does it matter what Kubrick liked? For years I’ve enjoyed unearthing as much information as I can about his favourite films and it slowly became a personal hobby. Partly because each time I came across such a film (usually from a newly disclosed anecdote – thanks internet! – or Taschen’s incredible The Stanley Kubrick Archives book) I could use it as a prism to reveal more about his sensibilities. My appreciation of both him and the films he liked grew. These discoveries led me on a fascinating trail, as I peppered them throughout the 11 existing Kubrick features (not counting the two he disowned) I try to watch every couple of years. I’m sure a decent film festival could be themed around the Master List at the end of this article…

  • Lastly, the Media Ecology Association has uploaded some videos from their latest annual convention which was held in June. These include Dominique Scheffel-Dunand on canonic texts in media ecology and Lance Strate's talk "If not A, then E".

 

Hacker's death, wearable tech, and some Cyberpunk

His genius was finding bugs in the tiny computers embedded in equipment, such as medical devices and cash machines. He often received standing ovations at conferences for his creativity and showmanship while his research forced equipment makers to fix bugs in their software.

Jack had planned to demonstrate his techniques to hack into pacemakers and implanted defibrillators at the Black Hat hackers convention in Las Vegas next Thursday. He told Reuters last week that he could kill a man from 30 feet away by attacking an implanted heart device.

Without the right approach, the continual distraction of multiple tasks exerts a toll that disrupts performance. It takes time to switch tasks, to get back what attention theorists call “situation awareness.” Interruptions disrupt performance, and even a voluntary switching of attention from one task to another is an interruption of the task being left behind.

Furthermore, it will be difficult to resist the temptation of using powerful technology that guides us with useful side information, suggestions, and even commands. Sure, other people will be able to see that we are being assisted, but they won’t know by whom, just as we will be able to tell that they are being minded, and we won’t know by whom.

9am to 1pm: Throughout the day you connect to your Dekko-powered augmented reality device, which overlays your vision with a broad range of information and entertainment. While many of the products the US software company is proposing are currently still fairly conceptual, Dekko hopes to find ways to integrate an extra layer of visual information into every part of daily life. Dekko is one of the companies supplying software to Google Glass, the wearable computer that gives users information through a spectacle-like visual display. Matt Miesnieks, CEO of Dekko, says that he believes "the power of wearables comes from connecting our senses to sensors."

Researchers at Belgian nonelectronics reseach and development center Imec and Belgium’s Ghent University are in the very early stages of developing such a device, which would bring augmented reality–the insertion of digital imagery such as virtual signs and historical markers with the real world–right to your eyeballs. It’s just one of several such projects (see “Contact Lens Computer: It’s Like Google Glass Without The Glasses”), and while the idea is nowhere near the point where you could ask your eye doctor for a pair, it could become more realistic as the cost and size of electronic components continue to fall and wearable gadgets gain popularity.

Speaking on the sidelines of the Wearable Technologies conference in San Francisco on Tuesday, Eric Dy, Imec’s North America business development manager, said researchers are investigating the feasibility of integrating an array of micro lenses with LEDs, using the lenses to help focus light and project it onto the wearer’s retinas.

The biggest barrier, beyond the translation itself, is speech recognition. In so many words, background noise interferes with the translation software, thus affecting results. But Barra said it works "close to 100 percent" when used in "controlled environments." Sounds perfect for diplomats, not so much for real-world conversations. Of course, Google's non-real-time, text-based translation software built into Chrome leaves quite a bit to be desired, making us all the more wary of putting our faith into Google's verbal solution. As the functionality is still "several years away," though, there's still plenty of time to convert us.

There will be limitations, however. It's easy to think that a life-sized human being, standing in your living room, would be capable of giving you a hug, for instance. But if that breakthrough is coming, it hasn't arrived yet. Holodeck creations these are not. And images projected through the magic of HoloVision won't be able to follow you into the kitchen for a snack either — not unless you've got a whole network of HoloVision cameras, anyway.

The implications of Euclid’s technology do not stop at surveillance or privacy. Remember, these systems are meant to feed data to store owners so that they can rearrange store shelves or entire showroom floors to increase sales. Malls, casinos, and grocery stores have always been carefully planned out spaces—scientifically arranged and calibrated for maximum profit at minimal cost. Euclid’s systems however, allow for massive and exceedingly precise quantification and analysis. More than anything, what worries me is the deliberateness of these augmented spaces. Euclid will make spaces designed to do exactly one thing almost perfectly: sell you shit you don’t need. I worry about spaces that are as expertly and diligently designed as Amazon’s home page or the latest Pepsi advertisement. A space built on data so rich and thorough that it’ll make focus groups look quaint in comparison.

Of course the US is not a totalitarian society, and no equivalent of Big Brother runs it, as the widespread reporting of Snowden’s information shows. We know little about what uses the NSA makes of most information available to it—it claims to have exposed a number of terrorist plots—and it has yet to be shown what effects its activities may have on the lives of most American citizens. Congressional committees and a special federal court are charged with overseeing its work, although they are committed to secrecy, and the court can hear appeals only from the government.

Still, the US intelligence agencies also seem to have adopted Orwell’s idea of doublethink—“to be conscious of complete truthfulness,” he wrote, “while telling carefully constructed lies.” For example, James Clapper, the director of national intelligence, was asked at a Senate hearing in March whether “the NSA collect[s] any type of data at all on millions or hundreds of millions of Americans.” Clapper’s answer: “No, sir…. Not wittingly.”

The drone is carrying a laptop so it can communicate with the headset, but right now the sticking point is range; since it's using wi-fi to communicate, it'll only get to around 50-100m.

"It's not a video game movie, it's a cyberpunk movie," Cargill said. "Eidos Montreal has given us a lot of freedom in terms of story; they want this movie to be Blade Runner. We want this movie to be Blade Runner."

INTERVIEWER

There’s a famous story about your being unable to sit through Blade Runner while writing Neuromancer.

GIBSON

I was afraid to watch Blade Runner in the theater because I was afraid the movie would be better than what I myself had been able to imagine. In a way, I was right to be afraid, because even the first few minutes were better. Later, I noticed that it was a total box-office flop, in first theatrical release. That worried me, too. I thought, Uh-oh. He got it right and ­nobody cares! Over a few years, though, I started to see that in some weird way it was the most influential film of my lifetime, up to that point. It affected the way people dressed, it affected the way people decorated nightclubs. Architects started building office buildings that you could tell they had seen in Blade Runner. It had had an astonishingly broad aesthetic impact on the world.

The concept was formally introduced in William Gibson's 1984 punkn novel, NEUROMANCER.  Although this first novel swept the Triple Crown of science fiction--the Hugo, the Nebula, and the Philip K. Dick awards--it is not really science fiction.  It could be called "science faction" in that it occurs not in another galaxy in the far future, but 20 years from now, in a BLADE RUNNER world just a notch beyond our silicon present.

      In Gibson's Cyberworld there is no-warp drive and "beam me up, Scotty."  The high technology is the stuff that appears on today's screens or that processes data in today's laboratories: Super-computer boards.  Recombinant DNA chips.  AI systems and enormous data banks controlled by multinational combines based in Japan and Zurich.

Mice memory implants, augmented reality trends, predictive policing, more

Scientists have created a false memory in mice by manipulating neurons that bear the memory of a place. The work further demonstrates just how unreliable memory can be. It also lays new ground for understanding the cell behavior and circuitry that controls memory, and could one day help researchers discover new ways to treat mental illnesses influenced by memory.

Augmented reality blurs the line between the virtual and real-world environment. This capability of augmented reality often confuses users, making them unable to determine the difference between the real world experience and the computer generated experience. It creates an interactive world in real-time and using this technology, businesses can give customers the opportunity to feel their products and service as if it is real right from their current dwelling.

AR technology imposes on the real world view with the help of computer-generated sensory, changing what we see. It can use any kind of object to alter our senses. The enhancements usually include sound, video, graphics and GPS data. And its potentials are tremendous as developers have just started exploring the world of augmented reality. However, you must not confuse between virtual reality and augmented reality, as there is a stark difference between them. Virtual reality, as the name suggests, is not real. It is just a made up world. On the other hand, augmented reality is enhancing the real world, providing an augmented view of the reality. The enhancements can be minor or major, but AR technology only changes how the real world around the user looks like.

Augmentedrealitytrends.com: Why augmented reality and why your prime focus is on retail industry?

SeeMore Interactive: We recognize the importance of merging brick-and-mortar retail with cloud-based technology to create the ultimate dynamic shopping experience. It’s simply a matter of tailoring a consumer’s shopping experience based on how he or she wants to shop; the ability to research reviews, compare prices, receive new merchandise recommendations, share photos and make purchases while shopping in-store or from the comfort of their home.

Deep learning is based on neural networks, simplified models of the way clusters of neurons act within the brain that were first proposed in the 1950s. The difference now is that new programming techniques combined with the incredible computing power we have today are allowing these neural networks to learn on their own, just as humans do. The computer is given a huge pile of data and asked to sort the information into categories on its own, with no specific instruction. This is in contrast to previous systems that had to be programmed by hand. By learning incrementally, the machine can grasp the low-level stuff before the high-level stuff. For example, sorting through 10,000 handwritten letters and grouping them into like categories, the machine can then move on to entire words, sentences, signage, etc. This is called “unsupervised learning,” and deep learning systems are very good at it.

Intelligent policing can convert these modest gains into significant reductions in crime. Cops working with predictive systems respond to call-outs as usual, but when they are free they return to the spots which the computer suggests. Officers may talk to locals or report problems, like broken lights or unsecured properties, that could encourage crime. Within six months of introducing predictive techniques in the Foothill area of Los Angeles, in late 2011, property crimes had fallen 12% compared with the previous year; in neighbouring districts they rose 0.5% (see chart). Police in Trafford, a suburb of Manchester in north-west England, say relatively simple and sometimes cost-free techniques, including routing police driving instructors through high-risk areas, helped them cut burglaries 26.6% in the year to May 2011, compared with a decline of 9.8% in the rest of the city.

Although they may all look very different, the cities of the future share a new way of doing things, from sustainable buildings to walkable streets to energy-efficient infrastructure. While some are not yet complete – or even built – these five locations showcase the cutting edge of urban planning, both in developing new parts of an existing metropolitan area and building entirely new towns. By 2050, it is forecast that 70% of the world’s population will live in cities. These endeavours may help determine the way we will live then, and in decades beyond.

Mention thorium—an alternative fuel for nuclear power—to the right crowd, and faces will alight with the same look of spirited devotion you might see in, say, Twin Peaks and Chicago Cubs fans. People love thorium against the odds. And now Bill Gates has given them a new reason to keep rooting for the underdog element.

TerraPower, the Gates-chaired nuclear power company, has garnered attention for pursuing traveling wave reactor tech, which runs entirely on spent uranium and would rarely need to be refueled. But the concern just quietly announced that it's going to start seriously exploring thorium power, too.

Google might have put the kibosh on allowing x-rated apps onto Glass (for now) but that hasn't stopped the porn industry from doing what they do best: using new technology to enhance the, um, adult experience. The not yet titled film stars James Deen and Andy San Dimas.

There has always been a basic split in machine vision work. The engineering approach tries to solve the problem by treating it as a signal detection task using standard engineering techniques. The more "soft" approach has been to try to build systems that are more like the way humans do things. Recently it has been this human approach that seems to have been on top, with DNNs managing to learn to recognize important features in sample videos. This is very impressive and very important, but as is often the case the engineering approach also has a trick or two up its sleeve.

  • From Google Research:

We demonstrate the advantages of our approach by scaling object detection from the current state of the art involving several hundred or at most a few thousand of object categories to 100,000 categories requiring what would amount to more than a million convolutions. Moreover, our demonstration was carried out on a single commodity computer requiring only a few seconds for each image. The basic technology is used in several pieces of Google infrastructure and can be applied to problems outside of computer vision such as auditory signal processing.

Thoughts on Oculus Rift, modding, and assessing the state of games journalism and criticism

Gaming journalism is, by some accounts, a broken field. By others, its unjournalistic process is a symptom of reporting online, where advertising revenue is minimal, at least when compared to revenue from newspapers or magazines. And that isn’t just exclusive to gaming journalism — most outlets, both online and in print, face an uncertain future under the weight of a change in the way we absorb news and opinion. (The change is evident when you account for how many sites have recently undergone a design to accommodate tablets better. USgamer, Kotaku and Polygon among others.)

[...]

That’s why gaming press seems like a corrupt industry, when it should be incorruptible. Corporate apologetics, publisher-granted exclusive reviews, mostly non-hard-hitting, superfluous bits to appease the companies. All of this is how modern journalism operates. (As an experiment, check notable outlets or magazines and look for the term “sponsored content”. More sites do it than you’d think.) But when the revenue stream is one-tenth of historical norms, journalists must find ways to continue writing, and that sometimes involves looking for sponsors. It’s not optimal, it’s not prestigious, it goes everything I learned in journalism school, but hey, money rules the world.

[blip.tv http://blip.tv/play/AYOT7SsC.x?p=1 width="720" height="433"]

Initially, I’m excited about using it for actors: there’s no reason it can’t work directly with the MVN mocap suits we use, and having actors able to see the virtual environment they’re acting in is a pretty mind-blowing concept. I may need to invest in a supply of sick-bags, though…

I’m also working on a virtual camera for the Rift, some tests of aiming cameras WITH MY FACE, the previously-mentioned preview suite, and more. Look for a post specifically about the Rift and filmmaking later this week or early next.

But for now, if you’ll excuse me, there’s a demon-filled corridor in Doom 3 that I’ve got to go be scared witless by…

Issues like over-crowding start to fade away. Of course, physical education can't be replaced (yet?), but actual problems that plague education for students, both young and old could be eradicated completely. Suddenly, post-secondary education becomes affordable once again. Taught by real teachers to real students with those social interactions at the core.

Political events could be attended by anyone. Having the ability to view political discussions on the hill are possible today through various news outlets, or public broadcasting. With integration of Oculus, you could physically be there, sitting there, watching anything and everything unfold as if you were actually there. Having something like this might increase public knowledge of the workings of government, and help youth become passionate about issues that really require their attention.

With both software and hardware modders growing in numbers at a staggering rate, and one that will presumably continue to increase, it’s safe to say that modding is the future of gaming. A single person or group of people going out of their way to improve the gaming experience for themselves and others for non-profit was almost unimaginable during the early stages of the industry. Today, it is the norm, albeit still a relatively underground one. Yet just as the amount of people who play games has risen dramatically over the years, I believe the same is destined to repeat itself for modders. In order for gaming companies to solidify their foothold in the industry, the implementation of cooperation with their target audience will soon be paramount.

The Ideology of Scarface, Community as PoMo masterpiece, Present Shock reviewed, etc.

[youtube=http://www.youtube.com/watch?v=YanhEVEgkYI&w=560&h=315]

[youtube=http://www.youtube.com/watch?v=7MCmBHPqz6I&w=560&h=315]

In The Godfather, the blurring of the line between crime and the “legitimate” economy can still seem shocking. In Scarface, the distinction seems quaintly naïve. In The Godfather, Don Vito almost loses everything over his refusal to deal in heroin. In Scarface, Tony Montana knows that coke is just another commodity in a boom economy. Michael Corleone marries the wispy, drooping Kate Adams to give his enterprise some old-fashioned, WASP class. When Tony Montana takes possession of the coked-up bombshell called Elvira Hancock, he is filling his waterbed with cash, not class. Even more excruciatingly, Scarface tells us these truths without any self-righteousness, without the consoling promise that manly discipline can save America from its fate. In the moral economy of this movie, the terms of critique have become indistinguishable from the terms of affirmation. “You know what capitalism is?” Tony answers his own question: “Getting fucked.”

Donovan put Neumann in charge of the Research and Analysis Branch of the OSS, studying Nazi-ruled central Europe. Neumann was soon joined by the philosopher Herbert Marcuse and the legal scholar Otto Kirchheimer, his colleagues at the left-wing Institute for Social Research, which had been founded in Frankfurt in 1923 but had moved to Columbia University after the Nazis came to power.

An update of the promise, that the media could create a different, even a better world, seems laughable from our perspective of experience with the technologically based democracies of markets. As a utopia-ersatz, this promise appears to be obsolete in the former hegemonial regions of North America and western and northern Europe. Now that it is possible to create a state with media, they are no longer any good for a revolution. The media are an indispensable component of functioning social hierarchies, both from the top down and the bottom up, of power and countervailing power. They have taken on systemic character. Without them, nothing works anymore in what the still surviving color supplements in a careless generalization continue to call a society. Media are an integral part of the the everyday coercive context, which is termed “practical constraints.” As cultural techniques, which need to be learned for social fitness, they are at the greatest possible remove from what whips us into a state of excitement, induces aesthetic exultation, or triggers irritated thoughts.

[...]

At the same time, many universities have established courses in media design, media studies, and media management. Something that operates as a complex, dynamic, and edgy complex between the discourses, that is, something which can only operate interdiscursively, has acquired a firm and fixed place in the academic landscape. This is reassuring and creates professorial chairs, upon which a once anarchic element can be sat out and developed into knowledge for domination and control. Colleges and academies founded specifically for the media proactively seek close relationships with the industries, manufacturers, and the professional trades associations of design, orientation, and communication.

There are five ways Rushkoff thinks present shock is being experienced and responded to. To begin, we are in an era in which he thinks narrative has collapsed. For as long as we have had the power of speech we have corralled time into linear stories with a beginning, middle and ending. More often than not these stories contained some lesson. They were not merely forms of entertainment or launching points for reflection but contained some guidance as to how we should act in a given circumstance, which, of course, differed by culture, but almost all stories were in effect small oversimplified models of real life.

[...]

The medium Rushkoff thinks is best adapted to the decline of narrative are video games. Yes, they are more often than not violent, but they also seem tailor made for the kinds of autonomy and collaborative play that are the positive manifestations of our new presentism.

 

2nd Update: Žižek responds to Chomsky's "Fantasies"

  • Žižek v. Chomsky continues: Žižek has responded to Chomsky's last comment in an article in the International Journal of Žižek Studies. You can read the entire article here, select excerpts follow. I am particularly interested in how Žižek focuses on conflicting definitions of ideology as a key factor in Chomsky's misunderstanding of Žižek's work:

For me, on the contrary, the problem is here a very rational one: everything hinges on how we define “ideology.”

[...]

This bias is ideology - a set of explicit and implicit, even unspoken, ethico-political and other positions, decision, choices, etc., which predetermine our perception of facts, what we tend to emphasize or to ignore, how we organize facts into a consistent whole of a narrative or a theory.

  • After a rational and diplomatic refutation of Chomsky's comments, Žižek ends the essay with a parting blow:

Chomsky obviously doesn’t agree with me here. So what if – just another fancy idea of mine – what if Chomsky can not find anything in my work that goes "beyond the level of something you can explain in five minutes to a twelve-year-old because” because, when he deals with continental thought, it is his mind which functions as the mind of a twelve-years-old, the mind which is unable to distinguish serious philosophical reflection from empty posturing and playing with empty words?

 

Next gen gaming on Oculus Rift, McLuhan on surveillance state, Rushkoff on viral media

The spy is the ideal tourist because he represents an inner self perfectly contained within an outer self that is adapted to any possible location or circumstance. Travel can broaden him by the width of a new sexual conquest, but for the most part, he's seen everything already. Going to the Louvre won't make him vulnerable, and he won't stammer when he buys his ticket. The pathos of the whole Bourne series lies in the way it gives us a character who's been left with the spy's invulnerable outer shell but lost the inner self it was originally meant to protect.

Newman: It has become a frightening world. We seem to be constantly under surveillance. How can we deal with this menace?

McLuhan: The new human occupation of the electronic age has become surveillance. CIA-style espionage is now the total human activity. Whether you call it audience rating, consumer surveys and so on—all men are now engaged as hunters of espionage. So women are completely free to take over the dominant role in our society. Women’s liberation represents demands for absolute mobility, not just physical and political freedom to change roles, jobs and attitudes—but total mobility.

Today, our social media amplify and accelerate word of mouth to a new level. These aren’t hushed water-cooler conversation about whatever salacious gossip we’ve seen on the news; they are publicly broadcasted pronouncements about who is a hero, who is a traitor, who is a liar, or who is a fraud. In a media culture that values retweets and “likes” even more than money, stories spread and replicate less because they titillate than because they are suitable subjects for loud, definitive, 140-character declarations.

 

Update: Chomsky contra Žižek

Noam Chomsky has responded to Žižek's response:

Žižek finds nothing, literally nothing, that is empirically wrong. That’s hardly a surprise. Anyone who claims to find empirical errors, and is minimally serious, will at the very least provide a few particles of evidence – some quotes, references, at least something. But there is nothing here – which, I’m afraid, doesn’t surprise me either. I’ve come across instances of Žižek’s concept of empirical fact and reasoned argument.

For example, in the Winter 2008 issue of the German cultural journal Lettre International, Žižek attributed to me a racist comment on Obama by Silvio Berlusconi. I ignored it. Anyone who strays from ideological orthodoxy is used to this kind of treatment. However, an editor of Harper’s magazine, Sam Stark, was interested and followed it up. In the January 2009 issue he reports the result of his investigation. Žižek said he was basing the attribution on something he had read in a Slovenian magazine. A marvelous source, if it even exists.

The Guardian provides a summary for those just tuning in:

Noam Chomsky, the professional contrarian, has accused Slavoj Žižek, the professional heretic, of posturing in the place of theory. This is an accusation often levelled at Žižek from within the Anglo-Saxon empirical tradition. Even those like Chomsky who are on the proto-anarchist left of this tradition like to maintain that their theories are empirically verifiable and rooted in reality.

Žižek has countered with the side-swipe that nobody had been so empirically wrong throughout his life as Chomsky. He brought up Chomsky's supposed support for the Khmer Rouge in the 1970s and Chomsky's later self-justification that there hadn't been empirical evidence at the time of the crimes of the Khmer Rouge. It has all got rather heated and intemperate, but then, debates on the left are like that. More time is spent ripping flesh out of each other than it is trying to find a common cause against an apparently invisible and impregnable enemy. But terms have to be defined, ground has to be laid out.

 

Multiple angles on gaming's Ebert, Kubrick, and Citizen Kane

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.

  • Ebert later clarified that he believed "anything can be art," but video games cannot be "high art". Among those who disagreed with Ebert's assessment was film director Clive Barker. Ebert responded to some of Barker's points in an article. Part of Barker's comments dealt with the importance of critics to video games:

Barker:"It used to worry me that the New York Times never reviewed my books. But the point is that people like the books. Books aren't about reviewers. Games aren't about reviewers. They are about players."

Ebert: A reviewer is a reader, a viewer or a player with an opinion about what he or she has viewed, read or played. Whether that opinion is valid is up to his audience, books, games and all forms of created experience are about themselves; the real question is, do we as their consumers become more or less complex, thoughtful, insightful, witty, empathetic, intelligent, philosophical (and so on) by experiencing them?

  • The idiosyncrasies of video game reviews themselves have become so well known that game reviews are practically considered a genre (see this satirical take from Something Awful: If films were reviewed like video games). Earlier this month video game designer Warren Spector wrote a blog post titled Where's gaming's Roger Ebert? In the post Spector argues that gaming journalism and criticism currently is geared toward specialized groups like developers, publishers, academics, and hard-core gamers, but not "normal people":

What we need, as I said in an earlier column, is our own Andrew Sarris, Leonard Maltin, Pauline Kael, Judith Crist, Manny Farber, David Thomson, or Roger Ebert. We need people in mainstream media who are willing to fight with each other (not literally, of course) about how games work, how they reflect and affect culture, how we judge them as art as well as entertainment. We need people who want to explain games, individually and generically, as much as they want to judge them. We need what might be called mainstream critical theorists.

And they need a home. Not only on the Internet (though we need them there, too), not just for sale at GDC, but on newsstands and bookstore shelves - our own Film Comment, Sight and Sound, Cahiers du Cinema. Magazines you could buy on the newsstand. Why? Because currently, criticism of this - what little we have of it - reaches only the already converted. To reach the parents, the teachers, the politicians, we need to be where they shop. Even if you never pick up a film magazine, the fact that there are obviously serious magazines devoted to the topic makes a difference in the minds of the uninitiated.

To wonder aloud when or where the Roger Ebert of games criticism will emerge is wrongheaded. First, we must ask where is our Scorsese, our Hitchcock, our Coppola, our Tarantino? Where is gaming’s Stanley Kubrick?

A precious few developers may already be taking those first, intrepid steps along that road. Once these new developers are ascendant, once “adult” is no longer just a byword for “graphic” on this medium, perhaps then we can start to discuss a new critical grammar for games, and begin the search for its greatest practitioner.

The game industry is not waiting for its formative masterpieces to materialize from the hazy future. They're here, right now, walking among us. The future was 2002, and in many ways we have yet to surpass it. Like Citizen Kane, Metroid Prime is a landmark in both technical innovation and pure creativity.

  • Writing in the Financial Post, Chad Sapieha says that video games will never have a Citizen Kane moment. Interestingly, his argument isn't based on the artistic merits of video games, but rather on the particularities of the medium: video games become obsolete with technological advancements. A film made in the 1940s may still be available to view on DVD or other format, but a video game released just twenty years ago likely exists as only a memory.

I'd go so far as to suggest that, over time, many games released today will end up sharing more in common with stage productions than books or movies or music. They will be appreciated in the moment, then eventually disappear. People will write about and record their experiences, and those words and videos will continue on to posterity, acting as the primary means by which they are remembered by gamers of the future.

[...]

What I'm saying is simply this: Video game "classics" should be viewed as a breed apart from those of other entertainment mediums. Any attempts at comparison are fundamentally flawed thanks to unavoidable expiration dates imposed by the unstoppable evolution of hardware and advancements in game design.

Our medium is a fantastic vessel than can go places and do things others cannot. Games don’t need to beckon reflection or emotion in order to be good, and I don’t require validation from other people for the hobby to seem like a worthwhile use of my time. Indeed, Citizen Kane is incredible. It’s beautiful, thought-provoking, and inspiring … and film can keep it. Video games don’t need any of it; they never have and never will.

The problem with gaming’s incessant desire to be just like big brother Hollywood is multifarious and exceedingly annoying – like a thousand-headed hydra puffing away on an equal number of vuvuzelas. Have games or games criticism earned a place in the rarefied pantheon of unanimously beloved “mainstream” art? No, not really. Would it be cool if we had a Citizen Kane or, as Warren Spector suggests, an Ebert? I guess so.

But everyone waiting for those shining beacons of cultural acceptance to descend from on-high utterly fails to understand two key points: 1) in this day and age, creating direct analogs to those landmarks is actually impossible, and 2) games and games criticism are in the midst of a renaissance. An unstoppable explosion of evolution and creativity. The formation of an identity that is, frankly, far more exciting than film. Why aren’t we championing that to everyone with (or without) ears? Why are we instead breathlessly awaiting the day our medium suddenly and inexplicably conforms to somebody else’s standard?

Žižek contra Chomsky

  • A minor war of words has emerged between two of my favorite public intellectuals: Noam Chomsky and Slavoj Žižek. Late last month Open Culture posted audio of an interview with Chomsky (apparently from 2012). The interviewer asked for Chomsky's thoughts on Žižek (along with Derrida and Lacan) in light of Chomsky's views on the use of theory. In part, Chomsky responded:

What you’re referring to is what’s called “theory.” And when I said I’m not interested in theory, what I meant is, I’m not interested in posturing–using fancy terms like polysyllables and pretending you have a theory when you have no theory whatsoever. So there’s no theory in any of this stuff, not in the sense of theory that anyone is familiar with in the sciences or any other serious field. Try to find in all of the work you mentioned some principles from which you can deduce conclusions, empirically testable propositions where it all goes beyond the level of something you can explain in five minutes to a twelve-year-old. See if you can find that when the fancy words are decoded. I can’t. So I’m not interested in that kind of posturing. Žižek is an extreme example of it. I don’t see anything to what he’s saying. Jacques Lacan I actually knew. I kind of liked him. We had meetings every once in awhile. But quite frankly I thought he was a total charlatan. He was just posturing for the television cameras in the way many Paris intellectuals do. Why this is influential, I haven’t the slightest idea. I don’t see anything there that should be influential.

What is that about, again, the academy and Chomsky and so on? Well with all deep respect that I do have for Chomsky, my first point is that Chomsky, who always emphasizes how one has to be empirical, accurate, not just some crazy Lacanian speculations and so on… well I don’t think I know a guy who was so often empirically wrong in his descriptions in his whatever! Let’s look… I remember when he defended this demonstration of Khmer Rouge. And he wrote a couple of texts claiming: No, this is Western propaganda. Khmer Rouge are not as horrible as that.” And when later he was compelled to admit that Khmer Rouge were not the nicest guys in the Universe and so on, his defense was quite shocking for me. It was that “No, with the data that we had at that point, I was right. At that point we didn’t yet know enough, so… you know.” But I totally reject this line of reasoning.

  • Chomsky certainly isn't the first person to accuse of Žižek of substanceless sophistry, but to my knowledge he's the most prominent so far.

Powered by Squarespace. Background image of New Songdo by Curry Chandler.