Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: videogames

The unreal urbanism of Pokémon Go

Earlier this month the mobile-app game Pokémon Go was released in the U.S., and the game has been ubiquitous ever since. Aside from being a sudden pop culture phenomenon, the game's success poses some significant implications. First of all, this is clearly a breakthrough moment for augmented reality. Pokémon Go is not the first augmented reality game, nor is it the most ambitious, but it has undoubtedly brought AR into mainstream consciousness. Secondly, the success of Pokémon Go has led me to reconsider all my previously held assumptions about the uses of mobile apps and gamification for interfacing with urban spaces. I have historically been cynical about the prospect of using mobile games or AR interfaces to interact with urban space, since they usually strike me as shallow and insignificant, typically resulting in a fleeting diversion like a flash mob dance party, rather than altering people's perceptions of place in any lasting or meaningful way. Pokémon Go satisfies all the requirements of my earlier preconceptions, yet despite my best critical instincts, I really like the game.

The buzz about Pokémon Go had been building on various forums online, and after it was released it was virtually impossible to avoid Pokémon Go-related posts. Save for maybe 10 minutes with a friend's Game Boy in the late 90s, I've never played a Pokémon game and I preemptively wrote off Pokémon Go as yet another cultural fad that I would never partake in or understand. Curiosity got the best of my wife, however, and she downloaded the app and we walked around our neighborhood to test it out. To my surprise, the game was a lot of fun; our familiar surroundings were now filled with digital surprises, and we were excited to see neighborhood landmarks and murals represented as Pokéstops, and wild Pokémon hanging out in the doorways of local shops.  We meandered around discovering which of our local landmarks had been incorporated into the game, and each discovery increased my enjoyment of the app. Yes, the game is simple and shallow, but I was completely charmed. I downloaded the game so I could play, too.

Reactions to Pokémon Go have been as fascinating as the game's widespread adoption. Many news articles sensationalized the inherent dangers of playing the game: distracted players wandering into traffic or off of cliffs, people's homes being designated as Pokéstops and besieged by players, and traps being laid (using the games "lures") to ambush and rob aspiring Pokétrainers. There have also been insightful critical analyses of the game. An early and oft-shared article by Omari Akil considered the implications of Pokémon Go in light of recent police shootings of black men, warning that "Pokemon Go is a death sentence if you are a black man":

I spent less than 20 minutes outside. Five of those minutes were spent enjoying the game. One of those minutes I spent trying to look as pleasant and nonthreatening as possible as I walked past a somewhat visibly disturbed white woman on her way to the bus stop. I spent the other 14 minutes being distracted from the game by thoughts of the countless Black Men who have had the police called on them because they looked “suspicious” or wondering what a second amendment exercising individual might do if I walked past their window a 3rd or 4th time in search of a Jigglypuff.

Others questioned the distribution of Pokémon across neighborhoods, suggesting that poor or black neighborhoods had disproportionately fewer Pokémon and Pokéstops. Among urbanists, however, reaction to the game has been mixed. Mark Wilson at Fastcodesign declared that Pokémon Go "is quietly helping people fall in love with their cities". Ross Brady of Architizer celebrated the game for sparking "a global wave of urban exploration". Writing for de zeen, Alex Wiltshire boldly states that the game has "redrawn the map of what people find important about the world". City Lab contributor Laura Bliss proclaimed "Pokémon Go has created a new kind of flaneur". 

Others have been more critical of the game, with Nicholas Korody at Archinect retorting: "No, Pokémon Go is not an urban fantasy for the new flaneur". At Jacobin, Sam Kriss implores readers to "resist Pokémon Go":

Walk around. Explore your neighborhood. Visit the park. Take in the sights. Have your fun. Pokémon Go is coercion, authority, a command issuing from out of a blank universe, which blasts through social and political cleavages to finally catch ‘em all. It must be resisted.

Some, like Jeff Sparrow at Overland, drew direct parallels to the Situationists.

Writing for the Atlantic, Ian Bogost mediated on "the tragedy of Pokémon Go":

We can have it both ways; we have to, even: Pokémon Go can be both a delightful new mechanism for urban and social discovery, and also a ghastly reminder that when it comes to culture, sequels rule. It’s easy to look at Pokémon Go and wonder if the game’s success might underwrite other, less trite or brazenly commercial examples of the genre. But that’s what the creators of pervasive games have been thinking for years, and still almost all of them are advertisements. Reality is and always has been augmented, it turns out. But not with video feeds of twenty-year old monsters in balls atop local landmarks. Rather, with swindlers shilling their wares to the everyfolk, whose ensuing dance of embrace and resistance is always as beautiful as it is ugly.

Pokémon Go's popularity has led to many online comparisons to the Star Trek: TNG episode "The Game," in which the crew of the Enterprise is overcome by a mind-controlling video game. The game in Star Trek is not strictly-speaking an augmented reality game, but does involve projecting images onto the player's vision similar to an AR-overlay. Previous gaming and gadget fads have been compared to the TNG episode, notably Google Glass (for it's similarity to the eye-beaming design used to interface with the game in Star Trek) and the pervasively popular Angry Birds game (as evident in this parody video). The comparison has regained cultural cachet because, unlike Angry Birds which can be played on the couch, Pokémon Go is played in motion. This, of course, has contributed to the perception of the game's zombie-fying effects; we've grown accustomed to the fact that everyone's eyes are glued to a smartphone screen in our public spaces, but now there are whole flocks of people milling around with their eyes on their devices.

My cynical side is inclined to agree with the critics who see Pokémon Go's proliferation as proof positive of the passification and banalization of our society; the visions of Orwell, Bradbury, and Phil Dick all realized at once. But there's something there that has me appreciative, even excited about this goofy game. As my wife and I wandered our neighborhood looking for pocket monsters, we noticed several other people walking around staring at their phones. This is not an uncommon sight, but it is re-contextualized in light of Pokémon Go's popularity. "Look," my wife would say, "I bet they're playing, too." After a while she had to know for sure, and started walking up to people and asking, "Are you playing Pokémon Go?" Every person she asked was indeed playing the game. Then we were walking along with these people we've just met, discussing play strategies, sharing  Pokéstop locations, spreading word of upcoming lure parties.

One night around 10:30 last week we went into the Oakland neighborhood, home to both Pitt and Carnegie Mellon's campuses and a hotbed of  Pokémon Go activity. When we arrived, at least 20 people sat along the wall in front of the Soldiers & Sailors Memorial, smartphones in hands. We walked around the base of the Cathedral of Learning, where dozens of people in groups of two, three, or more were slowly pacing, stopping to capture a virtual creature. We crossed the street to Schenley plaza, where still dozens more people trekked through the grass, laughing and exclaiming and running up to their friends to share which Pokémon they had just got. Sure, most of these people were only talking to their own groups of friends, if they were talking at all, but it was still a cool experience. For me, the greatest thing was not which monsters I caught or XP my avatar earned; rather it was the energy, the unspoken but palpable buzz generated by all these people walking around in the dark of a warm summer night. Yes, I was giving attention to my smartphone screen, but what I remember most from that evening are the stars, and the fireflies, and the murmuring voices. Pokémon Go is promoting a sort of communal public activity, even if the sociality it produces is liminal at best. Yes, it is still shallow, still commercial, still programmed, but it's something; there's an energy there and a potential that is worth paying attention to.

Pokémon Go is not the be-all-end-all of augmented urban exploration, nor should be it considered the pinnacle of how mobile technology can enable new ways of interfacing with city space. But the game's popularity, and my personal experience using it, has given me hope for the potential of AR apps to enrich our experience of urban spaces and engender new types of interactions in our shared environments.

 

Ludology grab bag: video games and authenticity, semiocapitalism, and geography

The postmodern condition presents a constant struggle and conflict between our own desires and a world that seems fully available to experience but devoid of concrete or objective meaning. Video games, by virtue of their most basic structure, allow easy access to the feeling that your chosen actions and goals are both informed and legitimised by the overarching rules surrounding them. This is the very definition of authenticity.

None of this is to say that games are better or preferable experience to real life or other media, but it is to suggest they’re uniquely placed at this point in time to provide satisfying experiences. Indeed, matching the appeal of video games with the search for authenticity goes a ways to explaining the particular trajectory of gaming’s prevalence, from largely rejected as a toy in the production-focused late eighties and early nineties, to an explosion of mainstream acceptance as the global media and advertising machine makes up more and more of our everyday lives.

  • Forbes contributor Michael Thomsen reports on an independent video game that touched on symbolic representation, labor and productivity, and even namedrops Franco Berardi's "semiocapitalism":

In most videogames, the semiotic meaning of the system is accepted by players before they begin playing—they don’t know what tactics they’ll use to win, nor whether they’ll play long enough to do so, but they know that winning or completion is the organizing metaphor. Players aren’t often encouraged to question the values of competitive systems, but only asked to internalize the responsibility of making them work as efficiently as possible, postponing the anxious reality of failure for a few magical moments that we’ve agreed to describe as fun.

In contrast, Rehearsals and Returns overflows with signifiers placed in a system that remains indifferent to their interpretative meaning, and which consciously obscures the player’s desire  to interpret them in terms of winning or losing. The system acknowledges player choices—whether you chose to tell Hilary Clinton something hateful or nice—but the game doesn’t interpret the player’s choice, nor does it tie the economy of collectible conversation pieces to any allegorical meaning. It uses the game as a sort of digital confessional chamber, in which familiar units of social and political meaning are taken out of their historical narratives and given to the player in an incomplete space meant only for self-reflection.

  • Ian Bogost recorded an interview for the Go For Rainbow podcast, discussing "gaming culture as it relates to geographical space, and when and when not to whip out the PhD cred". Full audio is available here.

Video mélange: David Harvey, Antonio Negri, and Saints Row IV

 

 

Ender's Game analyzed, the Stanley Parable explored, Political Economy of zombies, semiotics of Twitter, much more

It's been a long time since the last update (what happened to October?), so this post is extra long in an attempt to catch up.

In a world in which interplanetary conflicts play out on screens, the government needs commanders who will never shrug off their campaigns as merely “virtual.” These same commanders must feel the stakes of their simulated battles to be as high as actual warfare (because, of course, they are). Card’s book makes the nostalgic claim that children are useful because they are innocent. Hood’s movie leaves nostalgia by the roadside, making the more complex assertion that they are useful because of their unique socialization to be intimately involved with, rather than detached from, simulations.

  • In the ongoing discourse about games criticism and its relation to film reviews, Bob Chipman's latest Big Picture post uses his own review of the Ender's Game film as an entry point for a breathless treatise on criticism. The video presents a concise and nuanced overview of arts criticism, from the classical era through film reviews as consumer reports up to the very much in-flux conceptions of games criticism.  Personally I find this video sub-genre (where spoken content is crammed into a Tommy gun barrage of word bullets so that the narrator can convey a lot of information in a short running time) irritating and mostly worthless, since the verbal information is being presented faster than the listener can really process it. It reminds me of Film Crit Hulk, someone who writes excellent essays with obvious insight into filmmaking, but whose aesthetic choice (or "gimmick") to write in all caps is often a distraction from the content and a deterrent to readers. Film Crit Hulk has of course addressed this issue and explained the rationale for this choice, but considering that his more recent articles have dropped the third-person "Hulk speak"  writing style the all caps seems to be played out. Nevertheless, I'm sharing the video because Mr. Chipman makes a lot of interesting points, particularly regarding the cultural contexts for the various forms of criticism. Just remember to breathe deeply and monitor your heart rate while watching.

  • This video from Satchbag's Goods is ostensibly a review ofHotline Miami, but develops into a discussion of art movements and Kanye West:

  • This short interview with Slavoj Žižek in New York magazine continues a trend I've noticed since Pervert's Guide to Ideology has been releasing, wherein writers interviewing Žižek feel compelled to include themselves and their reactions to/interactions with Žižek into their article. Something about a Žižek encounter brings out the gonzo in journalists. The NY mag piece is also notable for this succinct positioning of Žižek's contribution to critical theory:

Žižek, after all, the ­Yugoslav-born, Ljubljana-based academic and Hegelian; mascot of the Occupy movement, critic of the Occupy movement; and former Slovenian presidential candidate, whose most infamous contribution to intellectual history remains his redefinition of ideology from a Marxist false consciousness to a Freudian-Lacanian projection of the unconscious. Translation: To Žižek, all politics—from communist to social-democratic—are formed not by deliberate principles of freedom, or equality, but by expressions of repressed desires—shame, guilt, sexual insecurity. We’re convinced we’re drawing conclusions from an interpretable world when we’re actually just suffering involuntary psychic fantasies.

Following the development of the environment on the team's blog you can see some of the gaps between what data was deemed noteworthy or worth recording in the seventeenth century and the level of detail we now expect in maps and other infographics. For example, the team struggled to pinpoint the exact location on Pudding Lane of the bakery where the Great Fire of London is thought to have originated and so just ended up placing it halfway along.

  • Stephen Totilo reviewed the new pirate-themed Assassin's Creed game for the New York Times. I haven't played the game, but I love that the sections of the game set in the present day have shifted from the standard global conspiracy tropes seen in the earlier installments to postmodern self-referential and meta-fictional framing:

Curiously, a new character is emerging in the series: Ubisoft itself, presented mostly in the form of self-parody in the guise of a fictional video game company, Abstergo Entertainment. We can play small sections as a developer in Abstergo’s Montreal headquarters. Our job is to help turn Kenway’s life — mined through DNA-sniffing gadgetry — into a mass-market video game adventure. We can also read management’s emails. The team debates whether games of this type could sell well if they focused more on peaceful, uplifting moments of humanity. Conflict is needed, someone argues. Violence sells.

It turns out that Abstergo is also a front for the villainous Templars, who search for history’s secrets when not creating entertainment to numb the population. In these sections, Ubisoft almost too cheekily aligns itself with the bad guys and justifies its inevitable 2015 Assassin’s Creed, set during yet another violent moment in world history.

  • Speaking of postmodern, self-referential, meta-fictional video games: The Stanley Parable was released late last month. There has already been a bevy of analysis written about the game, but I am waiting for the Mac release to play the game and doing my best to avoid spoilers in the meantime. Brenna Hillier's post at VG24/7 is spoiler free (assuming you are at least familiar with the games premise, or its original incarnation as a Half Life mod), and calls The Stanley parable "a reaction against, commentary upon, critique and celebration of narrative-driven game design":

The Stanley Parable wants you to think about it. The Stanley Parable, despite its very limited inputs (you can’t even jump, and very few objects are interactive) looks at those parts of first-person gaming that are least easy to design for – exploration and messing with the game’s engine – and foregrounds them. It takes the very limitations of traditional gaming narratives and uses them to ruthlessly expose their own flaws.

Roy’s research focus prior to founding Bluefin, and continued interest while running the company, has to do with how both artificial and human intelligences learn language. In studying this process, he determined that the most important factor in meaning making was the interaction between human beings: non one learns language in a vacuum, after all. That lesson helped inform his work at Twitter, which started with mapping the connection between social network activity and live broadcast television.

Aspiring to cinematic qualities is not bad in an of itself, nor do I mean to shame fellow game writers, but developers and their attendant press tend to be myopic in their point of view, both figuratively and literally. If we continually view videogames through a monocular lens, we miss much of their potential. And moreover, we begin to use ‘cinematic’ reflexively without taking the time to explain what the hell that word means.

Metaphor is a powerful tool. Thinking videogames through other media can reframe our expectations of what games can do, challenge our design habits, and reconfigure our critical vocabularies. To crib a quote from Andy Warhol, we get ‘a new idea, a new look, a new sex, a new pair of underwear.’ And as I hinted before, it turns out that fashion and videogames have some uncanny similarities.

Zombies started their life in the Hollywood of the 1930s and ‘40s as simplistic stand-ins for racist xenophobia. Post-millennial zombies have been hot-rodded by Danny Boyle and made into a subversive form of utopia. That grim utopianism was globalized by Max Brooks, and now Brad Pitt and his partners are working to transform it into a global franchise. But if zombies are to stay relevant, it will rely on the shambling monsters' ability to stay subversive – and real subversive shocks and terror are not dystopian. They are utopian.

Ironically, our bodies now must make physical contact with devices dictating access to the real; Apple’s Touch ID sensor can discern for the most part if we are actually alive. This way, we don’t end up trying to find our stolen fingers on the black market, or prevent others from 3D scanning them to gain access to our lives.

This is a monumental shift from when Apple released its first iPhone just six years ago. It’s a touchy subject: fingerprinting authentication means we confer our trust in an inanimate object to manage our animate selves - our biology is verified, digitised, encrypted, as they are handed over to our devices.

Can you really buy heroin on the Web as easily as you might purchase the latest best-seller from Amazon? Not exactly, but as the FBI explained in its complaint, it wasn't exactly rocket science, thanks to Tor and some bitcoins. Here's a rundown of how Silk Road worked before the feds swooped in.

  • Henry Jenkins posted the transcript of an interview with Mark J.P. Wolf. The theme of the discussion is "imaginary worlds," and they touch upon the narratology vs. ludology conflict in gaming:

The interactivity vs. storytelling debate is really a question of the author saying either “You choose” (interaction) or “I choose” (storytelling) regarding the events experienced; it can be all of one or all of the other, or some of each to varying degrees; and even when the author says “You choose”, you are still choosing from a set of options chosen by the author.  So it’s not just a question of how many choices you make, but how many options there are per choice.  Immersion, however, is a different issue, I think, which does not always rely on choice (such as immersive novels), unless you want to count “Continue reading” and “Stop reading” as two options you are constantly asked to choose between.

Manifesto for a Ludic Century, ludonarrative dissonance in GTA, games and mindf*cks, and more

Systems, play, design: these are not just aspects of the Ludic Century, they are also elements of gaming literacy. Literacy is about creating and understanding meaning, which allows people to write (create) and read (understand).

New literacies, such as visual and technological literacy, have also been identified in recent decades. However, to be truly literate in the Ludic Century also requires gaming literacy. The rise of games in our culture is both cause and effect of gaming literacy in the Ludic Century.

So, perhaps there is one fundamental challenge for the Manifesto for a Ludic Century: would a truly ludic century be a century of manifestos? Of declaring simple principles rather than embracing systems? Or, is the Ludic Manifesto meant to be the last manifesto, the manifesto to end manifestos, replacing simple answers with the complexity of "information at play?"

Might we conclude: videogames are the first creative medium to fully emerge after Marshall McLuhan. By the time they became popular, media ecology as a method was well-known. McLuhan was a popular icon. By the time the first generation of videogame players was becoming adults, McLuhan had become a trope. When the then-new publication Wired Magazine named him their "patron saint" in 1993, the editors didn't even bother to explain what that meant. They didn't need to.

By the time videogame studies became a going concern, McLuhan was gospel. So much so that we don't even talk about him. To use McLuhan's own language of the tetrad, game studies have enhanced or accelerated media ecology itself, to the point that the idea of studying the medium itself over its content has become a natural order.

Generally speaking, educators have warmed to the idea of the flipped classroom far more than that of the MOOC. That move might be injudicious, as the two are intimately connected. It's no accident that private, for-profit MOOC startups like Coursera have advocated for flipped classrooms, since those organizations have much to gain from their endorsement by universities. MOOCs rely on the short, video lecture as the backbone of a new educational beast, after all. Whether in the context of an all-online or a "hybrid" course, a flipped classroom takes the video lecture as a new standard for knowledge delivery and transfers that experience from the lecture hall to the laptop.

  • Also, with increased awareness of Animal Crossing following from the latest game's release for the Nintendo 3DS, Bogost recently posted an excerpt from his 2007 book Persuasive Games discussing consumption and naturalism in Animal Crossing:

Animal Crossing deploys a procedural rhetoric about the repetition of mundane work as a consequence of contemporary material property ideals. When my (then) five-year-old began playing the game seriously, he quickly recognized the dilemma he faced. On the one hand, he wanted to spend the money he had earned from collecting fruit and bugs on new furniture, carpets, and shirts. On the other hand, he wanted to pay off his house so he could get a bigger one like mine.

Ludonarrative dissonance is when the story the game is telling you and your gameplay experience somehow don’t match up. As an example, this was a particular issue in Rockstar’s most recent game, Max Payne 3. Max constantly makes remarks about how terrible he is at his job, even though he does more than is humanly possible to try to protect his employers – including making perfect one-handed head shots in mid-air while drunk and high on painkillers. The disparity and the dissonance between the narrative of the story and the gameplay leave things feeling off kilter and poorly inter-connnected. It doesn’t make sense or fit with your experience so it feels wrong and damages the cohesiveness of the game world and story. It’s like when you go on a old-lady only murdering spree as Niko, who is supposed to be a reluctant killer with a traumatic past, not a gerontophobic misogynist.

What I find strange, in light of our supposed anti-irony cultural moment, is a kind of old-fashioned ironic conceit behind a number of recent critical darlings in the commercial videogame space. 2007's Bioshock and this year’s Bioshock: Infinite are both about the irony of expecting ‘meaningful choice’ to live in an artificial dome of technological and commercial constraints. Last year’s Spec Ops: The Line offers a grim alchemy of self-deprecation and preemptive disdain for its audience. The Grand Theft Auto series has always maintained a cool, dismissive cynicism beneath its gleefully absurd mayhem. These games frame choice as illusory and experience as artificial. They are expensive, explosive parodies of free will.

To cut straight to the heart of it, Bioshock seems to suffer from a powerful dissonance between what it is about as a game, and what it is about as a story. By throwing the narrative and ludic elements of the work into opposition, the game seems to openly mock the player for having believed in the fiction of the game at all. The leveraging of the game’s narrative structure against its ludic structure all but destroys the player’s ability to feel connected to either, forcing the player to either abandon the game in protest (which I almost did) or simply accept that the game cannot be enjoyed as both a game and a story, and to then finish it for the mere sake of finishing it.

The post itself makes a very important point: games, for the most part, can’t pull the Mindfuck like movies can because of the nature of the kind of storytelling to which most games are confined, which is predicated on a particular kind of interaction. Watching a movie may not be an entirely passive experience, but it’s clearly more passive than a game. You may identify with the characters on the screen, but you’re not meant to implicitly think of yourself as them. You’re not engaging in the kind of subtle roleplaying that most (mainstream) games encourage. You are not adopting an avatar. In a game, you are your profile, you are the character you create, and you are also to a certain degree the character that the game sets in front of you. I may be watching everything Lara Croft does from behind her, but I also control her; to the extent that she has choices, I make them. I get her from point A to B, and if she fails it’s my fault. When I talk about something that happened in the game, I don’t say that Lara did it. I say that I did.

Anachrony is a common storytelling technique in which events are narrated out of chronological order. A familiar example is a flashback, where story time jumps to the past for a bit, before returning to the present. The term "nonlinear narrative" is also sometimes used for this kind of out-of-order storytelling (somewhat less precisely).

While it's a common technique in literature and film, anachrony is widely seen as more problematic to use in games, perhaps even to the point of being unusable. If the player's actions during a flashback scene imply a future that differs considerably from the one already presented in a present-day scene (say, the player kills someone who they had been talking to in a present-day scene, or commits suicide in a flashback), this produces an inconsistent narrative. The root of the problem is that players generally have degree of freedom of action, so flashbacks are less like the case in literature and film—where already decided events are simply narrated out of order—and more like time travel, where the player travels back in time and can mess up the timeline.

The first of the books are set to be published in early 2014. Some of the writers that will be published by Press Select in its first round have written for publications like Edge magazine, Kotaku, Kill Screen and personal blogs, including writers like Chris Dahlen, Michael Abbott, Jenn Frank, Jason Killingsworth, Maddy Myers, Tim Rogers, Patricia Hernandez and Robert Yang.

Inside Korea's gaming culture, virtual worlds and economic modeling, Hollywood's Summer of Doom continued, and more

  • I've long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea's Gaming Culture. From this westerner's perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:

Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam's Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea's own NCSoft, whose European base is but a stone's throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.

"It's relaxing," says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. "And dangerous," he adds. "It's easy to lose track of time playing these games, especially when you have so much invested in them. I'm always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I've been here for half the day."

[youtube=http://www.youtube.com/watch?v=Kue_gd8DneU&w=420&h=315]

Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.

In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”

The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.

The big thorny question is this: is technology destroying jobs faster than it creates them?

[...]

We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.

We begin with a love story--from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives...and that's chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.

[youtube=http://www.youtube.com/watch?v=pHCwaaactyY&w=420&h=315]

"These outages are absolutely going to continue," said Neil MacDonald, a fellow at technology research firm Gartner. "There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person."

From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.

What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.

What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.

[youtube=http://www.youtube.com/watch?v=GpO76SkpaWQ&w=560&h=315]

This 'celebrification' is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.

Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.

Spielberg's theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood's troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer's grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.

[youtube=http://www.youtube.com/watch?v=F4mDNMSntlA&w=420&h=315]

Bogost on Facebook feudalism, narrative possibilites in games, the gamification of sex

The short truth is this: Facebook doesn't care if developers can use the platform easily or at all. In fact, it doesn't seem to concern itself with any of the factors that might be at play in developers' professional or personal circumstances. The Facebook Platform is a selfish, self-made altar to Facebook, at which developers are expected to kneel and cower, rather than a generous contribution to the success of developers that also happens to benefit Facebook by its aggregate effects.

A lot of reactions to the narrative of [Bioshock] Infinite that I encountered were that it “didn’t make sense,” and that it was “being weird for the sake of being weird.”

Those reminded me of criticisms leveled at two of my favorite filmmakers: David Lynch and Stanley Kubrick. I think these comments arise because Infinite doesn’t go all the way, it hesitates. It tries to stick to conventional logic. It strews about Voxaphones to explain its abstractions.

  • Shujaat Syed at Player Effort writes about "making linear story telling interesting in video games by acknowledging the fourth wall":

At their core, video games are authoritarian. They have rules that need to be followed, and you are restricted to the game play systems and a story the programmers and designers have created. However, compared to other forms of media, they offer a breadth of freedom that is unmatched. I will not be speaking about the freedom of exploration. What I will be talking about is the freedom of creating a different type of narrative that is only possible through video games by breaking the 4th wall between the game and the player. This is one of our mediums greatest advantage, however, very rarely, is this power explored. With video games, we can have truly powerful forms of narrative, but at most we get ideas that could theoretically work as movies. Open-world sandbox games can dodge this because the player is free to create their own narrative alongside the main plotline, and this is a concept that is entirely unique to video games. It’s the linear story-based games where the narrative is usually much harder to distinguish than what you would get from a book or movie.

In addition to registering your decibel levels (I’m hoping mine will get a boost from the garbage truck always idling outside my window), Spreadsheets will also monitor your overall duration, frequency, and somehow, thrusts per minute. Apparently this does not require supplementary electrodes.

What’s more, you can unlock “badges” and the like. For example, to meet the “Hello Sunshine” achievement, worth 10 points, you must take on the ultimate challenge of our time: “perform morning sex.”

Multiple angles on gamification

  • This week my fiancée told me about an app she had recently installed on her phone. As she excitedly described it, users of the app can "check in" at a retail store (it sounded like your location is verified through GPS) and you receive points for doing so, presumably to redeem for store purchases but I don't recall all the details. I should also mention that his app is not Foursquare, though I am not sure how the two apps differ specifically. Apps like this exemplify the gamification trend in marketing and advertising. There is an entire wiki dedicated to gamification, with detailed pages like this one describing the various game mechanics used in gamification.

Gamification applies basic game thinking and game mechanics to a non-gaming context. Many gamification models reward users for participating, completing defined user tasks, or achieving goals. A great example is Foursquare, which awards points and perks for "checking-in" to places you go. Although some models introduce distinguishable game-related features, gamification of online shopping includes any type of game thinking applied to an online shopping model.

Gamification makes things fun because it taps into our basic human appetite for competition, stature, and social interaction. Rather than feeling tricked or manipulated, we feel a sense of control when participating in transparent game-oriented shopping. As a result, shopping becomes more exciting and rewarding, while increasing highly sought-after engagement and customer loyalty for retailers and brands.

  • This LinkedIn post by Dan Sanker describes gamification as "the application of game elements and digital game design techniques to non-game problems" and considers potential applications:

Small tools, influenced by simple game mechanics can be used to modify people’s behavior. [...] There is a long way to go to make some mundane tasks more engaging. I think the paradigm that rang true the most this week, especially after talking with the kids about their experiences – is that we need to start thinking about customers, consumers, employees and/or students less as ‘Users’ and more as ‘Players.’ Are there ways to enjoy the experience of buying, procuring, working and learning? It might be a better way for us to consider interacting with Generation Z and those who come after them.

But gamification hasn't just grabbed the attention of the corporate world. Teachers are trying to make learning more fun by introducing games into the classroom in the hope of keeping children engaged for longer. This made me think about how many banks, building societies and other financial services providers are using gamification to encourage kids to start saving or educate them about money.

In the minds of Silicon Valley’s eternal optimists, and the journalists who so unconditionally love them, gamification is the possibility of rendering intricately complex processes, such as education or health care, more effective by transforming them into games. If kids aren’t reading, goes the gamified mantra, perhaps some friendly competitive system of badges and leaderboards might provide the missing incentive. And if adults are getting a tad too heavy, just slap a gizmo on their wrists that challenges them to burn more and more calories each day and they’ll play along.

As a professor of video games, I’ve strong doubts that the same principles that compel us to save Princess Zelda or defeat Donkey Kong apply in the classroom, the boardroom, or the emergency room. Like most game scholars, I view gamification as the creation of the TED-circle nabobs, largely empty, feel-good fodder for the intellectually limp. But the idea isn’t totally useless: There are some special categories of human events, rare and far-between, whose own innate absurdities are so profound that a touch of gamification might actually do them good.

I’m talking, of course, about the Israeli-Palestinian peace process.

We are naturally drawn to entertaining, visually appealing, easily digestible information sources and the power is in our hands to choose who, when, where and on what we will engage.  Witness the rise of video consumption on mobile as part of this trend.

Gamification may be the answer but the problem is that businesses can rush into it without necessarily lifting the bonnet to see what is making it work. There are a number of services putting their hands up to execute it for you but executing without a clear view of what motivates your audience can and will prove fatal.

The concept of gamifying products and services came into being when marketers realised that loyalty programmes are becoming too banal to retain consumers. A number of leading brand names, including Hungama, Zapak, Adobe and Microsoft, have used the concept successfully to create a habit of their product amongst users.

Microsoft created a unique gamified tool that allowed users to learn the new MS Office applications and earn rewards, thus making the whole process interactive.

Epic EVE battle, Critical games criticism, indie developer self-publishing

Update, 9:18PM ET: The battle is over. After more than five hours of combat, the CFC has defeated TEST Alliance. Over 2,900 ships were destroyed today in the largest fleet battle in Eve Online's history. TEST Alliance intended to make a definitive statement in 6VDT, but their defeat at the hands of the CFC was decisive and will likely result in TEST's withdrawal from the Fountain region.

In a conversation with Whitten, he told us that the commitment to independent developers is full. There won't be restrictions on the type of titles that can be created, nor will there be limits in scope. In response to a question on whether retail-scale games could be published independently, Whitten told us, "Our goal is to give them access to the power of Xbox One, the power of Xbox Live, the cloud, Kinect, Smartglass. That's what we think will actually generate a bunch of creativity on the system." With regard to revenue splitting with developers, we were told that more information will be coming at Gamescom, but that we could think about it "generally like we think about Marketplace today." According to developers we've spoken with, that split can be approximately 50-50.

Another difference between the Xbox One and Xbox 360 is how the games will be published and bought by other gamers. Indie games will not be relegated to the Xbox Live Indie Marketplace like on the Xbox 360 or required to have a Microsoft-certified publisher to distribute physically or digitally outside the Indie Marketplace. All games will be featured in one big area with access to all kinds of games.

If anything has hurt modern video game design over the past several years, it has been the rise of 'freemium'. It seems that it is rare to see a top app or game in the app stores that has a business model that is something other than the 'free-to-play with in-app purchases' model. It has been used as an excuse to make lazy, poorly designed games that are predicated on taking advantage of psychological triggers in its players, and will have negative long term consequences for the video game industry if kept unchecked.

Many freemium games are designed around the idea of conditioning players to become addicted to playing the game. Many game designers want their games to be heavily played, but in this case the freemium games are designed to trigger a 'reward' state in the player's brain in order to keep the player playing (and ultimately entice the user to make in-app purchases to continue playing). This type of conditioning is often referred to as a 'Skinner box', named after the psychologist that created laboratory boxes used to perform behavioral experiments on animals.

It obviously isn’t beyond the realm of possibility that, not only do financial considerations influence a game’s structure and content, financial outcomes affect a studio’s likelihood of survival in the industry, based upon the machinations of its publishing overlords. Activision killed Bizarre Creations, Eidos ruined Looking Glass Studios, EA crushed Westood, Pandemic, Bullfrog, Origin Systems… well, the list could go on, until I turn a strange, purple color, but you get my point. And, when 3.4 million copies sold for a Tomb Raider reboot isn’t enough by a publisher’s standards, you can’t help but feel concern for a developer’s future.

This relationship between environment-learner-content interaction and transfer puts teachers in the unique position to capitalize on game engagement to promote reflection that positively shapes how students tackle real-world challenges. To some, this may seem like a shocking concept, but it’s definitely not a new one—roleplay as instruction, for example, was very popular among the ancient Greeks and, in many ways, served as the backbone for Plato’s renowned Allegory of the Cave. The same is true of Shakespeare’s works, 18th and 19th century opera, and many of the novels, movies, and other media that define our culture. More recently, NASA has applied game-like simulations to teach astronauts how to maneuver through space, medical schools have used them to teach robotic surgery, and the Federal Aviation Administration has employed them to test pilots.

The relationship between the creator, the product, and the audience, are all important contexts to consider during media analysis, especially with games. This is because the audience is an active participant in the media. So if you are creating a game you always have to keep in mind the audience. Even if you say the audience doesn’t matter to you, it won’t cease to exist, and it does not erase the impact your game will have.

Similarly, if you are critiquing or analyzing any media, you can’t ignore the creator and the creator’s intentions. Despite those who claim the “death of the author,” if the audience is aware of the creator’s intentions, it can affect how they perceive the game. Particularly, if you consider the ease in which creators can release statements talking about their work, you’ll have an audience with varying levels of awareness about the creator’s intentions. These factors all play off of each other–they do not exist in a vacuum.

When we talk about any medium’s legitimacy, be it film or videogames or painting, it’s a very historical phenomenon that is inextricably tied to its artness that allows for them to get in on the ground floor of “legitimate" and “important." So if we contextualize the qualities that allowed for film or photography to find themselves supported through a panoply of cultural institutions it was a cultural and political economic process that lead them there.


[...]

Videogames, the kind that would be written about in 20 dollar glossy art magazines, would be exactly this. When creators of videogames want to point to their medium’s legitimacy, it would help to have a lot of smart people legitimate your work in a medium (glossy magazines, international newspapers) that you consider to be likewise legitimate. Spector concedes that ‘yes all the critics right now are online’, but the real battle is in getting these critics offline and into more “legitimate" spaces of representation. It’s a kind of unspoken hierarchy of mediums that is dancing before us here: at each step a new gatekeeper steps into play, both legitimating and separating the reader from the critic and the object of criticism. 

All three games define fatherhood around the act of protection, primarily physical protection. And in each of these games, the protagonist fails—at least temporarily—to protect their ward. In Ethan’s case, his cheery family reflected in his pristine home collapses when he loses a son in a car accident. Later, when his other son goes missing, the game essentially tests Ethan’s ability to reclaim his protective-father status.

No video game grants absolute freedom; they all have rules or guidelines that govern what you can and can’t do. The sci-fi epic Mass Effect is a series that prides itself on choice, but even that trilogy ends on a variation of choosing between the “good” and “bad” ending. Minecraft, the open-world creation game, is extremely open-ended, but you can’t build a gun or construct a tower into space because it doesn’t let you. BioShock’s ending argues that the choices you think you’re making in these games don’t actually represent freedom. You’re just operating within the parameters set by the people in control, be they the developers or the guy in the game telling you to bash his skull with a golf club.

BioShock’s disappointing conclusion ends up illustrating Ryan’s point. A man chooses, a player obeys. It’s a grim and cynical message that emphasizes the constraints of its own art form. And given that the idea of choice is so important to BioShock’s story, I don’t think it could’ve ended any other way.

 

Hacker's death, wearable tech, and some Cyberpunk

His genius was finding bugs in the tiny computers embedded in equipment, such as medical devices and cash machines. He often received standing ovations at conferences for his creativity and showmanship while his research forced equipment makers to fix bugs in their software.

Jack had planned to demonstrate his techniques to hack into pacemakers and implanted defibrillators at the Black Hat hackers convention in Las Vegas next Thursday. He told Reuters last week that he could kill a man from 30 feet away by attacking an implanted heart device.

Without the right approach, the continual distraction of multiple tasks exerts a toll that disrupts performance. It takes time to switch tasks, to get back what attention theorists call “situation awareness.” Interruptions disrupt performance, and even a voluntary switching of attention from one task to another is an interruption of the task being left behind.

Furthermore, it will be difficult to resist the temptation of using powerful technology that guides us with useful side information, suggestions, and even commands. Sure, other people will be able to see that we are being assisted, but they won’t know by whom, just as we will be able to tell that they are being minded, and we won’t know by whom.

9am to 1pm: Throughout the day you connect to your Dekko-powered augmented reality device, which overlays your vision with a broad range of information and entertainment. While many of the products the US software company is proposing are currently still fairly conceptual, Dekko hopes to find ways to integrate an extra layer of visual information into every part of daily life. Dekko is one of the companies supplying software to Google Glass, the wearable computer that gives users information through a spectacle-like visual display. Matt Miesnieks, CEO of Dekko, says that he believes "the power of wearables comes from connecting our senses to sensors."

Researchers at Belgian nonelectronics reseach and development center Imec and Belgium’s Ghent University are in the very early stages of developing such a device, which would bring augmented reality–the insertion of digital imagery such as virtual signs and historical markers with the real world–right to your eyeballs. It’s just one of several such projects (see “Contact Lens Computer: It’s Like Google Glass Without The Glasses”), and while the idea is nowhere near the point where you could ask your eye doctor for a pair, it could become more realistic as the cost and size of electronic components continue to fall and wearable gadgets gain popularity.

Speaking on the sidelines of the Wearable Technologies conference in San Francisco on Tuesday, Eric Dy, Imec’s North America business development manager, said researchers are investigating the feasibility of integrating an array of micro lenses with LEDs, using the lenses to help focus light and project it onto the wearer’s retinas.

The biggest barrier, beyond the translation itself, is speech recognition. In so many words, background noise interferes with the translation software, thus affecting results. But Barra said it works "close to 100 percent" when used in "controlled environments." Sounds perfect for diplomats, not so much for real-world conversations. Of course, Google's non-real-time, text-based translation software built into Chrome leaves quite a bit to be desired, making us all the more wary of putting our faith into Google's verbal solution. As the functionality is still "several years away," though, there's still plenty of time to convert us.

There will be limitations, however. It's easy to think that a life-sized human being, standing in your living room, would be capable of giving you a hug, for instance. But if that breakthrough is coming, it hasn't arrived yet. Holodeck creations these are not. And images projected through the magic of HoloVision won't be able to follow you into the kitchen for a snack either — not unless you've got a whole network of HoloVision cameras, anyway.

The implications of Euclid’s technology do not stop at surveillance or privacy. Remember, these systems are meant to feed data to store owners so that they can rearrange store shelves or entire showroom floors to increase sales. Malls, casinos, and grocery stores have always been carefully planned out spaces—scientifically arranged and calibrated for maximum profit at minimal cost. Euclid’s systems however, allow for massive and exceedingly precise quantification and analysis. More than anything, what worries me is the deliberateness of these augmented spaces. Euclid will make spaces designed to do exactly one thing almost perfectly: sell you shit you don’t need. I worry about spaces that are as expertly and diligently designed as Amazon’s home page or the latest Pepsi advertisement. A space built on data so rich and thorough that it’ll make focus groups look quaint in comparison.

Of course the US is not a totalitarian society, and no equivalent of Big Brother runs it, as the widespread reporting of Snowden’s information shows. We know little about what uses the NSA makes of most information available to it—it claims to have exposed a number of terrorist plots—and it has yet to be shown what effects its activities may have on the lives of most American citizens. Congressional committees and a special federal court are charged with overseeing its work, although they are committed to secrecy, and the court can hear appeals only from the government.

Still, the US intelligence agencies also seem to have adopted Orwell’s idea of doublethink—“to be conscious of complete truthfulness,” he wrote, “while telling carefully constructed lies.” For example, James Clapper, the director of national intelligence, was asked at a Senate hearing in March whether “the NSA collect[s] any type of data at all on millions or hundreds of millions of Americans.” Clapper’s answer: “No, sir…. Not wittingly.”

The drone is carrying a laptop so it can communicate with the headset, but right now the sticking point is range; since it's using wi-fi to communicate, it'll only get to around 50-100m.

"It's not a video game movie, it's a cyberpunk movie," Cargill said. "Eidos Montreal has given us a lot of freedom in terms of story; they want this movie to be Blade Runner. We want this movie to be Blade Runner."

INTERVIEWER

There’s a famous story about your being unable to sit through Blade Runner while writing Neuromancer.

GIBSON

I was afraid to watch Blade Runner in the theater because I was afraid the movie would be better than what I myself had been able to imagine. In a way, I was right to be afraid, because even the first few minutes were better. Later, I noticed that it was a total box-office flop, in first theatrical release. That worried me, too. I thought, Uh-oh. He got it right and ­nobody cares! Over a few years, though, I started to see that in some weird way it was the most influential film of my lifetime, up to that point. It affected the way people dressed, it affected the way people decorated nightclubs. Architects started building office buildings that you could tell they had seen in Blade Runner. It had had an astonishingly broad aesthetic impact on the world.

The concept was formally introduced in William Gibson's 1984 punkn novel, NEUROMANCER.  Although this first novel swept the Triple Crown of science fiction--the Hugo, the Nebula, and the Philip K. Dick awards--it is not really science fiction.  It could be called "science faction" in that it occurs not in another galaxy in the far future, but 20 years from now, in a BLADE RUNNER world just a notch beyond our silicon present.

      In Gibson's Cyberworld there is no-warp drive and "beam me up, Scotty."  The high technology is the stuff that appears on today's screens or that processes data in today's laboratories: Super-computer boards.  Recombinant DNA chips.  AI systems and enormous data banks controlled by multinational combines based in Japan and Zurich.

Thoughts on Oculus Rift, modding, and assessing the state of games journalism and criticism

Gaming journalism is, by some accounts, a broken field. By others, its unjournalistic process is a symptom of reporting online, where advertising revenue is minimal, at least when compared to revenue from newspapers or magazines. And that isn’t just exclusive to gaming journalism — most outlets, both online and in print, face an uncertain future under the weight of a change in the way we absorb news and opinion. (The change is evident when you account for how many sites have recently undergone a design to accommodate tablets better. USgamer, Kotaku and Polygon among others.)

[...]

That’s why gaming press seems like a corrupt industry, when it should be incorruptible. Corporate apologetics, publisher-granted exclusive reviews, mostly non-hard-hitting, superfluous bits to appease the companies. All of this is how modern journalism operates. (As an experiment, check notable outlets or magazines and look for the term “sponsored content”. More sites do it than you’d think.) But when the revenue stream is one-tenth of historical norms, journalists must find ways to continue writing, and that sometimes involves looking for sponsors. It’s not optimal, it’s not prestigious, it goes everything I learned in journalism school, but hey, money rules the world.

[blip.tv http://blip.tv/play/AYOT7SsC.x?p=1 width="720" height="433"]

Initially, I’m excited about using it for actors: there’s no reason it can’t work directly with the MVN mocap suits we use, and having actors able to see the virtual environment they’re acting in is a pretty mind-blowing concept. I may need to invest in a supply of sick-bags, though…

I’m also working on a virtual camera for the Rift, some tests of aiming cameras WITH MY FACE, the previously-mentioned preview suite, and more. Look for a post specifically about the Rift and filmmaking later this week or early next.

But for now, if you’ll excuse me, there’s a demon-filled corridor in Doom 3 that I’ve got to go be scared witless by…

Issues like over-crowding start to fade away. Of course, physical education can't be replaced (yet?), but actual problems that plague education for students, both young and old could be eradicated completely. Suddenly, post-secondary education becomes affordable once again. Taught by real teachers to real students with those social interactions at the core.

Political events could be attended by anyone. Having the ability to view political discussions on the hill are possible today through various news outlets, or public broadcasting. With integration of Oculus, you could physically be there, sitting there, watching anything and everything unfold as if you were actually there. Having something like this might increase public knowledge of the workings of government, and help youth become passionate about issues that really require their attention.

With both software and hardware modders growing in numbers at a staggering rate, and one that will presumably continue to increase, it’s safe to say that modding is the future of gaming. A single person or group of people going out of their way to improve the gaming experience for themselves and others for non-profit was almost unimaginable during the early stages of the industry. Today, it is the norm, albeit still a relatively underground one. Yet just as the amount of people who play games has risen dramatically over the years, I believe the same is destined to repeat itself for modders. In order for gaming companies to solidify their foothold in the industry, the implementation of cooperation with their target audience will soon be paramount.

The Ideology of Scarface, Community as PoMo masterpiece, Present Shock reviewed, etc.

[youtube=http://www.youtube.com/watch?v=YanhEVEgkYI&w=560&h=315]

[youtube=http://www.youtube.com/watch?v=7MCmBHPqz6I&w=560&h=315]

In The Godfather, the blurring of the line between crime and the “legitimate” economy can still seem shocking. In Scarface, the distinction seems quaintly naïve. In The Godfather, Don Vito almost loses everything over his refusal to deal in heroin. In Scarface, Tony Montana knows that coke is just another commodity in a boom economy. Michael Corleone marries the wispy, drooping Kate Adams to give his enterprise some old-fashioned, WASP class. When Tony Montana takes possession of the coked-up bombshell called Elvira Hancock, he is filling his waterbed with cash, not class. Even more excruciatingly, Scarface tells us these truths without any self-righteousness, without the consoling promise that manly discipline can save America from its fate. In the moral economy of this movie, the terms of critique have become indistinguishable from the terms of affirmation. “You know what capitalism is?” Tony answers his own question: “Getting fucked.”

Donovan put Neumann in charge of the Research and Analysis Branch of the OSS, studying Nazi-ruled central Europe. Neumann was soon joined by the philosopher Herbert Marcuse and the legal scholar Otto Kirchheimer, his colleagues at the left-wing Institute for Social Research, which had been founded in Frankfurt in 1923 but had moved to Columbia University after the Nazis came to power.

An update of the promise, that the media could create a different, even a better world, seems laughable from our perspective of experience with the technologically based democracies of markets. As a utopia-ersatz, this promise appears to be obsolete in the former hegemonial regions of North America and western and northern Europe. Now that it is possible to create a state with media, they are no longer any good for a revolution. The media are an indispensable component of functioning social hierarchies, both from the top down and the bottom up, of power and countervailing power. They have taken on systemic character. Without them, nothing works anymore in what the still surviving color supplements in a careless generalization continue to call a society. Media are an integral part of the the everyday coercive context, which is termed “practical constraints.” As cultural techniques, which need to be learned for social fitness, they are at the greatest possible remove from what whips us into a state of excitement, induces aesthetic exultation, or triggers irritated thoughts.

[...]

At the same time, many universities have established courses in media design, media studies, and media management. Something that operates as a complex, dynamic, and edgy complex between the discourses, that is, something which can only operate interdiscursively, has acquired a firm and fixed place in the academic landscape. This is reassuring and creates professorial chairs, upon which a once anarchic element can be sat out and developed into knowledge for domination and control. Colleges and academies founded specifically for the media proactively seek close relationships with the industries, manufacturers, and the professional trades associations of design, orientation, and communication.

There are five ways Rushkoff thinks present shock is being experienced and responded to. To begin, we are in an era in which he thinks narrative has collapsed. For as long as we have had the power of speech we have corralled time into linear stories with a beginning, middle and ending. More often than not these stories contained some lesson. They were not merely forms of entertainment or launching points for reflection but contained some guidance as to how we should act in a given circumstance, which, of course, differed by culture, but almost all stories were in effect small oversimplified models of real life.

[...]

The medium Rushkoff thinks is best adapted to the decline of narrative are video games. Yes, they are more often than not violent, but they also seem tailor made for the kinds of autonomy and collaborative play that are the positive manifestations of our new presentism.

 

Next gen gaming on Oculus Rift, McLuhan on surveillance state, Rushkoff on viral media

The spy is the ideal tourist because he represents an inner self perfectly contained within an outer self that is adapted to any possible location or circumstance. Travel can broaden him by the width of a new sexual conquest, but for the most part, he's seen everything already. Going to the Louvre won't make him vulnerable, and he won't stammer when he buys his ticket. The pathos of the whole Bourne series lies in the way it gives us a character who's been left with the spy's invulnerable outer shell but lost the inner self it was originally meant to protect.

Newman: It has become a frightening world. We seem to be constantly under surveillance. How can we deal with this menace?

McLuhan: The new human occupation of the electronic age has become surveillance. CIA-style espionage is now the total human activity. Whether you call it audience rating, consumer surveys and so on—all men are now engaged as hunters of espionage. So women are completely free to take over the dominant role in our society. Women’s liberation represents demands for absolute mobility, not just physical and political freedom to change roles, jobs and attitudes—but total mobility.

Today, our social media amplify and accelerate word of mouth to a new level. These aren’t hushed water-cooler conversation about whatever salacious gossip we’ve seen on the news; they are publicly broadcasted pronouncements about who is a hero, who is a traitor, who is a liar, or who is a fraud. In a media culture that values retweets and “likes” even more than money, stories spread and replicate less because they titillate than because they are suitable subjects for loud, definitive, 140-character declarations.

 

Multiple angles on gaming's Ebert, Kubrick, and Citizen Kane

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.

  • Ebert later clarified that he believed "anything can be art," but video games cannot be "high art". Among those who disagreed with Ebert's assessment was film director Clive Barker. Ebert responded to some of Barker's points in an article. Part of Barker's comments dealt with the importance of critics to video games:

Barker:"It used to worry me that the New York Times never reviewed my books. But the point is that people like the books. Books aren't about reviewers. Games aren't about reviewers. They are about players."

Ebert: A reviewer is a reader, a viewer or a player with an opinion about what he or she has viewed, read or played. Whether that opinion is valid is up to his audience, books, games and all forms of created experience are about themselves; the real question is, do we as their consumers become more or less complex, thoughtful, insightful, witty, empathetic, intelligent, philosophical (and so on) by experiencing them?

  • The idiosyncrasies of video game reviews themselves have become so well known that game reviews are practically considered a genre (see this satirical take from Something Awful: If films were reviewed like video games). Earlier this month video game designer Warren Spector wrote a blog post titled Where's gaming's Roger Ebert? In the post Spector argues that gaming journalism and criticism currently is geared toward specialized groups like developers, publishers, academics, and hard-core gamers, but not "normal people":

What we need, as I said in an earlier column, is our own Andrew Sarris, Leonard Maltin, Pauline Kael, Judith Crist, Manny Farber, David Thomson, or Roger Ebert. We need people in mainstream media who are willing to fight with each other (not literally, of course) about how games work, how they reflect and affect culture, how we judge them as art as well as entertainment. We need people who want to explain games, individually and generically, as much as they want to judge them. We need what might be called mainstream critical theorists.

And they need a home. Not only on the Internet (though we need them there, too), not just for sale at GDC, but on newsstands and bookstore shelves - our own Film Comment, Sight and Sound, Cahiers du Cinema. Magazines you could buy on the newsstand. Why? Because currently, criticism of this - what little we have of it - reaches only the already converted. To reach the parents, the teachers, the politicians, we need to be where they shop. Even if you never pick up a film magazine, the fact that there are obviously serious magazines devoted to the topic makes a difference in the minds of the uninitiated.

To wonder aloud when or where the Roger Ebert of games criticism will emerge is wrongheaded. First, we must ask where is our Scorsese, our Hitchcock, our Coppola, our Tarantino? Where is gaming’s Stanley Kubrick?

A precious few developers may already be taking those first, intrepid steps along that road. Once these new developers are ascendant, once “adult” is no longer just a byword for “graphic” on this medium, perhaps then we can start to discuss a new critical grammar for games, and begin the search for its greatest practitioner.

The game industry is not waiting for its formative masterpieces to materialize from the hazy future. They're here, right now, walking among us. The future was 2002, and in many ways we have yet to surpass it. Like Citizen Kane, Metroid Prime is a landmark in both technical innovation and pure creativity.

  • Writing in the Financial Post, Chad Sapieha says that video games will never have a Citizen Kane moment. Interestingly, his argument isn't based on the artistic merits of video games, but rather on the particularities of the medium: video games become obsolete with technological advancements. A film made in the 1940s may still be available to view on DVD or other format, but a video game released just twenty years ago likely exists as only a memory.

I'd go so far as to suggest that, over time, many games released today will end up sharing more in common with stage productions than books or movies or music. They will be appreciated in the moment, then eventually disappear. People will write about and record their experiences, and those words and videos will continue on to posterity, acting as the primary means by which they are remembered by gamers of the future.

[...]

What I'm saying is simply this: Video game "classics" should be viewed as a breed apart from those of other entertainment mediums. Any attempts at comparison are fundamentally flawed thanks to unavoidable expiration dates imposed by the unstoppable evolution of hardware and advancements in game design.

Our medium is a fantastic vessel than can go places and do things others cannot. Games don’t need to beckon reflection or emotion in order to be good, and I don’t require validation from other people for the hobby to seem like a worthwhile use of my time. Indeed, Citizen Kane is incredible. It’s beautiful, thought-provoking, and inspiring … and film can keep it. Video games don’t need any of it; they never have and never will.

The problem with gaming’s incessant desire to be just like big brother Hollywood is multifarious and exceedingly annoying – like a thousand-headed hydra puffing away on an equal number of vuvuzelas. Have games or games criticism earned a place in the rarefied pantheon of unanimously beloved “mainstream” art? No, not really. Would it be cool if we had a Citizen Kane or, as Warren Spector suggests, an Ebert? I guess so.

But everyone waiting for those shining beacons of cultural acceptance to descend from on-high utterly fails to understand two key points: 1) in this day and age, creating direct analogs to those landmarks is actually impossible, and 2) games and games criticism are in the midst of a renaissance. An unstoppable explosion of evolution and creativity. The formation of an identity that is, frankly, far more exciting than film. Why aren’t we championing that to everyone with (or without) ears? Why are we instead breathlessly awaiting the day our medium suddenly and inexplicably conforms to somebody else’s standard?

In medias res: Semiology of Batman, economics of attention, hypodermic needles, magic bullets and more

So I've decided to headline these posts with interesting (to me) media-related content from around the web "In medias res". Not very original, I know, but "in the middle of things" seems appropriate.

Following the semiotics goals I defined earlier, we will explore the complex network of sign language of AAA games, comic books, the Batman universe and related pop-culture, we will explore the narrative themes behind it all and we will examine how Rocksteady implemented said sign language using semiotic principles.

Schiller elaborates on the ways in which, "Corporate speech has become the dominant discourse...While the corporate voice booms across the land, individual expression, at best, trickles through tiny constricted public circuits. This has allowed the effective right to free speech to be transferred from individuals to billion dollar companies which, in effect, monopolize public communication (pg. 45)." Privatization, deregulation and the expansion of market relationships are cited by Schiller as the environment in which the national information infrastructure has been eroded (pg. 46).

  • Tomi Ahonen, apparently the person who declared mobile technology the 7th mass medium (who knew?), has declared augmented reality the 8th mass media. The list of media, in order of appearance:

1st mass media PRINT - from 1400s (books, pamphlets, newspapers, magazines, billboards)

2nd mass media RECORDINGS - from 1890s (records, tapes, cartridges, videocassettes, CDs, DVDs)

3rd mass media CINEMA - from 1900s

4th mass media RADIO - from 1920s

5th mass media TELEVISION - from 1940s

6th mass media INTERNET - from 1992

7th mass media MOBILE - from 1998

8th mass media AUGMENTED REALITY - from 2010

The return to the “magic bullet” theory has led many Arab and Western media scholars to focus on the study of the role of social media in developing popular movements. Little or no attention is paid to folk and traditional communication outlets such as Friday sermons, coffeehouse storytellers (“hakawati”), and mourning gatherings of women (“subhieh”). These face-to-face folk communication vehicles play an important role in developing the Arab public sphere as well as in introducing change.

And this piece about a new sex-advice show on MTV mentions the "hypodermic needle" theory:

When you talk about "young viewers" as helpless victims who are targeted by a message and instantly fall prey to it, you are positing a pre-World-War-II era mass communications theory known as the hypodermic model.

This model saw mass media as a giant hypodermic needle that "injected" messages into our brains. And no brains were more susceptible to the injections than those of children.

Powered by Squarespace. Background image of New Songdo by Curry Chandler.