Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: capitalism

Mind-controlled exoskeleton opens World Cup; AI will crash the stock market; Cortana's personality

The exoskeleton -- a system comprising a helmet implanted with a microchip that sticks out from the underside; a T-shirt loaded with sensors; metal leg braces; and a battery worn in a backpack -- is set in motion when the user envisions himself making the kick. The chip translates those electronic commands to a digital language that powers the skeleton, which then moves accordingly. The T-shirt vibrates to enhance the user's sensation of movement (and eliminate the need to look at his feet to see if he's stepping forward).

Talk about dropping the ball. Earlier today, Juliano Pinto — a 29 year-old paraplegic — successfully kicked off the 2014 FIFA World Cup by using a mind-controlled exoskeleton. But sadly, most TV networks failed to show it.

After months of hype, the official broadcast of the opening ceremonies showed only a fraction of it, while some TV networks missed the event altogether. Commentators criticized the organizers for casting aside the moment in favor of performing acts. 

The invasion of high-frequency trading machines is now forcing capitalism far away from anything either Adam Smith or the founders of the NYSE could possibly find virtuous. 

We’re not about to let robots compete in the Olympics, driverless cars race in the Indianapolis 500, or automated machines play sports like football, basketball, or baseball. So why is it we allow them to play a role in the most valuable contest of all, the world wide stock exchange? 

With crude forms of AI now entering the quant manipulator’s toolbox, we are now teetering dangerously close to a total collapse of the stock market, one that will leave many corporations and individuals financially destitute.

  • Microsoft has announced their version of apple's Siri virtual assistant. coming to Windows smartphones: Named Cortana, after the AI character from the Halo video game series, she is coming to Windows smartphones, and as Brad Molen at engadget reports, developers programmed her with a distinct personality:

Confident, caring, competent, loyal; helpful, but not bossy: These are just some of the words Susan Hendrich, the project manager in charge of overseeing Cortana's personality, used to describe the program's most significant character traits. "She's eager to learn and can be downright funny, peppering her answers with banter or a comeback," Hendrich said. "She seeks familiarity, but her job is to be a personal assistant." With that kind of list, it sure sounds like Hendrich's describing a human. Which is precisely what she and her team set out to do during Cortana's development; create an AI with human-like qualities.

Microsoft's decision to infuse Cortana with a personality stemmed from one end goal: user attachment. "We did some research and found that people are more likely to interact with [AI] when it feels more human," said Hendrich. To illustrate that desired human-machine dynamic, Hendrich pointed to her grandmother's experience with a Roomba vacuum: "She gave a name and a personality to an inanimate object, and it brought her joy." That sense of familiarity is exactly what Microsoft wants Window Phone users to feel when interacting with Cortana on their own devices.

Graeber on labor and leisure; the perils of hipster economics; and the educational value of MOOCs

Right after my original bullshit jobs piece came out, I used to think that if I wanted, I could start a whole career in job counseling – because so many people were writing to me saying “I realize my job is pointless, but how can I support a family doing something that’s actually worthwhile?” A lot of people who worked the information desk at Zuccotti Park, and other occupations, told me the same thing: young Wall Street types would come up to them and say “I mean, I know you’re right, we’re not doing the world any good doing what we’re doing. But I don’t know how to live on less than a six figure income. I’d have to learn everything over. Could you teach me?”

But I don’t think we can solve the problem by mass individual defection. Or some kind of spiritual awakening. That’s what a lot of people tried in the ‘60s and the result was a savage counter-offensive which made the situation even worse. I think we need to attack the core of the problem, which is that we have an economic system that, by its very nature, will always reward people who make other people’s lives worse and punish those who make them better. I’m thinking of a labor movement, but one very different than the kind we’ve already seen. A labor movement that manages to finally ditch all traces of the ideology that says that work is a value in itself, but rather redefines labor as caring for other people.

Proponents of gentrification will vouch for its benevolence by noting it "cleaned up the neighbourhood". This is often code for a literal white-washing. The problems that existed in the neighbourhood - poverty, lack of opportunity, struggling populations denied city services - did not go away. They were simply priced out to a new location.

That new location is often an impoverished suburb, which lacks the glamour to make it the object of future renewal efforts. There is no history to attract preservationists because there is nothing in poor suburbs viewed as worth preserving, including the futures of the people forced to live in them. This is blight without beauty, ruin without romance: payday loan stores, dollar stores, unassuming homes and unpaid bills. In the suburbs, poverty looks banal and is overlooked.

In cities, gentrifiers have the political clout - and accompanying racial privilege - to reallocate resources and repair infrastructure. The neighbourhood is "cleaned up" through the removal of its residents. Gentrifiers can then bask in "urban life" - the storied history, the selective nostalgia, the carefully sprinkled grit - while avoiding responsibility to those they displaced.

Hipsters want rubble with guarantee of renewal. They want to move into a memory they have already made.

In the pedagogic trenches, MOOCs are considered a symptom of wider economic patterns which effectively vacuum resources up into the financial stratosphere, leaving those doing the actual work with many more responsibilities, and far less compensation. Basic questions about the sustainability of this model remain unanswered, but it is clear that there is little room for enfranchised, full-time, fully-compensated faculty. Instead, we find an army of adjuncts servicing thousands of students; a situation which brings to mind scenes from Metropolis rather than Dead Poets Society.

[...]

For companies pushing MOOCs, education is no different from entertainment: it is simply a question of delivering ‘content.’ But learning to think exclusively via modem is like learning to dance by watching YouTube videos. You may get a sense of it, but no-one is there to point out mistakes, deepen your understanding, contextualise the gestures, shake up your default perspective, and facilitate the process. The role of the professor or instructor is not simply the shepherd for the transmission of information from point A to point B, but the co-forging of new types of knowledge, and critically testing these for various versions of soundness and feasibility. Wisdom may be eternal, but knowledge – both practical and theoretical – evolves over time, and especially exponentially in the last century, with all its accelerated technologies. Knowledge is always mediated, so we must consciously take the tools of mediation into account. Hence the need for a sensitive and responsive guide: someone students can bounce new notions off, rather than simply absorb information from. Without this element, distance learning all too often becomes distanced learning. Just as a class taken remotely usually leads to a sea of remote students.

[...]

Marshall McLuhan was half-right when he insisted that the electronic age is ushering in a post-literate society. But no matter how we like to talk of new audio-visual forms of literacy, there is still the ‘typographic man’ pulling the strings, encouraging us to express ourselves alphabetically. Indeed, the electronic and the literate are not mutually exclusive, much as people like to pit them against each other.

  • Pettman also quotes Ian Bogost's comments on distance learning:

The more we buy into the efficiency argument, the more we cede ground to the technolibertarians who believe that a fusion of business and technology will solve all ills. But then again, I think that's what the proponents of MOOCs want anyway. The issue isn't online education per se, it's the logics and rationales that come along with certain implementations of it.

TV still sucks, we should still complain about hipsters, your job shouldn't exist

None of this could be happening at a worse time. According to the latest S.O.S. from climate science, we have maybe 15 years to enact a radical civilizational shift before game over. This may be generous, it may be alarmist; no one knows. What is certain is that pulling off a civilizational Houdini trick will require not just switching energy tracks, but somehow confronting the “endless growth” paradigm of the Industrial Revolution that continues to be shared by everyone from Charles Koch to Paul Krugman. We face very long odds in just getting our heads fully around our situation, let alone organizing around it. But it will be impossible if we no longer even understand the dangers of chuckling along to Kia commercials while flipping between Maher, “Merlin” and “Girls.”

  • Zaitchik's article name checks pertinent critics and theorists including Adorno's "cultural industry," Postman’s “Amusing Ourselves to Death,” and even Jerry Mander's "Four Arguments for the Elimination of Television." Where this article was discussed on sites like Reddit or Metafilter commenters seemed angry at Zaitchik, overly defensive as if they felt under attack for watching "Hannibal" and "Game of Thrones". I thoroughly enjoyed Zaitchik's piece, even if it doesn't present a fully developed argument, because the perspective he presents strongly resonates with many of the philosophical foundations that have shaped my own views on media, particularly the media ecology tradition. A large part of Zaitchik's argument is that even if television content is the highest quality it has ever been, the form of television and its effects are the same as ever:

Staring at images on a little screen — that are edited in ways that weaken the brain’s capacity for sustained and critical thought, that encourage passivity and continued viewing, that are controlled by a handful of publicly traded corporations, that have baked into them lots of extremely slick and manipulating advertising — is not the most productive or pleasurable way to spend your time, whether you’re interested in serious social change, or just want to have a calm, clear and rewarding relationship with the real world around you.

But wait, you say, you’re not just being a killjoy and a bore, you’re living in the past. Television in 2014 is not the same as television in 1984, or 1994. That’s true. Chomsky’s “propaganda model,” set out during cable’s late dawn in “Manufacturing Consent,” is due for an update. The rise of on-demand viewing and token progressive programming has complicated the picture. But only by a little. The old arguments were about structure, advertising, structure, ownership, and structure, more than they were about programming content, or what time of the day you watched it. Less has changed than remains the same. By all means, let’s revisit the old arguments. That is, if everyone isn’t busy binge-watching “House of Cards.”

It’s been something to watch, this televisionification of the left. Open a window on social media during prime time, and you’ll find young journalists talking about TV under Twitter avatars of themselves in MSNBC makeup. Fifteen years ago, these people might have attended media reform congresses discussing how corporate TV pacifies and controls people, and how those facts flow from the nature of the medium. Today, they’re more likely to status-update themselves on their favorite corporate cable channel, as if this were something to brag about.

The entertainment demands of the 21st Century seem (apparently) bottomless. We’ve outsourced much of our serotonin production to the corporations which control music, sports, television, games, movies, and books. And they’ve grown increasingly desperate to produce the most universally acceptable, exportable, franchisable, exciting, boring, money-making pablum possible. Of course that is not new either… yet it continues to worsen.

Various alternative cultures have been attempting to fight it for decades. The beats, hippies, punks, and grunge kids all tried… and eventually lost. But the hipsters have avoided it altogether by never producing anything of substance except a lifestyle based upon fetishizing obscurity and cultivating tasteful disdain. A noncommital and safe appreciation of ironic art and dead artists. No ideals, no demands, no struggle.

Rarely has the modern alternative to pop culture been so self-conscious and crippled. The mainstream has repeatedly beaten down and destroyed a half-century’s worth of attempts to keep art on a worthwhile and genuine path, but now it seems the final scion of those indie movements has adopted the: ‘if you can’t beat‘em, join‘em’ compromise of creative death.

  • In an interview for PBS, London School of Economics professor David Graeber poses the question: should your job exist?

How could you have dignity in labor if you secretly believe your job shouldn’t exist? But, of course, you’re not going to tell your boss that. So I thought, you know, there must be enormous moral and spiritual damage done to our society. And then I thought, well, maybe that explains some other things, like why is it there’s this deep, popular resentment against people who have real jobs? They can get people so angry at auto-workers, just because they make 30 bucks an hour, which is like nowhere near what corporate lawyers make, but nobody seems to resent them. They get angry at the auto-workers; they get angry at teachers. They don’t get angry at school administrators, who actually make more money. Most of the problems people blame on teachers, and I think on some level, that’s resentment: all these people with meaningless jobs are saying, but, you guys get to teach kids, you get to make cars; that’s real work. We don’t get to do real work; you want benefits, too? That’s not reasonable.

If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how they could have done a better job. Real, productive workers are relentlessly squeezed and exploited. The remainder are divided between a terrorised stratum of the, universally reviled, unemployed and a larger stratum who are basically paid to do nothing, in positions designed to make them identify with the perspectives and sensibilities of the ruling class (managers, administrators, etc) – and particularly its financial avatars – but, at the same time, foster a simmering resentment against anyone whose work has clear and undeniable social value. Clearly, the system was never consciously designed. It emerged from almost a century of trial and error. But it is the only explanation for why, despite our technological capacities, we are not all working 3-4 hour days.

Žižek on post-U.S. order, Harvey on Piketty, Rushkoff's new job and doc

The "American century" is over, and we have entered a period in which multiple centres of global capitalism have been forming. In the US, Europe, China and maybe Latin America, too, capitalist systems have developed with specific twists: the US stands for neoliberal capitalism, Europe for what remains of the welfare state, China for authoritarian capitalism, Latin America for populist capitalism. After the attempt by the US to impose itself as the sole superpower – the universal policeman – failed, there is now the need to establish the rules of interaction between these local centres as regards their conflicting interests.

In politics, age-old fixations, and particular, substantial ethnic, religious and cultural identities, have returned with a vengeance. Our predicament today is defined by this tension: the global free circulation of commodities is accompanied by growing separations in the social sphere. Since the fall of the Berlin Wall and the rise of the global market, new walls have begun emerging everywhere, separating peoples and their cultures. Perhaps the very survival of humanity depends on resolving this tension.

  • Thomas Piketty's book Capital in the 21st Century has received widespread media attention, and enjoyed so much popular success that at times Amazon has been sold out of copies. It seems natural then that David Harvey, reigning champion of Marx's Capital in the 21st century would comment on the work, which he has now done on his web site:

The book has often been presented as a twenty-first century substitute for Karl Marx’s nineteenth century work of the same title. Piketty actually denies this was his intention, which is just as well since his is not a book about capital at all. It does not tell us why the crash of 2008 occurred and why it is taking so long for so many people to get out from under the dual burdens of prolonged unemployment and millions of houses lost to foreclosure. It does not help us understand why growth is currently so sluggish in the US as opposed to China and why Europe is locked down in a politics of austerity and an economy of stagnation. What Piketty does show statistically (and we should be indebted to him and his colleagues for this) is that capital has tended throughout its history to produce ever-greater levels of inequality. This is, for many of us, hardly news. It was, moreover, exactly Marx’s theoretical conclusion in Volume One of his version of Capital. Piketty fails to note this, which is not surprising since he has since claimed, in the face of accusations in the right wing press that he is a Marxist in disguise, not to have read Marx’s Capital.

[...]

There is, however, a central difficulty with Piketty’s argument. It rests on a mistaken definition of capital. Capital is a process not a thing. It is a process of circulation in which money is used to make more money often, but not exclusively through the exploitation of labor power.

  • At the 2012 Media Ecology conference in Manhattan I heard Douglas Rushkoff explain that he had stopped teaching classes at NYU because the department was not letting him teach a sufficient number of hours, all while using his likeness on program brochures. Well, Rushkoff has just been appointed to his first full-time academic post. Media Bistro reported CUNY's announcement :

Beginning this fall at CUNY’s Queens College, students can work their way towards an MA in Media Studies. Set to mold the curriculum is an expert responsible for terms such as “viral media” and “social currency.”

  • Lastly, this news made me realize that I completely missed Rushkoff's new Frontline special that premiered in February: Generation Like, which is available on the Frontline web site.

Video mélange: David Harvey, Antonio Negri, and Saints Row IV

 

 

Technology, hyperemployment, and femininity

If you’re like many people, you’ve started using your smartphone as an alarm clock. Now it’s the first thing you see and hear in the morning. And touch, before your spouse or your crusty eyes. Then the ritual begins. Overnight, twenty or forty new emails: spam, solicitations, invitations or requests from those whose days pass during your nights, mailing list reminders, bill pay notices. A quick triage, only to be undone while you shower and breakfast.

[...]

Often, we cast these new obligations either as compulsions (the addictive, possibly dangerous draw of online life) or as necessities (the importance of digital contact and an “online brand” in the information economy). But what if we’re mistaken, and both tendencies are really just symptoms of hyperemployment?

[...]

Hyperemployment offers a subtly different way to characterize all the tiny effort we contribute to Facebook and Instagram and the like. It’s not just that we’ve been duped into contributing free value to technology companies (although that’s also true), but that we’ve tacitly agreed to work unpaid jobs for all these companies. And even calling them “unpaid” is slightly unfair, since we do get something back from these services, even if they often take more than they give. Rather than just being exploited or duped, we’ve been hyperemployed. We do tiny bits of work for Google, for Tumblr, for Twitter, all day and every day.

Bogost writes, “hyperemployment offers a subtly different way to characterize all the tiny effort we contribute to Facebook and Instagram and the like. It’s not just that we’ve been duped into contributing free value to technology companies (although that’s also true), but that we’ve tacitly agreed to work unpaid jobs for all these companies.” This tacit agreement, however, extends beyond social media and e-mail and is really a form of housework and maintenance for our daily lives. In that regard, I wonder if calling the cozy arrangement between digital technologies, data economies, and invisible labor “employment” runs the danger of side-stepping the deeper (gendered and racialized) antagonisms inherent in the distinction between what is considered labor and what is considered “care.”

For more than thirty years, Marxist feminists have been arguing that women’s unpaid labor–housework, reproduction, etc.–is a prerequisite for capitalist wage labor, surplus value extraction, and profit-making. Capital can extract surplus value from waged labor only because the wage laborer is supported by (extracts surplus value from) unwaged labor, mainly in the form of the wife. Gregory’s argument is that what Bogost is pointing to isn’t a new phenomenon so much as a reconfiguration of an ongoing practice: we are all our own wives and moms, so to speak. Indeed, as Bogost’s example suggests, our smartphones wake us up, not our moms, just as emails accomplish a lot of the relational work (scheduling, reminding, checking in, etc.) conventionally performed by women.

So does technology relieve the burden on women to perform certain traditionally feminine tasks? Sure! If your husband scans the news on his iPad, you no longer need to collect the morning paper. If your kids have SpongeBob SquarePants for company, you are free to leave them bathed in television glare while you check Twitter/wallow in 21st-century guilt. On the other hand, assigning a task to a computer doesn’t necessarily make it go away. Wageless work may now be more evenly distributed among men and women, but someone still has to send the reminder emails and program the vacuum bot. We haven’t escaped the reality of unpaid labor; we’ve simply spread it around.

Inside Korea's gaming culture, virtual worlds and economic modeling, Hollywood's Summer of Doom continued, and more

  • I've long been fascinated by the gaming culture in South Korea, and Tom Massey has written a great feature piece for Eurogamer titled Seoul Caliber: Inside Korea's Gaming Culture. From this westerner's perspective, having never visited Korea, the article reads almost more like cyberpunk fiction than games journalism:

Not quite as ubiquitous, but still extremely common, are PC Bangs: LAN gaming hangouts where 1000 Won nets you an hour of multiplayer catharsis. In Gangnam's Maxzone, overhead fans rotate at Apocalypse Now speed, slicing cigarette smoke as it snakes through the blades. Korea's own NCSoft, whose European base is but a stone's throw from the Eurogamer offices, is currently going strong with its latest MMO, Blade & Soul.

"It's relaxing," says Min-Su, sipping a Milkis purchased from the wall-mounted vending machine. "And dangerous," he adds. "It's easy to lose track of time playing these games, especially when you have so much invested in them. I'm always thinking about achieving the next level or taking on a quick quest to try to obtain a weapon, and the next thing I know I've been here for half the day."

[youtube=http://www.youtube.com/watch?v=Kue_gd8DneU&w=420&h=315]

Creation and simulation in virtual worlds appear to offer the best domain to test the new ideas required to tackle the very real problems of depravation, inequality, unemployment, and poverty that exist in national economies. On that note the need to see our socioeconomic institutions for the games that they really are seems even more poignant.

In the words of Vili Lehdonvirta, a leading scholar in virtual goods and currencies, the suffering we see today is “not some consequence of natural or physical law” it instead “is a result of the way we play these games.”

The global economy seems to be bifurcating into a rich/tech track and a poor/non-tech track, not least because new technology will increasingly destroy/replace old non-tech jobs. (Yes, global. Foxconn is already replacing Chinese employees with one million robots.) So far so fairly non-controversial.

The big thorny question is this: is technology destroying jobs faster than it creates them?

[...]

We live in an era of rapid exponential growth in technological capabilities. (Which may finally be slowing down, true, but that’s an issue for decades hence.) If you’re talking about the economic effects of technology in the 1980s, much less the 1930s or the nineteenth century, as if it has any relevance whatsoever to today’s situation, then you do not understand exponential growth. The present changes so much faster that the past is no guide at all; the difference is qualitative, not just quantitative. It’s like comparing a leisurely walk to relativistic speeds.

We begin with a love story--from a man who unwittingly fell in love with a chatbot on an online dating site. Then, we encounter a robot therapist whose inventor became so unnerved by its success that he pulled the plug. And we talk to the man who coded Cleverbot, a software program that learns from every new line of conversation it receives...and that's chatting with more than 3 million humans each month. Then, five intrepid kids help us test a hypothesis about a toy designed to push our buttons, and play on our human empathy. And we meet a robot built to be so sentient that its creators hope it will one day have a consciousness, and a life, all its own.

[youtube=http://www.youtube.com/watch?v=pHCwaaactyY&w=420&h=315]

"These outages are absolutely going to continue," said Neil MacDonald, a fellow at technology research firm Gartner. "There has been an explosion in data across all types of enterprises. The complexity of the systems created to support big data is beyond the understanding of a single person and they also fail in ways that are beyond the comprehension of a single person."

From high volume securities trading to the explosion in social media and the online consumption of entertainment, the amount of data being carried globally over the private networks, such as stock exchanges, and the public internet is placing unprecedented strain on websites and on the networks that connect them.

What I want is systems that have intrinsic rewards; that are disciplines similar to drawing or playing a musical instrument. I want systems which are their own reward.

What videogames almost always give me instead are labor that I must perform for an extrinsic reward. I want to convince you that not only is this not what I want, this isn’t really what anyone wants.

[youtube=http://www.youtube.com/watch?v=GpO76SkpaWQ&w=560&h=315]

This 'celebrification' is enlivening making games and giving players role models, drawing more people in to development, especially indie and auteured games. This shift is proving more prosperous than any Skillset-accredited course or government pot could ever hope for. We are making men sitting in pants at their laptops for 12 hours a day as glamorous as it could be.

Creating luminaries will lead to all the benefits that more people in games can bring: a bigger and brighter community, plus new and fresh talent making exciting games. However, celebritydom demands storms, turmoil and gossip.

Spielberg's theory is essentially that a studio will eventually go under after it releases five or six bombs in a row. The reason: budgets have become so gigantic. And, indeed, this summer has been full of movies with giant budgets and modest grosses, all of which has elicited hand-wringing about financial losses, the lack of a quality product (another post-apocalyptic thriller? more superheroes?), and a possible connection between the two. There has been some hope that Hollywood's troubles will lead to a rethinking of how movies get made, and which movies get greenlit by studio executives. But a close look at this summer's grosses suggest a more worrisome possibility: that the studios will become more conservative and even less creative.

[youtube=http://www.youtube.com/watch?v=F4mDNMSntlA&w=420&h=315]

Powered by Squarespace. Background image of New Songdo by Curry Chandler.