Curry Chandler

Curry Chandler is a writer, researcher, and independent scholar working in the field of communication and media studies. His writing on media theory and policy has been published in the popular press as well as academic journals. Curry approaches the study of communication from a distinctly critical perspective, and with a commitment to addressing inequality in power relations. The scope of his research activity includes media ecology, political economy, and the critique of ideology.

Curry is a graduate student in the Communication Department at the University of Pittsburgh, having previously earned degrees from Pepperdine University and the University of Central Florida.

Filtering by Tag: journalism

Critical perspectives on the Isla Vista spree killer, media coverage


Reuters/Lucy Nicholson

Reuters/Lucy Nicholson

  • Immediately following Elliot Rodger's spree killing in Isla Vista, CA last month Internet users discovered his YouTube channel and a 140-page autobiographical screed, dubbed a "manifesto" by the media. The written document and the videos documented Rodger's sexual frustration and his chronic inability to connect with other people. He specifically lashed out at women for forcing him " to endure an existence of loneliness, rejection and unfulfilled desires" and causing his violent "retribution". Commentators and the popular press framed the killings as an outcome of misogynistic ideology, with headlines such as: How misogyny kills men, further proof that misogyny kills, and Elliot Rodger proves the danger of everyday sexism. Slate contributor Amanda Hess wrote:

Elliot Rodger targeted women out of entitlement, their male partners out of jealousy, and unrelated male bystanders out of expedience. This is not ammunition for an argument that he was a misandrist at heart—it’s evidence of the horrific extent of misogyny’s cultural reach.

His parents saw the digitally mediated rants and contacted his therapist and a social worker, who contacted a mental health hotline. These were the proper steps. But those who interviewed Rodger found him to be a “perfectly polite, kind and wonderful human.” They deemed his involuntary holding unnecessary and a search of his apartment unwarranted. That is, authorities defined Rodger and assessed his intentions based upon face-to-face interaction, privileging this interaction over and above a “vast digital trail.” This is digital dualism taken to its worst imaginable conclusion.

In fact, in the entire 140-odd-page memoir he left behind, “My Twisted World,” documents with agonizing repetition the daily tortured minutiae of his life, and barely has any interactions with women. What it has is interactions with the symbols of women, a non-stop shuffling of imaginary worlds that women represented access to. Women weren’t objects of desire per se, they were currency.

[...]

What exists in painstaking detail are the male figures in his life. The ones he meets who then reveal that they have kissed a girl, or slept with a girl, or slept with a few girls. These are the men who have what Elliot can’t have, and these are the men that he obsesses over.

[...]

Women don’t merely serve as objects for Elliot. Women are the currency used to buy whatever he’s missing. Just as a dollar bill used to get you a dollar’s worth of silver, a woman is an indicator of spending power. He wants to throw this money around for other people. Bring them home to prove something to his roommates. Show the bullies who picked on him that he deserves the same things they do.

[...]

There’s another, slightly more obscure recurring theme in Elliot’s manifesto: The frequency with which he discusses either his desire or attempt to throw a glass of some liquid at happy couples, particularly if the girl is a ‘beautiful tall blonde.’ [...] These are the only interactions Elliot has with women: marking his territory.

[...]

When we don’t know how else to say what we need, like entitled children, we scream, and the loudest scream we have is violence. Violence is not an act of expressing the inexpressible, it’s an act of expressing our frustration with the inexpressible. When we surround ourselves by closed ideology, anger and frustration and rage come to us when words can’t. Some ideologies prey on fear and hatred and shift them into symbols that all other symbols are defined by. It limits your vocabulary.

While the motivations for the shootings may vary, they have in common crises in masculinity in which young men use guns and violence to create ultra-masculine identities as part of a media spectacle that produces fame and celebrity for the shooters.

[...]

Crises in masculinity are grounded in the deterioration of socio-economic possibilities for young men and are inflamed by economic troubles. Gun carnage is also encouraged in part by media that repeatedly illustrates violence as a way of responding to problems. Explosions of male rage and rampage are also embedded in the escalation of war and militarism in the United States from the long nightmare of Vietnam through the military interventions in Afghanistan and Iraq.

For Debord, “spectacle” constituted the overarching concept to describe the media and consumer society, including the packaging, promotion, and display of commodities and the production and effects of all media. Using the term “media spectacle,” I am largely focusing on various forms of technologically-constructed media productions that are produced and disseminated through the so-called mass media, ranging from radio and television to the Internet and the latest wireless gadgets.

  • Kellner's comments from a 2008 interview talking about the Virginia Tech shooter's videos broadcast after the massacre, and his comments on critical media literacy, remain relevant to the current situation:

Cho’s multimedia video dossier, released after the Virginia Tech shootings, showed that he was consciously creating a spectacle of terror to create a hypermasculine identity for himself and avenge himself to solve his personal crises and problems. The NIU shooter, dressed in black emerged from a curtain onto a stage and started shooting, obviously creating a spectacle of terror, although as of this moment we still do not know much about his motivations. As for the television networks, since they are profit centers in a highly competitive business, they will continue to circulate school shootings and other acts of domestic terrorism as “breaking events” and will constitute the murderers as celebrities. Some media have begun to not publicize the name of teen suicides, to attempt to deter copy-cat effects, and the media should definitely be concerned about creating celebrities out of school shooters and not sensationalize them.

[...]

People have to become critical of the media scripts of hyperviolence and hypermasculinity that are projected as role models for men in the media, or that help to legitimate violence as a means to resolve personal crises or solve problems. We need critical media literacy to analyze how the media construct models of masculinities and femininities, good and evil, and become critical readers of the media who ourselves seek alternative models of identity and behavior.

  • Almost immediately after news of the violence broke, and word of the killer's YouTube videos spread, there was a spike of online backlash against the media saturation and warnings against promoting the perpetrator to celebrity status through omnipresent news coverage. Just two days after the killings Isla Vista residents and UCSB students let the news crews at the scene know that they were not welcome to intrude upon the community's mourning. As they are wont to do, journalists reported on their role in the story while ignoring the wishes of the residents, as in this LA Times brief:

More than a dozen reporters were camped out on Pardall Road in front of the deli -- and had been for days, their cameras and lights and gear taking up an entire lane of the street. At one point, police officers showed up to ensure that tensions did not boil over.

The students stared straight-faced at reporters. Some held signs expressing their frustration with the news media:

"OUR TRAGEDY IS NOT YOUR COMMODITY."

"Remembrance NOT ratings."

"Stop filming our tears."

"Let us heal."

"NEWS CREWS GO HOME!"

Zimmerman media coverage, Scorcese on reading cinema, remediation in Game of Thrones, and much more

The reports are based on an ABC News interview with Juror B29, the sole nonwhite juror. She has identified herself only by her first name, Maddy. She’s been framed as the woman who was bullied out of voting to convict Zimmerman. But that’s not true. She stands by the verdict. She yielded to the evidence and the law, not to bullying. She thinks Zimmerman was morally culpable but not legally guilty. And she wants us to distinguish between this trial and larger questions of race and justice.

ABC News hasn’t posted a full unedited video or transcript of the interview. The video that has been broadcast—on World News Tonight, Nightline, and Good Morning America—has been cut and spliced in different ways, often so artfully that the transitions appear continuous. So beware what you’re seeing. But the video that’s available already shows, on closer inspection, that Maddy has been manipulated and misrepresented. Here are the key points.

In the recording heard by NBC viewers, Zimmerman appeared to volunteer the information, “This guy looks like he’s up to no good. He looks black.”

Edited out was the 911 dispatcher asking Zimmerman if the person he was suspicious of was “black, white or Hispanic,” to which Zimmerman had responded, “He looks black.”

Though Zimmerman and his attorneys have filed a lawsuit against NBC News for the malicious editing of the 911 tape, what CNN did is far worse.

NBC News was attempting to make Zimmerman look like a racial profiler. CNN, on the other hand, was attempting to make Zimmerman look like an enraged outright racist (there was no racial angle in ABC's fraud). It also took CNN far longer to retract their story than either NBC or ABC.

Moreover, on its own airwaves, CNN would allow the complete fallacy that Zimmerman had said "fucking coon" to live on.

Pulling teeth doesn’t do justice to the painful viewing experience accompanying this sort of news manufacture - making news from no news. Even the daily palaver known as Changing the Guard was spun to look like an integral prelude to the long-awaited arrival. And the waiting went on, and on, and on, and the longer it went on, the more desperate and dull the coverage became. Sometimes people complain about the high salaries enjoyed by news presenters, especially the public service variety, but by golly they earnt their crust trying, albeit failing, to sustain the suspense.

Light is at the beginning of cinema, of course. It’s fundamental—because cinema is created with light, and it’s still best seen projected in dark rooms, where it’s the only source of light. But light is also at the beginning of everything. Most creation myths start with darkness, and then the real beginning comes with light—which means the creation of forms. Which leads to distinguishing one thing from another, and ourselves from the rest of the world. Recognizing patterns, similarities, differences, naming things—interpreting the world. Metaphors—seeing one thing “in light of” something else. Becoming “enlightened.” Light is at the core of who we are and how we understand ourselves.

[...]

Or consider the famous Stargate sequence from Stanley Kubrick’s monumental 2001: A Space Odyssey. Narrative, abstraction, speed, movement, stillness, life, death—they’re all up there. Again we find ourselves back at that mystical urge—to explore, to create movement, to go faster and faster, and maybe find some kind of peace at the heart of it, a state of pure being.

Despite stormy forecasts, Hollywood appears to be too unwieldly or too unwilling to shift direction towards smaller, cheaper pictures. Guests at Comic-Con learned about upcoming studio productions including Pirates of the Caribbean 5, Thor 2, Fantastic Four 3 and a reboot of Godzilla. The director Joss Whedon came to the event to lament that "pop culture is eating itself" and called for "new universes, new messages and new icons". He then revealed the title of his next film to be Avengers: Age of Ultron.

Repeat after me: Edward Snowden is not the story. The story is what he has revealed about the hidden wiring of our networked world. This insight seems to have escaped most of the world's mainstream media, for reasons that escape me but would not have surprised Evelyn Waugh, whose contempt for journalists was one of his few endearing characteristics. The obvious explanations are: incorrigible ignorance; the imperative to personalise stories; or gullibility in swallowing US government spin, which brands Snowden as a spy rather than a whistleblower.

The video site is aiming to showcase some geek culture by pronouncing 4-10 August its first ever ‘Geek Week’ and promoting some of the genre’s top channels which cover everything from sci-fi to comics, gaming and superheroes. To do this, its own channel will be featuring videos from users like Nerdist, the official Doctor Who channel, MinutePhysics and more than a hundred others, with every day of the week hosted by a different user. It’ll even include the first trailer for the new Thor movie, The Dark World.

That said, things kept nagging me. Blackfish does raise some valuable secondary issues - how SeaWorld markets itself, how labor issues are at stake in addition to environmental ones - but as a spectator I kept wanting the film to pursue lines of analysis that it would suggest but never develop.

[...]


In short, if there's an ur-ideology to the American progressive documentary, it's that demand-side drivers of political situations (Gramsci's hegemony, ideology, what have you) don't matter, it's merely the supply side of oligopoly, big money, and corporate control. Or to be less political, as a film scholar I can't help but notice than in a film about the business of spectacle, the spectator is both crucial (SeaWorld viewers provide the vital footage of the incidents) and completely effaced.

And what of the YouTube creator? How has AdSense helped or hindered their careers? In most cases, the advertising structure has been a blessing to creators as it’s allowed them to launch careers solely through YouTube. AdSense gave us a new type of celebrity for a new generation.

Creators have had their fair share of AdSense woes in the past, though. Last year, one of YouTube’s biggest names, Ray William Johnson, entered a very public dispute with Maker Studios. Johnson claimed that Maker Studios was holding his AdSense account “hostage” even after he had terminated his contract with them.

If you watch big budget entertainments, there's no escaping these sorts of moments. The trope familiar to the Scooby-Doo generation, in which a few nagging uncertainties are resolved with a "there's just one thing I don't understand" kickoff, has now become a motif. Characters must constantly address questions on behalf of a too-curious audience awash in complexly-plotted mega-stories. The movies are trying to plug leaks in a boat before the whole thing sinks—never quite repairing it, but doing just enough to get by.

What I’m talking about here is the unavoidable shift that occurs when content is remediated—that is, borrowed from one medium and reimagined in another. In this case, the content of the book series A Song of Ice and Fire (ASOIAF) is remediated to Game of Thrones, the HBO television series. Some of the differences in this instance of remediation seem pragmatic—remembrances are turned into scenes of their own, dialogue is shortened, characters omitted or altered for the sake of brevity and clarity. I am no purist, and I recognize that with remediation comes necessary alteration for the content to suit the new medium. But other differences speak volumes about our cultural biases and expectations surrounding those with socially-othered bodies—like Tyrion, Sam, and, of course, women. What can we say about these differences? And perhaps more importantly, what do they say about us?

Why does it matter what Kubrick liked? For years I’ve enjoyed unearthing as much information as I can about his favourite films and it slowly became a personal hobby. Partly because each time I came across such a film (usually from a newly disclosed anecdote – thanks internet! – or Taschen’s incredible The Stanley Kubrick Archives book) I could use it as a prism to reveal more about his sensibilities. My appreciation of both him and the films he liked grew. These discoveries led me on a fascinating trail, as I peppered them throughout the 11 existing Kubrick features (not counting the two he disowned) I try to watch every couple of years. I’m sure a decent film festival could be themed around the Master List at the end of this article…

  • Lastly, the Media Ecology Association has uploaded some videos from their latest annual convention which was held in June. These include Dominique Scheffel-Dunand on canonic texts in media ecology and Lance Strate's talk "If not A, then E".

 

Multiple angles on gaming's Ebert, Kubrick, and Citizen Kane

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.

  • Ebert later clarified that he believed "anything can be art," but video games cannot be "high art". Among those who disagreed with Ebert's assessment was film director Clive Barker. Ebert responded to some of Barker's points in an article. Part of Barker's comments dealt with the importance of critics to video games:

Barker:"It used to worry me that the New York Times never reviewed my books. But the point is that people like the books. Books aren't about reviewers. Games aren't about reviewers. They are about players."

Ebert: A reviewer is a reader, a viewer or a player with an opinion about what he or she has viewed, read or played. Whether that opinion is valid is up to his audience, books, games and all forms of created experience are about themselves; the real question is, do we as their consumers become more or less complex, thoughtful, insightful, witty, empathetic, intelligent, philosophical (and so on) by experiencing them?

  • The idiosyncrasies of video game reviews themselves have become so well known that game reviews are practically considered a genre (see this satirical take from Something Awful: If films were reviewed like video games). Earlier this month video game designer Warren Spector wrote a blog post titled Where's gaming's Roger Ebert? In the post Spector argues that gaming journalism and criticism currently is geared toward specialized groups like developers, publishers, academics, and hard-core gamers, but not "normal people":

What we need, as I said in an earlier column, is our own Andrew Sarris, Leonard Maltin, Pauline Kael, Judith Crist, Manny Farber, David Thomson, or Roger Ebert. We need people in mainstream media who are willing to fight with each other (not literally, of course) about how games work, how they reflect and affect culture, how we judge them as art as well as entertainment. We need people who want to explain games, individually and generically, as much as they want to judge them. We need what might be called mainstream critical theorists.

And they need a home. Not only on the Internet (though we need them there, too), not just for sale at GDC, but on newsstands and bookstore shelves - our own Film Comment, Sight and Sound, Cahiers du Cinema. Magazines you could buy on the newsstand. Why? Because currently, criticism of this - what little we have of it - reaches only the already converted. To reach the parents, the teachers, the politicians, we need to be where they shop. Even if you never pick up a film magazine, the fact that there are obviously serious magazines devoted to the topic makes a difference in the minds of the uninitiated.

To wonder aloud when or where the Roger Ebert of games criticism will emerge is wrongheaded. First, we must ask where is our Scorsese, our Hitchcock, our Coppola, our Tarantino? Where is gaming’s Stanley Kubrick?

A precious few developers may already be taking those first, intrepid steps along that road. Once these new developers are ascendant, once “adult” is no longer just a byword for “graphic” on this medium, perhaps then we can start to discuss a new critical grammar for games, and begin the search for its greatest practitioner.

The game industry is not waiting for its formative masterpieces to materialize from the hazy future. They're here, right now, walking among us. The future was 2002, and in many ways we have yet to surpass it. Like Citizen Kane, Metroid Prime is a landmark in both technical innovation and pure creativity.

  • Writing in the Financial Post, Chad Sapieha says that video games will never have a Citizen Kane moment. Interestingly, his argument isn't based on the artistic merits of video games, but rather on the particularities of the medium: video games become obsolete with technological advancements. A film made in the 1940s may still be available to view on DVD or other format, but a video game released just twenty years ago likely exists as only a memory.

I'd go so far as to suggest that, over time, many games released today will end up sharing more in common with stage productions than books or movies or music. They will be appreciated in the moment, then eventually disappear. People will write about and record their experiences, and those words and videos will continue on to posterity, acting as the primary means by which they are remembered by gamers of the future.

[...]

What I'm saying is simply this: Video game "classics" should be viewed as a breed apart from those of other entertainment mediums. Any attempts at comparison are fundamentally flawed thanks to unavoidable expiration dates imposed by the unstoppable evolution of hardware and advancements in game design.

Our medium is a fantastic vessel than can go places and do things others cannot. Games don’t need to beckon reflection or emotion in order to be good, and I don’t require validation from other people for the hobby to seem like a worthwhile use of my time. Indeed, Citizen Kane is incredible. It’s beautiful, thought-provoking, and inspiring … and film can keep it. Video games don’t need any of it; they never have and never will.

The problem with gaming’s incessant desire to be just like big brother Hollywood is multifarious and exceedingly annoying – like a thousand-headed hydra puffing away on an equal number of vuvuzelas. Have games or games criticism earned a place in the rarefied pantheon of unanimously beloved “mainstream” art? No, not really. Would it be cool if we had a Citizen Kane or, as Warren Spector suggests, an Ebert? I guess so.

But everyone waiting for those shining beacons of cultural acceptance to descend from on-high utterly fails to understand two key points: 1) in this day and age, creating direct analogs to those landmarks is actually impossible, and 2) games and games criticism are in the midst of a renaissance. An unstoppable explosion of evolution and creativity. The formation of an identity that is, frankly, far more exciting than film. Why aren’t we championing that to everyone with (or without) ears? Why are we instead breathlessly awaiting the day our medium suddenly and inexplicably conforms to somebody else’s standard?

Google settles over privacy violations, Social media segregation, the era of big data, and more...

  • Google is reportedly reaching a settlement with the Federal Trade Commission over an incident in which the Internet search giant violated an agreement with the FTC by tracking Safari users' data. From the Associated Press:

Google is poised to pay a $22.5 million fine to resolve allegations that it broke a privacy promise by secretly tracking millions of Web surfers who rely on Apple's Safari browser, according to a person familiar with settlement.

If approved by the FTC's five commissioners, the $22.5 million penalty would be the largest the agency has ever imposed on a single company.

  • Adrianna Jeffries at BetaBeat covers a BBC report on how users of specific web sites break down along racial demographics. The article misleadingly refers to "segregation" in social media, but the information and analysis by danah boyd is interesting:

Pinterest is 70 percent female and 79 percent white, according to the BBC. By contrast, black and Latino users are overrepresented on Twitter versus the general population.

Ms. Boyd theorized that there was an exodus of users from Myspace to Facebook similar to white flight to the suburbs when the U.S. desegregated schools. Facebook, the vanilla of social media sites, was approaching the makeup of the U.S. population at the time of an analysis done in 2009. That was the year that white users stopped being overrepresented and black and Latino users stopped being underrepresented.

Among companies of more than 1,000 employees in 15 out of the economy's 17 sectors, the average amount of data is a surreal 235 terabytes. That's right -- each of these companies has more info than the Library of Congress. And so, why should we care? Because data is valuable. The growth of digital networks and the networked sensors in everything from phones to cars to heavy machinery mean that data has a reach and sweep it has never had before. The key to Big Data is connecting these sensors to computing intelligence which can make sense of all this information (in pure Wall-E style, some theorists call this the Internet of Things).

  • This short post at Kethu.org presents survey data and rhetorically wonders whether social media behaviors negatively impact life enjoyment:

Consider this: 24% of respondents to one survey said they’ve missed out on enjoying special moments in person because — ironically enough — they were too busy trying to document their experiences for online sharing. Many of us have had to remind ourselves to “live in the now” — instead of worry about composing the perfect tweet or angling for just the right Instagram shot.

I’m coming to believe that classroom time is too limiting in the teaching of tools. At CUNY, we’ve seen over the years that students come in with widening gulfs in both their prior experience and their future ambitions in tools and technologies. My colleagues at CUNY, led by Sandeep Junnarkar, have implemented many new modules and courses to teach such topics as data journalism (gathering, analysis, visualization) and familiarity with programming.

Note well that I have argued since coming to CUNY that we should not and cannot turn out coders. I also do not subscribe to the belief that journalism’s salvation lies in hunting down that elusive unicorn, the coder-journalism, the hack-squared. I do believe that journalists must become conversant in technologies, sufficient to enable them to (a) know what’s possible, (b) specify what they want, and (c) work with the experts who can create that.

in medias res: bridging the "time sap" gap, DIY politics, Google thinks you're stupid, and more

  • When researchers started using the term "digital divide" in the 1990s they were referring to an inequality of access to the Internet and other ICTs. Over time the issue shifted from unequal access to emphasizing disparities of technological competency across socioeconomic sectors. The new manifestation of the digital divide, according to a New York Times article, is reflected in whether time on the Internet is spent being productive, or wasting time:

As access to devices has spread, children in poorer families are spending considerably more time than children from more well-off families using their television and gadgets to watch shows and videos, play games and connect on social networking sites, studies show.

The new divide is such a cause of concern for the Federal Communications Commission that it is considering a proposal to spend $200 million to create a digital literacy corps. This group of hundreds, even thousands, of trainers would fan out to schools and libraries to teach productive uses of computers for parents, students and job seekers.

A study published in 2010 by the Kaiser Family Foundation found that children and teenagers whose parents do not have a college degree spent 90 minutes more per day exposed to media than children from higher socioeconomic families. In 1999, the difference was just 16 minutes.

  • In an op-ed for the LA Times Neal Gabler writes that Obama's legacy may be disillusionment with partisan politics and a shift toward do-it-yourself democracy:

Disillusionment with partisan politics is certainly nothing new. Obama's fall from grace, however, may look like a bigger belly flop because his young supporters saw him standing so much higher than typical politicians. Yet by dashing their hopes, Obama may actually have accomplished something so remarkable that it could turn out to be his legacy: He has redirected young people's energies away from conventional electoral politics and into a different, grass-roots kind of activism. Call it DIY politics.

We got a taste of DIY politics last fall with the Occupy Wall Street sit-ins, which were a reaction to government inaction on financial abuses, and we got another taste when the 99% Spring campaign mobilized tens of thousands against economic inequality. OWS and its tangential offshoots may seem political, but it is important to note that OWS emphatically isn't politics as usual. It isn't even a traditional movement.

  • In a piece on The Daily Beast Andrew Blum, author of a new net-centric book titled The Tubes: A Journey to the Center of the Internet, details the condescension and furtiveness he encountered while researching Google for his book:

Walking past a large data center building, painted yellow like a penitentiary, I asked what went on inside. Did this building contain the computers that crawl through the Web for the search index? Did it process search queries? Did it store email? “You mean what The Dalles does?” my guide responded. “That’s not something that we probably discuss. But I’m sure that data is available internally.” (I bet.) It was a scripted non-answer, however awkwardly expressed. And it might have been excusable, if the contrast weren’t so stark with the dozens of other pieces of the Internet that I visited. Google was the outlier—not only for being the most secretive but the most disingenuous about that secrecy.

After my tour of Google’s parking lot, I joined a hand-picked group of Googlers for lunch in their cafeteria overlooking the Columbia River. The conversation consisted of a PR handler prompting each of them to say a few words about how much they liked living in The Dalles and working at Google. (It was some consolation that they were treated like children, too.) I considered expressing my frustration at the kabuki going on, but I decided it wasn’t their choice. It was bigger than them. Eventually, emboldened by my peanut-butter cups, I said only that I was disappointed not to have the opportunity to go inside a data center and learn more. My PR handler’s response was immediate: “Senators and governors have been disappointed too!”

When news reports focus on individuals and their stories, rather than simply facts or policy, readers experience greater feelings of compassion, said Penn State Distinguished Professor Mary Beth Oliver, co-director of the Media Effects Research Laboratory and a member of the Department of Film-Video and Media Studies. This compassion also extends to feelings about social groups in general, including groups that are often stigmatized.

"Issues such as health care, poverty and discrimination all should elicit compassion," Oliver said. "But presenting these issues as personalized stories more effectively evokes emotions that lead to greater caring, willingness to help and interest in obtaining more information."

The emphasis on "personalized stories" reminds me of Zillmann's exemplification theory, though the article makes no mention of exemplification.

The problem with living through a revolution is that you've no idea how things will turn out. So it is with the revolutionary transformation of our communications environment driven by the internet and mobile phone technology. Strangely, our problem is not that we are short of data about what's going on; on the contrary we are awash with the stuff. This is what led Manuel Castells, the great scholar of cyberspace, to describe our current mental state as one of "informed bewilderment": we have lots of information, but not much of a clue about what it means.

If, however, you're concerned about things such as freedom, control and innovation, then the prospect of a world in which most people access the internet via smartphones and other cloud devices is a troubling one. Why? Because smartphones (and tablets) are tightly controlled, "tethered" appliances. You may think that you own your shiny new iPhone or iPad, for example. But in fact an invisible chain stretches from it all the way back to Apple's corporate HQ in California. Nothing, but nothing, goes on your iDevice that hasn't been approved by Apple.

Powered by Squarespace. Background image of New Songdo by Curry Chandler.