June 8th, 2012
erickstoll

The Ecstasy Of Influence

For Harper’s Jonathan Lethem argues that all artists plagiarize, and advocates for a new understanding of artistic ownership and appropriation. 

In a courtroom scene from The Simpsons that has since entered into the television canon, an argument over the ownership of the animated characters Itchy and Scratchy rapidly escalates into an existential debate on the very nature of cartoons. “Animation is built on plagiarism!” declares the show’s hot-tempered cartoon-producer-within-a-cartoon, Roger Meyers Jr. “You take away our right to steal ideas, where are they going to come from?” If nostalgic cartoonists had never borrowed from Fritz the Cat, there would be no Ren & Stimpy Show; without the Rankin/Bass and Charlie Brown Christmas specials, there would be no South Park; and without The Flintstones—more or less The Honeymooners in cartoon loincloths—The Simpsons would cease to exist. If those don’t strike you as essential losses, then consider the remarkable series of “plagiarisms” that links Ovid’s “Pyramus and Thisbe” with Shakespeare’s Romeo and Juliet and Leonard Bernstein’s West Side Story, or Shakespeare’s description of Cleopatra, copied nearly verbatim from Plutarch’s life of Mark Antony and also later nicked by T. S. Eliot for The Waste Land. If these are examples of plagiarism, then we want more plagiarism.

Read the whole article here.

// Follow Read This, Not That on Tumblr / Facebook / Twitter //

May 23rd, 2012
atomvincent
BAD ART
Writing for ARTnews, Richard B. Woodward looks at the popularity of “bad” art and the complicated, contradictory notion of taste.

Cattelan, like Hirst, has hit on a formula that forecloses on the possibility of an audience’s feeling insulted. Only a tiny number of Catholics took umbrage at La Nona Ora, Cattelan’s 1999 sculpture of Pope John Paul II struck by a meteor, and even they weren’t sure why they should be offended. When the piece sold at auction in 2004 for $3 million, Cattelan’s act of smirking impiety was confirmed as a high-priced collectible. As Peter Schjeldahl wrote in the New Yorker, Cattelan’s career “reveals, or even fortifies, the fact that self-parody has become the life-support system of international art infrastructures. Make people feel smart, and they will put up with anything. The mindset cannot be outflanked or overturned, because it routinely performs those operations on itself.
Bad taste often passes for avant-garde taste these days—so long as the artist signals “transgressive” intent. And whereas kitsch in art was once to be assiduously disdained, art that traffics in sentimentality and bathos behind a dancing veil of ironic laughter has become highly prized. Jeff Koons, John Currin, Lisa Yuskavage, Richard Prince, and Takashi Murakami are just a few of those who have learned that coy subversion can be popular and lucrative. As long as everyone is in on the joke that the art is satirizing its own historical codes of representation, there is nothing to be upset about.

Read the full article here.
// Follow Read This, Not That on Tumblr / Facebook / Twitter //

BAD ART

Writing for ARTnews, Richard B. Woodward looks at the popularity of “bad” art and the complicated, contradictory notion of taste.

Cattelan, like Hirst, has hit on a formula that forecloses on the possibility of an audience’s feeling insulted. Only a tiny number of Catholics took umbrage at La Nona Ora, Cattelan’s 1999 sculpture of Pope John Paul II struck by a meteor, and even they weren’t sure why they should be offended. When the piece sold at auction in 2004 for $3 million, Cattelan’s act of smirking impiety was confirmed as a high-priced collectible. As Peter Schjeldahl wrote in the New Yorker, Cattelan’s career “reveals, or even fortifies, the fact that self-parody has become the life-support system of international art infrastructures. Make people feel smart, and they will put up with anything. The mindset cannot be outflanked or overturned, because it routinely performs those operations on itself.

Bad taste often passes for avant-garde taste these days—so long as the artist signals “transgressive” intent. And whereas kitsch in art was once to be assiduously disdained, art that traffics in sentimentality and bathos behind a dancing veil of ironic laughter has become highly prized. Jeff Koons, John Currin, Lisa Yuskavage, Richard Prince, and Takashi Murakami are just a few of those who have learned that coy subversion can be popular and lucrative. As long as everyone is in on the joke that the art is satirizing its own historical codes of representation, there is nothing to be upset about.

Read the full article here.

// Follow Read This, Not That on Tumblr / Facebook / Twitter //

April 15th, 2012
atomvincent

Walk On, Good People

I love walking. It’s a simple statement that may seem silly because everyone is walking all the time, right? Not exactly, says Tom Vanderbilt in this four-part piece on foot travel in America. For Slate, he looks at the benefits of bipedal locomotion, pedestrian habits, walkability, auto-centric civic planning, and all manner of issues related to the most fundamental form of human transportation. I hope you’ll take Vanderbilt’s insights to heart and consider the pleasures and possibilities that come from hoofing it. While you’re reading, I’ll be taking a walk. Maybe we’ll meet on the footpath.

Walking in America is a bit like sex: Everybody’s doing it, but nobody knows how much. Bricker, of America Walks, adds that the “collection of information around walking is quite poor and inconsistent.” There are the problems of self-reporting—who can really remember, sans pedometer, how much one has walked, and who wants to admit on a survey that they never walk? There’s also little agreement, he says, on what, statistically, constitutes a walking trip. “Is walking down the hall to the bathroom a walking trip? Do you have to leave the house? Is walking to the park with your dog a walking trip? Is walking to and from the bus a walking trip? None of those things are counted.” The most accurate source of information we have comes from the U.S. Census, in the so-called “Journey to Work” questions. But these only inquire about commuting trips. What’s more, as researchers have noted, because the Census emphasizes the mode of transportation taken most often, and for the longest part of the total journey, any number of walking trips may be obscured. People who take train transit, for example, have been shown in pedometer studies to walk much more than those who drive.

This focus on work trips rather misses the point in a country where very few people could walk to work, even if they wanted. Commuting (by any method) accounts for less than 15 percent of all trips. What’s more at stake is so-called “discretionary travel,” the trips to the grocery store, to soccer practice, to the bank, and these are where we logged our greatest increases in driving. “It’s not just about how many people walk to work,” says Bricker. “It’s how many are willing to walk out the front door for any reason.” Where walking has been lost is in these short trips of a mile or less—28 percent of all trips in America—the majority of which are now taken in a car. “Let’s take that stroll,” says Bricker. “It’s missing from the cultural mindset.”

Read the first part here, and note the convenient links to the subsequent three parts at the top of the page.

January 25th, 2012
atomvincent

Youth & Flames

The culture of young drifters - “train hoppers” and “squatters” - is widely misunderstood, dehumanized, and ignored. The youths that take to the road and rails are often written off as troublemakers and criminals, but, as with any group so marginalized, there’s much more to be found beyond the accusations of the straight world. Writing for the Boston Review, Danielle Morton investigates a handful of lives lost to a squat fire in New Orleans and in doing so offers us a glimpse into this obscured world.

I didn’t ask many questions, and  [my daughter]  didn’t volunteer much. She didn’t think I’d ever understand why she had to go or what she was seeking when she left. The most she and her friends offered by way of explanation were stories about hopping trains.

Like the August evening she and her traveling buddy Joey Two Times hopped a Cadillac grainer out of Alabama. The grainer’s front and back angle to a wide V that funnels the cargo out a spigot in the bottom. The Cadillac feature is its fenced porches, fore and aft. This is where they lounged as the train pulled away from the heat of the barren yard. Within an hour or two, the moon had risen, and they were snaking through the lush southern summer singing at the tops of their lungs.

I could see that. I could feel it. And because of that, I didn’t believe that what I valued and what my daughter did were now so far apart that we no longer shared a common language. I had to admit there were moments when I admired her bravery, and her timing.

Then in the early hours of December 28, 2010 eight people, some of whom she knew, died in that terrible conflagration in New Orleans. Suddenly we had the common language of grief. The dead—aged 17 to 29—were from Wisconsin, Texas, California, Pennsylvania, Iowa, Nebraska, and New Orleans. I knew how they died, but I didn’t know how they lived. How had they ended up in the squat on that night? What forces had pushed them onto the rails, and what had they left behind? Was I an outlier among those eight sets of parents, or were there others like me? Perhaps Marissa was right that I couldn’t understand, but this tragedy made me want to try.

I knew I’d never be allowed into this clandestine and suspicious world without my daughter as my guide. I asked her to take me there. To my surprise, she agreed.

Read the full article here.

January 23rd, 2012
atomvincent

The Perils of Pitchfork 

For n+1, Richard Beck dissects Pitchfork, indie music, and the culture that surrounds them.

In Sufjan Stevens, indie adopted precious, pastoral nationalism at the Bush Administration’s exact midpoint. In M.I.A., indie rock celebrated a musician whose greatest accomplishment has been to turn the world’s various catastrophes into remixed pop songs. This is a kind of music, in other words, that’s very good at avoiding uncomfortable conversations. Pitchfork has imitated, inspired, and encouraged indie rock in this respect. It has incorporated a perfect awareness of cultural capital into its basic architecture. A Pitchfork review may ignore history, aesthetics, or the basic technical aspects of tonal music, but it will almost never fail to include a detailed taxonomy of the current hype cycle and media environment. This is a small, petty way of thinking about a large art, and as indie bands have both absorbed and refined the culture’s obsession with who is over- and underhyped, their musical ambitions have been winnowed down to almost nothing at all.

It’s usually a waste of time to close-read rock lyrics; a lot of great rock musicians just aren’t that good with words. What you can do with a rock lyric, though, is note the kinds of phrasing that come to mind when a musician is trying to fill a particular rhythmic space with words. You can see what kind of language comes naturally, and some of the habits and beliefs that the language reveals. This makes it worth pausing, just for a moment, over Animal Collective’s most famous lyric: “I don’t mean to seem like I care about material things.” The ethical lyric to sing would be, “I don’t want to be someone who cares about material things,” but in indie rock today the worst thing would be just to seem like a materialistic person. You can learn a lot about indie rock, its fans, and Pitchfork from the words “mean to seem like.”

Read the full article here.

January 18th, 2012
atomvincent

Forget About It

Jeffrey Rosen writes for the New York Times' Sunday Magazine on the increasing impact the internet and social networking have on our identities and our “real-world” lives. Can we control who we are in a world that never forgets?

For most of human history, the idea of reinventing yourself or freely shaping your identity — of presenting different selves in different contexts (at home, at work, at play) — was hard to fathom, because people’s identities were fixed by their roles in a rigid social hierarchy. With little geographic or social mobility, you were defined not as an individual but by your village, your class, your job or your guild. But that started to change in the late Middle Ages and the Renaissance, with a growing individualism that came to redefine human identity. As people perceived themselves increasingly as individuals, their status became a function not of inherited categories but of their own efforts and achievements. This new conception of malleable and fluid identity found its fullest and purest expression in the American ideal of the self-made man, a term popularized by Henry Clay in 1832. From the late 18th to the early 20th century, millions of Europeans moved from the Old World to the New World and then continued to move westward across America, a development that led to what the historian Frederick Jackson Turner called “the significance of the frontier,” in which the possibility of constant migration from civilization to the wilderness made Americans distrustful of hierarchy and committed to inventing and reinventing themselves.

In the 20th century, however, the ideal of the self-made man came under siege. The end of the Western frontier led to worries that Americans could no longer seek a fresh start and leave their past behind, a kind of reinvention associated with the phrase “G.T.T.,” or “Gone to Texas.” But the dawning of the Internet age promised to resurrect the ideal of what the psychiatrist Robert Jay Lifton has called the “protean self.” If you couldn’t flee to Texas, you could always seek out a new chat room and create a new screen name. For some technology enthusiasts, the Web was supposed to be the second flowering of the open frontier, and the ability to segment our identities with an endless supply of pseudonyms, avatars and categories of friendship was supposed to let people present different sides of their personalities in different contexts. What seemed within our grasp was a power that only Proteus possessed: namely, perfect control over our shifting identities.

But the hope that we could carefully control how others view us in different contexts has proved to be another myth. As social-networking sites expanded, it was no longer quite so easy to have segmented identities: now that so many people use a single platform to post constant status updates and photos about their private and public activities, the idea of a home self, a work self, a family self and a high-school-friends self has become increasingly untenable. In fact, the attempt to maintain different selves often arouses suspicion. Moreover, far from giving us a new sense of control over the face we present to the world, the Internet is shackling us to everything that we have ever said, or that anyone has said about us, making the possibility of digital self-reinvention seem like an ideal from a distant era.

Read the full article here.

January 9th, 2012
atomvincent

Crueler Than Violence, Certain As Moonlight

Stephen Burt writes for The Nation on four new books of poetry that seek to observe the unique emotions of the contemporary world and the intersection of our interior lives with the systems that surround us.

It’s tempting, sometimes irresistible, to divide poets into movements and schools, to slot any poem that seems mildly memorable into the category New Whatever and argue that it represents our time. You can do that with these four poets if you come at them from a certain angle, an angle that they sometimes recommend; you can do the same with other contemporary poets—Claudia Rankine, Mark Nowak, Craig Santos Perez and especially C.D. Wright—who have won praise for quotation-filled, reportorial, essaylike forms. (The poet and critic Joseph Harrington has done just that in the online magazine Jacket2, announcing the age of the docu-poem, of what he prefers to call “creative nonpoetry,” whose arguments, facts and incorporated quotations—Perez, Nowak and Ossip stand among his examples—break out of any and all generic frames.) You can also find earlier precedents for these kinds of forms, too, from Ezra Pound’s Cantos (1925–69) to Muriel Rukeyser’s now undeniably influentialU.S. 1 (1938); you can find poems made largely or wholly of source texts erased or altered in search of sublimity, like Ronald Johnson’s Radi Os (1977), fashioned from Paradise Lost. Yet these new poems, from Wright’s to Ossip’s—unlike those older ones—function as essays, medium-length attempts at understanding some things without explaining everything: they do not pretend to predict the whole course of our history, nor do they tell us what we should do.

Instead they are partial takes—neither songlike nor epic—on systems more complicated and fragile, and less amenable to human governance, than previous generations of writers believed. Avowedly partial, attentive to the self and to something outside the self, the essay form—or the ghost of it, or the fragments of it—makes a bracing contrast both with the lyric compression these poets refuse, and with the giant systems they critique.

Read the full article here.

January 2nd, 2012
atomvincent

Love In the Time of Algorithms

Writing for The New Yorker, Nick Paumgarten looks at the history and intricacies of online dating. Is online match-making an unnatural approach to finding a mate? Or is it a reasonable and functional simplification of an age old process? Can attraction really be broken down into equations?

The process of selecting and securing a partner, whether for conceiving and rearing children, or for enhancing one’s socioeconomic standing, or for attempting motel-room acrobatics, or merely for finding companionship in a cold and lonely universe, is as consequential as it can be inefficient or irresolute. Lives hang in the balance, and yet we have typically relied for our choices on happenstance—offhand referrals, late nights at the office, or the dream of meeting cute.
Online dating sites, whatever their more mercenary motives, draw on the premise that there has got to be a better way. They approach the primeval mystery of human attraction with a systematic and almost Promethean hand. They rely on algorithms, those often proprietary mathematical equations and processes which make it possible to perform computational feats beyond the reach of the naked brain. Some add an extra layer of projection and interpretation; they adhere to a certain theory of compatibility, rooted in psychology or brain chemistry or genetic coding, or they define themselves by other, more readily obvious indicators of similitude, such as race, religion, sexual predilection, sense of humor, or musical taste. There are those which basically allow you to browse through profiles as you would boxes of cereal on a shelf in the store. Others choose for you; they bring five boxes of cereal to your door, ask you to select one, and then return to the warehouse with the four others. Or else they leave you with all five.
It is tempting to think of online dating as a sophisticated way to address the ancient and fundamental problem of sorting humans into pairs, except that the problem isn’t very old. Civilization, in its various guises, had it pretty much worked out. Society—family, tribe, caste, church, village, probate court—established and enforced its connubial protocols for the presumed good of everyone, except maybe for the couples themselves. The criteria for compatibility had little to do with mutual affection or a shared enthusiasm for spicy food and Fleetwood Mac. Happiness, self-fulfillment, “me time,” a woman’s needs: these didn’t rate. As for romantic love, it was an almost mutually exclusive category of human experience. As much as it may have evolved, in the human animal, as a motivation system for mate-finding, it was rarely given great consideration in the final reckoning of conjugal choice.
The twentieth century reduced it all to smithereens. The Pill, women in the workforce, widespread deferment of marriage, rising divorce rates, gay rights—these set off a prolonged but erratic improvisation on a replacement. In a fractured and bewildered landscape of fern bars, ladies’ nights, Plato’s Retreat, “The Bachelor,” sexting, and the concept of the “cougar,” the Internet promised reconnection, profusion, and processing power.

Read the full article here.

Love In the Time of Algorithms

Writing for The New Yorker, Nick Paumgarten looks at the history and intricacies of online dating. Is online match-making an unnatural approach to finding a mate? Or is it a reasonable and functional simplification of an age old process? Can attraction really be broken down into equations?

The process of selecting and securing a partner, whether for conceiving and rearing children, or for enhancing one’s socioeconomic standing, or for attempting motel-room acrobatics, or merely for finding companionship in a cold and lonely universe, is as consequential as it can be inefficient or irresolute. Lives hang in the balance, and yet we have typically relied for our choices on happenstance—offhand referrals, late nights at the office, or the dream of meeting cute.

Online dating sites, whatever their more mercenary motives, draw on the premise that there has got to be a better way. They approach the primeval mystery of human attraction with a systematic and almost Promethean hand. They rely on algorithms, those often proprietary mathematical equations and processes which make it possible to perform computational feats beyond the reach of the naked brain. Some add an extra layer of projection and interpretation; they adhere to a certain theory of compatibility, rooted in psychology or brain chemistry or genetic coding, or they define themselves by other, more readily obvious indicators of similitude, such as race, religion, sexual predilection, sense of humor, or musical taste. There are those which basically allow you to browse through profiles as you would boxes of cereal on a shelf in the store. Others choose for you; they bring five boxes of cereal to your door, ask you to select one, and then return to the warehouse with the four others. Or else they leave you with all five.

It is tempting to think of online dating as a sophisticated way to address the ancient and fundamental problem of sorting humans into pairs, except that the problem isn’t very old. Civilization, in its various guises, had it pretty much worked out. Society—family, tribe, caste, church, village, probate court—established and enforced its connubial protocols for the presumed good of everyone, except maybe for the couples themselves. The criteria for compatibility had little to do with mutual affection or a shared enthusiasm for spicy food and Fleetwood Mac. Happiness, self-fulfillment, “me time,” a woman’s needs: these didn’t rate. As for romantic love, it was an almost mutually exclusive category of human experience. As much as it may have evolved, in the human animal, as a motivation system for mate-finding, it was rarely given great consideration in the final reckoning of conjugal choice.

The twentieth century reduced it all to smithereens. The Pill, women in the workforce, widespread deferment of marriage, rising divorce rates, gay rights—these set off a prolonged but erratic improvisation on a replacement. In a fractured and bewildered landscape of fern bars, ladies’ nights, Plato’s Retreat, “The Bachelor,” sexting, and the concept of the “cougar,” the Internet promised reconnection, profusion, and processing power.

Read the full article here.

December 15th, 2011
atomvincent

Read This, Not That: Man and Beast

Writing for The Chronicle of Higher Education, Justin E.H. Smith looks at the unique and wide role animals play in human culture, both as symbols of the other and as manifestations of our hidden selves. Over the course of history, we have increasingly removed ourselves, intellectually, from community with the animal world around us. What can we learn from that cultural history? And what might we lose when we cease to think of ourselves as beasts?

Before and after Darwin, the specter of the animal in man has been compensated by a hierarchical scheme that separates our angelic nature from our merely circumstantial, and hopefully temporary, beastly one. And we find more or less the same separation in medieval Christian theology, Romantic nature poetry, or current cognitive science: All of it aims to distinguish the merely animal in us from the properly human. Thus Thoreau, widely lauded as a friend of the animals, cannot refrain from invoking animality as something to be overcome: “Men think that it is essential,” he writes, “that the Nation have commerce, and export ice, and talk through a telegraph, and ride 30 miles an hour, without a doubt, whether they do or not; but whether we should live like baboons or like men, is a little uncertain.” What the author of Walden misses is that men might be living like baboons not because they are failing at something or other, but because they are, in fact, primates. Thoreau can’t help invoking the obscene and filthy beasts that have, since classical antiquity, formed a convenient contrast to everything we aspire to be.

Read the full article here.

December 14th, 2011
atomvincent

Read This, Not That: Film Criticism and Democracy in the Digital Age

Writing for Dissent, Charles Taylor ponders how the internet has ruined film criticism, democracy, and culture.

When I started as a film critic online at Salon.com, readers could click on a link that allowed them to e-mail me directly. Within a month, I heard from more readers than I had in a decade as a print critic… That all ended when the publication made it possible for readers to post directly without going through an editor. Almost immediately, I and the other writers I knew stopped hearing directly from readers. Instead, instant posting became survival of the loudest. Posturing and haranguing ruled. If the writer was female or Jewish, misogynists and anti-Semites would turn up. Why wouldn’t they? There was no editor to stop them. Bullies and bigots seized the chance to show off. And those reasonable people, the ones I and my colleagues heard from? They went nowhere near the online forums.

This “lively forum” or “spirited debate” or whatever euphemism is now used for online bullying has always been defended by the claims that balance would be restored as reasonable respondents came in to counter the blowhards. Bullet wounds can be stitched up as well, but the damage is already done. “Communication,” the virtual reality pioneer, Jaron Lanier writes in his essential book You Are Not a Gadget, “is now often experienced as a superhuman experience that towers above individuals. A new generation has come of age with a reduced experience of what a person can be, and of who each person might become.

That kind of divisiveness is what digital culture has come to specialize in on a much broader and insidious scale—and what gets held up as proof of the Web’s democratizing influence. The same progressives who bemoan the way Fox News has polarized political discourse in America, masquerading as news while never troubling its followers with anything that would disturb its most cherished and untested convictions, happily turn to the satellite radio station of their preferred genre or subgenre of music or seek out the support group or message board that fits their demographic, the political site that skews their way. Entering the realm of the other seems done solely to express rage.

The rigorous division of websites into narrow interests, the attempts of Amazon and Netflix to steer your next purchase based on what you’ve already bought, the ability of Web users to never encounter anything outside of their established political or cultural preferences, and the way technology enables advertisers to identify each potential market and direct advertising to it, all represent the triumph of cultural segregation that is the negation of democracy. It’s the reassurance of never having to face anyone different from ourselves.

Read the full article here.

Loading tweets...

@rtntnews