Essay About Culture Change Images

The Image Culture 

Christine Rosen

When Hurricane Katrina struck the Gulf Coast of Mississippi, Alabama, and Louisiana in late August, images of the immense devastation were immediately available to anyone with a television set or an Internet connection. Although images of both natural and man-made disasters have long been displayed in newspapers and on television, the number and variety of images in the aftermath of Katrina reveals the sophistication, speed, and power of images in contemporary American culture. Satellite photographs from space offered us miniature before and after images of downtown New Orleans and the damaged coast of Biloxi; video footage from an array of news outlets tracked rescue operations and recorded the thoughts of survivors; wire photos captured the grief of victims; amateur pictures, taken with camera-enabled cell phones or digital cameras and posted to personal blogs, tracked the disaster’s toll on countless individuals. The world was offered, in a negligible space of time, both God’s-eye and man’s-eye views of a devastated region. Within days, as pictures of the squalor at the Louisiana Superdome and photographs of dead bodies abandoned in downtown streets emerged, we confronted our inability to cope with the immediate chaos, destruction, and desperation the storm had caused. These images brutally drove home the realization of just how unprepared the U.S. was to cope with such a disaster.

But how did this saturation of images influence our understanding of what happened in New Orleans and elsewhere? How did the speed with which the images were disseminated alter the humanitarian and political response to the disaster? And how, in time, will these images influence our cultural memory of the devastation caused by Hurricane Katrina?

Such questions could be asked of any contemporary disaster — and often have been, especially in the wake of the September 2001 terrorist attacks in New York and Washington, D.C., which forever etched in public memory the image of the burning Twin Towers. But the average person sees tens of thousands of images in the course of a day. One sees images on television, in newspapers and magazines, on websites, and on the sides of buses. Images grace soda cans and t-shirts and billboards. “In our world we sleep and eat the image and pray to it and wear it too,” novelist Don DeLillo observed. Internet search engines can instantly procure images for practically any word you type. On flickr.com, a photo-sharing website, you can type in a word such as “love” and find amateur digital photos of couples in steamy embrace or parents hugging their children. Type in “terror” and among the results is a photograph of the World Trade Center towers burning. “Remember when this was a shocking image?” asks the person who posted the picture.

The question is not merely rhetorical. It points to something important about images in our culture: They have, by their sheer number and ease of replication, become less magical and less shocking — a situation unknown until fairly recently in human history. Until the development of mass reproduction, images carried more power and evoked more fear. The second of the Ten Commandments listed in Exodus 20 warns against idolizing, or even making, graven images: “Thou shalt not make unto thee any graven image, or any likeness of any thing that is in heaven above, or that is in the earth beneath, or that is in the water under the earth.” During the English Reformation, Henry VIII’s advisor Thomas Cromwell led the effort to destroy religious images and icons in the country’s churches and monasteries, and was successful enough that few survive to this day. The 2001 decision by the Taliban government in Afghanistan to destroy images throughout the country — including the two towering stone Buddhas carved into the cliffs of Bamiyan — is only the most recent example of this impulse. Political leaders have long feared images and taken extreme measures to control and manipulate them. The anonymous minions of manipulators who sanitized photographs at the behest of Stalin (a man who seemingly never met an enemy he didn’t murder and then airbrush from history) are perhaps the best known example. Control of images has long been a preoccupation of the powerful.

It is understandable why so many have been so jealous of the image’s influence. Sight is our most powerful sense, much more dominant in translating experience than taste, touch, or hearing. And images appeal to emotion — often viscerally so. They claim our attention without uttering a word. They can persuade, repel, or charm us. They can be absorbed instantly and easily by anyone who can see. They seem to speak for themselves.

Today, anyone with a digital camera and a personal computer can produce and alter an image. As a result, the power of the image has been diluted in one sense, but strengthened in another. It has been diluted by the ubiquity of images and the many populist technologies (like inexpensive cameras and picture-editing software) that give almost everyone the power to create, distort, and transmit images. But it has been strengthened by the gradual capitulation of the printed word to pictures, particularly moving pictures — the ceding of text to image, which might be likened not to a defeated political candidate ceding to his opponent, but to an articulate person being rendered mute, forced to communicate via gesture and expression rather than language.

Americans love images. We love the democratizing power of technologies — such as digital cameras, video cameras, Photoshop, and PowerPoint — that give us the capability to make and manipulate images. What we are less eager to consider are the broader cultural effects of a society devoted to the image. Historians and anthropologists have explored the story of mankind’s movement from an oral-based culture to a written culture, and later to a printed one. But it is only in the past several decades that we have begun to assimilate the effects of the move from a culture based on the printed word to one based largely on images. In making images rather than texts our guide, are we opening up new vistas for understanding and expression, creating a form of communication that is “better than print,” as New York University communications professor Mitchell Stephens has argued? Or are we merely making a peculiar and unwelcome return to forms of communication once ascendant in preliterate societies — perhaps creating a world of hieroglyphics and ideograms (albeit technologically sophisticated ones) — and in the process becoming, as the late Daniel Boorstin argued, slavishly devoted to the enchanting and superficial image at the expense of the deeper truths that the written word alone can convey?

Two things in particular are at stake in our contemporary confrontation with an image-based culture: First, technology has considerably undermined our ability to trust what we see, yet we have not adequately grappled with the effects of this on our notions of truth. Second, if we are indeed moving from the era of the printed word to an era dominated by the image, what impact will this have on culture, broadly speaking, and its institutions? What will art, literature, and music look like in the age of the image? And will we, in the age of the image, become too easily accustomed to verisimilar rather than true things, preferring appearance to reality and in the process rejecting the demands of discipline and patience that true things often require of us if we are to understand their meaning and describe it with precision? The potential costs of moving from the printed word to the image are immense. We may find ourselves in a world where our ability to communicate is stunted, our understanding and acceptance of what we see questionable, and our desire to transmit culture from one generation to the next seriously compromised.

The Mirror With a Memory

The creator of one of the earliest technologies of the image named his invention, appropriately enough, for himself. Louis-Jacques-Mandé Daguerre, a Frenchman known for his elaborate and whimsical stage design in the Paris theater, began building on the work of Joseph Nicéphore Niepce to try to produce a fixed image. Daguerre called the image he created in 1837 the “daguerreotype” (acquiring a patent from the French government for the process in 1839). He made extravagant claims for his device. It is “not merely an instrument which serves to draw nature,” he wrote in 1838, it “gives her the power to reproduce herself.”

Despite its technological crudeness and often-spectral images, the daguerreotype was eerily effective at capturing glimmers of personality in its fixed portraits. The extant daguerreotypes of well-known Americans in the nineteenth century include: a young and serious Abraham Lincoln, sans beard; an affable Horace Greeley in stovepipe hat; and a dour picture of the suffragist Lucy Stone. A daguerreotype of Edgar Allen Poe, taken in 1848, depicts the writer with a baleful expression and crossed arms, and was taken not long before Poe was found delirious and near death on the streets of Baltimore.

But the daguerreotype did more than capture the posture of a poised citizenry. It also changed artists’ perceptions of human nature. Nathaniel Hawthorne’s 1851 Gothic romance, The House of the Seven Gables, has an ancient moral (“the wrong-doing of one generation lives into the successive ones”) but made use of a modern technology, daguerreotyping, to unspool its story about the unmasking of festering, latent evil. In the story, Holgrave, the strange lodger living in the gabled house, is a daguerreotypist (as well as a political radical) who says of his art: “While we give it credit only for depicting the merest surface, it actually brings out the secret character with a truth no painter would ever venture upon, even could he detect it.” It is Holgrave’s silvery daguerreotypes that eventually reveal the nefarious motives of Judge Pyncheon — and in so doing suggest that the camera could expose human character more acutely than the eye.

Oliver Wendell Holmes called the photo the “mirror with a memory,” and in 1859 predicted that the “image would become more important than the object itself and would in fact make the object disposable.” But praise for the photograph was not universal. “A revengeful God has given ear to the prayers of this multitude. Daguerre was his Messiah,” said the French poet Charles Baudelaire in an essay written in 1859. “Our squalid society rushed, Narcissus to a man, to gaze at its trivial image on a scrap of metal.” As a result, Baudelaire worried, “artistic genius” was being impoverished.

Contemporary critiques of photography have at times echoed Baudelaire’s fear. In her elegant extended essay, On Photography, the late Susan Sontag argues that images — particularly photographs — carry the risk of undermining true things and genuine experiences, as well as the danger of upending our understanding of art. “Knowing a great deal about what is in the world (art, catastrophe, the beauties of nature) through photographic images,” Sontag notes, “people are frequently disappointed, surprised, unmoved when they see the real thing.” This is not a new problem, of course; it plagued the art world when the printing process allowed the mass reproduction of great works of art, and its effects can still be seen whenever one overhears a museum-goer express disappointment that the Van Gogh he sees hanging on the wall is nowhere near as vibrant as the one on his coffee mug.

But Sontag’s point is broader, and suggests that photography has forced us to consider that exposure to images does not necessarily create understanding of the things themselves. Images do not necessarily lead to meaning; the information they convey does not always lead to knowledge. This is due in part to the fact that photographic images must constantly be refreshed if one’s attention is to continue to be drawn to them. “Photographs shock insofar as they show something novel,” Sontag argues. “Unfortunately, the ante keeps getting raised — partly through the very proliferation of such images of horror.” Images, Sontag concludes, have turned the world “into a department store or museum-without-walls,” a place where people “become customers or tourists of reality.”

Other contemporary critics, such as Roger Scruton, have also lamented this diversionary danger and worried about our potential dependence on images. “Photographic images, with their capacity for realization of fantasies, have a distracting character which requires masterly control if it is not to get out of hand,” Scruton writes. “People raised on such images ... inevitably require a need for them.” Marshall McLuhan, the Sixties media guru, offered perhaps the most blunt and apt metaphor for photography: he called it “the brothel-without-walls.” After all, he noted, the images of celebrities whose behavior we so avidly track “can be bought and hugged and thumbed more easily than public prostitutes” — and all for a greatly reduced price.

Nevertheless, photographs still retain some of the magical allure that the earliest daguerreotypes inspired. As W. J. T. Mitchell observes in What Do Pictures Want?, “When students scoff at the idea of a magical relation between a picture and what it represents, ask them to take a photograph of their mother and cut out the eyes.” As objects, our photographs have changed; they have become physically flimsier as they have become more technologically sophisticated. Daguerre produced pictures on copper plates; today many of our photographs never become tangible things, but instead remain filed away on computers and cameras, part of the digital ether that envelops the modern world. At the same time, our patience for the creation of images has also eroded. Children today are used to being tracked from birth by digital cameras and video recorders and they expect to see the results of their poses and performances instantly. “Let me see,” a child says, when you take her picture with a digital camera. And she does, immediately. The space between life as it is being lived and life as it is being displayed shrinks to a mere second. Yet, despite these technical developments, photographs remain powerful because they are reminders of the people and things we care about. They are surrogates carried into battle by a soldier or by a traveler on holiday. They exist to remind us of the absent, the beloved, and the dead. But in the new era of the digital image, they also have a greater potential for fostering falsehood and trickery, perpetuating fictions that seem so real we cannot tell the difference.

Vanishing Commissars and Bloodthirsty Presidents

Human nature being what it is, little time passed after photography’s invention before a means for altering and falsifying photographs was developed. A German photographer in the 1840s discovered a way to retouch negatives, Susan Sontag recounts, and, perversely if not unpredictably, “the news that the camera could lie made getting photographed much more popular.”

One of the most successful mass manipulators of the photographic image was Stalin. As David King recounts in his riveting book, The Commissar Vanishes: The Falsification of Photographs and Art in Stalin’s Russia, image manipulation was the extension of Stalin’s paranoiac megalomania. “The physical eradication of Stalin’s political opponents at the hands of the secret police was swiftly followed by their obliteration from all forms of pictorial existence,” King writes. Airbrush, India ink, and scalpel were all marshaled to remove enemies such as Trotsky from photographs. “There is hardly a publication from the Stalinist period that does not bear the scars of this political vandalism,” King concludes.

Even in non-authoritarian societies, early photo falsification was commonly used to dupe the masses. A new exhibit at the Metropolitan Museum of Art in New York, “The Perfect Medium: Photography and the Occult,” displays a range of photographs from the late-nineteenth- and early-twentieth-century United States and Europe that purport to show ghosts, levitating mediums, and a motley array of other emanations that were proffered as evidence of the spirit world by devotees of the spiritualism movement popular at the time. The pictures, which include images of tiny heads shrouded in smoke and hovering over the furrowed brows of mediums, and ghosts in diaphanous robes walking through gardens, are “by turns spooky, beautiful, disturbing, and hilarious,” notes the New York Times. They create “visual records of decades of fraud, cons, flimflams and gullibility.”

Stalin and the spiritualists were not the only people to manipulate images in the service of reconstructing the past — many an angry ex-lover has taken shears to photos of a once-beloved in the hope that excising the images might also excise the bad memories the images prompt. But it was the debut of a computer program called Photoshop in 1990 that allowed the masses, inexpensively and easily, to begin rewriting visual history. Photoshop and the many copycat programs that have followed in its wake allow users to manipulate digital images with great ease — resizing, changing scale, and airbrushing flaws, among other things — and they have been both denounced for facilitating the death of the old-fashioned darkroom and hailed as democratic tools for free expression. “It’s the inevitable consequence of the democratization of technology,” John Knoll, the inventor of Photoshop, told Salon.com. “You give people a tool, but you can’t really control what they do with it.”

For some people, of course, offering Photoshop as a tool is akin to giving a stick of dynamite to a toddler. Last year, The Nation published an advertisement that used Photoshop to superimpose President Bush’s head over the image of a brutal and disturbing Richard Serra sculpture (which itself borrows from Goya’s painting, “Saturn Devouring One of His Children”) so that Bush appeared to be enthusiastically devouring a naked human torso. In contrast to the sickening image, the accompanying text appears prim: www.pleasevote.com. As this and other images suggest, Photoshop has introduced a new fecklessness into our relationship with the image. We tend to lose respect for things we can manipulate. And when we can so readily manipulate images — even images of presidents or loved ones — we contribute to the decline of respect for what the image represents.

Photoshop is popular not only because it allows us visually to settle scores, but also because it appeals to our desire for the incongruous (and the ribald). “Photoshop contests” such as those found on the website Fark.com offer people the opportunity to create wacky and fantastic images that are then judged by others in cyberspace. This is an impulse that predates software and whose most enthusiastic American purveyor was, perhaps, P. T. Barnum. In the nineteenth century, Barnum barkered an infamous “mermaid woman” that was actually the moldering head of a monkey stitched onto the body of a fish. Photoshop allows us to employ pixels rather than taxidermy to achieve such fantasies, but the motivation for creating them is the same — they are a form of wish fulfillment and, at times, a vehicle for reinforcing our existing prejudices.

Of course, Photoshop meddling is not the only tactic available for producing misleading images. Magazines routinely airbrushed and retouched photographs long before picture-editing software was invented. And of course even “authentic” pictures can be staged, like the 1960s Life magazine pictures of Muhammad Ali that showed him training underwater; in fact, Ali couldn’t even swim, and he hadn’t done any underwater training for his prizefights before stepping into the pool for that photo opportunity. More recently, in July 2005, the New York Times Magazine raised eyebrows when it failed to disclose that the Andres Serrano photographs accompanying a cover story about prisoner interrogation were in fact staged images rather than straightforward photojournalism. (Serrano was already infamous for his controversial 1989 photograph, “Piss Christ.”) The Times public editor chastised the magazine for violating the paper’s guidelines that “images in our pages that purport to depict reality must be genuine in every way.”

But while Photoshop did not invent image fraud, it has made us all potential practitioners. It enables the average computer user to become a digital prankster whose merrymaking with photographs can create more than silly images — it can spawn political and social controversy. In a well-reported article published in Salon.com in 2004, Farhad Manjoo explored in depth one such controversy: an image that purportedly showed an American Marine reservist in Iraq standing next to two young boys. One boy held a cardboard sign that read, “Lcpl Boudreaux killed my Dad then he knocked up my sister!” When the image found its way to the Council on American-Islamic Relations (CAIR), Manjoo reports, it seemed to prove the group’s worst fears about the behavior of American soldiers in Iraq. An angry press release soon followed. But then another image surfaced on various websites, identical to the first except for the text written on the cardboard sign, which now read, “Lcpl Boudreaux saved my Dad then he rescued my sister!” The authenticity of both photos was never satisfactorily proven, and, as Manjoo notes, the episode serves as a reminder that in today’s Photoshop world, “pictures are endlessly pliable.” (Interestingly, CAIR found itself at the center of a recent Photoshop scandal, the Weekly Standard reported, when it was shown that the organization had Photoshopped a hijab, or headscarf, onto several women in a picture taken at a CAIR event and then posted the doctored image on the organization’s website.)

Just as political campaigns in the past produced vituperative pamphlets and slogans, today Photoshop helps produce misleading images. The Bush-Cheney campaign was pilloried for using a Photoshopped image of a crowd of soldiers in the recent presidential election; the photo duplicated groups of soldiers to make the crowd appear larger than it actually was. The replicated faces of the soldiers recalled an earlier and cruder montaged crowd scene, “Stalin and the Masses,” produced in 1930, which purported to show the glowering dictator, in overcoat and cap, standing before a throng of loyal communists. (Other political campaigns — and university publicity departments — have also reportedly resorted to using Photoshop on pictures to make them seem more racially diverse.) Similarly, a Seventies-era image of Jane Fonda addressing an anti-war crowd with a young and raptly admiring John Kerry looking on was also created with Photoshop sorcery but circulated widely on the Internet during the last presidential election as evidence of Kerry’s extreme views. The doctored image fooled several news outlets before its questionable provenance was revealed. (Another image of Kerry and Fonda, showing them both sitting in the audience in a 1970 anti-war rally, was authentic.)

Photoshop, in effect, democratizes the ability to commit fraud. As a result, a few computer programmers are creating new digital detection techniques to uncover forgeries and manipulations. The Inspector Javert of digital fraud is Dartmouth computer science professor Hany Farid, who developed a software program that analyzes the pattern of pixels in digital images. Since all digital pictures are, in essence, a collection of codes, Farid’s program ferrets out “abnormal patterns of information that, while invisible to the eye, are detectable by computer” and that represent possible tampering, according to the New York Times. “It used to be that you had a photograph, and that was the end of it — that was truth,” Farid saidlast July. “We’re trying to bring some of that back. To put some measure of guarantee back in photography.”

But the digital manipulation of images can also be employed for far more enlightened purposes than removing models’ blemishes and attacking political opponents. Some artists use Photoshop merely to enhance photographs they take; others have made digital editing a central part of their art. The expansive images of the German photographer Andreas Gursky, whose photos of Montparnasse, the Tokyo Stock Exchange, and a 99-cent store make use of digital alteration, prompt us to look at familiar spaces in unfamiliar ways. The portraits taken and Photoshopped by artist Loretta Lux are “mesmerizing images of children who seem trapped between the nineteenth and twenty-first centuries, who don’t exist except in the magical realm of art,” according to a New YorkTimes critic. Here the manipulation of the image does not intrude. It illuminates. In these pictures, the manipulation of the image at least serves an authentic artistic vision, a vision that relies on genuine aesthetic and critical standards. Ironically, it is these very standards that a culture devoted to the image risks compromising.

The MTV Effect

The still images of daguerreotyping and photography laid the groundwork for the moving image in film and video; as photography did before them, these technologies prompted wonder and sweeping claims about the merits of this new way of seeing. In 1915, after a screening of filmmaker D. W. Griffith’s The Birth of a Nation, Woodrow Wilson declared that it was “like writing history with lightning” (a judgment Griffith promptly began using in his promotional efforts for the film). Moving images are as powerful as photos, if not more so. Like photographs, they appeal to emotion and can be read in competing ways. Yet moving images change so rapidly and so often that they arrest our attention and task the brain’s ability to absorb what we are seeing. They are becoming a ubiquitous presence in public and private life — so much so that Camille Paglia, an astute critic of images, has called our world “a media starscape of explosive but evanescent images.”

The moving image, like the photograph, can also be marshaled to prove or disprove competing claims. During the legal and political debate surrounding the case of Terri Schiavo, for example, videotape of her movements and apparent responsiveness to loved ones became central in this family dispute-turned-national drama. Those who argued for keeping Schiavo alive used the footage as evidence that she did indeed have feelings and thoughts that rendered attempts to remove her feeding tube barbaric and immoral. Those who believed that she should be left to die (including her husband) thought the tape “grossly deceptive,” because it represented a misleading portrait of Schiavo’s real condition. Most of the time, her husband and others argued, Terri did not demonstrate awareness; she was “immobile, expressionless.” In the Schiavo case, the moving image was both alibi and accuser.

Most Americans consume moving images through the media of television and movies (and, to a lesser degree, through the Internet and video games). In recent years, in what many observers have called “the MTV effect,” those moving images have become more nimble and less demanding of our attention. Jumping quickly from image to image in hastily edited segments (in some cases as quickly as one image every one-thirtieth of a second), television and, to a lesser extent, movies offer us a constant stream of visual candy. Former Vice President Al Gore’s new for-profit public access television channel, Current TV, is the latest expression of this trend. The network’s website lists its upcoming programming in tiny time increments: “In 1 min,” “In 3 min,” “In 10 min,” and so on. Reviewing the channel’s first few broadcasts, New York Times television critic Alessandra Stanley noted the many techniques “designed to hold short attention spans,” including a “progress bar” at the bottom of the screen that counts down how much time is left for each of the segments — some of which last as little as 15 seconds.

According to enthusiasts of television, the speed and sophistication of moving images allows new and improved forms of oral storytelling that can and should replace staler vehicles like the novel. Video game and television apologist Steven Johnson, author of Everything Bad is Good for You, dreams of a world of “DVD cases lining living room shelves like so many triple-decker novels.” If television is our new form of narrative, then our storytelling skills have declined, as anyone who has watched the new raft of sitcoms and dramas that premiere (and then quickly disappear) each fall on the major networks can attest. (Shows like The Sopranos are perhaps the rare exception.) In fact, television doesn’t really “tell stories.” It constructs fantasy worlds through a combination of images and words, relying more on our visual and aural senses and leaving less to the imagination than oral storytelling does. Writing some years ago in the journal Media & Values, J. Francis Davis noted that although television is in one sense a form of storytelling, the most important messages that emanate from the screen “are those not verbalized — the stories and myths hidden in its constant flow of images.”

It is precisely those hidden stories in the moving image that excite critics like NYU professor Mitchell Stephens. In The Rise of the Image, The Fall of the Word, Stephens argues that the moving image offers a potential cure for the “crisis of the spirit” that afflicts our society, and he is enthusiastic about the fact that “the image is replacing the word as the predominant means of mental transport.” Stephens envisions a future of learning through synecdoche, using vivid and condensed images: “A half second of the Capitol may be enough to indicate the federal government, a quick shot of a white-haired woman may represent age. The part, in other words, will be substituted for the whole so that in a given period of time it will be possible to consider a larger number of wholes.” He quotes approvingly the prediction of movie director Ridley Scott, who declares: “Film is twentieth-century theater, and it will become twenty-first-century writing.”

Perhaps it will. But Stephens, like other boosters of the image, fails to acknowledge what we will lose as well as gain if this revolution succeeds. He says, for example, “our descendants undoubtedly will still learn to read and write, but they undoubtedly will read and write less often and, therefore, less well.” Language, too, will be “less precise, less subtle,” and books “will maintain a small, elite audience.” This, then, is the future that prompts celebration: a world where, after a century’s effort to make literacy as broadly accessible as possible — to make it a tool for the masses — the ability to read and write is once again returned to the elite. Reading and writing either become what they were before widespread education — a mark of privilege — or else antiquarian preoccupations or mere hobbies, like coin collecting.

Stephens also assumes that the people who will be absorbing these images will have a store of knowledge at their disposal with which to interpret them. A quick shot of a white-haired woman might effectively be absorbed as symbolizing “age” to one person, as Stephens says, but it could also reasonably prompt ideas such as “hair dye,” “feebleness,” or “Social Security” to another. As Camille Paglia observes of her own students, “young people today are flooded with disconnected images but lack a sympathetic instrument to analyze them as well as a historical frame of reference in which to situate them.” They lack, in other words, a shared language or lexicon that would allow them to interpret images and then communicate an understanding of what they are seeing.

Such a deficit will pose a unique challenge for cultural transmission from one generation to the next. How, in Stephens’s future world of the moving image, will history, literature, and art be passed down to the next generation? He might envision classrooms where children watch the History Channel rather than pore over dull textbooks. But no matter how much one might enjoy the BBC’s televised version of Pride and Prejudice, it is no substitute for actually reading Austen’s prose, nor is a documentary about the American Constitutional Convention as effective at distilling the political ideals of the early American republic as reading The Federalist Papers. Moving images are a rich aide to learning and understanding but their victory as the best means of forming rigorous habits of mind is by no means assured.

In addition, Stephens accepts uncritically the claim that the “old days” of written and printed culture are gone (or nearly so) and assumes that video is the language that has emerged, like some species evolving through a process of natural selection, to take its place in the culture. He does not entertain the possibility that the reason the moving image is replacing the written word is not because it is, in fact, a superior form for the communication of ideas, but because the moving image — more so than the written word — crudely but intoxicatingly satisfies our desire for stimulation and immediate gratification.

Like any good techno-enthusiast, Stephens takes the choices that we have made en masse as a culture (such as watching television rather than reading), accepts them without challenge, and then declares them inevitable. This is a form of reasoning that techno-enthusiasts often employ when they attempt to engage the concerns of skeptics. Although rhetorically useful in the short-term, this strategy avoids the real questions: Did things have to happen this way rather than that way? Does every cultural trend make a culture genuinely better? By neglecting to ask these questions, the enthusiast becomes nearly Panglossian in his hymns to his new world.

There is, of course, a long and thorough literature critical of television and the moving image, most notably the work of Neil Postman, Jerry Mander, and Marie Winn. And as with photography, from its earliest days there have been those who worried that television might undermine our appreciation for true things. “Television hangs on the questionable theory that whatever happens anywhere should be sensed everywhere,” E. B. White wrote in The New Yorker in 1948. “If everyone is going to be able to see everything, in the long run all sights may lose whatever rarity value they once possessed, and it may well turn out that people, being able to see and hear practically everything, will be specially interested in almost nothing.” Others are even blunter. As Roger Scruton writes, “Observing the products of the video culture you come to see why the Greeks insisted that actors wear masks, and that all violence take place behind the scenes.” It is possible, in other words, to see too much, and in the seeing lose our grasp on what is real. Television is the perfect vehicle for this experience, since it bombards us with shocking, stimulating, and pleasant images, all the while keeping us at a safe remove from what we are seeing.

But the power the moving image now exercises over modern American life has grown considerably in recent years. It is as if the Jumbotron television screen that looms over Times Square in New York has replicated and installed itself permanently in public space. Large screens broadcasting any number of images and advertisements can be found in most sports arenas, restaurants, and shopping malls; they even appear in a growing number of larger churches. The dentist’s and doctor’s office are no longer safe havens from a barrage of images and sounds. A walk through an airport terminal is now a gauntlet of moving images, as televisions bolted into ceilings or walls blare vacuous segments from CNN’s dedicated “airport programming”; once on board a plane, we’re treated to nonstop displays of movies and TV options like “NBC In Flight.” The ubiquity of television sets in public space is often explained as an attempt to entertain and distract, but in fact it seems more successful at annoyance or anesthetization. For people who wish to travel, eat, or pray in silence, there are few options beyond the deliciously subversive “TV-B-Gone” device, a universal remote control the size of a key chain that allows users to turn off televisions in public places. Considering the number of televisions currently in use, however, it would take an army of TV-B-Gone users to restore peace and quiet in public space.

One of the more startling developments in recent years is the moving image’s interjection into the classical concert hall. In 2004, the New York Philharmonic experimented with a 15-by-20-foot screen that projected enormous images of the musicians and conductor to the audience during performances of Wagner and Brahms. The orchestra trustee who encouraged the project was blunt about his motivation: “We want to increase attendance at concerts, change the demographics,” he told the New York Times. “And the younger generation is more responsive to visual stimuli.” A classical music industry consultant echoed the sentiment. “We have to recognize that this is a visual generation,” he said. “They are used to seeing things more than they are used to hearing things.” Symphonies in Vancouver, San Diego, Omaha, Atlanta, and Philadelphia have all tried using moving images during concerts, and some orchestras are resorting to gimmicks such as projecting works of art during performances of Mussorgsky’s “Pictures at an Exhibition,” or broadcasting images of space during Holst’s “The Planets.”

Among those less than pleased with the triumph of the moving image in the concert hall are the musicians themselves, who are haplessly being transformed into video stars. “I found it very distracting,” a violinist with the New York Philharmonic said. “People might as well stay home with their big-screen TVs,” said another resignedly. “It’s going the route of MTV, and I’m not sure it’s the way to go.” What these musicians are expressing is a concern for the eclipse of their music, which often requires discipline and concentration to appreciate, by imagery. The images, flashing across a large screen above their heads, demand far less of their audience’s active attention than the complicated notes and chords, rhythms and patterns, coming from their instruments. The capitulation of the concert hall to the moving image suggests that in an image-based culture, art will only be valuable insofar as it can be marketed as entertainment. The moving image redefines all other forms of expression in its image, often leaving us impoverished in the process.

Brain Candy

Concern about the long-term effects of being saturated by moving images is not merely the expression of quasi-Luddite angst or cultural conservatism. It has a basis in what the neurosciences are teaching us about the brain and how it processes images. Images can have a profound physiological impact on those who view them. Dr. Steven Most, a postdoctoral fellow at Yale University, recently found that graphic images can “blind” us by briefly impairing the brain, often for as long as one-fifth of a second. As his fellow researcher explained to Discovery News: “Brain mechanisms that help us to attend to things become tied up by the provocative image, unable to orient to other stimuli.”

Another study by researchers at the Center for Cognitive Science at Ohio State University found that, for young children, sound was actually more riveting than images — overwhelmingly so, in some cases. The research findings, which were published in Child Development, showed that “children seem to be able to process only one type of stimuli at a time” and that “for infants, sounds are preferred almost exclusively,” a preference that continues up until at least age four. In their book Imagination and Play in the Electronic Age, Dorothy and Jerome Singer argue that “the electronic media of television, film and video games now may contribute to the child’s development of an autonomous ongoing consciousness but with particular constraints. Looking and listening alone without other sensory inducements,” they write, “can be misleading guides to action.”

Research into the function of the primary visual cortex region of the brain suggests that it is not alarmist to assume that constant visual stimulation of the sort broadcast on television might have profound effects on the brains of children, whose neurological function continues to develop throughout childhood and adolescence. One study conducted at the University of Rochester and published in the journal Nature in 2004, involved, weirdly enough, tracking the visual processing patterns of ferrets that were forced to watch the movie The Matrix. The researchers found some surprising things: The adult ferrets “had neural patterns in their visual cortex that correlated very well with images they viewed,” according to a summary of the research, “but that correlation didn’t exist at all in very young ferrets, suggesting the very basis of comprehending vision may be a very different task for young brains versus old brains.” The younger ferrets were “taking in and processing visual stimuli” just like the adult ferrets, but they were “not processing the stimuli in a way that reflects reality.”

These kinds of findings have led to warnings about the long-term negative impact of moving images on young minds. A study published in 2004 in the journal Pediatrics, for example, found a clear link between early television viewing and later problems such as attention deficit/hyperactivity disorder, and recent research has suggested troubling, near-term effects on behavior for young players of violent video games. In short: Moving images — ubiquitous in homes and public spaces — pose challenges to healthy development when they become the primary object of children’s attention. Inculcating the young into the image culture may be bad for their brains.

The Closing of the PowerPoint Mind

A culture that raises its children on the milk of the moving image should not be surprised when they prove unwilling to wean themselves from it as adults. Nowhere is the evidence of this more apparent than in the business world, which has become enamored of and obedient to a particular image technology: the computer software program PowerPoint.

PowerPoint, a program included in the popular “Microsoft Office” suite of software, allows users to create visual presentations using slide templates and graphics that can be projected from a computer onto a larger screen for an audience’s benefit. The addition of an “AutoContent Wizard,” which is less a magician than an electronic duenna, helpfully ushers the user through an array of existing templates, suggesting bullet points and summaries and images. Its ease of use has made PowerPoint a reliable and ubiquitous presence at board meetings and conferences worldwide.

In recent years, however, PowerPoint’s reach has extended beyond the business office. People have used PowerPoint slides at their wedding receptions to depict their courtship as a series of “priority points” and pictures. Elementary-school children are using the software to craft bullet-point-riddled book reports and class presentations. As a 2001 story in the New YorkTimes reported, “69 percent of teachers who use Microsoft software use PowerPoint in their classrooms.”

Despite its widespread use, PowerPoint has spawned criticism almost from its inception, and has been called everything from a disaster to a virus. Some claim the program aids sophistry. As a chief scientist at Sun Microsystems put it: “It gives you a persuasive sheen of authenticity that can cover a complete lack of honesty.” Others have argued that it deadens discussion and allows presenters with little to say to cover up their ignorance with constantly flashing images and bullet points. Frustration with PowerPoint has grown so widespread that in 2003, the New Yorker published a cartoon that illustrated a typical job interview in hell. In it, the devil asks his applicant: “I need someone well versed in the art of torture — do you know PowerPoint?”

People subjected endlessly to PowerPoint presentations complain about its oddly chilling effect on thought and discussion and the way the constantly changing slides easily distract attention from the substance of a speaker’s presentation. These concerns prompted Scott McNealy, the chairman of Sun Microsystems, to forbid his employees from using PowerPoint in the late 1990s. But it was the exegesis of the PowerPoint mindset published by Yale emeritus professor Edward Tufte in 2003 that remains the most thorough challenge to this image-heavy, analytically weak technology. In a slim pamphlet titled The Cognitive Style of PowerPoint, Tufte argued that PowerPoint’s dizzying array of templates and slides “weaken verbal and spatial reasoning, and almost always corrupt statistical analysis.” Because PowerPoint is “presenter-oriented” rather than content or audience-oriented, Tufte wrote, it fosters a “cognitive style” characterized by “foreshortening of evidence and thought, low spatial reasoning ... rapid temporal sequencing of thin information ... conspicuous decoration ... a preoccupation with format not content, [and] an attitude of commercialism that turns everything into a sales pitch.” PowerPoint, Tufte concluded, is “faux-analytical.”

Tufte’s criticism of PowerPoint made use of a tragic but effective example: the space shuttle Columbia disaster. When NASA engineers evaluated the safety of the shuttle, which had reached orbit but faced risks upon reentry due to tiles that had been damaged by loose foam during launch, they used PowerPoint slides to illustrate their reasoning — an unfortunate decision that led to very poor technical communication. The Columbia Accident Investigation Board later cited “the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.” Rather than simply a tool that aids thought, PowerPoint changes the way we think, forcing us to express ourselves in terms of its own functionalities and protocols. As a result, only that which can be said using PowerPoint is worth saying at all.

Pseudo-Events and Pseudo-Culture

Although PowerPoint had not yet been created when he published his book, The Image, in 1961, historian Daniel Boorstin was nevertheless prescient in his warnings about the dangers of a culture that entrusted its rational decision-making to the image. By elevating image over substance and form over content, Boorstin argued that society was at risk of substituting “pseudo-events” for real life and personal image-making for real virtue. (He described in detail new efforts to create public images for the famous and not-so-famous, a process well illustrated by a Canon Camera commercial of several years ago that featured tennis star Andre Agassi insouciantly stating, “Image is everything.”)

“The pseudo-events which flood our consciousness are neither true nor false in the old familiar senses,” Boorstin wrote, but they have created a world “where fantasy is more real than reality, where the image has more dignity than its original.” The result was a culture of “synthetic heroes, prefabricated tourist attractions, [and] homogenized interchangeable forms of art and literature.” Images were wildly popular, Boorstin conceded, but they were, in fact, little different from illusions. “We risk being the first people in history to have been able to make their illusions so vivid, so persuasive, so ‘realistic’ that they can live in them,” he wrote.

Other critics followed Boorstin. In The Disappearance of Childhood, Neil Postman wrote about the way the “electronic and graphic revolutions” launched an “uncoordinated but powerful assault on language and literacy, a recasting of the world of ideas into speed-of-light icons and images.” Images, Postman worried, “ask us to feel, not to think.” French critic Roland Barthes fretted that “the image no longer illustrates the words; it is now the words which, structurally, are parasitic on the image.” In a more recent iteration of the same idea, technology critic Paul Virilio identified a “great threat to the word” in the “evocative power of the screen.” “It is real time that threatens writing,” he noted, “once the image is live, there is a conflict between deferred time and real time, and in this there is a serious threat to writing and to the author.”

Real events are now compared to those of sitcom characters; real tragedies or accidents are described as being “just like a movie” (a practice Susan Sontag first noticed in the 1970s). Even the imagination is often crippled by our image-based culture. For every creative artist (like Gursky) using Photoshop there is a plethora of posturing and shallow artists like Damien Hirst, who once proudly told an interviewer that he spent more time “watching TV than ever I did in the galleries.”

Is it possible to find a balance between naïve techno-enthusiasm for the image culture and the “spirit of bulldog opacity,” as McLuhan described it, which fueled undue skepticism about new technologies in the past? Perhaps devotees of the written word will eventually form a dwindling guild, pensioned off by universities and governments and think tanks to live out their days in quiet obscurity as the purveyors of the image culture expand their reach. But concern about a culture of the image has a rich history, and neither side can yet claim victory. In the preface to his book, The Essence of Christianity, published in 1843, Feuerbach complained that his own era “prefers the image to the thing, the copy to the original, the representation to the reality, appearance to being.”

Techno-enthusiasts are fond of reminding us, as if relating a quaint tale of reason’s triumph over superstition, that new technologies have always stirred controversy. The printing press unnerved the scholastic philosophers and religious scribes whose lives were paced to the tempo of the manuscript; later, the telephone was indicted by a cadre fearful of its threat to conviviality and face-to-face communication, and so on. The laborious copiers of manuscripts did indeed fear the printing press, and some traditionalists did vigorously resist the intrusions of the telephone. But at a time of great social hierarchy, much of this was driven by an elite disdain for the democratizing influence of these technologies and their potential for overturning social conventions (which indeed many of them did). Contemporary criticism of our image-saturated culture is not criticism of the means by which we create images (cameras, television, video). No one would seriously argue for the elimination of such technologies, as those who feared Gutenberg’s invention did when they destroyed printing presses. The critique is an expression of concern about the ends of an image-based culture, and our unwillingness as yet to consider whether those ends might be what we truly want for our society.

Nor is concern about the image culture merely a fear of losing our grip on what is familiar — that known world with its long history of reliance on the printed word. Those copyists who feared the printing press were not wrong to believe that it would render them obsolete. It did. But contemporary critics who question the proliferation of images in culture and who fear that the sheer number of images will undermine the sensibility that creates readers of the written word (replacing them with clever but shallow interpreters of the image) aren’t worried about being usurped by image-makers. They are motivated largely by the hope of preserving what is left of their craft. They are more like the conservationist who has made the forest his home only to discover, to his surprise, that the animals with which he shares it are rapidly dwindling in number. What he wants to know, in his perplexed state, is not “how do I retreat deeper into the forest?” but “how might I preserve the few survivors before all record of them is lost?”

So it is with those who resist an image-based culture. As its boosters suggest, it is here to stay, and likely to grow more powerful as time goes on, making all of us virtual flâneurs strolling down boulevards filled with digital images and moving pictures. We will, of course, be enormously entertained by these images, and many of them will tell us stories in new and exciting ways. At the same time, however, we will have lost something profound: the ability to marshal words to describe the ambiguities of life and the sources of our ideas; the possibility of conveying to others, with the subtlety, precision, and poetry of the written word, why particular events or people affect us as they do; and the capacity, through language, to distill the deeper meaning of common experience. We will become a society of a million pictures without much memory, a society that looks forward every second to an immediate replication of what it has just done, but one that does not sustain the difficult labor of transmitting culture from one generation to the next.

Christine Rosen, "The Image Culture," The New Atlantis, Number 10, Fall 2005, pp. 27-46.

1We are always in need of definitions whenever we want to explore why cultures change. We are pressed to come up with answers as to what culture might be and how the idea of culture might fit into a nutshell. The general applicability of the answer we struggle to devise invites theoretical formulas and abstraction from specific historical developments. It also, as a result, cautions us to choose fields from which to cull situations and conflicts that may help deliver the concepts we want to grasp, and invites to understand the theory of culture as shaped by how events unfold, and how society moves along. In particular, one may have in mind what the American philosopher Ralph Waldo Emerson once wrote about Napoleon (our favourite dictator, to us French people) in a book he devoted to figures of historical importance (Representative Men): “Such a man was wanted, and such a man was born”1. This strikes a negative note, as does a quote from Napoleon himself that Emerson has unearthed from the vast body of memoirs the Napoleon era has handed down to us. Emerson is reported to have once declared: “My hand of iron […] was not at the extremity of my arm; it was immediately connected with my head”2. The remark and the quote hold a tentative definition of culture. Culture begins when sheer force is mitigated by intellect, intellect itself being shaped by a response to facts, and, we hope, as Emerson hopes, abstracted from fact by ethical imperative. On top of this, we feel Emerson’s attempt at rationality is run through by doubt: what if one might never discriminate between intellect and action? What if one might never grasp how ethics can disengage us from the cogs of history and were incapable of controlling an ongoing process that leads to disaster and apocalypse? Whenever one tries to define culture, culture breaks down into its many components: it splinters into action and responsibility, and we feel there might never be a connection between them. There lies Emerson’s historical pessimism, which it is hard to tone down.

2In recent years, a debate has been brought to the foreground, for reasons that have to do with our increasingly globalized world. Are there any values left? If such a thing as culture exists, then, there might be precise contents of an ethical sort that we want to pin down. Might not this sense of emptiness be the result of a crisis of value, as if the very idea of value had been swept away? This is what the French cultural critic Hubert Damisch thinks has happened, in a recent contribution to a volume aptly titled Which Values for our Time, published by the Gulbenkian Foundation of Lisbon. Damisch rounds up his interrogation as follows: “Crisis of values, or crisis value?”3 The suggestion is of course that value is no longer visible on the horizon of our history to be, that the trend should be resisted, and that intellectual resistance is what we need. It is by no means new to be aware, among philosophers and cultural critics alike, that values are hard to come by. In Plato’s Republic, book seven, humankind is looking at the walls of a cave, noting the shadows dancing there, and being taught that our poor sight precludes the perception of good and evil, and the difference between them. Now that the walls of the cave have turned into television screens, one image is chased away by the next one, while our sense of global responsibility dissolves into thin air even though all the fields of human action hold perspectives of responsibility within them. Culture, like values, is a plenum and a void, a constant expectation and in the end something impossible when one looks at results and facts.

3We should keep in mind Jacques Derrida’s anthropology of culture, and the degree to which it identifies conflict as the prime-mover within our cultural narratives. In a major contribution at a Cerisy conference in Normandy in 1980, titled “On a Newly Arisen Superior Tone in Philosophy”4, Jacques Derrida opposes two sets of attitudes: seeking rationality, and seeking mystery. Derrida views culture as the competition between the Aüfklarer and the mystics, and suggests there are possibilities that the two trends in cultural discourse might eventually reach some kind of truce achieved as a result of an interaction between them. No doubt he was trying to hold historical pessimism at a distance by suggesting gain might be reached in the historical development of cultures if rationality were capable of reading through the language of mysticism, and curb the influence of those he chose to call the mystagogues, in whom he saw a danger for democracy and human dignity. Cultures change, and when they do, they are pulled in opposite directions if we abide by Derrida’s critical thinking. They change to eliminate reason, even, as Derrida puts it, to emasculate it, and we must, as a result, apply pressure to preserve amity, and to uphold the values of democracy. To be sure, Derrida’s onslaught upon mystery is no onslaught upon religious values: there are many other targets we might think of in the current context of globalized liberal economies and environmental overuse, such as religious fundamentalism, terrorism, and the emergence of a global self-appointed elite, although Derrida’s inquiry was started some thirty years ago, and he never gets that precise about what should be indicted.

Disaster and Apocalypse

4Our globalizing societies offer alternatives to an ideal world. In particular, market mechanisms and the rise of global capital have impoverished some non-European nations, while Europe has, in recent years, worked to thin the immigration flux while downsizing out of their jobs the low-skilled workers of a once predominantly industrial economy that has now turned to services. As a result, local communities have been struck, either in Europe or the United States, by being impoverished within the more glitzy context of affluence. In China as elsewhere, industrial activity has surged, while working conditions have never been worse among the former peasants driven to urban areas. Globalization may well pass for an agenda of disaster and social apocalypse, as Joseph Stiglitz has demonstrated5. Welfare and human rights have hardly benefited from the promise economic liberalism keeps harping on, and human development has been restricted to the rising middle-classes of China, or India, if we look at the most significant examples. Richard Rorty, meditating on social hope, has brought home the idea that globalization has been a blow to democracy. He wrote the following in an essay published in 1993: “We now have a global overclass which makes all the major economic decisions, and makes them entirely independently from the legislatures, and a fortiori of the will of the voters, of any given country”6. Rorty’s remark comes as an apposite reminder that there is no such thing as a world government, a fact that we all tend to overlook. The ideology of economic growth heralds human development, but delivers little in terms of the strengthening of local communities, both in rising nations as well as in Western ones. Might not this ideology form the most recent embodiment of some pseudo-thinking the mystagogues parade as rationality for us to kneel to?

5Communities, we hear, have gone global, which means they are now glocal. The portmanteau word means more than it seems to say. On the one hand, the buzzword suggests that local communities may be strengthened by globalization; on the other, it suggests that local communities are shaped, in ways that cannot all be positive, by the advance of global liberalism. However, one of the unsought effects of glocalization may well be that cultural interference with distant or unknown communities might emerge from the pressure of global liberalism, by dissolving national, or even nationalist perspectives, and favouring international contacts. Let us be cautious in this: international interaction, in the context of globalizing economic exchange, may well be no other than buying and selling, and one more version of materialism without national values being cross-fertilized.

6Globalization cannot control the rise of a new conservatism, in spite of the surge in optimism that comes with it in some areas, if we look at the poor condition of welfare systems across developed countries and elsewhere. As Habermas has pointed out, “modernity sees itself as dependent exclusively upon itself”7, and utopian ideals are increasingly wiped out of the Zeitgeist. Globalization is in dire need of strengthening, not exhausting, utopian energies. If it proves incapable of effecting this, renewing utopian energies, the road down globalization may well be what one supposes it to be from recent evidence: a hurdle-race, with one winner, a few good athletes, and vast crowds of anonymous losers. Jacques Derrida has pointed out that we need peace in culture, and that peace can be achieved when the mystagogues accept to interact with rationality. Rationality however, to him, is not an empty bottle, or an instrument by which societies may solve practical questions. Rationality involves moral choice, and one may well suggest that the Habermas notion that utopian ideals have to be upheld is the best way to reorder, and refashion global liberalism. No doubt, the culture wars must go on, to stay the current backlash and its related traumas, terrorism East and West, the political violence within national borders and without, the religious fundamentalism which has found in globalization its ecotope, in Israel, in the Arab world, in the United States, and elsewhere, while environmental disasters from North to South take their toll upon communities. Cultures, as a result of globalization, change, for reasons that have to do with the innate systemic risks that globalization runs through them, risks which are supra-human, but which, for that very reason, have to be identified, deconstructed, and eliminated, although we do know that this process cannot be the work of one sole generation. Indifference as well as naïveté ought to be avoided. If, as Habermas thinks they are, utopian values are used-up, because they are targeted, then, they must be invigorated.

7No doubt any such invigoration, if we want it to have pragmatic efficiency, we need specific measures, and precautions. Intellectual clarity can help. And meditation upon what is and what is not scientific can be an asset. It is true odium has been cast on the precautionary principle by some scholars of environmental studies. In a fairly recent issue (2004) of the M.I.T. Press quarterly Global Environmental Politics, scholars Emery Roe and Michel Van Eeten have condemned the precautionary principle in matters of environmental policy on the grounds that scientific evidence is not sufficient, calling for empirical knowledge, supposed to be an index to what is and what is not scientific8. Is it that globalization has reshaped the image of science in academia, making us wistful once again, and inviting us to find peace of mind in a belated version of science which is reminiscent of the nineteenth century, when science was largely considered to rely on empirical observation, whatever this might mean? Empiricism and dogmatic thinking are birds of a feather flocking together. More open intellectual attitudes are necessary to face the risks of globalization upon our environment. Doubt, in particular, may be protective, in this respect. Without it, scientific thinking can be stultified. Science cannot be independent of general interest and social respect, and requires critical detachment to shelter us from the systemic dangers inherent in its objects of inquiry and the applicability of its fundamental findings. In scientific knowledge as well, the culture wars loom large, though they tend to be overlooked. These wars may lead both ways: to cultural changes that will crush social hope, and to cultural changes that will uplift a sense of community and cooperation.

The Secularization of Value

8The values of science, therefore, should be secularized, and scientists should avoid generating systems which hold dangers in them that might express their potential for destruction. The French philosopher and Stanford scholar Jean-Pierre Dupuy has pointed out that the atomic bombing of Japan was the result of systemic danger, in an amazing remark: “Why was the bomb ever used? Because it existed, quite simply”9. The implication of what he says is that science too, and what was at one point presented as an advance of the civilized mind, may lead to pragmatic consequences that reshape thinking and emasculate it, if we want to harp on the Derrida proposition that the mystagogues are able to emasculate rationality (let us pardon Derrida’s male chauvinism if we can). Human thinking involves systemic dangers, and one therefore has to rethink thinking in different terms, which has been the task of modern philosophy. Perhaps we might suggest at this point that cultural change involves the thinking of rationality in secularized terms. This means that technology may well lead us astray, tethered as it is to scientific knowledge which we tend to view as total, whereas any inquiry into the results of science tends to demonstrate that science is provisional, and that its propositions will sooner or later be refined, or redefined, and that intellectual inquiry, whatever its field, rarely comes to conclusions that will never be reworded, or revised. Knowledge is an ongoing process, and if we keep this in mind, we secularize science, instead of projecting it onto the higher plane of superior frozen truths. Science, like any other human adventure, unfolds through time, and taking this into consideration helps science respond to social needs.

9Political scientists are struggling for secular views, as John Rawls has amply demonstrated. Behind his eulogy of democracy as a condition and an effect of economic and political liberalism, one finds an attempt to define the nature of rationality as the mainspring of social hope. It is striking, when reading John Rawls, to realize the extent to which rationality is assessed in conjunction with its effects upon social organization, which yields workable political conceptions of justice. John Rawls, in his second major opus, Political Liberalism, defines political rationality as outcome-centered, and this leads to a list of primary goods, which reads as follows:

  • basic rights and liberties […];

  • freedom of movement and free choice of occupation against a background of diverse opportunities;

  • powers and prerogatives of offices and positions of responsibility in the political and economic institutions of the basic structure;

  • income and wealth;

  • and finally, the social bases of self-respect.10

10Rawls’ agenda relies on the traditions of the common-sense philosophy of the English-speaking world and the theoretical culture of pragmatism, which he found ready for use in his New-England intellectual environment. Nowhere do we find perspectives that would be disconnected from and independent from day-to-day preoccupations. Rawls wants to harness human development to democracy, to wring democracy out of economic growth, while there is an increasing belief, in this century, that our globalized economies hold a promise of democracy as an expectation which will always be contradicted by fact. Just recently, in a major contribution to the debate, the Slovenian philosopher Slavoj Zizek has pointed out that China allies a vicious use of the Asian bludgeon in Tibet with the logics of the European stock-market, and that this betrays the belief that democracy is an obstacle to economic growth. As a result of this, Zizek’s assumption is that our global culture might be brought to understand that democracy is no longer needed to back human development, which might lead global cultural change in the wrong direction11. Democracy has to be maintained as a horizon of belief, and as the sole teleology worthy of respect. Rawls helps us understand that teleology should be one version of practicality, though we tend to think that any political teleology is an empty promise. His contribution to political philosophy views rationality not just as a belated version of theology, but as a tool that may help deliver collective results, following in the footsteps of American intellectual traditions which assess value in terms of their pragmatic consequences rather than in terms of otherworldly conceptual exploration.

11What if, beyond this sound conception of political values, and the organic laws that go to frame them, human culture was unresponsive, thus precluding cultural change, and sustainable development? It is this situation that Samuel Huntington examines, leaving little room for hope, suggesting that cultures cannot change, or will change slowly or with difficulty, on the grounds that society will not change and that there is no connection between assumptions, beliefs, and the economic and political opportunities that the modern liberal state offers if we are willing to grasp them. Huntington’s dream is to get rid of cultural obstacles to economic development, while it is yet unclear whether there is any strong belief in the virtues of democracy in what he has to say. Huntington’s answer does not intend to demonstrate that it is democracy which has to be left out of his global picture. In his case, if progress is not fast enough, it is because those cultures which resist progress as seen from Massachusetts are obstacles which one must remove, but Huntington is no clear analyst of how culture and democracy might hinge. “[…] We define culture, Huntington writes, in purely subjective terms as the values, attitudes, beliefs, orientations, and underlying assumptions prevalent among people in a society”12. His vision of culture has left one notion unmentioned: what about solidarity, the cornerstone of Richard Rorty’s vision of social hope? It may well be that this is one value that the modern liberal state has eroded, and that solidarity is a basic asset to those communities forming the lesser developed countries of Africa, Latin America and parts of the Asian world, where welfare is weak, and institutionalized education poorly developed, where, for political reasons, states are not ready to reach out to populations and areas left to their own resources and inventiveness in terms of welfare. Huntington’s discourse, as a result, is a perfect illustration of the New Conservatism that Habermas has targeted. Modernity, in Huntington’s world-view, is seen as totally dependent on itself. Beliefs, in particular, are taken to task, in Huntington’s definition of culture. What if beliefs were an adequate instrument of the progress Huntington has in mind, one notion which is empty enough, and which Huntington parades to conceal his conservative views? Inherited ideas and attitudes are more of a survival-kit than an obstacle to social cohesiveness. One hardly knows, when reading Huntington, whether progress, the norm of his perspective, is one serious academic case of mystagogic thinking, or whether it may have practical applicability. It is arguable that progress, with Samuel Huntington, is an abstract notion.

12Asian culture turns out to be an epistemological obstacle to many political scientists. Once considered incapable of generating economic growth, Asian values are seen as an asset in the ongoing economic race, with growth rates that belittle Europe and the United States alike in some quarters of the Asian world. Can one blame economic stagnation on them yesterday, and now say that some basic values of Asian cultures are the leverage of change helping those so-called miracle economies make some headway? There may well be an emphasis on hard work in Chinese culture, but one cannot see how this is specifically Chinese, or American, or British. Lucian Pye, one prominent M.I.T. scholar in Chinese studies, has suggested that Taoism and the belief in good fortune, supposed to be specific to Chinese culture (although I am aware this might be challenged), has produced outgoing dynamic character in the Chinese people, which makes them ready to grasp any opportunity likely to turn to their advantage. Pye’s view of Chinese culture may easily be taken to task, as he implies that Chinese culture leaves no room for introspection. This is most probably a typical misconception such as New-England protestant culture wants to bring home. Lucian Pye, in particular, writes the following when considering the reasons for China’s rapid expansion: “This stress of the role of fortune makes for an outward-looking and highly reality-oriented approach to life, not an introspective one”13. This is, we guess, one academic version of prejudice insisting that the Chinese have no soul, and no interest for an inner life. Economists, on the other hand, go for a more mundane vision of China’s development, insisting on the capacity to attract foreign investors14. This is also quite true of many other rising Asian economies besides China.

13However, these observations lead us to want to extend our definition of culture. Culture is not just simply a cluster of beliefs and attitudes outside the realm of economic and political development. Culture is probably much more than beliefs and attitudes. It encompasses what we might call material culture, in the sense that attitudes matter in economic development, which is no big news, if we refer to Max Weber’s understanding of the ethic of capitalism, shaped as it is by the sense of insecurity that goes with the necessity to devise for oneself advancement in this world, the better to advance in the next one, or the higher or more sophisticated one in the rich oriental spiritual heritage. No wonder then that Derrida should suggest that between rationality and mystery, there is one connection to be established. And, in Derrida’s view of how rationality and mystery interact, one finds an abiding agreement occurring, and this is of course desirable to establish peace in what he calls culture, which to him is more of a socially encompassing substance than a mere individual determinant of behaviour.

14Lucian Pye is interesting as an analyst of Chinese social development, not for what certainties he may have in store for us, but for the scepticism which his propositions will cause in most areas of the academic world, and across disciplines. Examining the reasons for China’s economic advance, he writes that “[...] the driving force in Chinese capitalism has always been to find out who needs what and to satisfy that market need”15. One might meditate for quite a while to determine whether markets are out there for anyone to grab, or whether one should shape markets, create needs, and respond to one’s ambition to grow by being inventive. Nevertheless, Lucian Pye views Chinese economy as a simplistic answer to world needs, and the capacity to adapt to them, whereas the West is seen as technology-driven, and culturally more sophisticated: “Western firms seek to improve their products, strengthen their organizational structures, and work hard to achieve name recognition”16. We wonder whether Chinese firms have not always tried to do precisely this, which can only be generalized with a vast highly educated workforce, which China is trying to obtain by adequate investment in higher education. This path is promising, from what we can judge when considering our Chinese students in our higher learning European institutions.

Cultural Change and Universities

15If therefore, cultures change, not just private cultures, but also public ones, as we increasingly suspect cultures to be collective assets, university education has a major role to play in this process. We, as academics, either experienced or aspiring ones, must address the issue of what a university education ought to be like. So far in this discussion, we have acknowledged that academics should avoid voicing social prejudice, and this has not always been accomplished, to say the least. Jacques Derrida has meditated extensively on this, with a view to promoting the role education might play in defending the values of democracy, no doubt because Derrida’s understanding of the effects of academic training is combined with the idea of a political education for youth. This may be easily understood when one looks at the moral paralysis of the German university system and its many graduates embracing Nazism and providing the Nazi regime with its most destructive propagandists and functionaries. However, Habermas is clear on this point. German universities cannot be blamed for what befell. Habermas, in particular, points out that the number of students was halved during Nazism in Germany, dropping from 121 000 in 1933 to below 60 000 right before the Second World War17. One reason why this happened, although Derrida is not explicit on this point, is that universities tend to over-specialize knowledge. This has caused the decline of humanistic study. Habermas offers similar views, though they are cast in a more sociological mould. To Derrida, higher education should be critical of whatever rationality wants to assess. He calls this “the university without conditions”, which to him involves an ambitious agenda thus defined: “the primal right to say anything, be it in the name of fiction and of knowledge as experiment, and the right to speak publicly, and to publish this”18. Habermas offers a more accurate version of what ought to be done, and has been insufficiently accomplished so far: integrating humanistic study and technical expertise to curb the specialization of knowledge19.

16This may sound vague enough, and we wonder where it might lead, because one doubts whether knowledge, in various disciplines, might efficiently refrain from becoming specialized. This is why Derrida comes up with more practical propositions as to the contents and orientations of higher education in the book he published in 2001, L’Université sans condition. There are seven such propositions, all having to do with what one might call the architecture of knowledge, all answering the need to redefine humanistic study, which should come alongside more specialized training, either in established scholarly disciplines, or the training of students towards professions outside the academic world. The new humanities should, according to Derrida, deal with what he calls “the history of man”, which calls us to devote more attention than has so far been devoted to human rights, be they for men or women. To him, these rights are “legal performatives”20, which sounds otherworldly owing to the weight of abstraction in the phrase. However, this might basically mean that these rights are to be upheld because they can be applied to the various fields of human activity. Furthermore we must bear in mind that these so-called “legal performatives” are performatives because they hold within them an applicability that may be constantly expanded, in practical terms, to various areas of cultural practice, among which of course science and business, two areas of higher education that are growing to meet the social needs of human development.

17The idea of democracy comes second in Derrida’s architecture of the new humanities. It comes second for reasons of clarity in the presentation of the programme he has in mind. Yet the idea of democracy is not a second-thought, because it runs, let us be reminded, through all his oeuvre as a philosopher. Let us note that democracy, as far as what Derrida has to say about it, is not tethered to nationhood. Nationhood is dangerous, and one may easily understand this in the light of European history, and also of Asia. From this, we can easily infer that cultural change in the future should not rely on national traditions, and that, in this respect, globalization offers opportunities for positive cross-fertilization. Derrida’s meditation on this hinges on the concept of sovereignty. While sovereignty is a desirable goal for each and every one of us; the idea is viewed as misleading, as it has often been a concept without practical consequences, while we may still hope that sovereignty will remain a horizon of belief for individuals, and a value that will guide collective decisions. Yet, if Derrida invites us to abide by this concept (sovereignty), he also believes that any collective formalization of the idea of sovereignty should avoid reliance on the nation-state, which may too easily lead to a betrayal of individual dignity.

18Derrida then focuses on the necessity to recuperate the authority of teaching, and of literature, whose proposals cannot be easily understood. One suspects, when reading Derrida’s proposals, that teaching as well as literature have to do with amity, a concept that emerges from Derrida’s body of works. This is not a norm, neither is it prescriptive, nor can it be strictly defined as a doctrine or a set of mandatory rules. We gather this is to be understood as an opening to otherness on the part of the teacher, and a eulogy of respect for the other person, which involves inventiveness and the by-passing of any sort of regulation that defines the other person in some way or other that might lead to a position of authority of a colonial or exploitative nature. It certainly is an attitude of respect, which elbows aside the very notion of authority, “routs it”, as Derrida says21. Universities, therefore, should constitute an idea that transcends any specialized discourse on the technicalities of education; it consists in letting the other reach out for his or her potential towards self-development. The institutional strength of higher education springs, in Derrida’s view of it, from the interaction of the person who teaches and the one being taught, to live to the full his or her aspirations. Derrida’s ideal is so elevated that it transcends any definition one might come up with. It certainly is a call to confront the normative nature of higher education in order to recuperate a lost sense of human warmth that has been eliminated by the technocratic complexities of institutions seeking intellectual identity in the measurement of student skills and their willingness to comply to them. One also cannot rule out that a backlash has been underway in higher education itself owing to the rising number of first-generation graduates from the less educated groups of our national cultures. This has been more of an opportunity for universities to fulfil their cultural mission from the sixties onwards than a serious obstacle to the growth of higher education, and one can argue that Derrida was balking away from the pessimistic discourse one hears in most academic circles today – ill-grounded as it is on the relative accessibility to higher education.

19The challenges that higher education has to face, in the context of an ever-increasing cross-fertilization of cultures, points to one underlying question that surfaces from an examination of current economic and social trends. Is what we call culture tethered to social and economic factors? The question is by no means new, and was handed down to us by the industrial revolutions of the nineteenth century, and by Marxist theory. We now tend to believe that culture is one mode of collective representation that one may disengage from submission to social and economic facts. On this point, the French sociologist Emile Durkheim referred to real structures, that he saw as disconnected from institutions or working facts.22 There is still much thought to be devoted to whether the degree of autonomy of culture as collective representation involves radical or relative autonomy from economic factors. We are also hard pressed to determine whether, in this framework of analytical thinking, autonomy is or is not hampered by the necessities of those real structures and the institutions that shape them, and even perhaps discreetly justify them. Hence, Stiglitz’s view that one must respond to a democratic deficit, and Derrida’s view that one must face the serious issue of a democratic deficit in higher education. The question is not benign, and it calls forth an autonomy of the mind to bend social realities and economic factors to purposes that do not derive from them.

Haut de page

0 thoughts on “Essay About Culture Change Images”

    -->

Leave a Comment

Your email address will not be published. Required fields are marked *