25 on 25

Duke experts imagine the shape of scholarship and the state of our world over the next quarter-century

Since its launch twenty-five years ago, Duke Magazine has identified itself with the values, aspirations, and intellectual intensity of a great university. Despite its stature and achievements, Duke is constantly asking hard questions of itself, constantly reinventing itself, constantly anticipating shifts in what we learn and how we learn. In that forward-looking spirit, we asked twenty-five Duke leaders and scholars to project ahead twenty-five years and imagine what life will be like in the year 2034.

All of these essayists have appeared in our pages, some of them with regularity, as profile subjects, expert voices, or newsmakers. Some emphasize a satisfying continuity in the areas in which they work. Others embrace the excitement of dramatic change. Their responses tell us a lot about the trajectory of fields ranging from literature to neuroscience. They also tell us, of course, about Duke's ever-increasing intellectual reach.

Go behind the scenes of the photo shoots with Chris Hildreth, director of Duke Photography.

 

Future of the University

The More Things Change

By Richard H. Brodhead, president

If you come to Duke in 2009 not having visited since 1984, you'll find it profoundly changed and profoundly the same. This paradox, a sign of health in a great university, arises from two facts. As places of discovery that attract the most creative minds from this country and around the world, universities are in touch with new developments and are natural homes of innovation. But our fundamental work is largely constant: promoting curiosity and inquiry, bringing intelligence to bear on the world's complex problems, training people for positions of responsibility, and challenging students to discover and live up to their full promise.

But change and permanence don't just coexist on campus. At best, change supplies us with new ways to perform our constant mission. Twenty-five years ago, the personal computer was a fantasy and the Internet a mystery known to a few research scientists. Since then, information technology has altered virtually every aspect of daily life, creating, among other things, new tools for analysis and diagnosis in medical research, new means for retrieving knowledge of the past, and new ways for faculty members and students to be in closer, more continuous communication.

The other major change in American research universities in the last twenty-five years, the extension of global reach, has had a similar effect. Duke once drew students principally from North Carolina, then from the region, then from the nation, and now from around the world. This new fact, too, has created new ways to perform our classic function. A key benefit universities afford is the chance for talented people to meet, to stimulate and learn from one another, and to discover how to accomplish things together, at work and at play. Nowadays, and into the future, a Dukie will inevitably learn about and learn to collaborate with people from around the world—daily practice for a life in a global society.

What will the next twenty-five years bring? In spring 2009, the economic crisis looks to be the main thing that will shape the future university. But Duke, which was created on the eve of the Great Depression, has outlasted many a business cycle, and other changes may prove far more enduring.

I expect the next decades to be a time of profound transformation in higher education. The historical model was based on separation and specialization; the new model will be about creating connections and interactions among a wide variety of separate domains. In a world where all important problems have plural dimensions, faculty members will be less content to inhabit islands of specialized expertise but will reach across old boundaries to form new communities of inquiry; students won't learn to narrow their curiosity but to access and synthesize knowledge across multiple domains.

In place of the classic separation of "the academic," academic study and real-world experience will become more tightly connected, each enriching and building on the other. On campus, curricular and what is now called extracurricular activity will be understood as complementary ways to teach fundamental skills of self-discipline, initiative, teamwork, and leadership. And people who "go to Duke" will form ever-denser connections with people away from campus, in this city and region, and around the world.

Because the future will require different things from higher education, the Duke of 2034 will be profoundly different from the Duke of today. But behind these innovations, our core mission will be recognizably the same. This will still be a place dedicated to expanding the reach of human understanding. This will still be a place dedicated to the flowering of individual human potential. And this will still be a place where the lived interactions of a face-to-face community form the heart of teaching and learning.

Brodhead, who holds a Ph.D. in English from Yale University, where he was a professor and Dean of Yale College, was inaugurated the ninth president of Duke in 2004.

 

Information Technology

Instant Access to the Cloud

By John Board, associate professor of electrical and computer engineering and computer science

Twenty-five years ago, one would have been hard-pressed to predict that the "Internet," a small confederation of computers found mostly in computer-science departments, would evolve into an indispensable tool of education (and everything else). Only a true early-1980s visionary could have seen the original "brick phones"—the first "portable" cell phones, marvels of their day—and imagined our GPS-enabled, media-streaming, IM- and e-mail-flooded, camera-equipped, tilt-sensing smart phones. Projecting ahead to 2034, how will information technology affect how students learn at Future Duke?

First, don't panic. The future of computing and information technology will not involve bulky cyborg explants. We will expect continuous active and passive access to the world's collective and constantly updated "cloud" of information—the totality of human knowledge at a given instant. We will indeed use a variety of devices to access and contribute to the cloud, but these devices will be discrete. Many will be ubiquitous in the world around us; others we will carry, wear, or have implanted. There will still be a role for keyboards and other specialized devices to aid in the production of large amounts of text, but most interactions with the information cloud will be through alternate, more natural means that will make it unnecessary for us to go to a "computer." (Who knows, maybe voice recognition will even work by then!)

Students in 2034 will have grown up in the cloud, and they will hardly think twice about the "T" in IT. A century ago, electric motors made possible a host of new inventions, but each motor asserted itself on the user, requiring frequent maintenance and attention. Eventually, engineers learned to design maintenance-free motors that simply "disappeared" into the devices in which they play a vital role. Old-timers in 2034 recounting tales of downloading firmware patches and updating device drivers will draw the same looks of incredulity that great-grandpa gets today when he describes checking the commutator brushes for sparks.

Nearly every manufactured item, and many natural ones, will be tagged with sensors and enough networking smarts to relay, at a minimum, the existence, location, and overall state of the item to the cloud. It may very well have a camera and microphone, to capture the state of its local environment, as well. This network will be largely self-organizing and self-maintaining. The cloud and the devices we use to access it will augment reality: If we choose, our earpiece will be able to whisper the name and major of the casual acquaintance passing us in the quad; information about the dietary value of the meal we're assembling in the Great Hall will project onto our tray. During a lecture, supplemental information about a term we are unfamiliar with, or alternate solution techniques for a problem under discussion, may discreetly appear in the margins of our notes.

Today, social-networking sites allow us and our "friends" to capture, preserve, and carefully index selected still photos of us at our best and not-so-best (whether we want them to or not). User-generated, casually recorded video is in its infancy today. Twenty-five years from now, the output of hundreds of millions of fixed and mobile video cameras combined with automated video indexing and archiving will fundamentally redefine privacy in ways that could make 1984's Big Brother seem to be an amateur. Students in 2034 will be among the first to have grown up with carefully indexed multimedia records that have captured nearly all moments of their lives—flattering and otherwise. Faculty members and students alike will be struggling with what this means for the future of humanity.

"Telepresence" technologies will permit compelling "in-person" interactions among groups of people, regardless of geography. This will doubly challenge the residential university model: Why should I have to move to Durham to take Duke's classes? Why shouldn't I just assemble a smorgasbord of the best courses from all over the planet? Duke will accommodate aspects of both objections with aggressive remote-presence programs of our own, but our administrators will continue to wax eloquent on the value of four years spent on our idyllic campus.

Many on the present-day faculty bemoan reliance on simple Web searches and "amateur" productions like Wikipedia for serious scholarship, but by 2034, the tools of editorial judgment and peer review will have been applied to the complete corpus of traditional academic output. Imagine a Wikipedia successor in which article summaries expand (when desired) to monograph length, with exhaustive annotated bibliographies. All the discipline-specific measurements of the quality of an article (reviews, relevance scores, citation counts) will be carried as part of its metadata.

Accessing every opinion ever written (or spoken or filmed) about the causes of the French Revolution will not be the challenge facing students in a freshman history course; the job of the academy will be, even more than today, to teach students how to home in on high-quality information. Though libraries as repositories of printed information will be largely obsolete, librarians will be more important than ever to index, archive, and teach navigation in the cloud. Strap on your sensors, and don't forget, the cloud remembers all!

Board B.S.E. '82, M.S. '82, who also serves as associate chief information officer for the Office of Information Technology, has taught at Duke since 1987.

 

American English

ne1's Guess

By Julie Tetel Andresen, associate professor of english

No group of adults has ever said, "We sure do like the way the kids are talking these days." But the fact is, the way the kids are talking now will, in large part, determine how English will be spoken in twenty-five years. Urban teens in Baltimore are reported to be using the construction Peep yo to mean Look at this/him. I mention this mostly because it delights me, but also to make the point that innovations occur in specific places among specific groups of speakers.

If you are not experiencing delight with the Baltimore innovation, but rather distaste or puzzlement, let me remind you that speakers in the seventeenth century would have experienced similar feelings upon hearing The house is being built (the "proper" verb form at the time being The house is a-building), while a sentence such as We are being reasonable would have been downright strange and probably unthinkable. (The construction present progressive of "to be" + adjective was an innovation of the twentieth century.)

Will the construction Peep yo become a form of Standard English twenty-five years from now? Probably not. For one thing, the expression is used only in a specific, small area. For another, yo is a pronoun, and new pronouns rarely enter a language. But stranger things have happened.

The word-formation process du jour in American English is blending, that is, combining two existing words to make a new word. A couple of blends formed in the Middle English period (1150 to 1500) have survived into Modern English, e.g., scrawl (sprout + crawl) and quaver (quake + waver), as well as a couple from Early Modern English (1500 to 1800), dumbfound (dumb + confound) and apathetic (apathy + pathetic).

The first half of the twentieth century gave us smog, motel, and brunch. But the fun really starts in the last few decades. We now have prequel, dramedy, infotainment, and blog, along with rockumentary, shockumentary,mockumentary, and while we're at it, mockbuster. We have clothes (unitard, tankini), food (frappucino), flavors (strawbermelon), pets (labradoodle) and even people (Brangelina). Blends occur in all parts of speech: ginormous,geektastic, negatory.

Last year, I was driving along the highway, channel surfing, and I caught a DJ saying, "We're gonna party in the Caribbean and turn Jamaica into Boozachusetts." My first thought was, "Ahhh, we're in the Golden Age of Blends." My second thought was of the blend that uses another part of the Bay State, namely the term Masshole, which is how one person from Massachusetts greets another or is an insult from a New Yorker (so I have been told).

Blends are everywhere. My YMCA not only has a gymnasium, it also has a fitnasium. Q: What do you call the relationship between two straight men who really like each other? A: a bromance, of course. And as a sign of our fusion times, I have just heard of the new religion Chrislam.

Two things are certain: 1) Some of these blends will become standardized and live on (I vote for bridezilla); and 2) in twenty-five years, some blends, particularly those of companies or brand names, will sound dated. Microsoft may become known as MS, just as GE has replaced General Electric in common parlance.

Here we come to acronyms, the other robust word-formation process that mushroomed in the twentieth century, either pronounced as words, amphetamine (alpha methyl phenyl ethyl amine), or pronounced as a sequence of letters: VCR, MSG, IUD, and BLT. Some may be pronounced both ways, as in ROTC. A slew has, indeed, come out of army talk: AWOL, RADAR, SNAFU, MIA, and GP (pronounced jeep).

The 1960s gave us LSD, JFK, LBJ (think: the "Initials" song from Hair), which has led to the slimmed-down W and now to O. Reinforced by text-messaging, the kids are talking in abbreves, just as earlier generations began to speak punctuation: He's a quote-unquote body builder slash trainer. If you don't know what an abbreve is, ask someone under the age of thirty.

So here is my prediction for a possible utterance twenty-five years from now: "Peep this. Some dude a long time ago thought the word bridezilla would last. Hah! Total fail. But I like it." Pause. "JK."

Tetel Andresen is the author of Linguistics in America, 1769-1924: A Critical History, and has recently completed a new book, Linguistics Reimagined: Language Study for the 21st Century. She has also written eighteen works of historical fiction.

 

Transportation

Forward Motion

By Henry Petroski, Aleksandar S. Vesic Professor of civil engineering and history

As much as urban planners might wish that the personal automobile would disappear from America's streets and highways, that is not likely to happen in the foreseeable future. But, if automotive research and development goes according to technical, financial, and political plan, very few vehicles powered by internal-combustion engines burning fossil fuels will remain in use twenty-five years from now. Hybrid, all-electric, and fuel-cell vehicles will dominate the highways in 2034. All of this will have been made possible because the infrastructure necessary to support them was built. Gas stations will have been converted to dispense a variety of nonfossil fuels—including hydrogen—and parking meters will have been equipped to serve as battery-recharging hitching posts. The few internal-combustion vehicles that do remain on the road will be burning biofuels or compressed natural gas.

This is the good news, but it does not come without some bad. Already, the decline in fossil-fuel use has resulted in a decline in tax revenue intended to fund highway maintenance projects. So far, alternative fuels have been largely subsidized, but creative new means of green taxation will likely accompany the rise in use of clean energy sources in the coming decades. By 2034, older bridges will have been replaced, many with structures that use form and color to add an artistic flair to their engineering. There will be so many notable pieces of highly visible infrastructure that they will have made our constructed landscape into one grand outdoor sculpture garden.

Because automobiles and trucks will be generally smaller and lighter, roads and bridges will deteriorate at a much slower rate. Most interstate highways will have been retrofitted with so-called smart pavements, which not only will heal themselves of cracks and potholes but also will interact with smart vehicles to keep them automatically in their lanes and at safe distances from each other. Driver intervention will be necessary only when changing lanes and entering or exiting the highway. Bridges, tunnels, and other critical structures will also be smart, giving warning of any impending failure.

There will be free ultra-high-speed wireless Internet access available from any car on any road anywhere in the country. We can speculate about the scenery through which those cars will be driving: Farmland along our highways will be interspersed with clusters of wind turbines hundreds of feet tall, situated to take advantage of prevailing winds and to feed the power generated directly into the national grid. Some farmland and much desert will be given over to vast fields of solar-power plants. The roofs of energy-self-sufficient buildings will be fitted with small-scale wind turbines and solar panels. Many of the vehicles on the road will also have solar panels built into their roofs.

Since so many vehicles will be powered by electricity, city streets will be pollution-free, and concerns over global warming will have been allayed. There will be no smog, and people will breathe freely. Busy streets will be as quiet as a country lane because cars and trucks will emit no noise above a barely audible hum. Since this means that blind people may not hear vehicles approaching intersections, vehicles will be equipped with warning devices that will emit constantly changing soft electronic tones that indicate traffic conditions in the street.

Infrastructure comprises not only roads and bridges and intersections, of course. The American Society of Civil Engineers occasionally issues report cards on some fifteen categories of the built environment—ranging from airports to wastewater-treatment plants—that it grades on the basis of condition, performance, capacity, and other criteria. On the latest report card, nothing received a grade higher than C+, and the nation's overall infrastructure grade-point average was equivalent to a D, which was defined as "poor."

Reviving such concepts as a Civilian Conservation Corps and a Works Progress Administration could go a long way toward reclaiming, improving, and supplementing aging roads, pipelines, and other essential networks with bold new projects. But our future physical infrastructure will only earn respectable grades if efforts to rebuild the existing one are based upon a financial system sound enough to continue to fund and maintain them properly. Ultimately, then, what physical America looks like in 2034 will depend as much on financial conditions, public policy, politics, and political will as it will on engineering and technology.

Petroski has written fourteen books on engineering and design, including The Toothpick: Technology and Culture, published in 2007.

 

Economics

Rationality Lost

By Dan Ariely, James B. Duke Professor of behavioral economics at the Fuqua School of Business

It is hard to remember how politicians used to go about legislating policies. Let's take a quick trip down memory lane and examine public policy as it was twenty-five years ago, before the experimental policy era—that is, an approach that values and nurtures social experimentation as a way to design policies—we live in now.

As you may recall, in 2008-09, a global financial crisis was brought on by lack of transparency, conflicts of interest, terrible bank-lending policies, complex financial instruments, and unenforced government regulations. Above all, a religious-like belief that humans and the market are perfectly rational was a major contributor to this historical catastrophe and the five-year economic recession that followed.

In retrospect, it sounds ridiculous, but before the 2008 crisis, rational economics—now largely restricted to university courses taught by a few academics—was the only guiding light that politicians used when designing taxation strategies, policies, and institutions. When the economic tragedy of 2008 hit, it illustrated beyond any doubt that relying on the assumption of perfect rationality is dangerous; this realization, in turn, ushered in a new era for behavioral economics.

After the initial shock, this painful and expensive crisis taught businesses and policymakers to recognize three main lessons: 1) Human beings have many irrational tendencies, fallibilities, and quirks; 2) we often have bad intuition and a limited understanding of our irrational tendencies and; 3) if we want to create effective policies, we shouldn't rely on our intuition to find recommendations nor on the assumption that people behave rationally. Instead, we should ground our recommendations in how people actually behave.

With these three lessons in mind, the business and policy landscape changed dramatically over the next twenty-five years. As expected, businesses led the charge and questioned their basic assumptions about the relationship between salary and productivity, the value of meetings, and the problems with conflicts of interest. After spending a few years watching in awe as business productivity improved, government policymakers followed suit by implementing experiments with the Education-Forward Initiative (formerly No Child Left Behind). These experiments produced several striking findings. They showed that basing teachers' salaries on student performance had minor short-term benefits and caused substantial long-term damage to teacher and student motivation, that creating interest in education was more important than grades, and that shifting the curriculum focus from calculus to statistics and probability had a wonderful impact on students.

Between 2019 and 2025, experimental policy, which became a more established field, examined the simplification of the income tax (with huge social and tax benefits) and studied incentives that encouraged people to get routine medical exams (with tremendous long-term benefits). Currently up for debate are the benefits and disadvantages of socialized health care.

In 2009, this might have devolved into an ideological debate about right and wrong. But given our realization of how little we truly understand ourselves and the systems we design, combined with an appreciation for the benefits of detailed experimental investigations and empiricism, we are trying to solve this important issue by setting up multiple experimental programs. In some of these, the levels of copay are low, in some medium, and in some high. Some of these programs focus on specialists, while others focus on the family doctor. And some focus on treating patients; others, on preventive care and education.

Rhode Island, whose size and demography makes it a good laboratory, currently runs many of these experiments, and in the next few years, we will have a much better idea of the costs and benefits, both short-term and long-term, associated with all of these approaches. Based on these, we will pick the best policy to adopt. In the end, thanks to experimental policy, our decisions will be based on solid empirical testing and data, not on a misconception of rationality.

In light of these improvements, maybe the financial meltdown of 2008 was really a blessing in disguise: It forced us to think more carefully about the assumptions we were making across the board and helped us discard the theoretical assumption of perfect rationality and replace it with empirical data.

Ariely's first book, Predictably Irrational: The Hidden Forces that Shape Our Decisions, was due out in a revised edition in May. Ariely Ph.D. '98 is also a visiting professor at MIT's Media Lab.

 

Women's Studies

At Home in the Academy

By Jean O'Barr, professor emerita of the practice of women's studies

The early advocates for women's studies pursued two goals simultaneously: They revolutionized research by finding new topics, with new methods, using new theories, then launched courses and programs to transmit these findings, doing it all with an interdisciplinary perspective.

In general, the large public universities fostered feminist scholarship and established women's studies programs first. Many regional universities and liberal-arts colleges set up programs soon thereafter. The single-sex schools, male and female, came later to women's studies. In the process of becoming a recognized discipline on any campus with majors, tenured faculty members, and degree programs, a shift occurred. Over the past four decades, women's studies began to look like other disciplines, hunkering down into a familiar academic pattern with specialized language, a focus on selected topics, and increasingly separated from other disciplines. Its evolution is neither surprising nor alarming.

Some future foci are clear. Women's studies will continue to engage more and more global material. It will continue to analyze women in all their multiple dimensions. It will continue to ask questions about institutional patterns that negate women's presence—especially when the practice of placing a few women in prominent positions appears to negate the invisibility of most women.

Other future directions are less clear and may depend on at least three factors. One is the campus context: Each women's studies program nationally or internationally grows out of its own context. A second is the shifting nature of higher education, increasingly corporate and preprofessional. A third comes from the internal dynamics of the field itself and the second and third generation of scholars who will lead it. There are likely to be as many answers as there are programs, campuses, and scholars as we move farther into the twenty-first century.

One distinctive characteristic of women's studies as a field has been how campus-specific each program is. Every program, its faculty and curriculum, grew out of the situation of women on that campus—and each continues to be shaped by it. Thus, at a major research university like the University of Michigan, with its long history of social-science research and system of interdisciplinary funding, women's studies has an outstanding, multidimensional global-activism project. Documenting the activities of leaders of the women's movement around the world enables scholars to compare the first wave of feminism in the U.S. and Europe with later waves around the globe. Such diversity, while healthy, makes generalizations across programs difficult.

Contemporary higher education is under financial stress, stress that leads to larger classes and more teaching and service by adjunct faculty members. Elite institutions rely on international star faculty members who negotiate time away from the classroom. Under these and similar conditions, how do undergraduate students get the classroom time and space to develop the self-knowledge and reflective capacity that have been the hallmark of women's studies? What about the gendered dynamics of all classrooms?

Some universities, like Cal Poly or Virginia Tech, continue to address these core concerns by sponsoring projects on women in science, mathematics, and engineering. Using the insights of humanities and social-science research to investigate the status of women in the natural sciences, the projects they sponsor demonstrate the continued need to investigate women's lives throughout the life cycle and across boundaries of race, ethnicity, sexuality, and class.

Disciplinary specialization is a hallmark of maturity in a field—and simultaneously a challenge for a field's future. As a field becomes refined, does it—should it—stay in conversation with other fields? How does it interpret its own history? What priorities are given to theoretical development, teaching effectiveness, campus presence? The facile answer is that a department can do all three. The reality is more complex: On some campuses, this conversation about the relationships among foundations, evolution, and current priorities is vibrant; on others, marginalized; on some, ignored.

Nationally, women's studies began in 1969. A forty-year-old discipline is a young discipline. It will take another twenty-five years for women's studies to make meaning of its life experiences and achieve the status of a wise elder in the academy.

O'Barr was the founding director of women's studies at Duke and the Margaret Taylor Smith Director of women's studies from 1997 to 2001. She was dean of continuing education from 1971 to 1983. On the Duke faculty since 1969, she planned to retire in May.

 

Black Popular Culture

Appropriation and Reinvention

By Mark Anthony Neal, professor of black popular culture in the Department of African & African American Studies

The election of Barack Hussein Obama speaks volumes about the extent to which America has become a decidedly multicultural nation. But does it, as some pundits surmise, signal the end of a distinctive black popular culture? Certainly, Obama's success highlights what many scholars of black culture have known for some time: African-American culture is intimately related to what we scholars might define as mainstream Americana, neatly packaged with a Motown-era backbeat and the sass drawn from black women's vernacular.

What many folks see as reflecting black popular culture is a commercialized, sanitized, airbrushed version. This year is the thirtieth anniversary of the miniseries Roots. Think about the startling nature of that show when it first aired: slavery, lynchings, Jim Crow—the underbelly of American race relations —on prime time. I read recently there's going to be a collector's edition DVD with commentary and other extras. I think of it as a user-friendly introduction to black life in America.

So you have things coming out of the  everyday experiences of black people that gain a larger visibility because of an artist, because of a TV show, and then the ball starts rolling: It becomes part of somebody's marketing plan, divorcing it from its origins. No one understands this better than Obama.

Whatever I feel about the substance of the man, politically or otherwise, part of his success was achieved by presenting him as a brand that resonated for different audiences at one and the same time. His use of Motown music, for example, was brilliant in that regard.

There is something very superficial at play, not necessarily the man, but what people read in the man. I have been asked, "Is he seen as 'less black' than other African-American candidates?" I wouldn't say "less black"; I would say less threatening—someone those in mainstream white America could imagine as their neighbor.

The sudden visibility of the first black First Family also has put a spotlight on some of the mundane, everyday practices within black communities that have not reached the mainstream or that have been appropriated but misunderstood. A good example is the media sensation caused by the "fist pound" (although neophytes in the media called it a "bump") between Obama and his wife, Michelle, after the close of the primary season.

I was among the dozens of scholars of black culture who received calls from the media in search of the cultural genealogy of what some whites interpreted as a threatening gesture but which, in the black community, was simply an intimate public moment between a husband and wife. As I told many reporters, it is something I do daily with my own daughters, an act of affection. The campaign season was full of such questions; the Obama presidency is akin, for some, to having the first black family move into an all-white neighborhood.

All of this bodes well for those of us who make meaning in both the mundane and the exceptional in African-American life and culture. There's little doubt in my mind that Obama's presidency—and its long-term influence and implications—will usher in an exciting period in the study of black popular culture. It also promises to provide an unprecedented opportunity, and inspiration, for black artists and entertainers, as they scout and interpret new cultural terrain.

Like Amiri Baraka's notion of the "changing same" in music—something akin to an improvisation on a known entity that produces new music at once different and familiar—true black popular culture is not something you buy over the counter. It's something that is renewable, produced as a reaction or antidote to market forces. So, just when you think it's mainstream, something else comes along that's clearly saying, No, it's not. What makes black popular culture so compelling as a subject for study—and what will continue to do so—is its ability to remake itself, time and time again, even as some of its very tenets are continuously appropriated as emblems of American culture.

Thirty years ago, when you could hear a Motown recording on any radio station in America, suddenly, you have hip-hop emanating out of the Bronx. Now, of course, it has been appropriated as an emblem of black and American culture. But rest assured that in some other quadrant of the culture, some group of kids is creating something new that very much resists such incorporation. It is that aspect of black popular culture that makes it so exciting for those of us who study it, and there's little indication that that will change in the next twenty-five years.

Neal is the author of four books, most recently New Black Man, published in 2005; he is a contributor to many online news outlets, and his blog, Critical Noir, appears at http://blogs.vibe.com.

 

Religion

God Only Knows

By Sam Wells, dean of Duke Chapel and research professor of Christian ethics at Duke Divinity School

A generation or two ago, there were perhaps four strands of religion in global perspective. In the developing world, global faiths (e.g., Christianity) and local religions (e.g., animism) were integral to daily life. In Western Europe, secularity was ascendant in mind and discourse, but spirituality was tangible in heart and culture. The decline of formal religion, individually or culturally, had entailed neither the collapse of morality nor the demise of interest in ritual, spirituality, and the supernatural, as pre-twentieth-century commentators had often assumed it would.

In the U.S., mainline and Catholic Christianity were very strong; evangelical Christianity was becoming more self-aware and was being increasingly harnessed to political platforms, sometimes emerging in fundamentalist political programs. By contrast, in the communist world, official hostility to most religious expression had significantly, but not entirely, eradicated religious sensibility.

The most notable thing, in retrospect, is the degree to which these four strands ran parallel to and independently of one another, rather than intermingling and merging.

Today, several striking themes emerge out of the energy and anxiety that arise from the increasing overlapping of these four strands. The energy stems from a number of sources, including the rise of the charismatic and prosperity gospel movements in the developing world, as a result of the cultural crossover between the U.S. and societies with huge wealth disparity; the enormous potential of China and Russia for rediscovering religious dimensions of life and culture, and the unknown of whether the traditions that become dominant will be indigenous or imported; and the slowly emerging tendency of evangelical Americans to engage with global poverty and the environmental crisis.

The anxiety includes lament at the decline in Arab and/or Muslim cultural integrity and resentment at American and/or Western ascendancy, focused on the Israeli-Palestinian political impasse, which erupts into violence locally and globally. There is also anger in developing societies, especially in Africa, toward Western assumptions that global Christianity should incorporate perceived secular values (particularly in relation to sexuality) into its ethic. And there is fear in Western Europe of the loss of the postwar secular consensus (the belief that religion is confined to the private and personal sphere and has no significant purchase on the public domain), in the face of the cultural resistance of immigrant groups—a fear that finds expression in militant secular tracts, hapless efforts at legislation, and clumsy assertions of free speech.

In a generation's time, one can only assume that this cross-fertilization of cultures will continue to increase, with corresponding dimensions of energy and anxiety. It is often said that if you want to make God laugh, you tell him your plans—so prediction is a fool's game. But the following are among the key unknowns, the resolution of which is likely to shape significantly the character of religion in 2034:

  • Whether nations with large Muslim populations, such as Iran, Pakistan, Turkey, and Indonesia, can harmonize democracy and cultural adaptation to Western ascendancy with abiding Muslim faith, or find resources in their own traditions to sustain them amid diversity and conflict. If not, the consequences may be explosive; 
  • Whether mainline Christianity, so entangled in Europe with secularity and in America with cultural dominance, will lose its doctrinal and ethical identity in the West and dissolve into a mass of symbols with no content, therapies with no center, and rhetoric with no handle on truth; 
  • Whether the Israeli-Palestinian question will remain a focus of international and inter-religious tension or become a symbol of religious and global hope and cooperation; 
  • Whether rapid and grassroots-led communication, based on the visual and tangible rather than the verbal and cerebral, will make popular religion inherently eclectic and spirituality a syncretistic pick-and-mix.

Reports of the death of religion have turned out to be greatly exaggerated. The secular consensus established in Western Europe and largely assumed by American intellectuals and public officials, but not by the American public as a whole, is proving hard to export to the rest of the world. Will this secular consensus become global, will it disintegrate (perhaps violently), or will it be transformed and enriched in international, intercultural, and inter-religious encounter? Those are the key questions for global religion in the next twenty-five years.

Wells is the author of several books on theological ethics, most recently Speaking the Truth, published in 2008. He was ordained in the Church of England in 1991.

 

Music

Eclectic Notes

By Harry Davidson, professor of the practice of music and director and conductor of the Duke Symphony Orchestra

It is safe to assume that in the year 2034, we will be listening to all music that has been and will be preserved (in whatever form). Our super-technological age has made listening a boundless activity that transcends time and space: At the touch of a few buttons, we can access music written five centuries, or five minutes, ago—from any place on Earth. For many, music is omnipresent, waking us in the morning, accompanying us in our cars or on walks, in stores and places of business, and in most modes of entertainment and relaxation. At no time in history has so much music been accessible to so many people: truly a golden age of listening.

We have no reason to think there will be any interruption to this golden age. Instead, we can expect further expansion. As the world gets smaller, there will be more extensive cross-cultural fertilization of musical styles and genres. As a professor, I have often marveled at the ease with which my students absorb and assimilate a tremendous variety of music. Classifications seem to mean less and less to them as they accommodate an ever more diverse musical experience. While they clearly have their personal preferences and strong opinions about what appeals to them at any given time, I find them marvelously open to new musical encounters. Since they are the future, I can only believe that musical eclecticism will flourish. The appellation "crossover artist" will become an anachronism.

In the West, what we commonly call classical music will be as alive and well as ever, despite all the dire predictions of its diminishing influence or demise. I have no worries for Bach, Mozart, Beethoven, Brahms, or any of the other great composers; they will always have their ardent advocates and admirers. (Needless to say, I hope to be among them at age seventy-eight.) There are more orchestras (professional, community, and youth), opera companies, and chamber ensembles than ever before, and, while some may have trouble in the economic climate that is sure to be with us for a good portion of these next twenty-five years, most will survive and transform themselves with the times.

In my own lifetime, I have witnessed huge changes in the way people listen. Even so, live performance has been the constant because there is no replacement for the simultaneous communal and individual experience of perceiving music that way. It can be said that music, by its very nature as a temporal art form, helps teach us to live in and experience the present. Vinyl LP records and cassette tapes, even though still collected, are no longer the standard, and even CDs are on their way out. Today, the computer and the MP3 player are the reigning music purveyors—that is, until something else comes along.

None of these media will completely disappear, but, naturally, the most recent digital formats will dominate. My Duke composer colleague, Anthony Kelley '87, A.M. '90, however, suggests that "we are already on the way to the elimination of hard media; thus the trend will continue until the databases of music will be accessible, wirelessly, with the touch of a button, or maybe in some cases from a neural impulse." Wow! Composers and conductors can hear music in their heads, but for any listener to be able to do so—I can't wait!

Musical notation has evolved quite a bit since Guido d'Arezzo invented the staff, sometime around the year 1000. The composition process today is as likely to involve a computer, keyboard synthesizer, and printer as a piano, lined paper, and a pencil with a large eraser. Hence, it seems reasonable, as Kelley suggests, that the mingling of acoustic and electronic sound sources, as well as written text with improvisation, will continue to expand the tone colors and forms that music presents to performers and listeners. Our time has also placed a great premium on "seeing" music in addition to "hearing" it (witness websites like MySpace). Narrative seems crucially important today, and while this will continue to be true, I sense something of a subtle, gradual, increased awareness of the more abstract potential of the art.

Putting aside all of this aural augury, my view and hope on the subject of future music listening is best summed up by the great twentieth-century American composer Aaron Copland in his book What to Listen for in Music, published seventy years ago: "Since it is our combined reaction as listeners that most profoundly influences both the art of composition and interpretation, it may truthfully be said that the future of music is in our hands."

Davidson is in his tenth season as orchestra conductor. The Duke Symphony Orchestra consists of more than ninety musicians, mostly students.

 

Biodiversity

Going, Going,...

By Stuart Pimm, Doris Duke Professor of conservation ecology

This century will surely be remembered as the time the Earth bit back—not that Mother Nature hadn't been a little testy before now. In the fourteenth century, plague spread more easily as the population both grew and became more concentrated in urban areas. When Europeans began to travel widely to other parts of the world, they took diseases with them to vulnerable continents—smallpox to the Americas, for example. And, there were plenty of regional examples of cultures, some sophisticated, that declined precipitously, abandoning long-occupied sites where people had abused the ecological services nature had supplied them.

Now, early in this century, those kinds of formerly isolated occurrences have become undeniable global problems. Human numbers have reached almost seven billion and continue to grow; human aspirations grow faster. We Americans are the third-most populous country; our per-capita impacts far exceed those of any rival, and they desperately want to catch up.

Except when well paid to do otherwise, scientists agree that humanity faces a historical, unprecedented set of intertwined threats from our global environmental impacts. Global use of fossil fuels puts so much carbon dioxide into the air that we are close to doubling the natural level. Fertilizer production has already doubled the amount of nitrogen in the biosphere.

Changes to the land have damaged two-thirds of the planet's grazing lands, turning them into areas too dry to support our domestic animals. We're shrinking forests, particularly moist, tropical forests. Only remnants will remain by mid-century. This deforestation contributes a fifth of our emissions of carbon dioxide. (The details are nasty, too: Moving into African forests to cut trees, loggers fed on chimps and other monkeys. Butchering bush meat is a crude business, and carelessness with bloody meat introduced HIV into humans, with disastrous results.) Across the world's oceans, almost every major fishery has been reduced far below the levels at which it can provide the maximum harvest.

Twenty-five years from now, the optimist in me thinks we'll merely have the same set of worries that we have now. The pessimist thinks that we'll have made such a mess of the planet that massive global changes will be unavoidable. The next twenty-five years will decide whether or not we leave a much-damaged planet to our children.

In theory, all of these problems are reversible, though recovery might take centuries—all, that is, except the loss of biological diversity. Already, the damage we've done has deprived us of species forever. In tropical forests, I have seen the devastation of extinction firsthand. I once looked for the po'o uli and the alala, two species of unique Hawaiian birds, on research trips. The former is extinct; the latter survives only in captivity.

The future is potentially grimmer. Habitat destruction, exacerbated by the changing climate, may push a third or even half of all life on Earth permanently out of existence. Such losses will be unavoidable in twenty-five years, unless we act differently.Jurassic Park is fiction.

Certainly, there are narrowly defined ecological issues involved in preventing extinctions, and standard disciplinary science can handle these problems well. The major challenges lie elsewhere. Preventing extinctions requires attention to stabilizing human population growth, for example, slowing deforestation, and changing our attitudes about global stewardship.

Breaking through disciplinary lines is essential to finding solutions to the environmental crisis. Environmentalists must engage with the faith community, as conservation is fundamentally an ethical and religious endeavor. While many believe that damaging the environment is a sin, others argue a God-given right to plunder, or deny completely that problems exist.

Environmentalists must also engage with the business world: Figuring out how to get the most species protected for the smallest land purchase, or how to capture the market value of naturally occurring ecological solutions, are complex tasks. Policymakers must be brought into the process, as well. Even those who understand the consequences of a warming planet might not see beyond overly technical fixes. The tree, after all, is the simplest and cheapest means of soaking up carbon and protecting species.

What academics decide in ivory towers, or politicians in corridors of power, does not always translate into practical, local actions. These issues belong to no one discipline or field, and only by crossing the boundaries between them will we find specialized solutions to the greatest threats to Earth's future.

Pimm's books include The World According to Pimm: A Scientist Audits the Earth.

 

Learning

Radical Transformation

By Cathy N. Davidson, John Hope Franklin Humanities Institute Professor of Interdisciplinary studies and Ruth F. Devarney Professor of English

Alvin Toffler has said that in the twenty-first century, literacy isn't just reading, writing, and arithmetic but the ability to "learn, unlearn, and relearn." Given the ever-increasing rapidity and magnitude of change on a global scale, we all need to master the precious and formidable skill of being able to stop in our tracks, discard the road map that has failed us, and try a different route on the unpredictable journey ahead. That takes imagination, creativity, an ability to work with others, and fearlessness in the face of uncertain success.

I don't imagine the challenges of 2034 will be any less daunting than those of 2009. But I would hope that a quarter of a century from now, we will be doing a better job of training our students for unlearning the future.

Not much in our current educational system prepares us for the task. In fact, one could argue that data mastery (in any field), the ability to absorb and evaluate information, and the skill to use existing paradigms to solve problems predict good grades but do not necessarily prepare students to respond effectively to the unexpected twists and turns that, inevitably, lie ahead.

At most universities and colleges across America, the curriculum is divided into disciplines and departments with relatively rigid requirements for graduation within and across those divisions. In that sense, higher education looks pretty much the same in 2009 as it looked twenty-five years ago, in 1984—and not that much different, to be frank, from the way it did in 1884. We basically are continuing with a nineteenth-century model of discipline-based expertise—a machine-age education based on an antiquated model of a hardwired, localized, compartmentalized, and hierarchical brain. Yet we are in the midst of a world-altering digital revolution that, in the words of inventor Douglas Engelbart (he invented your computer mouse), makes it essential that we "shift our paradigms about paradigms."

So, what would education look like if we took seriously the injunction to learn, unlearn, and relearn? Methodologically, unlearning typically happens when one is confronted with the irrefutable uselessness of one's present repertoire of skills to comprehend or cope with an important situation. Traumas such as a serious illness, the death of a loved one, or the loss of one's life savings require us to find resources we didn't know we had. Spending significant time having to negotiate a different culture calls up the same reflection about the inadequacy of one's own skills and the urgent need to develop new ones, which is one reason we encourage students to study abroad. How could we replicate that form of disorientation in the familiar confines of the classroom?

Here's one of my fantasies for radical educational transformation in 2034. Once a semester, students will be required to take one course outside their major or minor that is picked from the equivalent of a digital hat. Since this is a fantasy, I'm not worrying about the practicalities so I'll say every course, no matter how specialized, goes into the digital hat. It might be algebraic geometry, advanced Arabic, econometrics, or eighteenth-century English poetry. By hook or crook, students must learn something they didn't plan on learning, with seemingly no relevance to their chosen career path, and, in many cases, something for which they are utterly, abysmally, and delightfully unprepared. Grades will be collaborative, as will learning; the lowest grade anyone earns will be what everyone receives. All students must show that they contributed substantively to a final project that demonstrates mastery of the material. So it is everyone's job to help everyone else learn the subject and to figure out how everyone can contribute to his or her maximum ability. Maybe the digital-hat student doesn't know the finest twists and turns of abstract algebra, but is a genius at managing collaborative teams or is a brilliant science writer who can explain the significance of the team project to non-mathematicians.

It would certainly shake up our concept of learning if a quarter of our educational lives was spent in the intellectual deep end, with our fellow students there helping us learn to swim. On the professorial level, the digital-hat requirement would force us to rethink what the major might look like with a quarter fewer courses required of each student and to re-imagine the role of the university itself, because of the redistribution of courses and faculty members required by this random assignment of courses. It would also demand a different pedagogy, since a quarter of one's students could be presumed to have little preparation for or even interest in the subject matter.

What would learning look like without the safety net of prerequisites? U.S. Department of Labor statistics indicate the average person changes careers three to seven times. So, this educational shake-up more closely resembles the challenges ahead than does the present system of overly prescriptive majors and minors. Would our experiment in strategic failure be interesting? Indubitably. The method evokes the concept of katsu in Zen Buddhism, the thunderous blast that turns the acolyte's life around. The important part, as in all education, isn't the blast itself, but the reverberations that continue after, inspiring imagination, confidence, and daring for a future of unlearning.

Davidson cofounded the Humanities, Arts, Sciences, and Technology Advanced Collaboratory (HASTAC) and blogs at Cat in the Stack, http://www.hastac.org.

 

Foreign Policy

Competition and Compromise

By Peter D. Feaver, Alexander F. Hehmeyer Professor of political science and public policy

The U.S. is the sole superpower, the only state that is a global heavyweight in every category of power that matters for international relations—hard (coercive) and soft (cooptive) power, diplomatic and cultural, economic and military. This was more true before the spectacular financial collapse of 2008-09. But it is still true today, though there is serious debate about how much lasting damage the bursting of the financial bubble will do to the American economy and, thus, to American international standing.

Twenty-five years ago, the long-term prognosis might have been grimmer. Although President Ronald Reagan had been reelected on the promise that it was morning in America, in fact the Doomsday Clock indicated it was just three minutes to midnight. The U.S. and the Soviet Union were locked in a seemingly permanent nuclear arms race. Even if we were lucky enough to avoid nuclear Armageddon, many scholars believed that the U.S. was the victim of what historian Paul Kennedy has called "imperial overstretch," that the combination of Reagan's economic, military, and foreign policies would surely usher in an American decline like that of other great powers before it. Few thought the Soviet Union would collapse; many thought China might rise (albeit not as quickly as it has); and almost everyone believed that the big winner would be Japan. It didn't work out that way.

Thus, it is wise to take predictions of American doom (or American boom) with a grain of salt. That does not mean forecasting is a fool's errand. In theory, forecasts about the future should guide the "grand strategy" of a state—the plan leaders develop that takes into account where they are in the world, the challenges and opportunities they confront, and where they would like to be. Though it happens less often in real life than in academic seminars, long-range forecasts do shift grand strategies.

Nixon and Kissinger, for instance, quite deliberately pursued a grand strategy aimed at turning the bipolar U.S.-Soviet rivalry into a tri-polar game in which China tilted toward us. This was based on their conviction that there was an opportunity for such a gambit—the Sino-Soviet bloc was an unhappy marriage of rivals—and their fear that the U.S. would lose out in the long run if it didn't seize the opportunity.

President George W. Bush's grand strategy in Asia was similarly shaped by long-term forecasting. The Bush administration sought to deal with China's likely rise over the next several decades by giving it an equity stake in the existing global order—seeking to make it a "responsible stakeholder" that benefited from, rather than chafed at, existing rules of the game. (For example, the U.S. backed China's entry into the powerful World Trade Organization, despite concerns about China's trade policies, hoping the reward would also be a lever to nudge China away from mercantilist policies.)

Likewise, Bush sought to deal with the likely rise of India (and to hedge against China's rise) by pursuing a strategic partnership with India, even to the point of appeasing India by dropping long-standing sanctions against its nuclear program.

These are far-seeing strategies that, if successful, will shape the course of international relations for decades. Beyond that, however, it is hard to think of reasonable foreign-policy moves that hinge on twenty-five-year forecasts and also make sense in the short-term.

To be sure, there are many important things that need to be done, but they are the foreign-policy equivalent of "diet and exercise," because they make sense regardless of which twenty-five-year forecast you believe. They include improving U.S. energy efficiency and developing alternative energy sources, improving our fiscal posture, reinvesting in economic fundamentals, and dealing with looming entitlement gaps so the material foundation of American power is preserved. I am assuming we will take those sensible (if painful) steps because we have to.

If we do not, all bets are off. If we do, then we are likely to remain the world's most influential state. Europe is an economic superpower, but its capacity to project military power or to act decisively is limited. Russia is a force to be reckoned with when oil is at $140 a barrel, but it faces severe demographic and economic challenges. China is potentially a superpower, but to reach that level, it must overcome enormous internal tensions and contradictions.  

Japan will matter, but less and less over the long run unless it overcomes its own demographic challenge—a shrinking population means that, over time, the country's share of global production will also shrink. India and Brazil have potential, but they have had potential for decades and always seem to find a way to under-realize that potential. No one has America's cultural appeal and capacity for self-renewal.

No commentary about betting on the future can close without citing Niels Bohr's observation that "Prediction is very difficult, especially about the future." Political scientists are much better at retrodiction—predicting and explaining the past.

Nevertheless, it is a fairly safe prediction that in twenty-five years, the U.S. will occupy a more compromised position in world affairs than it does now—stiffer peer competition, less leverage, and more constraints on what we can do internationally. But it is also a fairly safe prediction that the U.S. will be on the very short list of states that matter globally—with global reach and global influence—and probably will be at the top of that list.

Feaver was special adviser for strategic planning and institutional reform on the National Security Council staff at the White House from June 2005 to July 2007.

 

Literary Canon

Firing Up the Canon

By Frank Lentrecchia, Katherine Everett Gilbert Professor of literature and theater studies

When Duke Magazine asked me to address the future of the literary canon, I agreed, but probably I shouldn't have. The truth is that I've never given much thought to the question of canon formation, not even during the years of the culture wars, when it was important to declare yourself for the canon as it stood (in which case you were said to be on the "right") or for "opening it up" (in which case you were said to be on the "left").

I've been a lifelong Democrat, have been called a Marxist by some high-placed defenders of the traditional humanistic curriculum (including the controversial former head of the National Endowment for the Humanities Lynne Cheney), and worked for Barack Obama's election. And (I almost wrote "but") I teach only so-called canonical writers. I've never thought that political and aesthetic values were in conflict, or understood how they could be. For me, at any rate, they are not.

I say "so-called canonical" because my specialty is modernism, and it seems wrong to me to call "canonical" writers who have been around for only 100 years or less. I teach the writers I'm passionate about and refuse to put on my syllabus—because I fear that I'd teach them badly—writers for whom I can summon no passion. Here's my more or less complete list of the former: Beckett, Chekhov, Conrad, DeLillo, Eliot, Ellison, Faulkner, Ibsen, Joyce, Kafka, Lampedusa, Mann, Pound, Stevens, Woolf, Yeats. (But what about…? And how about…? How can you possibly not…?)

To say that I teach these writers is misleading: I teach a very few select texts by masters of aesthetic excellence. "Masters," you may recall, was a highly contested word among cultural warriors. I used the word because the texts I teach seem to me to tell us about the world in ways too dense, too particular, too wayward to be captured by ideological abstractions from any place on the political spectrum. They tell us about our being in the world by creating pleasurably difficult artistic structures and textures in and through which—and only in and through which—we are given access to unique visions.

I'd say, if pushed, that they give us the most complex portraits out there of what it means to be human, not generally, but in specific historical circumstances, while recognizing that my colleagues in the other disciplines—in art history, in classics, in anthropology, in history, in philosophy—will offer strong claims of their own that I'd not be able to successfully turn back, even if I had the desire to do so, which I do not.

Where will my so-called modernist canon be in twenty-five years? Likely, just where it is today. The unanswerable question is, where will it be in two or three hundred years? I'm confident of this: Changes in my "canon" and the classic Western canon (Homer, Virgil, Dante, Shakespeare, Milton, etc.) won't be effected by individuals at Duke or any other institution of "higher" learning engaged with each other in polemical struggle, but by long, slow, historical processes over which no individual, or university, can exercise determinative action.

Lentricchia A.M. '63, Ph.D. '66 has written more than twenty books of fiction and of literary criticism. His most recent novel, The Italian Actress, was published in 2008.

 

Journalism

Who Will Pay?

By Susan Tifft '73, Eugene C. Patterson Professor of the practice of journalism and public policy studies, DeWitt Wallace Center for Media and Democracy

Asking whether journalism will exist a quarter of a century from now is like asking Americans at the depth of the Great Depression whether they can envision a life without Hoovervilles and apple sellers. Hope springs eternal, but present indicators aren't promising. With the Tribune Company, owner of the Los Angeles Times and Chicago Tribune, in bankruptcy, the Rocky Mountain Newsand Cincinnati Post extinct, and the stock of The New York Times Company hovering near $4 a share, the specter of out-of-work journalists sporting "Will Report for Food" signs seems chillingly real.

That's bad news for democracy. Our country needs a robust press to act as a check on public and private power. If there is a silver lining, it is that while the business of journalism is undergoing a painful transformation, journalism is at a thrilling juncture.

Never before have there been such sophisticated tools at the disposal of modern-day Woodwards and Bernsteins. Computers make it possible to analyze public records and distill telling insights from disparate bits of data. Social media such as Facebook, MySpace, and Twitter help reporters find sources, news tips, photos, and eyewitnesses to important events. Technology has extended the reach of even modest news operations. With the click of a mouse, an American working in New Delhi can instantly access his hometown daily in Des Moines and check out local blogs for the latest doings in his old neighborhood.

News consumers, too, have been transformed. The innate human desire to know what's going on, and to share it, has found infinite expression on the Web, through mainstream media news sites, blogs, "tweets," and video- and photo-sharing sites like Flickr and YouTube. There is a ravenous hunger for news, in every way that term can be defined. In an increasingly connected and globalized world, that craving is sure to grow between now and 2034.

Whether news outlets as we currently understand them will be around to satisfy that need remains to be seen. Newspapers, which generate most of the reliable, independently verified news in this country, are an endangered species, hobbled by the Internet's ability to atomize audiences and lure advertisers to free sites such as Craigslist. It's ironic, but even as the Internet has greatly expanded newspapers' readership, it has choked off their economic lifeline.

As a result, there will be fewer newspapers in twenty-five years. Indeed, we may not even use that word. (The American Society of Newspaper Editors recently proposed substituting "News" for "Newspaper" in its name.) The ones that survive will be wholly online (a move just made by The Christian Science Monitor). TV and radio will also migrate to the Web, with text and visual and audio reporting delivered to anything that has a screen (laptops, PCs, flat-screen TVs, iPhones). By 2020, according to the Pew Internet & American Life Project, most people in the world will connect to the Internet through a mobile device, so it stands to reason that future news junkies will read and view and hear the news through something the size of a human palm. Already, USA Today, The Washington Post, The Wall Street Journal, and The New York Times can be downloaded and read on the portable e-book device Kindle.

These are all innovations in news delivery. They don't address the more critical question of who or what will pay for the hard work of actually reporting credible news and analysis. Without news institutions or, at the very least, independent journalists steeped in the values of truth-seeking and verification, Google and Yahoo would have little trustworthy news to aggregate, radio talk-show hosts would have few facts to twist, and bloggers would be reduced to swapping rumor and gossip (something many specialize in anyway). Especially at risk is the expensive investigative journalism that is so critical to holding government and business to account.

Who will pay for high-quality news in the future? Perhaps press lords patterned after News Corporation's Rupert Murdoch and Mexican billionaire media investor Carlos Slim. Perhaps foundations, philanthropists, readers, and viewers, who have long supported public broadcasting and are now underwriting GlobalPost, ProPublica, and other intriguing experiments. Perhaps news consumers, who will pool donations to "hire" individual journalists to cover a particular story or a beat, a model now being tested at Spot.Us. Perhaps "serious news" will become a niche product like expensive industry newsletters, marketed by providers like The New York Times to an elite audience willing to pay top dollar. Perhaps news websites, currently free, will throw up tollbooths and collect fees for entry.

At the end of last semester, I asked my students what they saw in their crystal balls. A surprising number were horrified at the prospect of ink-on-paper journalism disappearing. Some worried that in a 24-7 media world, accuracy would give way to speed; others that news would become more partisan and opinionated. One predicted that news shows would merge with game shows: American Idol meets 60 Minutes. But, overall, there was confidence that, by the time they reached their forties, reliable news would still be available, and the ethics and values of good journalism would endure.

Tifft '73, who planned to retire as Patterson professor in May, conducts research on media ownership and ethics. The Sanford Institute has established the Susan Tifft Undergraduate Teaching/Mentoring Award in her honor. She is a member of Duke Magazine's Editorial Advisory Board.

 

Health Care

Global Imperative

By Victor Dzau, chancellor for health affairs, Duke Medicine, and president and CEO of Duke University Health System

It is understandable that meaningful reform of the U.S. health-care system is front and center on the minds of legislators, business leaders, health-care professionals, and the millions of people across the country who voted for the kind of reform outlined by the Obama campaign. The need for this reform in the near term is undeniable. However, I believe the next few decades are also critically important to making progress on the global-health front.

Leading academic medical centers like Duke's have a unique opportunity to exercise leadership in extending the scope of global-health initiatives. These will include not only improving the treatment of disease, but also training health-care providers and conducting research that can be effectively translated and applied for the benefit of people around the world. I, and others, including Michael Merson, a physician and director of the Duke Global Health Institute, believe a number of factors will drive a greater focus on the importance of global health over the next few decades.

First, we'll see a growing conviction that people everywhere should have access to life-saving medicines. Nowhere has this become more evident than in the provision of anti-retroviral drugs for people with HIV/AIDS around the world. Some estimates suggest that, owing to collective international funding of programs that provide access to medication, millions of lives have been saved and a major impact has been made in preventing maternal-fetal transmission of the disease. With the real and emerging threats of global pandemics, I believe this conviction will grow stronger in coming years, along with a greater sense of urgency in the response of government and industry to the need for medicines and vaccines.

The importance of global research collaborations will become increasingly evident, providing the greatest opportunity for academic medical centers to take leadership roles. Duke Medicine, which is on the verge of some exciting, mission-based research collaborations with strong and respected partners in Asia, Africa, and elsewhere, is in an excellent position to be one of those leaders.

The rapid growth of clinical studies being conducted globally is an indication that clinical research is undergoing the same globalization process that can be seen in other enterprises. It is also clear that the intellectual and scientific firepower that can be accessed through broader collaborations with investigators worldwide will have far greater yields than limiting collaborations within a given country or region. And, unless the U.S. Food and Drug Administration makes fundamental changes that modernize and streamline the clinical-trials process, I would look for Asia to become the hub for clinical research.

Finally, foreign governments that recognize that economic and social development cannot occur without a healthy population will increase global-health research and care-delivery programs. At the World Economic Forum meeting in January, which I attended as a Governor for Healthcare Industry, it was clear that concern is rising around the world about the economic and security threats posed by an unhealthy population. It is becoming more widely understood that cardiovascular disease, obesity, diabetes, hypertension, and other chronic diseases are on the rise globally, especially in India, China, Latin America, and in urban centers in sub-Saharan Africa. Add to this an aging global population, health risks associated with poverty and gender inequity, and emerging infectious diseases, and the need for greater investments in health-care infrastructure abroad is obvious.

With poor health becoming a security risk and a barrier to economic growth, I would look for countries to become more aggressive in seeking strategic partnerships with the health-care industry, academic medical centers, and others that can identify and pursue improved public-health and disease-prevention strategies. At Duke, we are already working on public-private partnerships to address these issues.

The world is changing at a pace that almost defies belief. And, as threats to the health of people around the globe increase, so do the opportunities to have an impact. No doubt, as we look to the future at a time of financial upheaval, significant investments in global medicine carry great risks, but the setbacks that will result from inaction are too great to ignore.

Dzau holds an M.D. from McGill University. His research has focused on the molecular and genetic mechanisms of cardiovascular disease. Before coming to Duke in 2004, he was a professor of medicine at Harvard Medical School and chaired the department of medicine at Brigham and Women's Hospital in Boston.

 

Privacy

Nothing's Secret

By Karla Holloway, James B. Duke Professor of English and professor of law

Instantaneous photographs and newspaper enterprise have invaded the precincts of private and domestic life, and numerous mechanical devices threaten to make good the statement that what is whispered in the closet shall be proclaimed from the housetops…. Gossip is no longer the resource of the idle and the vicious but has become a trade which is pursued with industry as well as effrontery… The details of sexual relations are spread in the columns of our daily papers."

These are not the complaints of the twenty-first century, or even of the twentieth. They are the words of two Boston barristers, Louis Brandeis and Samuel Warren, who published their essay "The Right to Privacy" in the Harvard Law Review in 1890. They argued for a legal tort that would protect modern man from the kinds of outrageous violations of private personhood that the media's advancing technologies and "unscrupulous" interests threatened.

The right to privacy that allows us to associate our homes with sanctuary and that places the highest value on an "inviolable personhood"—a nineteenth-century notion of "the right to be let alone"—is a right we still understand and desire today. But in twenty-five years, I suspect that we will barely recognize how public—and indeed how violated—our inviolable personhood has become. In fact, we have encouraged some of this slippage. The threat of terrorism has been enough to persuade us to accept warrantless searches. And when these searches became virtual in the form of warrantless wire-tapping, it seemed merely a change of venue rather than a loss of constitutional rights. In the battle between privacy and national security, the nation followed its fears.

Brandeis and Warren were prescient when they named the kinds of devices and institutions that not only continue to threaten privacy (think of paparazzi, the Internet, Facebook, and MySpace) but also will, in a peculiar reversal of value, eventually relegate the idea of privacy to an archaic notion of years past.

By eagerly embracing new, increasingly necessary or seemingly urgent technologies, this era's modern man has contributed to the accelerated attenuation of the private. The desirable efficiencies of electronic communications displace concerns for their porous nature.

Consider one of the most recent versions of the private body rendered public—"sexting," or sending nude photographs via text messages. Did you allow the image to be taken on a cell phone? Did you send the image to ten of your very best friends? Just one? It doesn't matter. The production and form of transmission diminish the potential of an after-the-fact privacy claim. Our interest in joining virtual communities takes precedence over the caution that reasonably accompanies the insecurity of these technologies. And consumers can take no comfort in the fact that Facebook and Google seem to adopt the best model of corporate citizenship as they negotiate their ownership of users' information. The negotiations over how your information might be used become hot news-items-of-the-day but quickly fade from attention. They are more a customer-relations vehicle than a substantive engagement with consumer rights.

The visible body is not the only body at risk. Its very matter is no less vulnerable. When the U.S. Ninth Circuit Court of Appeals ruled in United States v. Kincade (2004) that a parolee had to provide a DNA sample for CODIS—the national DNA databank—it was a harbinger of the era to come. The Ninth Circuit cited a Supreme Court case, California v. Greenwood (1988). In that case, the Court held that household trash left curbside for collection was not private property because it was effectively and purposefully turned over to a third party (the trash collectors).

In his dissent from Kincade, Judge Alex Kozinski warned of the decision's potential reach, noting that "you can't go anywhere without leaving a bread-crumb trail of DNA matter." Thanks to twenty-first century science, even "garbage" DNA (naturally discarded matter like saliva, blood, hair, and skin particles) carries significant information about our genetic identities. Some "body's" trash can easily become our government's evidentiary treasure.

The potential for harm does not lie only in what we purposefully or inadvertently discard. In our eager embrace of DNA ancestry searches, we blithely swab our cheeks and send the sample through the mails, essentially cooperating in the public circulation of the most intimate dimensions of our identity. And those whose curiosity encourages their cooperation with enterprises that promise to unlock our genetic code are no different from those who surrender their DNA to determine, say, paternity. Whether it's Maury or National Geographic's Human Genographic Project, you might just as well have turned over both your own genetic information and your family's to any government entity that can prove a "special interest." You may indeed learn some facts about your genetic makeup, but the government may use it in any way that it deems appropriate. And as the science advances, the material you sent in for one purpose may end up being used for something entirely different—from medical research for the public good to tracking down a suspected criminal in your family, using the very genetic markers you made available for discovery.

Given the twenty-first century spectacles that already inhabit the virtual public square, as well as our increasingly eager, and arguably careless, contribution to their content, the idea of "a right to privacy" is already greatly diluted. In twenty-five years, I predict its social traces will reveal only faint reminders of the right those previous centuries were vigorously determined to claim. 

Holloway, who holds appointments in English, law, and African & African American Studies, is at work on a new book, tentatively titled Private Bodies/Public Texts: Race, Gender, and a Narrative Bioethics.

 

Neurosciences

Wearable Robots

By Miguel A. L. Nicolelis, Anne W. Deane Professor of neuroscience and professor of neurobiology, biomedical engineering, and psychology

One of the most exciting—and promising—areas of basic and applied research in modern neuroscience is the field of brain-machine interface (BMI). By creating a way to link living brain tissue to a variety of artificial tools, BMIs have made it possible for non-human primates to use the electrical activity produced by hundreds of neurons, located in multiple regions of their brains, to directly control the movements of a variety of robotic devices, including prosthetic arms and legs.

Over the past decade, neuroscientists at the Duke University Center for Neuro-engineering (DUCN) have been leaders at the forefront of BMI research. Their work raises the hope that in the not-too-distant future, patients suffering from a variety of neurological disorders that lead to devastating levels of paralysis may be able to recover their mobility by harnessing their own brain impulses to directly control sophisticated neuroprostheses.

Work at Duke and in laboratories around the world has provided many reasons to feel optimistic about this potentially revolutionary scenario in rehabilitation medicine. In 2003, we proved that, with the BMI, monkeys could make a robotic arm reach out and grasp objects simply by thinking about doing so. Just over a year ago, we demonstrated that a monkey could go even further: From the DUCN in Durham, the monkey was able to make a robot located halfway around the world walk on a treadmill by using her brain activity to control and guide it.

But this is only the beginning. The success of these experiments demonstrates that by working at the convergence of multiple fields of inquiry (computer science, microelectronics, and robotics, for example), modern brain researchers are quickly acquiring the ability to translate major scientific breakthroughs into a variety of promising new clinical tools designed to assist patients. These results may well lead to major advances in mitigating the clinical effects of neurological dysfunctions caused by spinal-cord injuries or such conditions as Parkinson's disease.

For example, one of our recent studies using animals has led to the development of a new prosthetic device that holds great potential for treating Parkinson's disease. In our study, the new device was attached to the surface of the spinal cord in mice and rats that had depleted levels of dopamine and exhibited the impaired motor skills characteristic of the advanced stages of Parkinson's disease. When the device was activated, the dopamine-depleted animals' slow, stiff movements were replaced, within seconds, with the active motor behaviors of healthy animals. (The study is explained in detail in the March 20 cover article in the journal Science.)

The realization of this and other technological advances over the next twenty-five years may well be accelerated by the Walk Again Project, an international consortium of leading research centers around the world organized by the DUCN. The Walk Again Project represents a new paradigm for scientific collaboration among the world's academic institutions, bringing together a global network of science and technology experts, distributed among all the continents, to achieve a key humanitarian goal.

The project's central goal is to develop and implement the first BMI capable of restoring full mobility to patients suffering from a severe degree of paralysis. This lofty goal will be achieved by building a neuroprosthetic device that uses a BMI as its core, allowing the patients to capture and use their own voluntary brain activity to control the movements of a full-body prosthetic device. This "wearable robot," also known as an "exoskeleton," will be designed to sustain and carry the patient's body according to his or her mental will.

In contributing their key intellectual assets and expertise to achieving such transcendent humanitarian feats, these scientists will once again prove the value of science as a fundamental agent of social transformation.

Nicolelis is also professor of neurobiology, biomedical engineering, and psychology and codirector of the Duke Center for Neuroengineering. A native of São Paulo, Brazil, he received his M.D. and Ph.D. in neurophysiology from the University of São Paulo. He was named one of the "Scientific American 50" in 2004. He holds three U.S. patents.

 

Race Relations

Hold the Hallelujahs

By Paula D. McClain, professor of political science and professor of public policy and African and African American Studies

The mere fact that Barack Obama, a black man of mixed race, was elected President is a statement of how far the U.S. has come from its sin of slavery and the stain of racism. But we should not delude ourselves into thinking that this milestone, as magnificent and wonderful as it is, signals the end of issues of race or racism in the U.S.

President Obama won 52.9 percent of the vote and 365 electoral votes, a significant victory. Yet, we need to get behind those—and other—numbers to get a true sense of whether we have progressed to the point that we can throw up our hands and shout, "Hallelujah!"

Democratic candidates have consistently failed to receive the majority of the white vote. Obama was no different. The majority of whites nationally who voted, voted for John McCain (55 percent to 43 percent). Obama's proportion of the white vote, however, was two percentage points higher than the white vote for John Kerry (41 percent), so progress was made. Moreover, in some states—for example, California, Connecticut, and other traditionally Democratic states—a majority of whites did vote for Obama. He was also able to pull a simple majority of white voters in traditionally Republican states, such as Iowa and Colorado.

In addition, although Obama lost the majority of white voters, he did carry the majority of white voters (54 percent) in a particular demographic category: whites between the ages of eighteen and twenty-nine. This is significant, as it might signal that younger whites—most of whom were raised in an era of integration and many of whom have a racially diverse set of friends—were more likely to support a black man for President than were other cohorts of whites. This suggests that the future of race relations in the U.S. is slated for continued improvement over time.

Obama's presidency may help to improve race relations on another level, as well. That the visual face of the U.S., domestically and internationally, is now a black face will make the presence of blacks and other individuals of color in certain occupations, leadership roles, and other aspects of social and political life more natural and commonplace.

I remain cautious, however, in any broad predictions about the state of race relations in twenty-five years because so many areas of structural inequality in the U.S. have existed for so long. For example, data from the American Housing Survey, conducted annually by the Census Bureau, indicate that blacks with similar income and credit scores to whites are charged higher interest rates on mortgage loans and are more likely to be pushed to the subprime mortgage market than are similarly situated whites. Given the recent collapse of the subprime market, the disparity in homeownership between whites (75.1 percent own homes) and blacks (47.4 percent own homes) will be even greater.

Disparities in the area of education are also startling. Data from the U.S. Department of Education indicate that 72 percent of white students who graduate from high school are "college ready," compared with only 37 percent of black high-school graduates. Moreover, the racial composition of a school is highly correlated with the number of advanced placement (AP) courses offered and who occupies seats in these classes. Blacks and Latinos enroll in AP courses at half the rate of white students.

In the area of health, blacks have consistently had higher infant-mortality rates (14 per 1,000) compared with whites (6 per 1,000). This gap has continued for decades in a country with what some say is the best health-care system in the world. Research on toxic-waste sites shows that these sites are more likely to be located in areas with high concentrations of nonwhite populations and that 56 percent of those living within 1.5 miles of a toxic-waste site are nonwhite.

Even this abbreviated list of challenges and concerns offers strong evidence that disparities are broad and deep. President Obama, if reelected in 2012, will have a total of only eight years to reverse structural inequalities developed over a far longer period of time. Even if he devotes all his attention and all the nation's resources to addressing these inequities, which is not possible, much would remain to be done after he leaves office. And as the nation becomes increasingly multiracial over the next twenty-five years, the imperative to confront these inequities will be even more important if the nation is to continue moving forward.

We must not be lulled into turning a blind eye to all that remains to be done. President Obama's election was one important milestone on the long, steady march toward equality, but we still have miles to go.

McClain is codirector of the Center for the Study of Race, Ethnicity, and Gender in the Social Sciences and just completed a two-year term as Academic Council chair. The fifth edition of her book Can We All Get Along? Racial and Ethnic Minorities in American Politics will be published in July.

 

Medical Education

Beyond the Basic Diagnosis

By Nancy Andrews, dean of the Duke School of Medicine and vice chancellor for academic affairs

"Modern" medical education is about to have an important birthday. In many ways, it began with the Flexner Report, which was published in 1910 by the Carnegie Foundation for the Advancement of Teaching. The Flexner Report was revolutionary in its assertion that medical practice should be based on scientific understanding. For the past ninety-nine years, most aspiring physicians have followed Flexner's prescription of two years of preclinical science education, followed by two years of clinical experiences. In the 1960s, the Duke School of Medicine uniquely modified this formula by requiring only one year of didactic science education, allowing students to use the extra year for research.

Biomedical science advanced rapidly over the twentieth century. Microorganisms were shown to be disease-causing agents, and new drugs and vaccines were discovered to combat them. Inborn errors of metabolism, diseases caused by a lack of specific enzymes, were described as genetic defects in the body's biochemistry. Even protean mental illnesses were described with strict criteria and catalogued accordingly. Throughout the decades, medical education focused on diseases as discrete, largely independent alterations of the human condition. Students were taught that a sick patient usually had one disease, but over time, might acquire additional diseases layered on top.

A century after the Flexner Report, medicine and medical education are again undergoing a transformation. Diagnosis is no longer binary; we can't simply say one does, or does not, have a disease. A growth in the intestine, for example, may have features that fall anywhere along a spectrum, from entirely benign to precancerous, to cancerous, to metastatic. In part owing to our ability to detect pathological processes early and follow them over time, medicine is becoming less disease-centered and more patient-centered. We can now examine a patient's genes, use a rapidly expanding repertoire of genomic knowledge to predict whether she is at risk for medical problems, and take advantage of that genetic information. Although we do this now in a rather primitive way, we're learning fast.

The Flexner Report also marked the start of an era in which allopathic medicine, based on the natural sciences, became more widely accepted than homeopathic, chiropractic, and osteopathic medicine. Here, too, old lines are blurring today, as we accept that traditional healing approaches can also be effective in promoting wellness. Acupuncture is a good example—although poorly understood in scientific terms, its benefits are now widely recognized by mainstream physicians. Approaches referred to as "alternative" or "integrative" medicine are increasingly being added to the modern physician's treatment playbook.

Medical students expecting their degrees in 2034 will understand how our bodies function in health and in disease far better than ever before. Traditional tests for cholesterol, blood sugar, and other indicators of well-being likely will be supplanted by complex metabolomics and proteomics tests that monitor thousands, rather than dozens, of biochemical clues. The complex data sets developed through genomics, metabolomics, proteomics, and other yet-to-be-defined "-omics" (a neologism referring to the totality of a biological system) will allow physicians and patients to work together in unprecedented ways to predict an individual's personal risk and prevent the onset of clinical symptoms through preemptive lifestyle changes and proactive treatments.

Increasingly, traditional x-rays and scans will be supplanted by imaging studies that not only show the anatomical structure of our parts but also allow us to visualize the biological activities of the brain, heart, and other organs in real time. Scientists will take advantage of futuristic approaches such as nanobiology, structure-guided synthetic chemistry, and function-guided synthetic biology to develop designer medicines that have very precise, laser-like effects. It's not beyond imagination that we might even find ways to repopulate our bodies with health-promoting, rather than disease-promoting, germs.

Instead of relying on black-and-white definitions of a normal state and an ill state, medical students of the future will learn to appreciate a complex gray scale of health and un-health. Rather than dealing with relatively late-stage disease complications, the emphasis likely will be on preventing and reversing pathological processes before they have a chance to do great harm. In fact, it's likely that the very terms "health" and "disease" will be anachronisms twenty-five years from now.

For the Duke School of Medicine, this means giving our students more opportunities to learn about human health in settings where groundbreaking discoveries happen and are rapidly translated for clinical use. This future is high tech, certainly. But I hope that we will continue to admit students who truly care about others and liberally share their skills and their intelligence. And I hope that no matter how advanced medical technology gets, it will still be true that a simple touch—a doctor's hand on a patient's arm—will have a very powerful effect, and that our graduates will use that power generously and compassionately.

Andrews, who was dean for basic sciences and graduate studies at Harvard Medical School from 2003 to 2007, is the first female dean of a top-tier medical school.

 

Artificial Materials

Optical Wonders

By David R. Smith, William Bevan Professor of Electrical and Computer Engineering

We recognize a material when we see it, though it's a fuzzy sort of intuitive concept. A piece of glass or plastic would serve as an example of a material, as might a section of copper wiring or part of an aluminum can. When we make the assertion that a substance is a material, we have certified that the substance has inherent physical properties from which we can assess how strong it is, how transparent or opaque it is, how well it conducts electricity, and so on. These properties are intrinsic, in the sense that if we cut or break the material into ever smaller pieces, all of the pieces will have identical physical properties.

This broad description provides a good working definition of a conventional material. However, we can imagine a much wider class of engineered artificial materials formed by carefully arranging other, smaller objects together in prescribed patterns. The concept of an artificial material has a long and well-established history in materials science and engineering (consider concrete used to construct buildings or carbon-fiber composites used in automobiles and aircraft). The application of this concept to optics (the study of visible light) or, more broadly, electromagnetism (the study of the entire spectrum of light, including infrared, ultraviolet, x-rays, radio waves, and microwaves) over the past decade has resulted in spectacular and profound advances.

There are at least two ways to make use of artificial materials. We can, for example, produce a material with properties unlike any found in nature. Alternatively, we can gain precise control over the properties of a material and vary them throughout a volume to create new devices. Two examples make clear the distinction: negative-index materials and invisibility cloaks. When rays of light, visible to us or not, penetrate a material, they bend. The amount they bend—or refract—is determined by the material's index of refraction, an inherent physical property. In all conventional materials, the index of refraction is positive.  However, there is no fundamental restriction that prevents a material's index of refraction from being negative—they just don't happen to exist in nature. Russian physicist Victor Veselago predicted the possibility of negative refractive index materials in 1968. His theory showed that if such materials were found, nearly all of electromagnetism and optics would be turned on its head!

Veselago's prediction was realized in 2000, when negative refraction was experimentally confirmed in an artificial material, demonstrating the power of engineered materials to realize properties difficult or impossible to achieve with naturally occurring materials. To underscore their singularity, artificially structured materials are now generally referred to as metamaterials, indicating they have properties beyond those that can be found in naturally occurring materials.  

Metamaterials enable us to control light in entirely new ways. For centuries, optical devices have concentrated on sculpting the surfaces of simple materials to control light. The lenses in eyeglasses, cameras, or telescopes, for example, are usually made from glass or plastics, relying on the shape of the front and back surfaces to bend light rays. Light bends at the first surface, travels in a straight line through the material, and bends again at the second surface. With metamaterials, we can use every bit of the lens material—not just the surfaces—to steer and manipulate light in unprecedented ways.

As an example, our research team at Duke has designed an electromagnetic "invisibility cloak," a structure that, with some limitations, can render an object, and itself, invisible from detection over a narrow band of wavelengths. The cloak is designed so that light waves striking it are channeled around an object "hidden" inside (picture water flowing around a rock in a stream) and then emerge on the other side, giving the appearance of uninterrupted continuity.

But it gets much more exciting: In theory, we can build metamaterials so that they incorporate the means necessary to change or modulate their makeup, even after they have been constructed. The metamaterial elements that have commonly been employed for cloaks and other negative-index materials are actually composed of tiny electrical circuits, just as you might find in any electronic device. In principle, these metamaterial elements can have active components—like transistors or amplifiers—integrated into the design, allowing the material properties to be dynamically controlled. So, while the lens of today has its focal length and aperture fixed by geometry, the lens of tomorrow might be able to achieve a wide range of optical properties as the parameters of each constituent metamaterial circuit are tuned.

A single metamaterial lens may be able to mimic thousands of lenses by individually addressing each of the underlying metamaterial elements, much in the same way that we control the properties of each of the thousands of pixels in a flat-screen television to produce an image. The result might be a single eyeglass lens that could be programmed for any type of vision and, at the same time, serve as a magnifying and zoom lens.

As the science of metamaterials continues to draw from such fields as traditional optics, photonics, physics, materials engineering, electrical engineering, and nanotechnology, the opportunities continue to expand. If the past decade has been an indication, we have much to look forward to in the coming twenty-five years!

Smith, who is also a visiting professor in the physics department at Imperial College, London, received his B.S. and Ph.D. in physics from the University of California at San Diego. While a postdoctoral researcher there, he demonstrated the first negative-index material, leading to the development of the field of metamaterials. His work was twice named among the "Top Ten Breakthroughs" by Science magazine, in 2004 and 2006, while metamaterial cloaking was selected as one of the "Top Ten Emerging Technologies" by Technology Review in 2007. Smith was named one of the "Scientific American 50" researchers and policymakers in 2006.

 

Art

More and Less

By Kristine Stiles, professor of contemporary art

ART WILL BE....2009-2034

Art will be more principled, less cynical;

more contemplative, less impetuous;

more introspective, less spectacular;

more accommodating, less belligerent;

more accountable, less self-interested;

more expressive, less conceptual;

more psychic, more and more;

more aesthetic, less kitschy;

more poetic, less theoretical;

more cosmopolitan, less nationalistic;

more local, less global.

There will be increased attention to depth of content over spectacle.

Art will be individualistic, yet less self-sufficient;

relational, yet less interpersonal;

soulful, yet not less cerebral;

techno-corporeal, yet not less phenomenological;

about animals, yet less about an animal;

environmental, yet less about nature;

more about violence, more and more;

proficient, yet less expert;

unusual and stubbornly determined, yet less skilled;

more about labor, yet less about work;

more about race, yet less about racism.

There will be increased attention to socialization and the mechanics of emotion.

Art will include more drawings; more prints;

more paintings, more collages;

more architecture, fewer sculptures;

more performance, more theatrical productions;

more installation, fewer earthworks;

more photography, video, and multimedia, more and more, more and more;

more animation and film, more and more;

more virtual reality, avatars, and hypermedia, more and more;

more archives, fewer books;

more artists, fewer curators;

more specialists, fewer public.

There will be an increase in the visualization of trauma and need.

Artists will attend more & more to human/animal interfaces;

genomics, genetics & telematics;

micro & macro technology from Nano & Endo to the Very Large Array;

neuro & cognitive sciences, the neurobiology  of vision & the engineering of visual technologies;

robotics in culture, medicine, industry & space;

exploration of the oceans & dynamic social systems related to quantum & chaos theory;

military industrial, congressional, informational, communicational complex & Paul Virilio's Pure War (1983);

surveillance & the semiotics of advertising & the media;

global economies, exchange & exploitation;

all feminisms, genders & sexual identities;

all spiritual & religious philosophies & questions of conscience & consciousness.

There will be an increase in creativity, rather than mere production.

Art will be more aesthetic, less ironical;

more on the peripheries, less in centers;

more unified and include more and more diversity;

more collaborative, less connected;

more erotic, less pornographic;

more real, less symbolic;

more sentient, less self-consciously political;

more socialist, less capitalist;

more questioning, less certain;
more sober, less earnest.

There will be an increase in all scholarship on world art.

Stiles is the author of numerous books and articles on global contemporary art, specializing in performance art, destruction, violence, and trauma. Dan Perjovschi is a Romanian artist whose work, along with that of Lia Perjovschi, his wife, was featured in a retrospective curated by Stiles at the Nasher Museum of Art in 2007-08.

 

Criminal Law

Burden of Truth

By James E. Coleman Jr., John S. Bradway Professor of the practice of law

John Balint is charged with murder. Balint claims self-defense. He testifies that he honestly thought the man he killed was threatening him with a gun. The evidence at the trial shows that the man was not armed. The issue for the jury is whether Balint honestly but also reasonably believed the dead man had a gun. In rebuttal, the prosecutor offers the results of the Brain Electrical Oscillations Signature (BEOS) test performed on Balint. In his closing argument, the prosecutor asks the jury: "So, which do you believe, Mr. Balint's buzzing brain or his lying eyes?"

In twenty-five years, this question may be unremarkable in criminal trials around the world.

In June 2008, a judge in India convicted a woman of murdering her former fiancè, based in part on evidence generated by the BEOS test. According to the neuroscientist who invented BEOS, the technology is capable of identifying, through electrical activity in the region of the brain where memories are stored, events that a person has actually experienced or conduct in which the person has engaged.

In the case of the woman in India who was convicted using BEOS, investigators hooked her to the machine through electrodes attached to her head and read aloud to her what they thought had happened on the night her former fiancè was killed. As the woman sat silently, listening as the facts were read to her, regions of her brain lit up. The judge who convicted her concluded that the test demonstrated the woman had "experiential knowledge" that only the killer would have had.

When I first heard about a person being convicted of murder using the BEOS test, I assumed the test was some type of fake lie detector concocted to obtain a confession from the woman. Stories abound about police investigators using fake lie detectors. One version of the story is that investigators place a colander on the head of their suspect with wires connected to the office copying machine. Whenever the suspect gives a "wrong" answer, one of the investigators hits the print button and the "lie detector" spits out a copy that reads: "He's lying." Eventually, the suspect confesses.

Whether BEOS turns out to be a fake lie detector remains to be seen. Although a committee of government-appointed experts concluded that the BEOS technology is "unscientific," it is gaining a foothold as an investigatory tool in several Indian states.

American researchers currently are trying to perfect brain-scan lie detectors, using electroencephalogram (EEG) and magnetic resonance imaging (MRI) technologies. One neuroscientist used "brain fingerprinting" to show that an Oklahoma death-row inmate who claimed to be innocent did not have knowledge of several key facts about the murder for which he was convicted. Ultimately, however, the court summarily rejected the uncorroborated test results as legally insufficient, and the inmate was subsequently executed.

I doubt that BEOS, brain fingerprinting, or other types of brain-scanning lie detectors will be accepted anytime soon in criminal trials in this country; the constitutional obstacles will be great, and courts generally are suspicious of lie detectors and truth serum. But an interdisciplinary group at Duke has been formed to consider the feasibility of using neurobiology to identify false and unreliable testimony. Such technology might be useful in a wrongful-conviction investigation, for example, to demonstrate with more conventional corroborating evidence that a witness who recants childhood testimony of sexual abuse is telling the truth. Or that an eyewitness whose unreliable testimony put an innocent person on death row could not have seen the homicide.

Twenty-five years from now, which are you going to believe: his buzzing brain or his lying eyes?

Coleman is director of the Duke Center for Criminal Justice and Professional Responsibility and codirector of the Duke Law Wrongful Convictions Clinic.

 

College Sports

Game Plan

By Mike Krzyzewski, head coach, men's basketball

I feel privileged to have been a part of Duke for nearly thirty years and, of course, throughout these twenty-five years of Duke Magazine. For me, and my family, Duke is home.

It is difficult to say how the next twenty-five years will change college athletics. The changes will likely be numerous, both on the national scale and within our institution, but the nature of these changes is nearly impossible to predict. What I do know is that there will be change. What I hope is that some things don't change.

So, for me, there is more meaning in discussing the things that Duke athletics will not change. In the face of the perpetual flux that occurs in most industries, and certainly in higher education, my goal for Duke would be to never change the fundamentals of who we are, of what makes Duke a great institution for learning. That means recognizing the fundamental values that make up the Duke identity and saying that, no matter how our environment changes, these are the things that will remain steadfast.

In the fall of 2006 and in the wake of great adversity at Duke, our athletics department developed a list of nine principles that define Duke athletics—the things that are essential to who we are. A crowd of 850 student-athletes, coaches, and administrators gathered in Cameron Indoor Stadium to listen as those principles were, one by one, revealed by former Duke student-athletes, alumni who represent great success stories in their respective fields. The words we selected to define us are: education, respect, integrity, diversity, sportsmanship, commitment, loyalty, accountability, and excellence. Duke athletics is all of these things. Without any one of them, we lose sight of who we collectively are.

Things change. There is no doubt about that. And neither people nor institutions can be naïve about that reality. In fact, change can be a terrific thing. It can make you better, it can help you identify limits and break through them, it can lift you to a higher level. Being adaptable allows you to take advantage of a changing environment. Duke will plan to be adaptable in many ways, but not in our dedication to the values we have identified as our foundation.

As I personally embark on my next twenty-five years, I know that my role at the university will change, too. But, for my family and me, Duke will continue to be home. And my goal for myself is the same as my goal for the future of Duke athletics, that I maintain and strengthen who I am and the standards I hold dear in the face of the many changes that will occur around me.

My hope is that everyone who has been touched and made better by Duke University and Duke athletics will insist upon giving back to our great institution. Two of the best ways to do this are to continue to live lives based on those nine principles that were ingrained in us during our time at Duke, and to remain active in life at the university. If we can make this commitment, the student-athletes of tomorrow will be reminded of what Duke has always been and, together, we will remain insistent upon what Duke will always be.

Krzyzewski has won three national championships in nearly three decades as men's basketball coach. He led Team U.S.A. to Olympic gold at the 2008 Beijing games.

 

Leadership

Adaptive Interdependence

By Alma Blount, director of the Hart Leadership Program and lecturer in public policy

Leadership is a word that holds great yearning. We wait for it. We wish for it. We study it and keep our fingers crossed that it comes our way. But good leadership in twenty-five years won't be different from what it is right now, or from what it was 2,500 years ago. It is right in front of us, ready for us to see. There is an ordinary quality to it. Leadership is ours when we choose to practice it. Though the activity of leadership may not be easy, its essence is simple and pure. Leadership is about being present.

Being present involves seeing deeply into things as they are, becoming adept at reading complex contexts, and developing an instinct for when and how to raise difficult questions. Especially when our attention is directed to public life, being present requires "inner work"—reflective space to cultivate equanimity in the midst of competing demands, and to keep distractions at bay. Inner work allows us to hone an instinct for when to go against the grain, how to pace the turbulence, and how to defy expectations in healthy ways that build adaptive capacity in groups, institutions, and social systems. This is sobering work, yet there is also great freedom in learning to take this kind of responsibility.

Who could possibly know what our circumstances will be in twenty-five years? Life is giving us constant clues right now about how we must adapt. But I do have hunches about what we may face a quarter- century from now.

I believe that the truth of our interdependence will no longer be avoidable. We will encounter increasing complexity in every direction. Daily life will continually confront us with difficult issues of diversity and value conflicts. And the principles and practices of sustainability will no longer be the purview of a few. Sustainability will be the core necessity and context of life for all of us.

So, what we will need are people who can exercise unglamorous, garden-variety leadership in multiple directions to help us reexamine our values, learn new habits, and find new ways of doing business.

Leadership can help us do these things by fostering collective learning. Skillful leadership mobilizes people to learn their way to the next stage of adaptation. The learning process itself, which is inherently improvisational and experiential, builds the self-reliance and resilience of the group. Leadership calls us to common purpose so we can create what Rabbi Jonathan Sacks calls "fellowships of joint responsibility." In dealing with differences, the leadership art is to become an excellent "border crosser," to help all of us simultaneously become more fully who we are, and at the same time, build a bigger "we."

Leadership can also help by teaching us to dive into the difficult. Leadership is about learning to work productively with value conflicts, and to see them as central resources in the problem-solving process. It requires much patience and courage. The art of leadership is cutting through complexity in order to locate what is simple, clear, integrative, and unifying.

Finally, leadership can help us find satisfaction and staying power in "just doing the work." We are all being called to practice interdependence, to commit ourselves to something far larger than our individual lives. If we accept the call and engage in the practice and pay attention, we are bound to cultivate insight along the way. And if we develop a little wisdom, it will come from the ordinary, daily quality of our common commitment. Although this leadership work is urgent and uncertain, it may bring us the hopeful joy that comes from living in "the homeland of interwoven pronouns," as Mexican poet Octavio Paz called it—the place "where I am you are us."

Blount teaches on the topics of civic participation, grassroots politics, systems analysis, and problem solving within organizations.

 

Student Life

Goodbye Books, Hello Quidditch

By Sue Wasiolek, assistant vice president for student affairs and dean of students

Student life at Duke in 2034? At first, I thought it rather difficult even to begin to imagine. But, upon reflection, I realized it should not be that great of a challenge; I have been at Duke and have watched this place change for more than thirty-five years. So, as I reflect on the past, I will try to anticipate the future:

  • Duke will have won multiple bowl games and national championships. As a result, students will continue to love Duke football and basketball, especially since both David Cutcliffe and Mike Krzyzewski will still be coaching here. In addition, Duke will be the first national champion in a new coed varsity sport: Quidditch!
  • Students will have to camp out to secure a space in the parking lots for traditional tailgating before football games. Tailgating as we've come to know it will have died.
  • With a welcome shift in the economy, Duke students will finally be living in the New Campus; in fact, it will be referred to as the "Not So New Campus."
  • Student organizational meetings will take place online. There will no longer be any need to reserve classroom space.
  • Similarly, with Duke going global, many more classes will be online.
  • Wilson Recreation Center will have doubled in size and will still be overcrowded! A new rec center will exist on the New Campus, as will outdoor playing fields. The number of club sports teams will have doubled.
  • The Chronicle will still be published on a daily basis, but will be available only online.
  • There will be plenty of parking because students will no longer bring their cars to campus. Buses will not be needed, either. Instead, with the total greening of Duke, everyone will ride a bike.
  • Shooters II, a notorious off-campus nightclub, will no longer exist, as it will have lost its liquor license. In its place will be a giant smoothie bar, Quenchers II, with meditation, massage, acupuncture, and yoga.
  • Perkins and Lilly libraries will be "party locations" for semiformals and campus-wide social events. With all research being done online, books will no longer be needed.
  • GPS systems will allow all members of the Duke community to know where every other member of the Duke community is at any time. The Duke Community Standard will include a new violation: Disabling your personal GPS so that you can run and hide will not be permitted.
  • Duke will be completely smoke-free.
  • The entire Bryan Center will be devoted to the Refectory, currently the "green" dining option at the divinity and law schools. With students more focused on health and nutrition, the Refectory will be the only eatery on campus, superseding all fast-food options—except for Pauly's and Joseph's hot dogs.
  • The campus will be as vibrant and beautiful as ever, with students studying on the quads and the Bryan Center plaza, always picking up their trash.
  • Students from the University of North Carolina at Chapel Hill will be trekking to Durham, primarily to take advantage of all the cool events and activities taking place downtown.
  • Dean Gerald Wilson will still be the prelaw adviser, preparing to celebrate his 100th birthday.

Wasiolek '76, M.H.A. '78, LL.M. '94, known on campus as "Dean Sue," is coauthor of Getting the Best Out of College, published in 2008.

Share your comments

Have an account?

Sign in to comment

No Account?

Email the editor