2016-06-30

The virus struck quickly and without warning. After the first case was detected in Philadelphia, reports came in from cities as far way as Rome, Kinshasa, Karachi, and Beijing. Within a year, 5 billion people were dead. Within a decade, wild animals roamed the streets of our major cities.

Science-fiction fans and movie buffs will recognize this scenario as coming from "12 Monkeys," Terry Gilliam's 1995 movie starring Bruce Willis as a time traveler sent back to our era to discover the origins of a plague that drove a human remnant underground. But Bill Joy, the chief scientist at Sun Microsystems, makes such sci-fi fears chillingly real. In an article in the April issue of Wired magazine called "Why the Future Doesn't Need Us," Joy describes a future in which "our most powerful 21st-century technologies," working singly and together, threaten the survival of the human race.

If you're tempted to write off Joy's warning as so much science fiction, bear in mind that Joy is one of the few people outside science fiction who's actually thinking about the threats and moral quandaries posed by our culture's uncritical embrace of technology.

According to Joy, robotics, genetic engineering, and nanotechnology (incredibly small machines that operate at the molecular level to, say, augment the human immune system) pose a greater threat to our survival than nuclear, biological, and chemical weapons. The heightened threat posed by these 21st-century technologies can be summed up in one word: control. Once they're deployed, it becomes difficult to control how they're used.

In the case of nanotech and robotics (not just intelligent machines that can do our work for us but also the eventual merging of human with machine), part of the difficulty is built into the system. The very quality that makes these technologies so promising--their ability to create copies of themselves without human intervention--also makes them dangerous. Any harmful by-products of these machines--say, environmental damage--can potentially be multiplied beyond our ability to respond, much like the killer virus in "12 Monkeys."

If these threats sound too distant and Star Trekkie for you to relate to, there are more mundane reasons to be concerned. As Joy points out, whereas the old weapons of mass destruction required access to hard-to-get raw materials and protected information, the new mass destruction will be within the reach of small groups and even individuals. Knowledge, not enriched uranium, will be the key to these technologies.

And if the possibility of a psycho or fanatic having his finger on the figurative button isn't enough to scare you, there's always what Joy calls the "oops! factor." "Dr. Strangelove" and "Fail Safe" notwithstanding, deploying a 20th-century weapon of mass destruction inadvertently is next to impossible. There are procedures in place to keep that from happening--aided by the fact that, until now, these weapons have been the almost exclusive possession of nation-states.

This won't be the case with knowledge-based mass destruction. Not only will robotics, genetic engineering, and nanotechnology be in private hands; it's also a lot easier to accidentally release a harmful organism into the environment than it is to set off a nuclear weapon. The image that once again comes to mind is "12 Monkeys," minus the deliberate spreading of the virus by the David Morse character's malice.

Speaking of sci-fi nightmare scenarios, Beliefnet columnist Robert Wright applauded Joy for trying to bring attention to threats posed by "microscopic things that can be inconspicuously made and transported and, once unleashed, whether intentionally or accidentally, can keep on truckin'." But, he added, by including "nanotechnology and super-robots and other far-off threats" in the mix, Joy made it possible for people to "to dismiss the whole issue as sci-fi rantings."

That may be true. I don't know whether the kind of technology depicted in Joy's article or in works of science fiction will ever come to pass. Joy seems to think so, and I'll defer to him on that score. (For that matter, 60 years ago, no one anticipated communications satellites in orbit 22,500 miles above the Earth. No one, that is, except Arthur C. Clarke.)

In any case, whether or not these technologies are doable isn't really the issue. The issue is whether we think about the possible consequences of these technologies before we rush headlong to develop them. And when it comes to technology, you have to go to the movies to hear someone ask, as Jeff Goldblum's character did in "Jurassic Park," whether, in our rush to see if we could do something, we shouldn't stop to ask ourselves if we should.

There are exceptions to this uncritical embrace. Last week, thousands of people protested at a meeting of biotech firms in Boston. And Jeremy Rifkin, the author of "The Biotech Century: Harnessing the Gene and Remaking The World," has been writing about the downside of biotechnology, including genetic engineering, for decades. But genetic engineering is the only one of Joy's technologies that regularly receives attention. And part of that attention comes from biotech's involvement with hot-button right-to-life life issues, such as abortion and fetal-tissue research.

Part of the uncritical embrace of technology is ingrained, almost at the genetic level, in the American character. Americans worship at the altar of progress. We believe that being able to do something better is always a good thing. Americans are the definitive pragmatists.

In this respect, American Christians, particularly Protestant evangelicals, are no exception. Historically, they have viewed technology as morally neutral, its morality determined by how it's used. If a technology can be adapted to further their mission, they see that technology as a gift from God. Religious broadcasters, such as Paul Crouch of the Trinity Broadcasting Network, have promoted the use of direct-broadcast satellite dishes as a way to spread the message of TBN. As sociologist Grant Wacker has written, "Pentecostalism"--the fastest-growing type of evangelicalism--"seeks out the Garden of Eden equipped with a satellite dish."

This attitude toward technology, and the fact that, like most people, Christians don't understand the kind of technologies Joy is writing about, means that we aren't likely to hear from people of faith on the dangers posed by the technologies of the 21st century. And that's unfortunate because, as observers since de Tocqueville have noted, most Americans take their moral and ethical cues from religion--in particular, Protestant Christianity.

So science fiction occupies the void left by the people we usually turn to for enlightened commentary. Science-fiction writers are asking questions that no one else is. And they're hardly knee-jerk Luddites. Like Joy, they understand that the technologies they're writing about now are different--and scary. We're not talking about making better ways to entertain ourselves (like HDTV or MPIII) or even better methods of communication, which are merely refinements on existing technology.

Robotics, genetic engineering, and nanotechnology promise--or threaten--to change the relationship between man and nature. If some of the people quoted by Joy are to be believed, these technologies could redefine what it means to be human. The innovations can be as seemingly innocuous as a world where genetic defects become optional, as in "Gattaca." Or, as Danny Kurzweil, the founder of Thinking Machines Corporation, imagines, a world where we'll live to the age of 200 by replacing parts of our body with silicon.

Before any of this happens, it might be nice to ask if that's what we want. And since so many otherwise thoughtful people have taken a pass on the question, we've got little choice but to head to the science-fiction section of our local bookseller or video store.

more from beliefnet and our partners
Close Ad