The virus struck quickly and without warning. After the first case was detected in Philadelphia, reports came in from cities as far way as Rome, Kinshasa, Karachi, and Beijing. Within a year, 5 billion people were dead. Within a decade, wild animals roamed the streets of our major cities.

Science-fiction fans and movie buffs will recognize this scenario as coming from "12 Monkeys," Terry Gilliam's 1995 movie starring Bruce Willis as a time traveler sent back to our era to discover the origins of a plague that drove a human remnant underground. But Bill Joy, the chief scientist at Sun Microsystems, makes such sci-fi fears chillingly real. In an article in the April issue of Wired magazine called "Why the Future Doesn't Need Us," Joy describes a future in which "our most powerful 21st-century technologies," working singly and together, threaten the survival of the human race.

If you're tempted to write off Joy's warning as so much science fiction, bear in mind that Joy is one of the few people outside science fiction who's actually thinking about the threats and moral quandaries posed by our culture's uncritical embrace of technology.

According to Joy, robotics, genetic engineering, and nanotechnology (incredibly small machines that operate at the molecular level to, say, augment the human immune system) pose a greater threat to our survival than nuclear, biological, and chemical weapons. The heightened threat posed by these 21st-century technologies can be summed up in one word: control. Once they're deployed, it becomes difficult to control how they're used.

In the case of nanotech and robotics (not just intelligent machines that can do our work for us but also the eventual merging of human with machine), part of the difficulty is built into the system. The very quality that makes these technologies so promising--their ability to create copies of themselves without human intervention--also makes them dangerous. Any harmful by-products of these machines--say, environmental damage--can potentially be multiplied beyond our ability to respond, much like the killer virus in "12 Monkeys."

If these threats sound too distant and Star Trekkie for you to relate to, there are more mundane reasons to be concerned. As Joy points out, whereas the old weapons of mass destruction required access to hard-to-get raw materials and protected information, the new mass destruction will be within the reach of small groups and even individuals. Knowledge, not enriched uranium, will be the key to these technologies.

And if the possibility of a psycho or fanatic having his finger on the figurative button isn't enough to scare you, there's always what Joy calls the "oops! factor." "Dr. Strangelove" and "Fail Safe" notwithstanding, deploying a 20th-century weapon of mass destruction inadvertently is next to impossible. There are procedures in place to keep that from happening--aided by the fact that, until now, these weapons have been the almost exclusive possession of nation-states.

This won't be the case with knowledge-based mass destruction. Not only will robotics, genetic engineering, and nanotechnology be in private hands; it's also a lot easier to accidentally release a harmful organism into the environment than it is to set off a nuclear weapon. The image that once again comes to mind is "12 Monkeys," minus the deliberate spreading of the virus by the David Morse character's malice.

Speaking of sci-fi nightmare scenarios, Beliefnet columnist Robert Wright applauded Joy for trying to bring attention to threats posed by "microscopic things that can be inconspicuously made and transported and, once unleashed, whether intentionally or accidentally, can keep on truckin'." But, he added, by including "nanotechnology and super-robots and other far-off threats" in the mix, Joy made it possible for people to "to dismiss the whole issue as sci-fi rantings."

That may be true. I don't know whether the kind of technology depicted in Joy's article or in works of science fiction will ever come to pass. Joy seems to think so, and I'll defer to him on that score. (For that matter, 60 years ago, no one anticipated communications satellites in orbit 22,500 miles above the Earth. No one, that is, except Arthur C. Clarke.)

In any case, whether or not these technologies are doable isn't really the issue. The issue is whether we think about the possible consequences of these technologies before we rush headlong to develop them. And when it comes to technology, you have to go to the movies to hear someone ask, as Jeff Goldblum's character did in "Jurassic Park," whether, in our rush to see if we could do something, we shouldn't stop to ask ourselves if we should.