Origin > Dangerous Futures > Interview: How much do we need to know?
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0681.html

Printable Version
    Interview: How much do we need to know?
by   Bill Joy

To limit access to risky information and technologies by bioterrorists, we should price catastrophe into the cost of doing business, rather than regulate things, says Bill Joy. Things judged to be dangerous would be expensive, and the most expensive would be withdrawn.


Originally published in New Scientist June 17, 2006. Reprinted with permission on KurzweilAI.net July 10, 2006.

Interview by Gregory T. Huang

Technology doesn't make everyone happy. Just ask computer scientist Bill Joy, who has pioneered everything from operating systems to networking software. These days the Silicon Valley guru is best known for preaching about the perils of technology with a gloom that belies his name. Joy's message is simple: limit access to information and technologies that could put unprecedented power into the hands of malign individuals (what is sometimes called asymmetric warfare). He is also translating that message into action: earlier this year, his venture-capital firm announced a $200 million initiative to fund projects in biodefence and preparation for pandemics. Gregory T. Huang caught up with Joy at the recent Technology Entertainment Design conference in Monterey, California.

Do you think your fears about technological abuse have been proven right since your Wired essay?

When I wrote that essay in 2000, I was very concerned about the potential for abuse. Throughout history, we dealt with individuals through the Ten Commandments, cities through individual liberty, and nation states through mutual non-aggression plus an international bargain to keep the peace. Now we face an asymmetric situation where technology is so powerful that it extends beyond nations to individuals — some with revenge on their minds. On 11 September 2001 I was living in New York City. Our company had a floor in a building that went down. I had a friend on a plane that crashed. That was a huge warning about asymmetric warfare and terrorism.

Did we learn the right lesson?

We can't give up the rule of law to fight an asymmetric threat, which is what we seem to be doing at the moment, because that is to give up what makes us a civilisation. A million-dollar act causes a billion dollars' damage and then a trillion-dollar response that makes the problem worse. September 11 was essentially a collision of early 20th-century technology: the aeroplane and the skyscraper. We don't want to see a collision of 21st-century technology.

What would that sort of collision look like?

A recent article in Science said the 1918 flu is too dangerous to FedEx: if you want to work on it in a lab, just reconstruct it yourself. The reason we can do this is a consequence of the fact that new technologies tend to be digital. You can download gene sequences of pathogens from the internet. So individuals and small groups super-empowered by access to self-replicating technologies are clearly a danger. They can cause a pandemic.

Why do pandemics pose such a huge danger?

AIDS is a sort of pandemic, but it moves slowly. We don't have much experience with the fast-moving varieties. We are not very good as a society at adapting to things we don't have gut-level experience with. People don't understand the magnitude of the problem: in terms of the number of deaths, there's a factor of 1000 between a pandemic and a normal flu season. Public policy has not been constructive, and scientists continue to publish pathogen sequences, which is really quite dangerous.

Why is it so dangerous?

If in turning AIDS into a chronic disease, or making cocktails of antivirals for flu, or using systems biology to construct broad-spectrum cures for many diseases, we make the tools universally available to people of bad intent, I don't know how we will defend ourselves. We have only a certain amount of time to come to our senses and realise some information has to be handled in a different way. We can reduce the risk greatly without losing much of our ability to innovate. I understand why scientists are reluctant, but it's the only ethically responsible thing to do.

So more technology is making the problem worse?

Unfortunately, yes. We need more policy.

What would that look like?

We could use the very strong force of markets. Rather than regulate things, we could price catastrophe into the cost of doing business. Right now, if you want approval for things, you go through a regulatory system. If we used insurance and actuaries to manage risk, we might have a more rational process. Things judged to be dangerous would be expensive, and the most expensive would be withdrawn. Drugs would make it to market on economic estimates of risk not regulatory evaluations of safety. This process could also be used to make companies more liable for the environmental consequences of their products. It's both less regulation and more accountability.

How are you combating the threat of pandemics?

We recently raised $200 million for biodefence and pandemic preparedness. We have started out focusing on bird flu. We need several antivirals, better surveillance, rapid diagnostics and new kinds of vaccines that can be manufactured quickly. If we fill these gaps, we can reduce the risk of a pandemic.

Do other technological advances excite you?

I have great confidence that we will extend the limits of Moore's law to give us another factor of 100 in what computer chips can do. If a computer costs $1000 today, we can have that for $10 in 2020. The challenge is: will we develop educational tools to take advantage of such devices? That's a great force for peace.

Another area that gives us hope is new materials. The world's urban population is expected to more than double to 6 billion this century. We need clean water, energy and transportation. Carbon nanotubes have incredible properties, and can be applied to develop fuel cells, make clean water, or make ethanol for electric-powered transport. My company has dedicated $100 million to this.

How do you see the increasing connectedness of human societies affecting innovation?

It's diffusing ideas at an incredible rate. You can use communications and search tools and find out incredible things. You see companies doing interesting things, and you can find out huge amounts very quickly. We can write a worldwide research briefing paper in an hour if we shut the door and unplug the telephone. That's something you couldn't do before.

What's the downside?

It's like putting a stick in a hornet's nest. We have religious and secular societies coming into contact, pre-Enlightenment values conflicting with Enlightenment values. It will be a messy process of change. Technology has brought western pop culture to the rest of the world. I'm not a fan of it, but the values it has brought to the world actually offend people in cultures that have been around for longer than my particular set of world views.

Will the human race survive the next 100 years?

We have to make it through a pandemic to understand the nature of that sort of threat. Whether we do that before we unleash the technology, I'm not sure. Either way, I don't believe we will become extinct this century, though we could make a pretty big mess. I hope we can do some sensible things. It is not enough to do great science and technology, we need sensible policy. We still think that if we find true things and publish them, good things happen. We should not be that naive.

If you could ask the god of technology one question, what would it be?

It seems that a perfect immune system is a disadvantage. If you are perfectly immune, you cannot evolve. A lot of evolution occurs because of selective pressure that your perfect immune system would prevent. This would leave the abusers of biotechnology with the advantage over the defenders, because society needs to be vulnerable so it can evolve. My question is, is that true, because it would prove that we had better limit access to some information. It would mean not only that we cannot make a perfect immune system, but that it would be a bad idea.

© 2006 New Scientist

   
 

[Post New Comment]