×

Używamy ciasteczek, aby ulepszyć LingQ. Odwiedzając stronę wyrażasz zgodę na nasze polityka Cookie.


image

Bruce Schneier: Beyond Fear

Bruce Schneier: Beyond Fear Doug Kaye: Hello and welcome to IT Conversations, a series of interviews with experts in today's hot topics in information technology. I am your host, Doug Kaye, and my guest today is Bruce Schneier, founder and CTO of Counterpane Internet Security and an internationally renowned security expert and author. Described by The Economist as a security guru, Bruce is well known as a security critic and commentator, an opinion I'm sure you will agree with shortly. His first best selling book, "Applied Cryptography" explained how the arcane science of secret codes actually works and was described by Wired as, "the book the National Security Agency wanted never to be published." His book on computer and network security called "Secrets and Lies" was called by Fortune, "a jewel box of little surprises you can actually use." But with this latest book, "Beyond Fear: Thinking Sensibly about Security in an Uncertain World," Bruce has crossed over into the mainstream of personal, corporate and national security. Hello Bruce! Thanks for joining me today!

Bruce Schneier: Well, thanks for having me!

Doug Kaye: I want to get into the issues of the latest book, but many of our listeners know you as a crypto and network security guy, and they submitted dozens of questions they'd like me ask you. I'd like to just ask you two of these before we get into the new book. Paul Bissex in Massachusetts asked, "What do you think is the role of cryptography and authentication in fighting spam and viruses?" In particular, he asked what do you think about recent developments such as SPF, Yahoo's Domain Keys and Microsoft's proposal for a CPU-intensive puzzle-solving algorithm as a possible solution for spam. Bruce Schneier: You know, spam is a tough problem, and it's really an economic problem. Authentication doesn't do any good because a lot of spam these days is being sent from stolen accounts. We already have blacklists that block spamming accounts, so spammers have learned they have to hack into the computers of regular people and send spam from there. So an authentication system will only authenticate who the victim is, the victim who has been hacked. Puzzles, I am more optimistic for. That tries to attack the economic problem. Can we make it economically nonviable to sent spam, but again, you've got the hacked computer problem, right? If I'm a spammer and I am going to hack into one thousand or ten thousand computers and have them sent the spam, then the puzzles aren't going to do much good. It's a tough issue. It's a tough issue because the economics are such that sending spam makes sense; that's the unfortunate reality. If we all stop buying, if we all ignored it, then it would go away, but as amazing as it might seem to the listeners of this, there are people who respond to spam ads, and that makes it a valid marketing tool because the costs of spam are so low.

Doug Kaye: Do you see any technological solution on the horizon that might be an answer for those of us who don't want to respond? Bruce Schneier: I don't get much spam. I use a service called Postini -- I'm not affiliated with them in any way, I'm just a satisfied customer -- and they're really good at blocking spam; I get a very little. Those are going to be the sorts of solutions.

The spam blockers are getting better; spam is getting better, but so are blockers, and it's an arms race, and I think we're doing pretty well. I'm happy with the amount of spam I get, which is almost none. For people who don't use spam blockers, it's a problem. I don't think there's a solution to spam. I think there's a lot of things we can do and are doing, and it's never going to go away until the economics changes. You don't get as much spam in physical mail because postage stamps are expensive, but as postage gets cheaper, you see more of it. We are seeing more telephone spam because the price of the telephone calls has been dropping considerably over the past decade, and we're seeing the same thing in email. So I see it as an economic problem, not one that is going to go away with technology.

Doug Kaye: Another listener in the UK asked, "What do you think are the strengths and weaknesses of biometrics as an authentication technology?" That's obviously a broad question, but. Bruce Schneier: It's a good question, and I like biometrics as authentication. Biometrics has got a lot of press, at least in the United States after 9/11 as a counter-terrorist tool. If we could just pick terrorists out of crowds at airports, we can catch them and stop their nefarious plans. That I'm very skeptical about. But for authentication I think biometrics is good, and it's important for listeners to understand the difference. Some systems use biometrics as an identification tool. "Can we scan the crowd and identify the bad guys?" That tends not to work. Biometrics work very well as an authentication tool. Type your user name into a computer, put your thumbprint down, and that authenticates you. They are very different applications. Biometrics has a really good place as an authentication tool.

I have seen biometric access controls for computers that work very well. At my company, Counterpane, we use biometric access controls for our secure operation centers; actually, we use hand geometry, we use a password and then we use a physical token. So we're mixing up different authentication techniques, and I am really in favor of that. It's something that's hard to lose. Passwords are easy to forgot, easy to write down. Physical tokens are easy to lose. Your thumbprint, you know, you don't lose it very often! Doug Kaye: Ten years ago you wrote a book entitled, "Protect your Macintosh." Bruce Schneier: You're probably the only reader, you know. Doug Kaye: Yeah, well it's part of my job! And your career seems to have taken you from cryptography to network security to, let's call it, big-picture security now. What came before all of that? How did you get involved with crypto in the first place?

Bruce Schneier: I've always believed that security is a mindset, and you're right, my career has been an endless series of generalizations because I think they're all…all apiece. They are very similar. I think people who are good at security look around the world as they wander through their day and see security systems and see ways to subvert them. In a sense, they're hackers of the truest sense. "How did this system work?" "How can I use it?" "How can I abuse it?" Whether they're walking into a store and looking at the cameras, walking into a voting booth or just wandering through the streets. They don't act on these ideas. They certainly never stop thinking about them. I had done that since forever, and I have been also interested in mathematics and I have been interested in cryptography since I was a kid.

Doug Kaye: I spoke recently with Phil Zimmermann of PGP, and he had a similar background. He even remembers a specific book he read as a kid about secret codes and invisible writing with lemon juice and things like that.

Bruce Schneier: Oh, sure, I read that same book! Back when we were kids, there were only, like, three or four of those books, so after you had read them, you were done. But, yeah, a lot of those books as a kid sparked our imagination.

One of the things I'd like to do at some time in my career is write a kids' book on cryptography and secret codes because I think that is important. I think kids need to be excited about the possibilities of privacy, secrecy and codes, and they are inherently interesting.

Doug Kaye: On the other hand, I also spoke recently with Dan Geer who was lamenting the fact that we're about to get to a point where security experts are people who are actually trained in security from day one, they get a degree in it and so forth, and they may not have some of the breadth of background that the early pioneers of security technology have. Bruce Schneier: I don't know, I'm less pessimistic. I really do believe that security is a mindset and that people with the mindset will have the breadth because you can't help it. They'll get training in their specific area of expertise, but they will have the breadth of experience, the breadth of analysis, the breadth of the way the world works. I think there are security experts who are just trained, and they're not very good, and I meet a lot of them all the time, but you can tell when someone has passion, when someone truly understands how security works. He might not know the math. He might not know how the operating system works. He might not know about locksmithing and how locks work or how voting machine work, but he gets security. And those sorts of people, I think, we will have in our field forever, and that's what makes the field great and that's what makes the field grow. So Dan is right; the majority of people will just be practitioners who have been trained and who don't have passion, but that's probably true in every industry. Doug Kaye: You had perfectly good reputation as a network security expert. You had grown Counterpane into a well-respected business. Why did you decide to write "Beyond Fear," a book that appears at least to go far beyond the scope of Counterpane's typical business? Bruce Schneier: I really wrote "Beyond Fear" because we're living in a silly security season in our country. We're seeing so much nonsense after September 11th, and so many people saying things about security, about terrorism that just makes no sense, so I wanted to contribute to the debate. I wanted to write a book that people can read and then understand security.

They don't have to agree with the conclusions; one of the things I say in the book is that security is personal, that there often are no answers. But I wanted people to at least understand how to ask questions, how to look at a security system, how to evaluate it because we're being asked to take our shoes off at airports. We might be asked to live with a national ID card. We're being asked to support invasions of foreign countries. We're being asked to support all sorts of domestic and foreign policy in the name of security, and I'd like people to ask, "Does this make sense? Should we do this?" Doug Kaye: The book is filled with an amazing array of facts and examples. Are these examples that you've been collecting over the course of your career, or did you find most of them just as you decided to sit down and research and write this book? Bruce Schneier: It's both. A lot of examples that I've been collecting. Even when I was writing about computer security, if you read, "Secrets and Lies," I tended to write in analogies, I tended to explain computer security using noncomputer analogies because people got them better. So, once you start working in analogies, you do wander around the world and you see security everywhere, so I like using those examples.

So when I started writing the book, I used those examples, and one of the things I do when I write books is I send drafts to people, hundreds of people read drafts of my book in various stages. So they started suggesting examples, and then I would research them.

So the process fed on itself, and the good ones would stay and the bad ones would go away or I have better examples, so I have this treasure trove of security examples that illustrate different points.

Doug Kaye: I'm going to read some of my favorites of those examples. Bruce Schneier: Oh, excellent!

Doug Kaye: …as a way to just get you to respond and flesh them out and explain what they mean.

First, one sounds at least something like the geeks in the audience might understand, and that is, "Complex systems have even more security problems when they are nonsequential and tightly coupled." What do you mean by that?

Bruce Schneier: That's a phrase coming out of systems theory, and it's fascinating. The book that I'd recommend people read is called "Normal Accidents" by a professor named Charles Perrow, and he talks about failures, not security failures, he talks mostly about accidents. But what he points out is that systems tend to fail catastrophically when they have these characteristics.

So a simple system would be like a row of dominos; you push one down and they all fall. That's sequential. It fails predictably.

A nonsequential system is where there are feedback loops and a small failure can get bigger. You think of a nuclear power plant, where a little failure here…or actually even better is the blackout from last August on the East Coast. It was a small failure at one power station in Michigan that dropped a lot of the northeast because the system was nonlinear, because one cause could have many effects which could feedback on the cause, which could have more effects.

And tight coupling is similar. Tight coupling means that stuff happens very, very fast, so if something is loosely coupled, think of the airlines as a great example. There are nonlinears going on, but it is loosely coupled, so a delay in Seattle might cause a delay in Chicago, which might cause a delay on the East Coast, but it doesn't cause planes to crash into each other. Because there is loose coupling, you can mitigate the problems. You've been on the ground when you've been told, "There is a hold in…oh… Chicago O'Hare and we're not going take off yet." If the system was tightly coupled, you couldn't do those mitigating things. Doug Kaye: You'd run out of fuel airborne. Bruce Schneier: Right, which is what used to happen. You used to circle a lot more, but the system got more loosely coupled.

When I look at security failures, these are the sorts of things that I look at in systems, that systems that are nonlinear and tightly coupled will tend to fail worse. They might not be more insecure at the front end, but an attacker could do more damage by attacking it because of these characteristics. Blowing up a power station might plunge ten thousand people into darkness, but in a tightly coupled nonlinear system, you could get tens of millions of people, hundreds of millions of people, in a blackout, and that's the difference. Doug Kaye: Sticking with this theme about the manner in which systems fail, a couple of short quotes here. You said, in reference to 9/11 and some other situations, that "the systems didn't fail in the way the designers expected," and along with that the concept that "attackers exploit the rarity of failures." I think they are two related concepts.

Bruce Schneier: They are, and when you think about failures, you always have to think about what you are not thinking about in that paradoxical way. One of the things is rarity. If a system almost never fails; then when it does fail, no one will believe it.

We have all had the example of calling up a bank or utility company complaining about a bill and being told that "the computer never makes mistakes," right? Well, the computer does make mistakes. It just makes them rare enough that no one knows what to do when the mistakes happen.

The power failure in August was an example of that. They were computer failures that happened that the people in the control room did not notice because they are so rare. And an attacker can exploit that because people aren't ready for it. One of the many unfortunate horrors of September 11th is that many people didn't know to evacuate the building because no one believed they would collapse. Now, that might be a perfectly reasonable belief. I don't know enough about structural engineering to say one way or another, but the fact that nobody believed that meant the death toll was that much higher. If a building failed every week, everyone would know; you'd get out of the building, but it never happened before, so people didn't know. Doug Kaye: Now a recurring concept in your book is probably typified by this example: "A terrorist who wants to create havoc will not be deterred by airline security; he will simply switch to another attack and bomb a shopping mall." Bruce Schneier: This is, I think, really important. I just did a hearing two days ago on Capitol Hill about CAPS II, about airline profiling, and one of the things I'm always struck with is how good we are at defending against what the terrorists did last year. We're spending a lot of money shoring up our airlines, we're now talking about shoring up trains. And money that we spend that simply causes the bad guys to change their tactics is money wasted.

You have a red and a blue door, and the terrorists go through the red door, and you say, "We must secure the red door," so they go through the blue door the next time. What did you actually buy?

And one of my fears is that we spend lots and lots of money securing the airlines, and the terrorists move to the shopping malls or movie theaters or crowded restaurants or any of the things they do in Israel; that there are just so many targets that taking the target the terrorist happened to pick last year and securing it just sort of ignores the real problem.

Doug Kaye: You also described what happened as a result of us building stronger and stronger bank vaults that were ultimately more impervious to dynamite.

Bruce Schneier: Right, and it's actually a great example! As bank vaults became impervious to dynamite, the preferred tactics switched to kidnapping bank presidents who knew the combination.

Doug Kaye: Not something we anticipated.

Bruce Schneier: No, and we saw this in South Africa. As cars had more and more antitheft provisions that made them harder to steal, car thieves turned to carjacking, a much more dangerous crime for the victim.

So we have again an example of security causing the attackers to change their tactics. And it might be a good thing because of the number of car jackers is going to be less than the number car thefts, but it might not be, and we have to remember that there is a feedback loop between the attacker and defender, that once an attacker does an attack and a defender modifies his defenses, the attacker will then modify his attack, and back and forth.

Going back to the bank vault analogy, after bank robbers started kidnapping bank presidents and bank tellers, the vault companies invented time locks, a lock that the bank president couldn't open even if you put a gun to his head. And the point of those time locks was to save the lives of the bank executives and the bank executives' families because all the bank robbers knew that kidnapping the bank president or his wife or children won't help because of the time lock. So again, now a defensive tactic evolved to respond to an attacker tactic, which evolved to respond to a defensive tactic.

Doug Kaye: This is another fundamental principle that I learned from this book, which is that in a sense, that is a failing game. If we attempt to anticipate every possible attack, we're going to be virtually unsuccessful and that, in fact, we're much better off by trying to ferret out the attacker and keep him from participating in virtually any attack. Bruce Schneier: Yeah, I'm often asked by the press, by legislators, "What do you recommend?" because I spend a lot of time saying, "This won't work, that won't work, or this won't work." And they always ask, "Well, what will work? What should we do?" And I think we just spend money at either end of the spectrum. The money we spend on intelligence, investigation, going after the terrorists will work regardless of what their next plan is. Defending the airlines only works if their next plan involves airlines, but going after the terrorists works regardless of what their next plan is, and I think, that's important. So I am very much in favor of those sorts of things: hiring linguists at the FBI, getting investigative teams in place, tracking terrorist funding, interdicting communications, that stuff also works.

And then I also advocate stuff at the back end, emergency response, first response, whether it is police, fire, medical, spending money there helps us, again, regardless of what the terrorist plans are.

We can never anticipate the terrorist plans. Inasmuch as we do, they will change them, so by definition you can't anticipate them. We can't all say, "My God! They're are going to attack the rail system! Let's secure the rail system," because then they'll decide to attack something else. Doug Kaye: So what are some of the examples of the expenditures that the United States is now making that you think aren't well justified? Bruce Schneier: You know, generally all of the big budget IT systems. CAPS II, the airline profiling system, is a complete waste of money. I think fingerprinting foreigners at the border, if you actually sit down and think about it, makes absolutely zero sense.

My complaints are largely for the big-ticket items, TIA (Total Information Awareness) where we are going to get information on every American and try to pick terrorists out of the crowds, the face scanning at airports program. This kind of stuff tends to be very, very expensive and doesn't produce useful results. Doug Kaye: In a recent Newsweek interview, you are quoted as saying that homeland security measures are an enormous waste of money. Now, is that that category that you consider homeland security?

Bruce Schneier: Yeah, that's the category. It's trying to secure the targets. They are just too many targets, and if you sit down and count up all the places 100 or more people gather in close proximity, like restaurants, movie theaters, sporting events, schools, trains, buses, crowded intersections, you rapidly realize that there are an enormous number of them in this country and you can't possibly secure them. Remember, the most amazing thing to me about the airline security measures in the months after 9/11 were the enormous lines. So here we are trying to make our airports safe, yet we bunch people in these huge crowds before security.

Doug Kaye: Making them essentially targets?

Bruce Schneier: Making them essentially targets! Now, there wasn't anything. Another thing we have to remember, which is very hard to remember in sort of our fear-laden society, is terrorism hardly ever happens. Very often I hear people from administration saying, "Our policies are working because in the two-and-a-half years since 9/11, nothing else has happened," and I think about it and say, "Well, nothing happened two-and-a-half years before 9/11 either. You did not have any policies. What does that prove?" What it proves is that terrorist attacks are very, very rare and that we're spending a lot of money on something that hardly ever happens. Now, we can decide to do that. We as a nation tend to worry about spectacular and rare events rather than common events, like spouse abuse, automobile crashes, things that kill lots and lots of people every year, and we tend to focus on the spectacular and rare, but we should realize we're doing that. Doug Kaye: Here's my favorite quote in the whole book, and I know that you probably know which one it is! Bruce Schneier: Actually, I don't! I can't wait! Doug Kaye: That "more people are killed every year by pigs than by sharks, which shows just how good we are at evaluating risks." Bruce Schneier: That was actually a fun quote. I actually went to the government web site, which actually has death statistics from various things. You can see how many people die from lightning, from heart disease, from anything, and the results are surprising. People tend to worry about the wrong things.

We worry about what's in the news. I tell my friends that if it's in the newspaper, don't worry about it because it means it hardly every happens. It's news. News hardly every happens; that's why it's news! When something stops being in the newspaper, then worry about it.

Doug Kaye: There's a huge emotional component to all of this, and you addressed this somewhat in the book. You say, for example, probably an important quote, that "people make bad security tradeoffs when they are scared." Bruce Schneier: Right, and the emotional part really can't be belittled. It's important. As a security professional, I often pooh-pooh it, but that's wrong, and I sort of learn not to do it. The reason we're spending money on terrorism that killed nobody in the past two-and-half years and we're not spending money on automobile crashes that have killed 40,000 people in the past dozen or so years each is because of an emotional reason. We are emotionally scared; this emotionally worries us more.

And that's important because security is a reality and a feeling. I try to talk about this in the book. I'm not a psychologist. It's getting away from my area of expertise, but it's important not to belittle it, not to forget it. The example I like to always think about is to imagine that you're living in the DC area during the weeks when the snipers were at large. Now, if you run the math, they doubled the murder rate in the counties they were operating during the weeks they were at large. You would have been more at danger probably moving into the Washington DC inner city than you would being where you were when the sniper was about. Yet, the reaction was completely out of proportion. School principals canceled outdoor events, people were afraid to leave their homes. So you can imagine that you're living there, and your daughter is afraid to walk to school because of the sniper, and you can sit her down with graphs and charts and explain to her the mathematics and why she is safe or you can just walk her to school. Honestly, if walking her to school makes her feel better, it's the right thing to do even though it might not make sense from a security perspective. My hope is that through my writings and my speaking, people can be educated to get beyond their fears, but you can't belittle them. Doug Kaye: Now you can this pragmatism quite a long distance, and as you do that, I find that I start to question it myself. Here's an example that I got from something you wrote in Wired magazine. I'll just paraphrase this. You wrote that 2,978 people in the US died from the terrorism in 2001, but 3,454 -- that's more people -- died from malnutrition in that same year. Now obviously we could have saves all of those who died of malnutrition for a whole lot less money than what we are spending to avoid a repeat of 9/11, but as a society, we're about as far from acting that way as I could possibly imagine. How rationally should we be thinking?

Bruce Schneier: I think the more rationally we think, the better we're going to spend our money. I view myself as a consumer of security, "How much am I getting versus what am I giving up?" And it doesn't work this way. When Congress are spending the $100 to $200 billion to invade Iraq, you cannot say to them, "Well, let's not invade Iraq and let's cure malnutrition instead." They won't do that. So it really doesn't work that way, but it should. We are spending $200 billion, that's the number I hear as the estimate when all is said and done, to invade Iraq. Are we $200 billion safer because of it? That's the real question to ask. Not, "Was it a good idea?" not, "How bad was Saddam Hussein?" but, "As security consumers, did we get our money's worth?" I want to look at malnutrition, at automobile safety, at spouse abuse the same way. How many lives can I save? How may lives can I improve? How much can I better society for the money? Because in the end, I only have so much money. I am a consumer. I have to make buying decisions.

Do I want to buy more nutritional food or do I want to go out to the movies? I have to make that choice; I only have one paycheck. I'd like it if we on the national scale could think the same way. I don't think it's possible, but I'd like it if we could. Doug Kaye: Now, we all make these personal decisions that you described, but on a national scale we are making much larger decisions, at least larger in terms of budgets, and the political system to me is very much an amplifier of that individual fear-based system.

Bruce Schneier: Yeah, I agree, 100%.

Doug Kaye: So, there's a positive feedback loop there unfortunately where we feel a small amount of fear, and that translates into a very large expenditure of money based upon the fears of individuals Bruce Schneier: Right. Doug Kaye: Is there a solution to that?

Bruce Schneier: Probably not. Unfortunately, I think this is the way politics works, because if you're a politician, you have to appear strong. You can't be weak on terrorism, even if it makes sense. Doug Kaye: How can we help our political leaders make strong, but rational, decisions?

Bruce Schneier: I write, I speak, that's what I do, I try to educate. I spend a lot of time on Capitol Hill talking to people and a lot of time writing my book. I give copies out. I write op-ed pieces, so that's what I try to do. I think education is the way to do it. It's not a fast solution and it might not work, but it's, I think, our only hope. Doug Kaye: Another quote, something that you wrote relative to this topic, was "When the US government says that security against terrorism is worth curtailing individual civil liberties, it is because the cost of that decision is not borne by those making it." Talk about that.

Bruce Schneier: Yeah, I spent a lot of time on that in some of my writings; I've done op-ed pieces on that. Security is multifaceted. There are many, many different threats in many areas, and right now we are focusing on the terrorist threat and ignoring other threats.

As a general rule, the people who make security decisions make security decisions that are rational to them. Security is a tradeoff, and they are going to make a personal tradeoff. So when you see a lot of these intrusive government systems, it's because the people making the decisions aren't the ones being intruded upon; they are the ones doing the intruding. So their tradeoff is inherently different than yours or mine might be.

Doug Kaye: But, on the other hand, there is also the inappropriate political influence. I love this quote, too. "Did you ever wonder why tweezers were confiscated at security checkpoints, but matches and cigarettes lighters, actual combustible materials, were not? If the tweezers' lobby had more power, I am sure they would have been allowed on board as well." That's a great one! Bruce Schneier: And it's true. The government wanted to ban laptops, but the airlines said, "No, you can't do that. Our business travelers will leave us." Security decisions as a general rule are subservient to other decisions, and I think that's as it should be; I don't think that's bad. Security is rarely a driver. It is much more often subservient.

Doug Kaye: What is your impression so far of the 9/11 hearings that are being conducted in Washington?

Bruce Schneier: There's not a lot of news; I mean, most of the stuff that is being said was said on September 12. Yeah, there's a lot of finger pointing, and certainly there were failures of intelligence. There were failures of communication. I am not convinced they were massive failures of intelligence.

This is one event, it's a singular event, and it's very hard to make generalizations from it. And sure, a bunch of things went wrong, but if they went right, other things would have gone wrong.

I personally feel that 9/11 was, more than anything else, a very, very horrible coincidence that things just failed in exactly the right way and there were many opportunities for things to go right, and if they did, we would have patted ourselves on the back and said it was a success even though the exact same stuff was in place.

It's important to realize that neither success nor failure necessarily shows a systemic problem. This is just a singular event.

Doug Kaye: Well, Bruce, I want to thank you for being with me today and I want to encourage everyone to go out and, if you haven't already, get a copy of "Beyond Fear" and read it. It really is one of the best books to come out in the last year or so.

Bruce, thank you very much!

Bruce Schneier: Thanks for having me; this was fun!

Doug Kaye: And thanks to all of you for listening to IT Conservations. This edition was recorded on April 16, 2004. My guest has been Bruce Schneier. You will find him all over the web, but a good staring point is probably his weblog, which you can find at www.schneier.com; actually, that's his home page. My name is Doug Kaye, and I hope you'll join me the next time for another edition of IT Conversations. This interview and many others are provided by ITConversations, please visit their website at http://www.itconversations.com

Learn languages from TV shows, movies, news, articles and more! Try LingQ for FREE
Bruce Schneier: Beyond Fear

Doug Kaye: Hello and welcome to IT Conversations, a series of interviews with experts in today's hot topics in information technology. I am your host, Doug Kaye, and my guest today is Bruce Schneier, founder and CTO of Counterpane Internet Security and an internationally renowned security expert and author. Described by The Economist as a security guru, Bruce is well known as a security critic and commentator, an opinion I'm sure you will agree with shortly.

His first best selling book, "Applied Cryptography" explained how the arcane science of secret codes actually works and was described by Wired as, "the book the National Security Agency wanted never to be published." His book on computer and network security called "Secrets and Lies" was called by Fortune, "a jewel box of little surprises you can actually use." But with this latest book, "Beyond Fear: Thinking Sensibly about Security in an Uncertain World," Bruce has crossed over into the mainstream of personal, corporate and national security.

Hello Bruce! Thanks for joining me today!

Bruce Schneier: Well, thanks for having me!

Doug Kaye: I want to get into the issues of the latest book, but many of our listeners know you as a crypto and network security guy, and they submitted dozens of questions they'd like me ask you. I'd like to just ask you two of these before we get into the new book.

Paul Bissex in Massachusetts asked, "What do you think is the role of cryptography and authentication in fighting spam and viruses?" In particular, he asked what do you think about recent developments such as SPF, Yahoo's Domain Keys and Microsoft's proposal for a CPU-intensive puzzle-solving algorithm as a possible solution for spam.

Bruce Schneier: You know, spam is a tough problem, and it's really an economic problem. Authentication doesn't do any good because a lot of spam these days is being sent from stolen accounts. We already have blacklists that block spamming accounts, so spammers have learned they have to hack into the computers of regular people and send spam from there. So an authentication system will only authenticate who the victim is, the victim who has been hacked. Puzzles, I am more optimistic for. That tries to attack the economic problem. Can we make it economically nonviable to sent spam, but again, you've got the hacked computer problem, right? If I'm a spammer and I am going to hack into one thousand or ten thousand computers and have them sent the spam, then the puzzles aren't going to do much good.

It's a tough issue. It's a tough issue because the economics are such that sending spam makes sense; that's the unfortunate reality. If we all stop buying, if we all ignored it, then it would go away, but as amazing as it might seem to the listeners of this, there are people who respond to spam ads, and that makes it a valid marketing tool because the costs of spam are so low.

Doug Kaye: Do you see any technological solution on the horizon that might be an answer for those of us who don't want to respond?

Bruce Schneier: I don't get much spam. I use a service called Postini -- I'm not affiliated with them in any way, I'm just a satisfied customer -- and they're really good at blocking spam; I get a very little. Those are going to be the sorts of solutions.

The spam blockers are getting better; spam is getting better, but so are blockers, and it's an arms race, and I think we're doing pretty well. I'm happy with the amount of spam I get, which is almost none. For people who don't use spam blockers, it's a problem.

I don't think there's a solution to spam. I think there's a lot of things we can do and are doing, and it's never going to go away until the economics changes. You don't get as much spam in physical mail because postage stamps are expensive, but as postage gets cheaper, you see more of it.

We are seeing more telephone spam because the price of the telephone calls has been dropping considerably over the past decade, and we're seeing the same thing in email.

So I see it as an economic problem, not one that is going to go away with technology.

Doug Kaye: Another listener in the UK asked, "What do you think are the strengths and weaknesses of biometrics as an authentication technology?" That's obviously a broad question, but.

Bruce Schneier: It's a good question, and I like biometrics as authentication. Biometrics has got a lot of press, at least in the United States after 9/11 as a counter-terrorist tool. If we could just pick terrorists out of crowds at airports, we can catch them and stop their nefarious plans. That I'm very skeptical about. But for authentication I think biometrics is good, and it's important for listeners to understand the difference.

Some systems use biometrics as an identification tool. "Can we scan the crowd and identify the bad guys?" That tends not to work. Biometrics work very well as an authentication tool. Type your user name into a computer, put your thumbprint down, and that authenticates you. They are very different applications. Biometrics has a really good place as an authentication tool.

I have seen biometric access controls for computers that work very well. At my company, Counterpane, we use biometric access controls for our secure operation centers; actually, we use hand geometry, we use a password and then we use a physical token. So we're mixing up different authentication techniques, and I am really in favor of that. It's something that's hard to lose. Passwords are easy to forgot, easy to write down. Physical tokens are easy to lose. Your thumbprint, you know, you don't lose it very often!

Doug Kaye: Ten years ago you wrote a book entitled, "Protect your Macintosh."

Bruce Schneier: You're probably the only reader, you know.

Doug Kaye: Yeah, well it's part of my job! And your career seems to have taken you from cryptography to network security to, let's call it, big-picture security now. What came before all of that? How did you get involved with crypto in the first place?

Bruce Schneier: I've always believed that security is a mindset, and you're right, my career has been an endless series of generalizations because I think they're all…all apiece. They are very similar. I think people who are good at security look around the world as they wander through their day and see security systems and see ways to subvert them. In a sense, they're hackers of the truest sense. "How did this system work?" "How can I use it?" "How can I abuse it?"

Whether they're walking into a store and looking at the cameras, walking into a voting booth or just wandering through the streets. They don't act on these ideas. They certainly never stop thinking about them. I had done that since forever, and I have been also interested in mathematics and I have been interested in cryptography since I was a kid.

Doug Kaye: I spoke recently with Phil Zimmermann of PGP, and he had a similar background. He even remembers a specific book he read as a kid about secret codes and invisible writing with lemon juice and things like that.

Bruce Schneier: Oh, sure, I read that same book! Back when we were kids, there were only, like, three or four of those books, so after you had read them, you were done. But, yeah, a lot of those books as a kid sparked our imagination.

One of the things I'd like to do at some time in my career is write a kids' book on cryptography and secret codes because I think that is important. I think kids need to be excited about the possibilities of privacy, secrecy and codes, and they are inherently interesting.

Doug Kaye: On the other hand, I also spoke recently with Dan Geer who was lamenting the fact that we're about to get to a point where security experts are people who are actually trained in security from day one, they get a degree in it and so forth, and they may not have some of the breadth of background that the early pioneers of security technology have.

Bruce Schneier: I don't know, I'm less pessimistic. I really do believe that security is a mindset and that people with the mindset will have the breadth because you can't help it. They'll get training in their specific area of expertise, but they will have the breadth of experience, the breadth of analysis, the breadth of the way the world works.

I think there are security experts who are just trained, and they're not very good, and I meet a lot of them all the time, but you can tell when someone has passion, when someone truly understands how security works. He might not know the math. He might not know how the operating system works. He might not know about locksmithing and how locks work or how voting machine work, but he gets security. And those sorts of people, I think, we will have in our field forever, and that's what makes the field great and that's what makes the field grow.

So Dan is right; the majority of people will just be practitioners who have been trained and who don't have passion, but that's probably true in every industry.

Doug Kaye: You had perfectly good reputation as a network security expert. You had grown Counterpane into a well-respected business. Why did you decide to write "Beyond Fear," a book that appears at least to go far beyond the scope of Counterpane's typical business?

Bruce Schneier: I really wrote "Beyond Fear" because we're living in a silly security season in our country. We're seeing so much nonsense after September 11th, and so many people saying things about security, about terrorism that just makes no sense, so I wanted to contribute to the debate. I wanted to write a book that people can read and then understand security.

They don't have to agree with the conclusions; one of the things I say in the book is that security is personal, that there often are no answers. But I wanted people to at least understand how to ask questions, how to look at a security system, how to evaluate it because we're being asked to take our shoes off at airports. We might be asked to live with a national ID card. We're being asked to support invasions of foreign countries. We're being asked to support all sorts of domestic and foreign policy in the name of security, and I'd like people to ask, "Does this make sense? Should we do this?"

Doug Kaye: The book is filled with an amazing array of facts and examples. Are these examples that you've been collecting over the course of your career, or did you find most of them just as you decided to sit down and research and write this book?

Bruce Schneier: It's both. A lot of examples that I've been collecting. Even when I was writing about computer security, if you read, "Secrets and Lies," I tended to write in analogies, I tended to explain computer security using noncomputer analogies because people got them better. So, once you start working in analogies, you do wander around the world and you see security everywhere, so I like using those examples.

So when I started writing the book, I used those examples, and one of the things I do when I write books is I send drafts to people, hundreds of people read drafts of my book in various stages. So they started suggesting examples, and then I would research them.

So the process fed on itself, and the good ones would stay and the bad ones would go away or I have better examples, so I have this treasure trove of security examples that illustrate different points.

Doug Kaye: I'm going to read some of my favorites of those examples.

Bruce Schneier: Oh, excellent!

Doug Kaye: …as a way to just get you to respond and flesh them out and explain what they mean.

First, one sounds at least something like the geeks in the audience might understand, and that is, "Complex systems have even more security problems when they are nonsequential and tightly coupled." What do you mean by that?

Bruce Schneier: That's a phrase coming out of systems theory, and it's fascinating. The book that I'd recommend people read is called "Normal Accidents" by a professor named Charles Perrow, and he talks about failures, not security failures, he talks mostly about accidents. But what he points out is that systems tend to fail catastrophically when they have these characteristics.

So a simple system would be like a row of dominos; you push one down and they all fall. That's sequential. It fails predictably.

A nonsequential system is where there are feedback loops and a small failure can get bigger. You think of a nuclear power plant, where a little failure here…or actually even better is the blackout from last August on the East Coast. It was a small failure at one power station in Michigan that dropped a lot of the northeast because the system was nonlinear, because one cause could have many effects which could feedback on the cause, which could have more effects.

And tight coupling is similar. Tight coupling means that stuff happens very, very fast, so if something is loosely coupled, think of the airlines as a great example. There are nonlinears going on, but it is loosely coupled, so a delay in Seattle might cause a delay in Chicago, which might cause a delay on the East Coast, but it doesn't cause planes to crash into each other. Because there is loose coupling, you can mitigate the problems. You've been on the ground when you've been told, "There is a hold in…oh… Chicago O'Hare and we're not going take off yet." If the system was tightly coupled, you couldn't do those mitigating things.

Doug Kaye: You'd run out of fuel airborne.

Bruce Schneier: Right, which is what used to happen. You used to circle a lot more, but the system got more loosely coupled.

When I look at security failures, these are the sorts of things that I look at in systems, that systems that are nonlinear and tightly coupled will tend to fail worse. They might not be more insecure at the front end, but an attacker could do more damage by attacking it because of these characteristics. Blowing up a power station might plunge ten thousand people into darkness, but in a tightly coupled nonlinear system, you could get tens of millions of people, hundreds of millions of people, in a blackout, and that's the difference.

Doug Kaye: Sticking with this theme about the manner in which systems fail, a couple of short quotes here. You said, in reference to 9/11 and some other situations, that "the systems didn't fail in the way the designers expected," and along with that the concept that "attackers exploit the rarity of failures." I think they are two related concepts.

Bruce Schneier: They are, and when you think about failures, you always have to think about what you are not thinking about in that paradoxical way. One of the things is rarity. If a system almost never fails; then when it does fail, no one will believe it.

We have all had the example of calling up a bank or utility company complaining about a bill and being told that "the computer never makes mistakes," right? Well, the computer does make mistakes. It just makes them rare enough that no one knows what to do when the mistakes happen.

The power failure in August was an example of that. They were computer failures that happened that the people in the control room did not notice because they are so rare. And an attacker can exploit that because people aren't ready for it.

One of the many unfortunate horrors of September 11th is that many people didn't know to evacuate the building because no one believed they would collapse. Now, that might be a perfectly reasonable belief. I don't know enough about structural engineering to say one way or another, but the fact that nobody believed that meant the death toll was that much higher. If a building failed every week, everyone would know; you'd get out of the building, but it never happened before, so people didn't know.

Doug Kaye: Now a recurring concept in your book is probably typified by this example: "A terrorist who wants to create havoc will not be deterred by airline security; he will simply switch to another attack and bomb a shopping mall."

Bruce Schneier: This is, I think, really important. I just did a hearing two days ago on Capitol Hill about CAPS II, about airline profiling, and one of the things I'm always struck with is how good we are at defending against what the terrorists did last year. We're spending a lot of money shoring up our airlines, we're now talking about shoring up trains. And money that we spend that simply causes the bad guys to change their tactics is money wasted.

You have a red and a blue door, and the terrorists go through the red door, and you say, "We must secure the red door," so they go through the blue door the next time. What did you actually buy?

And one of my fears is that we spend lots and lots of money securing the airlines, and the terrorists move to the shopping malls or movie theaters or crowded restaurants or any of the things they do in Israel; that there are just so many targets that taking the target the terrorist happened to pick last year and securing it just sort of ignores the real problem.

Doug Kaye: You also described what happened as a result of us building stronger and stronger bank vaults that were ultimately more impervious to dynamite.

Bruce Schneier: Right, and it's actually a great example! As bank vaults became impervious to dynamite, the preferred tactics switched to kidnapping bank presidents who knew the combination.

Doug Kaye: Not something we anticipated.

Bruce Schneier: No, and we saw this in South Africa. As cars had more and more antitheft provisions that made them harder to steal, car thieves turned to carjacking, a much more dangerous crime for the victim.

So we have again an example of security causing the attackers to change their tactics. And it might be a good thing because of the number of car jackers is going to be less than the number car thefts, but it might not be, and we have to remember that there is a feedback loop between the attacker and defender, that once an attacker does an attack and a defender modifies his defenses, the attacker will then modify his attack, and back and forth.

Going back to the bank vault analogy, after bank robbers started kidnapping bank presidents and bank tellers, the vault companies invented time locks, a lock that the bank president couldn't open even if you put a gun to his head. And the point of those time locks was to save the lives of the bank executives and the bank executives' families because all the bank robbers knew that kidnapping the bank president or his wife or children won't help because of the time lock.

So again, now a defensive tactic evolved to respond to an attacker tactic, which evolved to respond to a defensive tactic.

Doug Kaye: This is another fundamental principle that I learned from this book, which is that in a sense, that is a failing game. If we attempt to anticipate every possible attack, we're going to be virtually unsuccessful and that, in fact, we're much better off by trying to ferret out the attacker and keep him from participating in virtually any attack.

Bruce Schneier: Yeah, I'm often asked by the press, by legislators, "What do you recommend?" because I spend a lot of time saying, "This won't work, that won't work, or this won't work."

And they always ask, "Well, what will work? What should we do?"

And I think we just spend money at either end of the spectrum. The money we spend on intelligence, investigation, going after the terrorists will work regardless of what their next plan is. Defending the airlines only works if their next plan involves airlines, but going after the terrorists works regardless of what their next plan is, and I think, that's important.

So I am very much in favor of those sorts of things: hiring linguists at the FBI, getting investigative teams in place, tracking terrorist funding, interdicting communications, that stuff also works.

And then I also advocate stuff at the back end, emergency response, first response, whether it is police, fire, medical, spending money there helps us, again, regardless of what the terrorist plans are.

We can never anticipate the terrorist plans. Inasmuch as we do, they will change them, so by definition you can't anticipate them. We can't all say, "My God! They're are going to attack the rail system! Let's secure the rail system," because then they'll decide to attack something else.

Doug Kaye: So what are some of the examples of the expenditures that the United States is now making that you think aren't well justified?

Bruce Schneier: You know, generally all of the big budget IT systems. CAPS II, the airline profiling system, is a complete waste of money. I think fingerprinting foreigners at the border, if you actually sit down and think about it, makes absolutely zero sense.

My complaints are largely for the big-ticket items, TIA (Total Information Awareness) where we are going to get information on every American and try to pick terrorists out of the crowds, the face scanning at airports program. This kind of stuff tends to be very, very expensive and doesn't produce useful results.

Doug Kaye: In a recent Newsweek interview, you are quoted as saying that homeland security measures are an enormous waste of money. Now, is that that category that you consider homeland security?

Bruce Schneier: Yeah, that's the category. It's trying to secure the targets. They are just too many targets, and if you sit down and count up all the places 100 or more people gather in close proximity, like restaurants, movie theaters, sporting events, schools, trains, buses, crowded intersections, you rapidly realize that there are an enormous number of them in this country and you can't possibly secure them.

Remember, the most amazing thing to me about the airline security measures in the months after 9/11 were the enormous lines. So here we are trying to make our airports safe, yet we bunch people in these huge crowds before security.

Doug Kaye: Making them essentially targets?

Bruce Schneier: Making them essentially targets! Now, there wasn't anything.

Another thing we have to remember, which is very hard to remember in sort of our fear-laden society, is terrorism hardly ever happens. Very often I hear people from administration saying, "Our policies are working because in the two-and-a-half years since 9/11, nothing else has happened," and I think about it and say, "Well, nothing happened two-and-a-half years before 9/11 either. You did not have any policies. What does that prove?"

What it proves is that terrorist attacks are very, very rare and that we're spending a lot of money on something that hardly ever happens. Now, we can decide to do that. We as a nation tend to worry about spectacular and rare events rather than common events, like spouse abuse, automobile crashes, things that kill lots and lots of people every year, and we tend to focus on the spectacular and rare, but we should realize we're doing that.

Doug Kaye: Here's my favorite quote in the whole book, and I know that you probably know which one it is!

Bruce Schneier: Actually, I don't! I can't wait!

Doug Kaye: That "more people are killed every year by pigs than by sharks, which shows just how good we are at evaluating risks."

Bruce Schneier: That was actually a fun quote. I actually went to the government web site, which actually has death statistics from various things. You can see how many people die from lightning, from heart disease, from anything, and the results are surprising. People tend to worry about the wrong things.

We worry about what's in the news. I tell my friends that if it's in the newspaper, don't worry about it because it means it hardly every happens. It's news. News hardly every happens; that's why it's news! When something stops being in the newspaper, then worry about it.

Doug Kaye: There's a huge emotional component to all of this, and you addressed this somewhat in the book. You say, for example, probably an important quote, that "people make bad security tradeoffs when they are scared."

Bruce Schneier: Right, and the emotional part really can't be belittled. It's important. As a security professional, I often pooh-pooh it, but that's wrong, and I sort of learn not to do it. The reason we're spending money on terrorism that killed nobody in the past two-and-half years and we're not spending money on automobile crashes that have killed 40,000 people in the past dozen or so years each is because of an emotional reason. We are emotionally scared; this emotionally worries us more.

And that's important because security is a reality and a feeling. I try to talk about this in the book. I'm not a psychologist. It's getting away from my area of expertise, but it's important not to belittle it, not to forget it.

The example I like to always think about is to imagine that you're living in the DC area during the weeks when the snipers were at large. Now, if you run the math, they doubled the murder rate in the counties they were operating during the weeks they were at large. You would have been more at danger probably moving into the Washington DC inner city than you would being where you were when the sniper was about. Yet, the reaction was completely out of proportion. School principals canceled outdoor events, people were afraid to leave their homes. So you can imagine that you're living there, and your daughter is afraid to walk to school because of the sniper, and you can sit her down with graphs and charts and explain to her the mathematics and why she is safe or you can just walk her to school. Honestly, if walking her to school makes her feel better, it's the right thing to do even though it might not make sense from a security perspective.

My hope is that through my writings and my speaking, people can be educated to get beyond their fears, but you can't belittle them.

Doug Kaye: Now you can this pragmatism quite a long distance, and as you do that, I find that I start to question it myself. Here's an example that I got from something you wrote in Wired magazine. I'll just paraphrase this.

You wrote that 2,978 people in the US died from the terrorism in 2001, but 3,454 -- that's more people -- died from malnutrition in that same year. Now obviously we could have saves all of those who died of malnutrition for a whole lot less money than what we are spending to avoid a repeat of 9/11, but as a society, we're about as far from acting that way as I could possibly imagine. How rationally should we be thinking?

Bruce Schneier: I think the more rationally we think, the better we're going to spend our money. I view myself as a consumer of security, "How much am I getting versus what am I giving up?"

And it doesn't work this way. When Congress are spending the $100 to $200 billion to invade Iraq, you cannot say to them, "Well, let's not invade Iraq and let's cure malnutrition instead." They won't do that. So it really doesn't work that way, but it should.

We are spending $200 billion, that's the number I hear as the estimate when all is said and done, to invade Iraq. Are we $200 billion safer because of it? That's the real question to ask. Not, "Was it a good idea?" not, "How bad was Saddam Hussein?" but, "As security consumers, did we get our money's worth?"

I want to look at malnutrition, at automobile safety, at spouse abuse the same way. How many lives can I save? How may lives can I improve? How much can I better society for the money? Because in the end, I only have so much money. I am a consumer. I have to make buying decisions.

Do I want to buy more nutritional food or do I want to go out to the movies? I have to make that choice; I only have one paycheck. I'd like it if we on the national scale could think the same way. I don't think it's possible, but I'd like it if we could.

Doug Kaye: Now, we all make these personal decisions that you described, but on a national scale we are making much larger decisions, at least larger in terms of budgets, and the political system to me is very much an amplifier of that individual fear-based system.

Bruce Schneier: Yeah, I agree, 100%.

Doug Kaye: So, there's a positive feedback loop there unfortunately where we feel a small amount of fear, and that translates into a very large expenditure of money based upon the fears of individuals

Bruce Schneier: Right.

Doug Kaye: Is there a solution to that?

Bruce Schneier: Probably not. Unfortunately, I think this is the way politics works, because if you're a politician, you have to appear strong. You can't be weak on terrorism, even if it makes sense.

Doug Kaye: How can we help our political leaders make strong, but rational, decisions?

Bruce Schneier: I write, I speak, that's what I do, I try to educate. I spend a lot of time on Capitol Hill talking to people and a lot of time writing my book. I give copies out. I write op-ed pieces, so that's what I try to do. I think education is the way to do it. It's not a fast solution and it might not work, but it's, I think, our only hope.

Doug Kaye: Another quote, something that you wrote relative to this topic, was "When the US government says that security against terrorism is worth curtailing individual civil liberties, it is because the cost of that decision is not borne by those making it." Talk about that.

Bruce Schneier: Yeah, I spent a lot of time on that in some of my writings; I've done op-ed pieces on that. Security is multifaceted. There are many, many different threats in many areas, and right now we are focusing on the terrorist threat and ignoring other threats.

As a general rule, the people who make security decisions make security decisions that are rational to them. Security is a tradeoff, and they are going to make a personal tradeoff. So when you see a lot of these intrusive government systems, it's because the people making the decisions aren't the ones being intruded upon; they are the ones doing the intruding. So their tradeoff is inherently different than yours or mine might be.

Doug Kaye: But, on the other hand, there is also the inappropriate political influence. I love this quote, too. "Did you ever wonder why tweezers were confiscated at security checkpoints, but matches and cigarettes lighters, actual combustible materials, were not? If the tweezers' lobby had more power, I am sure they would have been allowed on board as well." That's a great one!

Bruce Schneier: And it's true. The government wanted to ban laptops, but the airlines said, "No, you can't do that. Our business travelers will leave us." Security decisions as a general rule are subservient to other decisions, and I think that's as it should be; I don't think that's bad. Security is rarely a driver. It is much more often subservient.

Doug Kaye: What is your impression so far of the 9/11 hearings that are being conducted in Washington?

Bruce Schneier: There's not a lot of news; I mean, most of the stuff that is being said was said on September 12. Yeah, there's a lot of finger pointing, and certainly there were failures of intelligence. There were failures of communication. I am not convinced they were massive failures of intelligence.

This is one event, it's a singular event, and it's very hard to make generalizations from it. And sure, a bunch of things went wrong, but if they went right, other things would have gone wrong.

I personally feel that 9/11 was, more than anything else, a very, very horrible coincidence that things just failed in exactly the right way and there were many opportunities for things to go right, and if they did, we would have patted ourselves on the back and said it was a success even though the exact same stuff was in place.

It's important to realize that neither success nor failure necessarily shows a systemic problem. This is just a singular event.

Doug Kaye: Well, Bruce, I want to thank you for being with me today and I want to encourage everyone to go out and, if you haven't already, get a copy of "Beyond Fear" and read it. It really is one of the best books to come out in the last year or so.

Bruce, thank you very much!

Bruce Schneier: Thanks for having me; this was fun!

Doug Kaye: And thanks to all of you for listening to IT Conservations. This edition was recorded on April 16, 2004. My guest has been Bruce Schneier. You will find him all over the web, but a good staring point is probably his weblog, which you can find at www.schneier.com; actually, that's his home page.

My name is Doug Kaye, and I hope you'll join me the next time for another edition of IT Conversations.

This interview and many others are provided by ITConversations, please visit their website at http://www.itconversations.com