People of ACM - Peter Neumann

May 30, 2013

Peter G. Neumann is Senior Principal Scientist at the SRI International Computer Science Laboratory, where his research focuses on security, cryptography, fault tolerance, reliability, safety, software engineering methodology, applications of formal methods, and risk avoidance. He is currently working on two five–year projects on making computers and networks more trustworthy (secure, reliable, resilient, and so on) for the U.S. Defense Advanced Research Projects Agency (DARPA), with Robert N. Watson, a computer security researcher at Cambridge University's Computer Laboratory, leading a team there in the UK.

Neumann chairs the ACM Committee on Computers and Public Policy. He received ACM's Outstanding Contribution award for his Inside RISKS column in Communications of the ACM and ACM RISKS Forum newsgroup, which inform computing professionals about the consequences of software failures. He also received the 2013 Computing Research Association (CRA) Distinguished Service Award in part for driving the fields of computer–related risk and socially responsible use of information technologies. He is a Fellow of ACM, IEEE, AAAS, and SRI. His 1995 book, Computer–Related Risks (ACM Press and Addison Wesley) still remains timely; many of the problems described there continue to recur, and many of the desired remedies remain to be adequately adopted.

A tireless advocate for computer security, Neumann has testified before the U.S. Congress and has worked as an advisor to government agencies such as the GAO, FAA, and IRS on public policy issues. He holds Bachelor's, Master's, and PhD degrees from Harvard, and was involved in development of the Multics file system at Bell Labs before moving to SRI in 1971.

Given your work on the Clean Slate Program, do you feel, as John Markoff described you in an October 2012 New York Times article, like "a voice in the wilderness" in your efforts to improve computer security?

The remarkable reach of the ACM Risks Forum readership suggests that there is a significantly large inner circle of folks who really have internalized the dangers of not anticipating all of the things that can go wrong in computer–related systems. Also, I have various colleagues who pursue holistic approaches—such as seeking overall system trustworthiness (encompassing security, but also reliability, system survivability, human safety, and lots more) rather than merely "trust" (especially having to trust entities that are not trustworthy or even unknown).

Thus, I certainly do not feel like a voice in the wilderness in that respect. However, I do feel that the rest of the world (including many hardware, software, and applications people in the computer industry, managers, politicians, governments, and casual users) needs much greater understanding of the shortcomings of our existing infrastructures and the potential risks, the dangers of malware and bogus URLs, and so on. The mere fact that security flaws such as buffer overflows and inherently limited authentication techniques such as passwords are still pervasive is always frustrating to me, because techniques for avoiding them have long been available.

Are you optimistic about the potential for changing the computer security culture by pursuing ideas from other sciences like biological systems?

Biological paradigms are of considerable interest conceptually, but only part of the puzzle. However, there is always a danger of believing in analogies. Thinking in terms of analogies that do not exactly mirror the situation at hand often leads to oversimplifications and misguided would–be solutions. Overall, computer science still has much to learn from other disciplines, such as engineering—because software engineering is not really an engineering discipline. Computer science also has much to offer some of those other disciplines, as in the case of computational biology.

How has your exposure to art in your early life and as a musician throughout your career influenced your ideas about designing and implementing computer security?

My discussion in 1952 with Albert Einstein focused extensively on complexity and certainly had a huge influence on my professional and personal life.

That discussion also included considerations of art (including my mother's portrait of Einstein and my parents' involvements in the art world) and even complexity in art, and to a very considerable extent, music—for example, tracing the evolution of complexity as in Gregorian chants to Bach to Mozart to Beethoven.

The discipline required in music is somewhat similar to that in computer science—particularly coordinating and integrating left–brain (logical, linear thinking) and right–brain activities (intuitive thinking), and the need for the concurrent development of both sides for meaningful creativity. (For example, see my article, "Psychosocial Implications of Computer Software Development and Use: Zen and the Art of Computing," in Theory and Practice of Software Technology, D. Ferrari, M. Bolognani, and J. Goguen, editors, North-Holland, 1983, 221-232.)

Perhaps most important in developing large computer systems is Einstein's belief that everything should be made as simple as possible—but no simpler. It is often the oversimplifications that result in system failures. On the other hand, much of my R&D has been devoted to predictable composition of well–designed components such that the components and the compositions appear to be simple (even if they are masking considerable complexity).

What advice would you give to budding technologists who are considering ways to improve computer security?

Security cannot be achieved by concentrating only on security. Think about trustworthiness overall, not just how to avoid a few nasty design flaws and programming bugs. Think holistically about well–conceived specifications, higher assurance, uses of applicable theoretical results, uses of sensible and soundly based software engineering approaches, incisive use of formal methods where most applicable, and so on. Trustworthiness is a total–system measure that must address hardware, software, system development and procurement plus user behavior, faults, failures, errors, and attackers.

Don't rely only on your own expertise—develop team efforts with strong synergies and common interest that overlap constructively. Read the literature, and understand past mistakes and successes. Follow the ACM Risks Forum, and contribute to it. Check out my website, http://www.csl.sri.com/neumann, and dig in to some of our ongoing work on clean–slate system and network architectures. Remember that there are no easy answers, and that "simplicity" is inherently complex.