Monday, October 04, 2004

Kernighan, debugging and assurance

I read a great Kernighan quote at Marquee de Sells today. For those who aren't familiar with Kernighan, he helped design the awk programming language and coauthored the first book on the C programming language.

I was struck by the security implications of programmers writing computer programs that they aren't smart enough to debug. The analysis and testing that makes up a security evaluation is comparable to debugging, and yet many of the evaluators I've worked with are not as smart as many of the programmers I know. My experience with evaluator qualifications states that a college or university degree in engineering or computer science, plus a few years apprenticing, is sufficient for evaluating source code. Is this realistic?

Thinking about debugging and security brought me to the concept of a reference monitor: a reference monitor enforces the authorized access relationships (i.e., the policy) between the subjects and the objects of a system.[1]

One implementation of the reference monitor concept was called a reference validation mechanism. Early examples of reference validation mechanisms were called security kernels, or that combination of hardware, firmware and software which implements the reference monitor concept.[2] Three design requirements of these reference validation mechanisms (and security kernels) were: 1) It must be tamper proof, 2) it must always be invoked, and 3) it must be small enough to analyze and test with complete assurance.[3]

It would seem to me that Kernighan's point speaks to a caveat on the third requirement: it must be small and simple enough to analyze and test with complete assurance.

Footnotes

1. Anderson, J. P., Computer Security Technology Planning Study, ESD-TR-73-51, vol. I, ESD/AFSC, Hanscom AFB, Bedford, Mass., October 1972 (NTIS AD-758 206).

2. Computer Security Technology Planning Study.

3. U.S. Department of Defense, Department of Defense Trusted Computer System Evaluation Criteria, December 1985.