Roland Ewald

Software Engineer · Researcher



Notes on 'Security Engineering', part 1

2025-03-30


I am currently reading the classic textbook on Security Engineering1 by the late Ross Anderson,2; here are some personal highlights from part I (eight chapters on the basics, covering everything from psychology to cryptography):

  • The abbreviation for access-control list (ACL) is pronounced “ackle”. This reminds me of other well-known technical abbreviations with dedicated English pronounciations, such SQL (pronounced sequel, also for historic reasons) and WSDL (whistle).

  • Names are even harder when securing distributed systems. This reminded me of Patrick McKenzies’ (a.k.a. patio11) classic essay about invalid assumptions programmers have regarding names. The Needham naming principles3, by Roger Needham, focus on security-relevant aspects of names, for example at which time they are bound to a principal, how they are resolved, or how easy it is to validate them. The book then goes on to add even more problems in the follow-up section, and in particular mentions “Zoko’s triangle”, formulated here as:

No naming system can be globally unique, decentralised and human-meaningful.

  • Even the basic discussion of cryptography (chapter 5) gives, besides a very brief theory overview, a lot of practical advice and argues that most systems are broken by bad defaults and other engineering problems, like overcomplicated APIs or unforeseen side channels, and not by cryptanalysis4:

Very few attacks on systems nowadays involve cryptanalysis in the sense of a mathematical attack on the encryption algorithm or key. […] Most attacks nowadays exploit the implementation.

  • I really like the overall ethos of the book: to successfully secure a distributed system—which does not just consist of hardware and software, as it is operated by people in a specific contextsecurity engineering needs to to also consider psychology, economics, game theory, and other related sciences. This is not only relevant for threat modeling, but also for change management or the assessment of new threats, like cryptanalysis breakthroughs. In chapter 55, for example, the book reminds us to stay calm when we read that, for example, “someone broke TLS6:

When someone discovers a vulnerability in a cryptographic primitive, it may or may not be relevant to your application. Often it won’t be, but will have been hyped by the media – so you will need to be able to explain clearly to your boss and your customers why it’s not a problem.

  • Something I found surprising on a personal level is that apparently crime is not declining, it is just moving online (and thus becomes invisible in crime stats). From chapter 27:

There is less of that liability dumping now, but the FBI still records much cybercrime as ‘identity theft’ which helps keep it out of the mainstream US crime statistics.

and from chapter 88:

There’s very little police action against cybercrime, as they found it simpler to deter people from reporting it. […] this enabled them to claim that crime was falling for many years even though it was just moving online like everything else.

I’m sure the second part of the book, focusing on more specific topics, will also be a fun read.

  1. The book is freely available, the last edition is from 2020. 

  2. Here is a nice obituary that highlights some of his work outside academia. 

  3. Section 7.4.1 from page 260 in the book (p. 305 in the PDF). 

  4. This classic xkcd comic makes a similar point :-) 

  5. Section 5.3.3 from page 162 in the book (p. 209 in the PDF). 

  6. This point nicely illustrates why we need to look at the whole system: the fact that such discoveries are “hyped by the media” are caused by human psychology (people being scared by something out of their control), by economic incentives (scary stories sell better in an ‘attention economy’), and even by the incentives of the security researchers discovering the mathematical attack (they need to deliver impactful research to improve the odds of future funding). 

  7. Section 2.3.2 from page 48 in the book (p. 96 in the PDF). 

  8. Section 8.6.6 from page 304 in the book (p. 348 in the PDF). 

  9. See footnote on page 23 in the book (p. 71 in the PDF). 

  10. Section 7.4.1 from page 262 in the book (p. 307 in the PDF).