[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

The Laws (was the principles) of secure information systems design

I've been revising the principles, and came up with this. It's an early version.

As ever, corrections and suggestions are welcome.

Calling them Laws is perhaps a bit overreaching - but on reflection I thought that's mostly what they are, break them and the system won't be secure.

I will put the Laws up on the 'net shortly, hopefully with a link for suggestions and comments.

The Laws of secure information systems design:

Law 0: It's all about who is in control

Law 1: Someone else is after your data

Law 2: If it isn't stored it can't be stolen

Law 3: Only those you trust can betray you

Law 4: Attack methods are many, varied, ever-changing and eternal

Law 5: The entire system is subject to attack

Law 6: A more complex system has more places to attack

Law 7: Holes for good guys are holes for bad guys too

Law 8: Kerckhoffs's Principle rulez! - usually...

Law 9: A system which is hard to use will be abused or unused

law 10: Design for future threats

Law 11: Security is a Boolean

Law 12: People offering the impossible are lying

Law 13: Nothing ever really goes away

Law 15: "Schneier's law c" [1] holds illimitable dominion over all... including these laws

-- Peter Fairbrother

[1] "

a: Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard.

b: What is hard is creating an algorithm that no one else can break, even after years of analysis.

c: And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around."