Podcast: Crypto-Gram 15 Augustus 2007: Disaster planning - You live in the safest society in the history of mankind.
from the Aug 15, 2007 Crypto-Gram Newsletter
by Bruce Schneier
* Assurance
eVote testing: It begins with a presumption of security: If there are no known vulnerabilities, the system must be secure. If there is a vulnerability, then once it's fixed, the system is again secure...
Yet again and again we react with surprise when a system has a vulnerability.
Once you stop thinking about security backward, you immediately understand why the current software security paradigm of patching doesn't make us any more secure. If vulnerabilities are so common, finding a few doesn't materially reduce the quantity remaining. A system with 100 patched vulnerabilities isn't more secure than a system with 10, nor is it less secure. A patched buffer overflow doesn't mean that there's one less way attackers can get into your system; it means that your design process was so lousy that it permitted buffer overflows, and there are probably thousands more lurking in your code.
Brian Snow from NSA said they couldn't use modern commercial systems with their backward security thinking. Assurance was his antidote:
"Assurances are confidence-building activities demonstrating that:
"1. The system's security policy is internally consistent and reflects the requirements of the organization,
"2. There are sufficient security functions to support the security policy,
"3. The system functions to meet a desired set of properties and *only* those properties,
"4. The functions are implemented correctly, and
"5. The assurances *hold up* through the manufacturing, delivery and life cycle of the system."
* Avian Flu and Disaster Planning
If an avian flu pandemic broke out tomorrow, would your company be ready for it?
It's not that organizations don't spend enough effort on disaster planning, although that's true; it's that this really isn't the sort of disaster worth planning for?
There is a sweet spot, though, in disaster preparedness. Some disasters are too small or too common to worry about. And others are too large or too rare.
It makes no sense to plan for total annihilation of the continent, whether by nuclear or meteor strike: that's obvious.
You can only reasonably prepare for disasters that leave your world largely intact. If a third of the country's population dies, it's a different world. The economy is different, the laws are different -- the world is different. You simply can't plan for it; there's no way you can know enough about what the new world will look like. Disaster planning only makes sense within the context of existing society.
The proper place for bird flu planning is at the government level.
The key is preparedness. Much more important than planning, preparedness is about setting up social structures so that people fall into doing something sensible when things go wrong. Think of all the wasted effort -- and even more wasted *desire* -- to do something after Katrina because there was no way for most people to help. Preparedness is about getting people to react when there's a crisis. It's something the military trains its soldiers for.
You live in the safest society in the history of mankind.
length: 63:19
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0708.html
by Bruce Schneier
* Assurance
eVote testing: It begins with a presumption of security: If there are no known vulnerabilities, the system must be secure. If there is a vulnerability, then once it's fixed, the system is again secure...
Yet again and again we react with surprise when a system has a vulnerability.
Once you stop thinking about security backward, you immediately understand why the current software security paradigm of patching doesn't make us any more secure. If vulnerabilities are so common, finding a few doesn't materially reduce the quantity remaining. A system with 100 patched vulnerabilities isn't more secure than a system with 10, nor is it less secure. A patched buffer overflow doesn't mean that there's one less way attackers can get into your system; it means that your design process was so lousy that it permitted buffer overflows, and there are probably thousands more lurking in your code.
Brian Snow from NSA said they couldn't use modern commercial systems with their backward security thinking. Assurance was his antidote:
"Assurances are confidence-building activities demonstrating that:
"1. The system's security policy is internally consistent and reflects the requirements of the organization,
"2. There are sufficient security functions to support the security policy,
"3. The system functions to meet a desired set of properties and *only* those properties,
"4. The functions are implemented correctly, and
"5. The assurances *hold up* through the manufacturing, delivery and life cycle of the system."
* Avian Flu and Disaster Planning
If an avian flu pandemic broke out tomorrow, would your company be ready for it?
It's not that organizations don't spend enough effort on disaster planning, although that's true; it's that this really isn't the sort of disaster worth planning for?
There is a sweet spot, though, in disaster preparedness. Some disasters are too small or too common to worry about. And others are too large or too rare.
It makes no sense to plan for total annihilation of the continent, whether by nuclear or meteor strike: that's obvious.
You can only reasonably prepare for disasters that leave your world largely intact. If a third of the country's population dies, it's a different world. The economy is different, the laws are different -- the world is different. You simply can't plan for it; there's no way you can know enough about what the new world will look like. Disaster planning only makes sense within the context of existing society.
The proper place for bird flu planning is at the government level.
The key is preparedness. Much more important than planning, preparedness is about setting up social structures so that people fall into doing something sensible when things go wrong. Think of all the wasted effort -- and even more wasted *desire* -- to do something after Katrina because there was no way for most people to help. Preparedness is about getting people to react when there's a crisis. It's something the military trains its soldiers for.
You live in the safest society in the history of mankind.
length: 63:19
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0708.html
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home