Dan Bricklin‘s elegant essay on the lessons for system design and use of on-line and other information sources (Learning From Accidents and a Terrorist Attack) is very informative and makes some excellent points around the ability and availability of the general public as a participant in disaster recovery. It nicely validates what every IT prophet has been saying, in one form or another, since the early 90s: Increases in communications and information processing capability will lead to more consumption of that resource, enabling organizations to quickly respond to outside changes – indeed of “spontaneous” organizations to quickly form to address issues. We communicate rather than plan, and can mobilize quickly. Dan makes some great points around what this means for systems and component design.
However, there is one problem with using open tools, such as RSS feeds, blogs, wikis and open conference calls: Their very openness makes them a path for a future terrorist. A group of terrorists wanting to do something akin to the 9/11 action could now learn from what happened then, and include a number of on-line participants with a role in spreading misinformation, increasing fear and diverting resources. There were instances of misinformation during 9/11 – I remember news items about a carful of bombs being stopped on some bridge in New York for instance – and the news channels normally apply some form of fact-checking. While the Wikipedia model works great when there is time – and people check changes against a contributor’s past behavior, I think we should be careful with too much openness in a time-pressured situation. Some form of validation needs to be in place, perhaps some form of peer-validation that draws on the public’s increased communications capability one wants to tap into in the first place.
(Also posted at RISKS Digest 23.52)