The history of information security offers some interesting parallels to the evolution of quality thinking, but also some critical differences. Both the similarities and differences provide insight in thinking about the growing challenge of Internet malware. As was the case with the quality revolution, the early practitioners of information security were technical experts, laboring in obscurity even as they battled the enemy. The quality statisticians, however, waged war against innate waste and error to reduce unintentional variation; the infotech security wizards have been fighting in an escalating and increasingly nasty war against an enemy with destructive intent. Early viruses generally reflected benign, often humorous attempts to demonstrate the fallibility of both humans and computers. Later, more malicious efforts appeared as a way to demonstrate technical superiority in a game of one-upmanship. Increasingly, malware reflects clear goals such as financial gain or social protest. The intentionality of malware provides a clear distinction between the two movements, but the lessons of the quality revolution remain relevant to malware nonetheless.
Crosby’s notions of cost map well onto security. Firms invest both in prevention (education) and ongoing appraisal (penetration testing). Likewise, security failures lead to both internal costs (lost productivity) and external costs (fines and damaged reputations). To date, no one has made a compelling case comparable to Crosby’s to assert that “information security is free,” but that may, in fact, be correct. Understanding all of these costs offers an important step toward improving security; however, in this case, the objective is not hard quantification but a change in executive and organizational mind-sets.
To move from the back room to the boardroom, information security specialists should employ a common, rigorous framework for quantifying the bottom-line impact of security breaches. Too frequently, senior executives are aware only of the cost side of the equation. They see growing investments in software tools designed to catch problems but rarely see hard quantification of the benefits of these controls. Even more unusual is a quantification of the negative effects of excessive controls. Tight policies limiting use of new devices or unapproved applications offer greater security, but they also stifle innovation — something hard to quantify.
After quantifying the full costs of information security, companies need to focus on the root causes. Although “denial of service” attacks and botnets grab the headlines, the root causes of many security breaches include both benign and malicious insiders with access to sensitive information. So, just as retail stores check departing employees for stolen merchandise, data loss prevention systems watch for data moving on a corporate network. If an employee deliberately (or accidentally) tries to e-mail a spreadsheet of credit card numbers or product release plans, the system bars the door. Related monitoring tools scrutinize employees’ use of company data, watching for suspicious frequency, unexplained volume, or a particular combination of data. Other security tools help users make good security decisions, such as warning them if a hyperlink in an e-mail message appears fraudulent.
In the words of Dorian Shainin, IT security experts need to “talk to the parts” rather than stay in the back room running system diagnostics. Users — the parts in information networks — rarely understand the true vulnerability inherent in the connectivity of the Internet. Rather than keeping users in the dark and assuming that malware protection programs such as Norton and McAfee have everything under control, we need to educate frontline employees. Following the lesson of Taguchi, companies must empower employees with user-friendly malware protection tools. Information security cannot be the sole domain of the technical team. Users need to understand the real threats, and technicians need to appreciate the loss of benefits that results from overly restrictive controls. Taking a lesson from Six Sigma, security specialists need to focus on stakeholder buy-in, and to recruit advocates in the user community. Cybersecurity is everyone’s job, not just that of the CIO or security specialist.