The latest Information Security Breaches Survey released last month by PwC (Pricewaterhouse Coopers) and BERR (the UK govenment department for Business Enterprise & Regulatory Reform, formerly known as the DTI Department of Trade & Industry) provides a wealth of statistical information and commentary on the current state of the information security art in the UK. Other similar surveys, most of which are acknowledged in the PwC/BERR one, include:
- The Information Security Forum survey of its members, benchmarking them against each other, the ISF Standard of Good Practice plus standards such as ISO27k and COBIT;
- The Global State of Security Survey, an online survey by PwC plus CIO and CSO Magazines;
- The Computer Security Institute (CSI) Computer Crime and Security Survey, originally run in conjunction with the Federal Bureau of Investigation (FBI);
- Ernst & Young's Global Information Security Survey of its customers.
When reading such surveys, it is always worth taking note of the survey methods first since the size of the sample and the way in which data are collected can significantly affect the results of any survey. The PwC-BERR survey, for example, was a telephone survey of UK organizations. The team selected organizations for the survey using a specific rationale which, they feel, gave a 'representative' sample but as this was not true random sampling, simple statistical techniques for extrapolating from the sample to the whole population are not entirely valid. Clearly, the sample was UK only and, strictly speaking, says nothing about the state of information security outside the UK. Likewise, other surveys mostly use self-selected samples (e.g. an online survey that anyone can complete) or other selection techniques (such as E&Y's customers) which affect the validity of the results and, in part, contribute to differences in the results of various surveys. Similarly, surveys with small sample sizes are likely to have wide margins for error although this is seldom fully acknowledged in the survey reports.
Another factor to consider is the way in which questions are asked. As any scientist or market survey specialist knows only too well, posing a question in a particular way affects the results obtained. Subtle changes of wording or framing can all change the way the questions are interpreted. 'Leading questions', for example, somehow imply that a certain answer is expected and therefore bias the results. Even the sequence of questions and the questionnaire headings can be important. This is a complex topic even for specialist professionals and dedicated market survey companies such as MORI.
That said, provided we understand the constraints, surveys are still interesting. They are a worthwhile source of numbers to support business cases. Those proposing investments are naturally biased towards whatever they are proposing, so using biased statistics is no big deal to them. On the other side of the table, though, those appraising investment proposals should probably dig a little deeper to get a fairer picture. Unfortunately, that requires background knowledge of the field and time to go digging. I've never yet met a senior manager in the commercial world who takes anything other than a glancing interest in investment proposals, but perhaps they should.
Personally, I'm more interested in the survey texts than the statistics. The careful way in which statistics are presented and interpreted often exposes curious anomalies such as the age old question of whether we should fear insiders or outsiders the most. For many years, the prevailing wisdom was that outsider attacks were more numerous and therefore of more concern but about five years ago the tide changed to the perspective that insider attacks may be less frequent but are more damaging. The true take home lesson from several surveys over several years is that we need to beware insiders AND outsiders.
Surveys also provide interesting quotations, particularly when survey respondents mention security breaches. The PwC/BERR survey, for example, mentions 'staff received additional training' as a typical part of the response to many security incidents, betraying the reactive nature of many security programs. Do you use surveys and news of security incidents elsewhere to inform your security risk analyses and security awareness programs? Learning from and avoiding others' mistakes sure beats learning from your own!
Regards,
Gary Hinson CISSP
Passionate about security awareness
www.NoticeBored.com Creative awareness materials
www.ISO27001security.com ISO/IEC 27000 standards