Data, data everywhere, but nobody knows where it is. How much time have you spent looking for a particular document in an unstructured shared repository? When people collaborate they often haphazardly name documents and directories which does not necessarily make it easier for others to find it. Not knowing where the data is when you need it is a classic availability problem.
When data is temporarily lost productivity drops, tempers flair, and frustration abounds. A lack of document management is not only a productivity problem, but a security one as well. Sensitive information misplaced may be located in a directory with inappropriate access controls for the type of data it contains.
Information that is moved to an improper directory is minimally copied from one location to another. Were other copies of the data also made? Did the information flow through other channels such as email or peer-to-peer software? The unknown in this regard is very disconcerting.
A recent report produced by Verizon indicates that 66% of data breaches they investigated involved information placed in areas easily accessible by the attacker, but the existence of which was unknown to the victim. They refer to this as the “Unknown Unknowns” of their data breach investigations. Some reasons these “Unknowns” might occur are:
- Users mistakenly making copies of the information from one location to another with weaker access controls.
- A lack of explicit directives regarding the controls necessary for the information.
- Poor security architecture design when systems are integrated and/or upgraded.
Recommended countermeasures include:
- Establishing policies, procedures, processes, and training regarding sensitive information. Inform users regarding the necessary access controls for each data type as well as the approved locations and access controls for storing the information. Appropriate storage media and file types should also be explicitly identified.
- The use of file and folder naming conventions. Establish a structure which makes sense and is easy for users to follow. Use access controls to protect the integrity of shared documents.
- Conducting periodic file searches using system tools (such as Grep or Windows Search) to identify documents in places where they should not be. Perform the search using sensitive keywords or words that should not be found in files in a particular directory.
- Using tools to perform dirty word searches to detect inappropriate information flows. Some intrusion detection systems and packet sniffing software might be useful in this regard. At a minimum, monitor information flows at the network edge to determine if sensitive information is inappropriately leaving the system.
All sensitive information within an organization needs to be explicitly identified. Users of the information should be made aware of the different sensitive data types and be provided guidance on approved handling methods. Technical controls which monitor information flows or inappropriate locations of sensitive information should help counteract mistakes, oversights, and perhaps even malicious activity. Knowing the location of your sensitive data is the first step to protect its confidentiality, integrity, and availability.