PIIscan: Find and Secure Unencrypted Personal Data

SecurityMetrics PIIscan Helps You Comply with Security Standards and Mandates. 

What is PII, and why do I need to find it?

Personally Identifiable Information (PII) is data kept by an organization which can be used to “distinguish or trace an individual’s identity,” according to NIST. For example, PII could include names, birth dates, birth places, mothers’ maiden names, or social security numbers. “Linked PII” is any information that is linkable to an individual, like educational, medical, employment, or financial information.

Storing these types of (unencrypted) information on your systems and devices can leave your organization open to fines and make you more vulnerable to data theft.

Organizations can manually search for PII on their systems and devices, but doing so is time-consuming, tedious, and expensive in terms of working hours.

Sensitive Data Discovery Tool: SecurityMetrics PIIscan

PIIscan was created to help organizations quickly find and secure unencrypted PII on their systems. The data discovery tool is now widely available and helps organizations and businesses of all sizes comply with data security mandates and standards in the US and EU. 

This scanner runs light, but performs a big job. According to Product Manager Kai Whitaker, “PIIscan is designed to be quick, small, and powerful. Organizations find value and increase their security through the effective scanning that PIIscan provides.”

SEE ALSO: SecurityMetrics Releases PIIscan

encryption, unencrypted data, data encryption, sensitive data discovery, sensitive data discovery tools Unencrypted PII hides in unexpected places

Of all the organizations that conducted first-time data discovery scans with SecurityMetrics PIIscan, 61% found unencrypted PII in their networks. Many times, this sensitive data shows up in accounting, marketing, or other unexpected areas or departments.

Caches of unencrypted PII are highly valuable to data thieves. PIIscan searches systems, hard drives, and attached storage devices for unencrypted sensitive data. If it does find unencrypted sensitive data, it provides you a path to the file location where the unencrypted information is found.

GDPR, PCI DSS, and HIPAA

If you are fulfilling the requirements of security standards and mandates like the EU’s General Data Protection Regulation (GDPR), the Payment Card Industry Data Security Standard (PCI DSS), or the Health Insurance Portability and Accountability Act (HIPAA), it’s important to know where PII is on your systems and whether it’s encrypted or not.

PIIscan searches not only for PII, but also for payment card data like primary account numbers and magnetic stripe track data. PIIscan finds the following information:

USA Social Security Numbers (SSN)
UK National Insurance Numbers (NINO)
Canada Social Insurance Numbers (SIN)
Australian Tax File Numbers (TFN)
Australian Business Numbers (ABN)
Primary account numbers (PAN)
Magnetic stripe track data
Protected Health Information (PHI)

SEE ALSO: GDPR 101 Part 1: Should I Be Worried?

More Tips to help you find and protect PII Data:


1. Monitor your PII data flow
To help find PII flows you might not immediately know about, create and regularly update a PII flow diagram that tracks the processes you go through as you receive, use, store, or transmit sensitive data.

This will help you see where PII enters and exits your organization.

 Here are some areas unprotected PII may be hiding:
  • Printers often store old jobs, which could include sensitive data
  • Error logs frequently contain sensitive numbers in plaintext during a failed authentication
  • Accounting and marketing departments may have email or paper forms with PII
  • Web browser cache may store PII inadvertently

2. Secure and Encrypt PII
When possible, avoid using and storing PII. You can also avoid storing sensitive data by using tokenization or outsourcing sensitive data handling to a third party.

But if you do need to keep data, make sure to find and encrypt PII. All electronic PII that is received, stored, handled, or transmitted in your systems and work devices must be encrypted. Industry best practice would be to use AES-128, AES-256, or better.

3. Segment Your Networks
While not all mandates require network segmentation, it’s considered security best practice to keep your networks that handle sensitive data like PII separate from your other networks.

Whether done physically or through firewall implementation, make sure systems that receive, store, handle, and transmit sensitive data are kept separate from each other. This can be done by regularly doing "segmentation checks.”

Learn more about sensitive data discovery tools or call us about a PCI audit or HIPAA audit at




GDPR 101 Part 1: Should I Be Worried?


What you need to know now about the EU’s General Data Protection Regulation (GDPR).  

Gary Glover
SVP, Assessments
CISSP, CISA, QSA, PA-QSA
With the EU’s GDPR compliance date looming (May 25, 2018), businesses are in varying states of readiness and awareness. Many are likely wondering, should I be worried? What changes do I need to make? What does it mean to be GDPR compliant?

This post is the first of a three-part series in which we will cover basics and requirements of the GDPR. This series is based on our recent “GDPR 101” Webinar. You can watch and listen here.

Who does the EU GDPR apply to?


The EU GDPR applies to any organization that handles the Personally Identifiable Information (PII) of European Union (EU) citizens. Whether an organization is in America, Europe, or somewhere else in the world—the GDPR may apply if it handles the PII of EU citizens.

Following GDPR guidelines will be very important for such companies or organizations. It’s also important to understand that cloud services will not be exempt from the GDPR.

SEE ALSO: Complying with the GDPR: What You Should Know

What is GDPR?


The GDPR replaces the 1995 EU Data Protection Directive. The new GDPR legislation is meant to unite and harmonize privacy laws across the EU. Before the GDPR, different businesses throughout the EU did slightly different things for data protection.

After four years of preparation and debate, the GDPR was approved by EU parliament on April 14, 2016. It went into effect 20 days after being approved and will be directly applicable for all member states two years later on May 25, 2018. After this date, organizations that are not following the GDPR could potentially face severe fines.

At this time, no one can guarantee how severe fines will be, or what types of businesses may be examined for non-compliance first, but after May 25th GDPR becomes enforceable.

GDPR Compliance


Some aspects of the GDPR are easy to interpret. For example, the GDPR says that data owners are required to have an opt-in choice presented to them before a company can begin storing, processing or transmitting their personal information. This requirement is clear, and one could easily determine whether or not that requirement has been met.

However, other aspects are more difficult to interpret. The GDPR states, “protect your data by design and default.” It’s difficult to know if you are perfectly compliant or meeting a specific GDPR requirement according to this statement.

Even though GDPR compliance isn’t currently as well-defined as Payment Card Industry Data Security Standard (PCI DSS) compliance, it’s important to be aware, be concerned, and be reasonable. It’s impossible to say with absolute clarity that an entity is 100% compliant with GDPR, because associated testing procedures are not specifically defined. Perhaps this will come later; various supervisory authorities are working on checklists and similar guidance, which indicates that there will likely be more specific audit protocols as time goes on.

For the time being, you can actively and carefully address GDPR regulations, document your efforts, collect your results, and show risk analysis/assessment results.

Why Should I Care about GDPR?


GDPR guidelines state that an entity can face fines of up to 20 million Euros or 4% of their Global Annual Turnover (AKA “revenue” in the U.S.), whichever is greater. Note that this is the maximum fine amount, and there doesn’t appear to be additional guidance to describe specific fine structure for various types of data compromise or general lack of preparation, other than the regulation stating that a fine could be less than 4%, (e.g., 2% of revenue or 10 million Euros).

We want to reiterate that we’re not saying the sky is falling. But, you should be aware of these regulations and make plans for any necessary changes.

Part 2 of this blog series will go into more depth on terms and definitions, but it’s important to understand the difference between Data Processors and Data Controllers and know that the GDPR rules and requirements apply to both of them:


  • Data Controller: Entities or individuals that need to process personal data in order to do business. They determine the purposes for which and the manner in which the personal data is processed. 
  • Data Processors: Processors take and/or process personal data on behalf of the Controller. 


When Do I Need to Worry about GDPR?


You have until May 25, 2018 to start complying with GDPR regulations. Right now, we don’t know what types of organizations the governing bodies will go after, or how aggressively. All we know is that after May 25 of this year, they can.

If your company has poor security practices that endanger personal information, it makes sense that you could get in trouble according to these EU laws and regulations. On the other hand, if your company takes data security seriously and is actively moving towards alignment with the GDPR or other data security standards, you will naturally fair better.

Remember, May 25, 2018 is not the end of the world. We all tend to fear the worst when a line is drawn in the sand, but someone has to draw one to get us all moving.

As security professionals, it’s our job to help companies clear up security issues. Our experience shows that addressing security and compliance problems may take time. The community has known about this regulation for two years now, so ignoring these regulations will not make them go away. Get started soon and you will see real progress.

Showing real progress in securing PII is important because this demonstrates you’re working towards compliance. If you were to experience a data breach but couldn’t show any proactive work towards security, enforcement of the regulation could be stricter.

If you’re looking to learn more about the GDPR, the Information Commissioner’s Office (ICO) is a UK organization that was set up to uphold information rights for UK citizens.

SEE ALSO: PIIscan: Find and Secure Unencrypted Personal Data

Part 2 of The GDPR 101 Blog Series


Watch for part 2 of our GDPR 101 blog series, which will cover specific terms, requirements, and details of the GDPR.

Gary Glover (CISSP, CISA, QSA, PA-QSA) is Senior VP of Security Assessments at SecurityMetrics with over 10 years of PCI audit experience and 25 years of Star Wars quoting skills. May the Force be with you as you visit his other blog posts.




How to Start a HIPAA Risk Analysis


A step-by-step process and template to help you start along your risk analysis journey. 

George Mateaki
CISSP, CISA, QSA, PA-QSA
Find the HIPAA risk analysis template here.

A risk analysis is the first step in an organization’s Security Rule compliance efforts. It’s the “physical” check-up that ensures all security aspects are running smoothly, and any weaknesses are addressed. And contrary to popular belief, a HIPAA risk analysis is not optional. HIPAA risk analysis is not optional.

The HHS issued guidance for risk analysis requirements that explains in additional detail the purpose of a risk analysis.

“Conducting a risk analysis is the first step in identifying and implementing safeguards that comply with and carry out the standards and implementation specifications in the Security Rule. Therefore, a risk analysis is foundational…”

A risk analysis is foundational to your security. You can’t be HIPAA compliant without one.

What is a HIPAA risk analysis?

A risk analysis is a way to assess the potential vulnerabilities, threats, and risks to protected health information (PHI) at your organization. Though the HHS did not specify an exact risk analysis methodology, they do require certain elements be present in a risk analysis, which we’ll talk about later, namely:

  • Scope analysis
  • Data collection
  • Vulnerabilities/threat identification
  • Assessment of current security measures
  • Likelihood of threat occurrence
  • Potential impact of threat
  • Risk level
  • Periodic review/update as needed

Risk Analysis Methodology

There are a variety of methods to conduct a HIPAA risk analysis, but I’ve described the method I’ve found to work best below. This is a condensed version of the method I use during the HIPAA onsite Risk Analysis that I conduct.

Please understand, conducting a complete and thorough risk analysis is extremely difficult to do yourself. I recommend contracting with a HIPAA auditor to help you. The problem is, most people just simply don’t know where to look, or bypass things because they don’t understand data security. If the Risk Analysis is foundational to your security, then you don’t want to overlook key Risk Analysis elements. (Learn the pros and cons of a HIPAA audit)
So let’s dive a little deeper into the methodology of how to conduct a risk analysis.

Step 1: Define scope by defining PHI flow in your environment

To identify your scope ("scope" meaning: the areas of your organization you need to secure), you have to understand how patient data flows within your organization. If you know all the places your organization houses, transmits, and stores PHI, you'll be able to better safeguard those potential vulnerable places.

There are four main parts to consider when defining your scope.
  • Where PHI starts or enters your entity
  • What happens to it in your system
  • Where PHI leaves your environment
  • Where potential or existing leaks are
Where PHI enters your environment
In the PHI lifecycle, it’s important to identify all PHI inputs. By doing this, you can make sure you identify exactly where security should begin at your organization.

When considering the origination of PHI, think of both new and existing patient records. PHI can begin from patients filling out their own information on physical paper, to business associates faxing you asking for more information about a current or former patient.

Here’s a list of places to get you started in the documentation of where PHI enters your environment.
  • Email: How many computers do you have, and who can log on to each computer?
  • Texts: How many mobile devices do you have, and who owns them?
  • EHR entries: How many staff members do you have entering in data?
  • Faxes: How many fax machines do you have?
  • USPS: How is incoming mail handled?
  • New patient papers: How many papers are patients required to fill out, and where? Front desk? In the examination room?
  • Business associate communications: How do business associates communicate to you?
  • Databases: Do you receive marketing databases of potential patients to reach out to?

What happens to PHI in your environment, including where it is stored
It’s not just enough to know where PHI begins. You must know exactly what happens to it once it enters your environment. Does it go directly to accounting? Is it automatically stored in your EHR? If it is emailed, is it encrypted?

To adequately understand what happens to PHI in your environment, you must record all hardware, software, devices, systems, and data storage locations that touch PHI in any way.

Here’s a list of places to get you started.
  • Filing cabinets
  • Mobile devices
  • EHR/EMR systems
  • Calendar software
  • Email
  • Servers
  • Workstations
  • Networked medical devices
  • Laptops
  • Computers
  • Operating systems
  • Applications
  • Encryption software
How does PHI leave your environment?
A lot of workforce members forget that they must protect PHI throughout its entire lifecycle. And that includes when it leaves your hands. If PHI leaves your organization, it is your job to ensure it is transmitted or destroyed in the most secure way possible. You, along with your business associate, are responsible for how the business associate handles your PHI.

Here are some things to consider when PHI leaves your environment.
  • Business associates
    • Encrypted transmission
    • Minimum necessary
    • Lifecycle with the BA
  • Recycling companies
  • Trash bins on computers
Subscribe for more healthcare posts

Where does PHI leak?
Now that you are the expert on what happens during the PHI lifecycle, it’s time to find the gaps. These gaps in security and environment weaknesses are the whole reason we define scope. Weaknesses provide the ability for unsecured PHI to leak in or outside your environment.

The best way to find all possible leaks is by creating a PHI flow diagram. Essentially, a PHI flow diagram documents all the information you found above, and lays it out in a graphical format. It’s a lot easier to understand PHI trails when looking at a diagram.

We’ll discuss environment weaknesses further in Step 2.

SEE ALSO: PIIscan: Find and Secure Unencrypted Personal Data

Step 2: Identify Vulnerabilities, Threats, and Risks to Your Patient Data


Now that you know how PHI flows in your organization, and can better understand your scope, you have to find the problems within that scope. For each of the identified areas above, you must identify:
  • What vulnerabilities exist in the system, application, process or people
  • What threats, internal, external, environmental and physical, exist for each of those vulnerabilities
  • What is the probability of each threat triggering a specific vulnerability? This is the risk.
As you think about your vulnerabilities, threats, and risks, keep in mind these categories in particular:
  • Digital: (e.g., setting a weak password on an EHR system)
  • Physical: (e.g., not shredding PHI, inaccessibility of facility)
  • Internal: (e.g., employee checks personal email and downloads malware)
  • External: (e.g., hacker trying to breach your remote access software)
  • Environmental: (e.g., fire destroys the building your backups are kept in)
  • Negligent: (e.g., employee accidentally leaving patient data visible in an examination room computer)
  • Willful: (e.g., employee snooping on celebrity, ex-spouse/companion, or family member)
Download and print this HIPAA Risk Analysis worksheet to help you jot down your ideas.

What are your vulnerabilities?

A vulnerability is a flaw in components, procedures, design, implementation, or internal controls. Vulnerabilities can be fixed.

The HHS explains further, “Vulnerabilities, whether accidentally triggered or intentionally exploited, could potentially result in a security incident, such as inappropriate access to or disclosure of ePHI. Vulnerabilities may be grouped into two general categories, technical and nontechnical. Non-technical vulnerabilities may include ineffective or non-existent policies, procedures, standards or guidelines. Technical vulnerabilities may include: holes, flaws or weaknesses in the development of information systems; or incorrectly implemented and/or configured information systems.”

Examples of vulnerabilities I’ve seen while conducting a HIPAA risk analysis:
  • Unpatched operating system software
  • Website coded incorrectly
  • No office security policies
  • Misconfigured or no firewall
  • Computer screens in view of public patient waiting areas
What are your threats?
A threat is the potential for a person or thing to trigger a vulnerability. Generally, it’s difficult for threats to be controlled. Even though most remain out of your control to change, they must be identified in order to assess the risk. Physical location, organization size, and systems all have the potential to be a threat.

According to the HHS, “There are several types of threats that may occur within an information system or operating environment. Threats may be grouped into general categories such as natural, human, and environmental.

Examples of threats I’ve seen while conducting a HIPAA risk analysis
  • Geological threats, such as landslides, earthquakes, and floods
  • Hackers downloading malware onto a system
  • Inadvertent data entry or deletion of data
  • Power failures
  • Chemical leakage
  • Workforce members
  • Business associates
What are your risks?
Risks are the probability that a particular threat will exercise a particular vulnerability, and the resulting impact on your organization.

Let me explain with an example.

In a system that allows weak passwords, the vulnerability is the fact that the password is vulnerable to attack. The threat is that a hacker could crack the password and break into the system. The risk is the unprotected PHI in your system.

According to the HHS, “risk is not a single factor or event, but rather it is a combination of factors or events (threats and vulnerabilities) that, if they occur, may have an adverse impact on the organization.”

Examples of risks I’ve seen while conducting a HIPAA risk analysis
  • Remote access to a PHI system with a weak password. There is an extremely high probability (“high” risk) that an external hacker will brute force the password and gain access to the system.
  • Windows XP machine with access to the Internet. There is an extremely high probability (“high” risk) that an external hacker will exploit security flaws (there is no longer support for WinXP) using malicious software and gain access PHI.
As we talk about vulnerabilities, threats, and risk, I want to reiterate my plea with you to consult a security professional. Even above-average compliance superstars only have a minimal understanding of vulnerabilities and threats. It’s crucial to ask a professional for help with your risk analysis.

Step 3: Analyze HIPAA Risk Level and Potential Impact

Now that you’ve identified any possible security problems in your organization (and there should be a lot), you need to bring that list back to reality. It’s time to decide what risks could and will impact your organization. This risk and impact prioritization is a crucial part of your risk analysis that will eventually translate to your risk management plan.

To analyze your risk level, you must first consider the following:
  • Likelihood of occurrence:Just because you are threatened by something, doesn’t necessarily mean it will have an impact on you. For example, an organization in Texas and an organization in Vermont technically could both be struck by a tornado. However, the likelihood of a tornado striking Texas is a lot higher than Vermont. So, the Texas-based organization’s tornado risk level will be a lot higher than the Vermont-based organization.
  • Potential impact:What is the effect the particular risk you are analyzing would have on your organization? For example, while a computer screen might accidentally show PHI to a patient in the waiting room, it probably won’t have as big of an impact as a hacker attacking your unsecured Wi-Fi and stealing all your patient data.
Every vulnerability and associated threat should be given a risk level. I typically assign mine a number as ‘high’, ‘medium’ and ‘low’. By documenting this information, you’ll have a prioritized list of all security problems at your organization.

Download this risk analysis template worksheet to help you start documenting your risks.

Step 4: Identify Top Security Measures Based on Top HIPAA Risks

Now that you have a prioritized list of all your security problems, it’s time to start mitigating them! Starting with the top-ranked risks first, identify the security measure that fixes that problem.

For example, if your risk is employees throwing PHI in the trash, your security measure could be quarterly employee security training and replacing trashcans with shredders.

Technically, once you’ve documented all the steps you’ll take, you’re done with the Risk Analysis! The implementation phase of fixing your security problems is actually part of your risk management plan (another crucial step towards HIPAA compliance.)

Step 5: Rinse, Repeat

A risk analysis is truly a rinse and repeat process. One of the most important parts of your risk analysis is documentation. If you don’t document steps 1-4, you can’t prove to the HHS that you’ve done a complete and thorough risk analysis. They will want to see documentation, your risk management plan, and monthly progress on addressing the items identified in that risk management plan.

There is a lot to do, and it can be overwhelming. Don’t try to do it all at once, but start now and schedule time each week or at least once per month to work on your HIPAA compliance.

George Mateaki (CISSP, CISA, QSA, PA-QSA) is a Security Analyst at SecurityMetrics with an extensive background in Information Security and 20+ years in IT.


2017 PCI DSS Data Breach Trends

What can we learn about PCI compliance from data breaches in 2017?  

Noncompliance to PCI Requirements

data breach, data breach trends, 2017 data breaches2017 was a year marked by massive hacks like Equifax, rampant malware like WannaCry and Petya, notable vulnerabilities like KRACK, as well as changes to and guidance about the Payment Card Industry Data Security Standard (PCI DSS).

But one thing remains the same year after year; complying with the PCI DSS makes organizations more secure. Failure to comply with even one of the PCI requirements can set a company up for inevitable data breach and theft. Our Forensics department has compiled statistics from their own PCI data breach investigations in 2017. This PCI Data Breach Visualization shows which specific failures in compliance contributed to the data breaches our Forensics department investigated.


data breach, 2017 data breaches, data breach forensics

Most Common Contributors to a Data Breach

At a rate of 73% of all investigated breaches at SecurityMetrics, noncompliance with PCI requirement 10 “Implement Logging and Log Monitoring” was the issue most frequently associated with a data breach.

Log monitoring systems (e.g., Security Information and Event Management [SIEM] tools) track network activity, inspect system events, alert of suspicious activity, and store user actions that occur inside your systems. They can help warn you about a data breach, providing you with data known as event logs, audit records, and audit trails. Regular log monitoring means a quicker response time to security events and better security program effectiveness.

In second place for contribution to data breaches is noncompliance with PCI requirement 11, “Conduct Vulnerability Scans and Penetration Testing.” These scans and tests are critical for finding vulnerabilities and testing your in-place security measures. There are different types of vulnerability scans and penetration tests depending on your business or PCI audit needs.

2017 Forensic Data Takeaways and Tips

The average organization was vulnerable for 1,549 days
  • The “window of vulnerability” is the time during which a system, environment, software, and/or website could potentially be exploited by an attacker. 
  • To reduce the window of vulnerability, implement and monitor standard security controls as found in the PCI DSS. Testing your security through traditional means like penetration testing and vulnerability scanning is a great place to start. 
Cardholder data was captured for an average of 237 days
  • This is the window of time during which data is being recorded, gathered, and/or stored from an unauthorized source.
Cardholder data was exfiltrated for an average of 264 days
  • “Exfiltration” is the unauthorized transfer of data from a system (e.g., exporting).
data breach, data breach trends, 2017 data breaches45% of organizations were breached through remote access
  • Remote access remains one of the most common hacking vectors because businesses often configure their remote access application insecurely. 
  • Limit employee access to remote access and implement multi-factor authentication.
21% of organizations were breached through malicious code
39% of organizations had memory-scraping malware on their system
97% of organizations had firewalls in place at time of compromise, but at least 15% of firewalls did not meet PCI requirements
  • Make sure firewalls are configured correctly and assign someone to monitor your firewall logs daily. You need to document your firewall-related policies and procedures, as well as regularly review and test your firewall rule sets. 

Now is the time to look back at forensic lessons and apply them to your current security practices. You can look forward to increased security in 2018 if you adjust your security practices to more closely follow the Payment Card Industry Data Security Standard.


Employee Data Security Training: Tabletop Exercises

data security training, tabletop exercise, cyber security tabletop exercise, incident response plan

Learn how to prepare for a data breach by conducting drills, exercises, and trainings. 

david ellis, tabletop exercises, data security training, cyber security tabletop exercise, incident response plan, employee training
David Ellis
SVP, Investigations
CISSP, PFI, QSA
Massive data breaches—and their devastating aftermath—are increasing in frequency. Businesses have taken renewed interest in planning for inevitable attacks on, and possible breaches of, their cardholder data environment. In this post, we’ll cover best practices for holding “tabletop exercises” like drills, discussions, and trainings, to make sure your business is prepared and protected in the event of a breach.

In their Guide for Cybersecurity Event Recovery, The National Institute of Standards and Technology (NIST) states that your Incident Response Plan will have 6 phases:

    data security training, tabletop exercise, cyber security tabletop exercise, incident response plan
  1. Preparation
  2. Identification 
  3. Containment
  4. Eradication
  5. Recovery
  6. Lessons Learned


This article focuses on parts of Phase 1— “Preparation”— as it’s different from the other five phases. It is the foundation of your entire incident response plan. Technically, you should always be in Phase 1, by holding regular training, drills, and incident-response-plan review. You will perform the vast majority of your planning and work during phase 1. During this phase, you should:


  • Ensure your employees are properly trained regarding their incident response roles and responsibilities in the event of data breach.
  • Develop incident response drill scenarios and conduct mock data breaches, at least annually, to evaluate the effectiveness of your incident response plan.
  • Ensure that all aspects of your incident response plan (training, execution, hardware and software resources, etc.) are approved, funded, available, and inventoried, in advance.


Your response plan should be thoroughly written, explaining everyone’s roles and responsibilities in detail, and you should document to whom it is distributed as well as the dates they received training regarding their role(s). Then the plan must be tested in order to assure that your employees will perform as they were trained.  The more prepared your employees are, the less likely they’ll make critical mistakes in the event of an actual breach incident.

SEE ALSO: 5 Things Your Incident Response Plan Needs

Types of Cyber Security Training Exercises


The point of running security incident response exercises is to increase awareness, test training effectiveness, and start discussions. Everyday drills and exercises can be as short as 15 minutes; where large-scale coordinated drills can last up to a day or two.  Your annual training should include at least one large-scale drill, while smaller table-top drills may be conducted more frequently according to your company needs.

DISCUSSION-BASED EXERCISE
In a discussion-based table exercise, you and your staff discuss response roles in hypothetical situations. A discussion-based tabletop exercise is a great starting point because it doesn’t require extensive preparation or resources, while still testing your team’s understanding of their incident response roles to potential real-life scenarios without risk to your organization. However, this exercise doesn’t fully test your incident response plan or your team’s actual response actions.

SIMULATION EXERCISE
In a simulation exercise, your team tests their incident responses through a live walk-through that has been highly choreographed and planned. This exercise allows participants to experience how events actually happen in semi-real time, helping your team better understand their roles. Simulation exercises require more time to plan and coordinate, while still not completely testing your team’s capabilities.

PARALLEL TESTING
In parallel testing, your incident response team actually tests their incident response roles in a safe test environment.  Parallel testing is the most realistic simulation possible and provides your team with the best feedback about their roles. However, parallel testing is more expensive
and requires more time planning than other exercise because you need to simulate an actual production environment (e.g., segregated systems, networks).

Data Security Training Tips

data security training, tabletop exercise, cyber security tabletop exercise, incident response plan
Before running through your exercises, consider these questions:


  • Have your security policies and incident response plan been approved by appropriate management?
  • Has everyone been trained on your security policies?
  • Does the Incident Response Team understand their roles, including making any required notifications?
  • Are all Incident Response Team members prepared to participate in mock drills?


When designing your tabletop exercise, prepare the following exercise information:


  • A facilitator guide that documents your exercise’s purpose, scope, objective, and scenario
  • A list of questions to address your exercise’s objectives
  • A participant briefing that includes the exercise agenda and logistics information
  • A participant guide that includes the same information as the facilitator guide, without the facilitator guide questions (or it may include a shorter list of questions designed to prepare participants)
  • An after-action report that documents the evaluations, observations, and lessons learned from your tabletop exercise staff


As you conduct your exercises, keep an eye out for a few things:

Task Timing:  Note how long certain tasks and operations seem to take under pressure. How long does it take to disconnect all of your potentially affected systems from the internet? How quickly can the team get a formal statement together? How quickly can they pull together a list of affected customers? If you do experience a data breach, there may be requirements for how soon you need to report it—especially if the suspected breach includes either HIPAA or PCI data.

Increased Volume:  You should test the ability of departments (like your call center, IT department, website, etc.) to expand and meet the demands of a data breach’s aftermath. Can your IT team handle an increase in internal requests? How many customer support calls can you realistically handle? Who will deal with the increase in customer questions on your website or your social media accounts?

Snags in the Plan:  Expect that some things may not go as planned.  This is not a cause for panic, as discovering issues that you were not prepared for is one of the primary reasons for conducting mock breach exercises.  Simply watch out for anomalies, note them, and address them either during the course of the mock exercise, or through the after-action process. This will also help you to develop contingency plans and alternative action scenarios. For instance, if someone plays a critical role in the incident response plan but happens to be out of office that day, what will you do?

After conducting a mock data breach exercise, be sure to set up a debrief meeting to discuss response successes and weaknesses. Your team’s input will help you know where and how to make necessary revisions to your incident response plan and training processes.

LEARN MORE: PCI AUDIT, DATA SECURITY

A Different Kind of Prevention


While it would be great to be able to prevent each and every data breach before it happens, in today’s world it’s often just not possible. While you still need to take all appropriate security measures and comply with all PCI and/or HIPAA requirements, it’s important to keep a “not if, but when” mindset when creating your incident response plan—remember, the designers of the Titanic didn’t think that it could sink. These exercises are an important part of your company’s security habit, and are intended to test your response plan, increase confidence, identify related strengths and weaknesses, and decrease collateral damage.

David Ellis (GCIH, QSA, PFI, CISSP) is Director of Forensic Investigations at SecurityMetrics with over 25 years of law enforcement and investigative experience. Check out his other blog posts.