Top Mistakes to Avoid When Building an Enterprise DLP Solution
18 cybersecurity pros weigh in on the considerations companies should make and the mistakes to avoid when building an enterprise data loss prevention solution.
There's much to consider when it comes to building an enterprise DLP solution, from understanding where your data exists to proper data classification, accounting for shadow IT, and more. First and foremost, you need to start with a clear data loss prevention strategy, evaluate existing systems and solutions, perform a complete data audit, and then take integration into consideration before you even begin the process of building a DLP solution. But these bare essentials don't begin to scratch the surface of the full range of considerations companies must weigh when building a DLP solution.
That's why it's not surprising that many companies overlook important considerations and make some potentially damaging mistakes in the process. To help you avoid common mistakes, we reached out to a panel of cybersecurity pros and asked them to answer this question:
"What are the biggest mistakes companies make when building an enterprise Data Loss Prevention solution?"
Meet Our Panel of Cybersecurity Pros:
Read on to find out what mistakes you could be making in building an enterprise DLP solution (and how you can avoid them).
Faith Kubicki is the Content Marketing Manager for IntelliChief, a company that provides enterprise-level document and data management solutions for mid- and large-size businesses.
"One of the biggest mistakes that we see is..."
When customers opt for cold-site recovery packages to cut costs. This is the least expensive option – and it's better than nothing – but it doesn't offer the best functionality. With cold-site recovery, documents are still configured, but not accessible, while equipment gets set back up and running. The complete recovery process can take days, depending on the circumstances, and companies just can't be without their data for that long.
A hot site recovery package, on the other hand, involves more setup (and a higher investment), but eliminates delays with recovering lost data. Hot site recovery offers a near-exact replica that includes instant backups of all your data. When it comes to important enterprise-level documents like sales orders, invoices, and customer contracts, it's crucial to have all this information easily accessible. Lost data can translate to late invoice payments (and excessive late fees), duplicated payments, and excessively long order fulfillment times (which in turn erode customer relationships). All of this considered, it's crucial to work a full hot site recovery package into your data loss prevention plan.
Boris Shiklo is the CTO at ScienceSoft.
"There is a triad of crucial mistakes connected with DLP systems..."
1) Lack of data inventory before integrating a DLP system into your IT environment. Do you know what data is critical to your business and where it is stored? Do you know how many servers, firewalls, and computers you have? If not, a DLP system won't cover all data sources and won't provide sufficient security level.
2) Neglect of fine-tuning. Purchasing a DLP solution will never add any further security if you don't adjust it to your IT environment. Only a fine-tuned DLP system will properly discover and categorize which data which is worth monitoring.
3) Businesses have to invest in employees' security education. It will prevent a colossal number of data breaches inadvertently caused by staff members.
Chetan is the CTO and co-founder of ShiftLeft. He is a serial entrepreneur with 20+ years of experience in authoring and architecting mission-critical software. His expertise includes building web-scale distributed infrastructure, personalization algorithms, complex event processing, fraud detection, and prevention in investment/retail banking domains.
"The biggest mistake organizations make when thinking about DLP is..."
Failing to address accidental data leakage via the applications their organization builds and maintains. Traditional DLP is focused on stopping malicious or accidental data from user behaviors such as via email or filesharing services. However, Uber lost 57M personal records last Fall because their development team inadvertently pushed hardcoded AWS logins into GitHub. From there, Uber's data breach was initiated simply by logging into AWS with legitimate credentials.
With the adoption of microservices and shortening development cycles, developers leaking sensitive data and credentials in a code repository (GitHub), to a logging service (Splunk), to external object storage services (S3), or analytics platform (MixPanel) is increasingly common. In order to tackle this problem, organizations need to understand the entry point, exit points in applications, and handling of sensitive data instances and variables in source code.
Erich Anderson serves as senior program manager within Global Services at Digital Guardian, bringing extensive knowledge around insider threats and data loss prevention program implementation to customers. He joined Digital Guardian after a successful tenure at the Federal Bureau of Investigation, serving as program manager, where he was influential in the creation of National Policy and Minimum Standards for Insider Threat.
"The biggest mistake organizations make when thinking about DLP is..."
Many programs look at data loss prevention as a burden to the business operations, claiming performance impacts to the computer/network as an example. Other than trying to enable the solution to empower your personnel to make educated decisions and protect your assets, executives will tend to blame DLP for all their problems. I think this is a big mistake. I’ve always liked the analogy of why do you put brakes on a car? Many people will say in order to make the car stop. While this is true, I look at it as way for the car to go faster. Without brakes and other safety features there is higher probability the car will crash. I see DLP in the same way. Programs that enable DLP actually allow the organization to operate efficiently by determining how data is being used, and where data is going. When there is a concern, the program is able to give details to properly respond and remediate. With formalized processes, content lifecycle and procedures, a DLP solution can and will speed the organization.
Brian is the CISO & VP of Technology Innovation at Verodin. He has over two decades of experience in the security industry. After getting his start in security with the Defense Information Systems Agency (DISA) and later Bell Labs, Brian began the process of building security startups and taking multiple companies through successful IPOs and acquisitions.
"Validate your security effectiveness across people, process, and tech to..."
Quickly determine what's working, what's not, and prioritize the remediation of gaps.
Put in place a mechanism to provide configuration assurance – as you start remediation, you need to know that the changes you are making are working.
Beyond manual efforts, leverage continuous automation to ensure that things that are working continue to work as expected, thus allowing you to manage by exception and be alerted when things devolve from a known good state.
Mike Meikle is a Partner at secureHIM, a security consulting and education company.
"There are two big mistakes companies make in purchasing and implementing data leak / data loss prevention tools..."
The first is the lack of a data inventory and audit before a data loss prevention (DLP) tool is purchased. If a company does not know where its data resides, who its owners are, whether the data it stores is critical or non-critical, and what data security regulatory requirements must be met, then procuring and implementing a DLP tool before all these questions are answered is a path to failure.
The next mistake that is commonly made is to treat the implementation of a DLP tool as a technology project, not a business program. When an enterprise commits to the implementation of a DLP product, it must realize that once the tool is in place the hard work begins. Data will have to be discovered, classified, and categorized based on a variety of factors on an ongoing basis. It will move from a project to a long-term program that must remain staffed for the life of the product. If this is not done, then the tool will eventually fall into disuse as staff is reassigned to other initiatives and executives place other priorities on the information technology department. A DLP tool and its program must be aligned with the business and have a business owner for it to be successful.
Aaron is Vice President of The Norris Group, which specializes in California and Florida hard money lending, note investments, and real estate investor resources.
"My most recent experience with data loss came from a screw up I didn't see coming..."
Our web host was charging us extra for daily backups. What I didn't fully understand, as a small business owner doing a ton of other things, is that the daily backup was only saved for 24 hours. We had an incident when we moved to a new server, somehow a crypto mining software got installed and we had to enlist the help of a security firm to help us take it off, and we locked down the site. We used them because it was only then that I found out the backups by our host were completely worthless if it went beyond 24 hours and we didn't catch the issue. Whoops!
CEO of DFHeinz, Daryl Heinz, has helped companies to implement data innovation strategies for the past two decades. His 60-person data consultancy currently supports the Chan Zuckerberg Initiative, NASA, and Universal Studios in building higher-impact data strategies. Heinz launched a global data user group called DUGTalks in 2018 to support in increasing awareness of using open source to secure the competitive advantage for any business.
"The biggest mistakes companies make when building enterprise DLP solutions are..."
1) Companies fail to leverage free, future-proof open-source DLP technologies.
Today's Fortune 500 companies use software reusability principals combined with open-source security solutions to build their own stack (i.e., Apache Flink or Data in Memory and Apache Metron Data Cybersecurity). This allows them to preserve all data assets while also laying the foundation for secure big data, machine learning, and AI solutions.
The Result: Gone are the days of writing one-off code from scratch.
Companies that fail to use data to increase the value of their business or that choose to license big data DLP solutions without first evaluating their unique use case needs will pay an average of 80% more than their competitors to preserve, analyze, and customize their data. This cuts into profit margins and defeats the purpose of pursuing big data innovation within an enterprise. Additional software updates will need to be purchased year-after-year and rights data ownership are often at risk. Companies that will end up ahead of their competitors are building scalable, future-proof data systems today and making an investment in training their talent on how to build and maintain their own stack.
Alexandra Zelenko is a Marketer and Technical Writer at DDI Development company.
"Information security is challenging and requires..."
Significant efforts, not only from IT department, but also from other departments as well. Some common mistakes:
1. Companies do not establish information security as a business component.
Building a Data Loss Prevention solution doesn’t mean only to install X software and supervise. A solid data protection strategy should start with a business need. Companies should inform their employees about securely performed business activities and established policies and procedures to adhere to.
2. Companies do not devote attention to insider threats.
Not giving enough importance to employees’ attitudes and focusing the data security implementation on user behavior and data transfer patterns, and putting the spotlight on external threats, instead, leads to loopholes in data security. There are also the negligence and human error factors that are part of insider threats and need to be addressed with proper tools and policies.
Robert Douglas is the owner & President of PlanetMagpie IT Consulting in Silicon Valley. He's worked in the IT industry for 30+ years, consulting for everything from new startups to major enterprises like Microsoft.
"One of the biggest mistakes companies make with Data Loss Prevention is..."
Not training their employees as part of the solution.
Employees are the single biggest factor in data loss. Whether it's taking a hard drive off site, having their laptop stolen, using an insecure Wi-Fi network vulnerable to hacking, or clicking on a suspicious email, their behaviors constitute the highest risk for data loss across the board.
Now, this almost never happens maliciously. Employees are busy, doing a lot with their time. Slip-ups happen. Shortcuts are taken. Nine out of ten times, they just don't know any better.
That's where the company's responsibility comes in. Your IT department must make sure all employees know how to safeguard the data entrusted to them.
Neglecting the employees' behavior with data opens ALL businesses to data loss. Fortunately, this is a fixable problem... if you include employee security training in your plans.
Michael Hall, CISO at DriveSavers, Inc. is in charge of data security, developing protocols to handle critical and encrypted data for corporations and government agencies. With 23 years' experience in data recovery technology, focusing on high-end arrays, he has successfully recovered data from over 17,000 storage devices.
"One issue that has become more significant over time is..."
The risk of using unvetted third-party data recovery providers.
Company-owned devices often hold security-sensitive electronically stored data (ESI), including critical IP, financial databases, accounting files, e-mail exchanges, customer records, PCI, PII, and PHI. Most corporations have a dynamic layered security practice, which incorporates multiple security controls to protect sensitive data. The reputational and financial consequences of lost or corrupted data require it. If a storage device fails, resulting in lost or corrupted digital data, few corporations have the internal resources to recover that data – especially in the case of physical damage or electromechanical failure. The device must be sent to a third-party data recovery vendor.
Most of the data recovery industry does not meet best practice standards to ensure data protection through cybersecurity; therefore, data recovery service providers must be classified as high-risk vendors. If a corporation does not perform due diligence before engaging the services of a data recovery vendor, it runs the risk of a data breach that will result in major financial and reputational damage.
Steve James is a Marketing Lead with Opus Consulting, Western Canada's largest Managed Service Provider for SME companies.
"Too little time spent on testing..."
With the array of threats now attacking businesses, fully test-bedding your loss prevention against the widest possible range of dangers – both man-made and natural – takes time, but is crucial in the security of your company records. Imagine moving into a fantastic office suite, filing all of your desk drawers and filing cabinets with precious data, then buying the cheapest padlock possible to lock your main office door, and not even trying it first to see if the key actually fits or if it works!
Trave Harmon is the Chief Executive Officer at Triton Computer Corporation.
"The greatest mistake that businesses make when it comes to a DLP solution is..."
Failing to consider all the possible avenues of losing data.
You must consider any and every possible means to get data and information off your systems as a potential leak. We see a lot of businesses that only think about USB drives or email, or utilize the most basic encryption for their network.
We find many businesses consider this a passive solution and not an active one. We always recommend that businesses take an active role and/or outsource their IT and intellectual property protection and monitoring. Far too many businesses have this set-it-and-forget-it mentality.
You should have, at minimum, a review every 90 days about important data and how it is protected.
Dennis Chow is the CISO at SCIS Security, a cyber security focused firm in Houston. Dennis has lead clients in predictive analytics enabled MSSP solutions across Energy and Healthcare verticals.
"The three largest mistakes or assumptions that I see in the enterprise environment for DLP solutions are..."
1. That the DLP solution is going to see any encrypted traffic over the wire. Not all agents act the same on the stack, and you may miss critical files exfiltrating your network if your solution does not act as a man-in-the-middle.
2. That the DLP solution or agent does not have any form of fuzzy learning or machine learning. Without a basic machine learning model built-in, you're only as good as your signatures. If your environment enforces a specific access standard on files or objects, you could be missing out on critical visibility.
3. That The DLP solution is going protect your BYOD and cloud use cases. Enterprises need to be wary that the solution selected or built needs to incorporate APIs and other visibility for cloud solutions and anything that needs to be managed by an MDM.
Tyler Riddell is the Vice President of Marketing for eSUB with over 15 years of experience in marketing, product management, advertising, and public relations. He has a proven track record for successful go to market and corporate communication programs in multiple vertical tech markets.
"Sometimes companies try too hard to..."
Create the right type of data protection system for your company, and instead create a very over-sensitive alarm system. This can tire out an IT department who is always reacting to false alarms and small tech problems. Make sure that your regulations and protocols are efficient, yet flexible.
Lee Reiber is a globally recognized expert in digital forensic investigations, author of Mobile Forensics Investigations: A Guide to Evidence Collection, Analysis, and Presentation, and Chief Operating Officer of Oxygen Forensics, headquartered in Alexandria, Virginia.
"The most important – and most frequently unaddressed – component of an enterprise data loss prevention program is proper employee education..."
As the handlers of your company’s most sensitive details, it is crucial that employees clearly understand which types of data are sensitive, how and when they may be accessed, and the potential repercussions if your policies and protocols are not properly followed. Employees who bring their own mobile devices to work will think twice before hitting send on an email or sharing a file if they understand that a slip-up may result in their phone being seized in the event of an investigation. Don’t hesitate to work with your corporate communications team to ensure your plan’s policies are effectively written and deployed before an incident occurs.
Randy Nieves is the Sr. Vice President of Product Management of Cal Net, A NexusTek Company.
"Cybersecurity threats are growing more sophisticated by the day..."
Small to medium-sized businesses with limited IT resources have quickly become the prime target for today's cybercriminal. The biggest mistake you can make as a business owner is to assume you're too small for hackers to target. Whether you're challenged with meeting IT budget demands, fighting new threat adversaries, or are just looking for more efficiency around your compliance and security initiatives, your success depends on your ability to maintain a secure Data Loss Prevention strategy.
Today, the conversation has changed from buying backup technology that is maintained by internal IT. Now, business owners just want to know their data is secure and that their business can run uninterrupted in the event of a disaster like a cyberattack. Modern security strategies must incorporate multiple tiers of protection and deal with bad actors on a daily basis. Ransomware, for example, changes daily, so protecting the perimeter of your network must be done proactively. In other words, don't make the mistake of thinking your firewall is something you can set and forget.
RJ Martino is the President and CEO of Scale Technology.
"A good data loss prevention solution for enterprise environments starts with identifying critical systems..."
Identifying critical systems is both difficult and time consuming. It is a big mistake for an executive not to prioritize your critical systems. Any executive can simply say it's all critical. But when everything is critical, nothing is. When everything is critical, the cost is exorbitant. When everything is critical, testing a restoration is nearly impossible.
Your leaders need to spend a fair amount of time prioritizing your most important systems. This decision shouldn't be taken lightly.
Then, your technical team needs to identify which systems rely on other systems. You may be surprised how often a technical team backs up a critical backend but forgets to backup the front-end application.