Skip to main content

The Best Tools & Practices for Better Data Visibility and Monitoring

by Juliana De Groot on Friday August 7, 2020

Contact Us
Free Demo
Chat

As the adage goes, you can't secure what you can't see. With that in mind we asked 21 security experts what they think the best tools and practices for data visibility and monitoring are.

21 Security Pros Share the Best Tools & Practices for Better Data Visibility and Monitoring

With companies managing rapidly growing volumes of data from a variety of disparate sources, security monitoring is increasingly complex as the security perimeter is difficult to define and constantly evolving. In one 2019 survey of more than 300 IT professionals who manage public and private cloud deployments, 87% of respondents said the lack of visibility into cloud workloads clouds security and hinders business value. "Nearly seven out of 10 respondents said public cloud monitoring is more difficult than monitoring data centers and private cloud environments, and less than 20% said their organizations can properly monitor public cloud environments," Dark Reading reports.

While cloud computing certainly makes data visibility and monitoring more complex, it's not the only challenge. The sheer volume of data companies manage from a multitude of sources is difficult to aggregate in a meaningful way to derive valuable insights and detect potential security concerns – at least without the proper tools. To find out what tools and best practices today's security leaders turn to for better data visibility and monitoring, we reached out to a panel of security professionals and asked them to answer this question:

"What are the best tools and practices for better data visibility and monitoring?"

Read on to find out what our panel had to say about the best tools and practices for better data visibility and monitoring.


Quincy SmithQuincy Smith

@ampjarcom

Quincy is part of the marketing team at Ampjar and currently lives in Shanghai. He's passionate about solo travel, strong coffee, and IPAs.

"The best thing we've done for our data is invest in creating a singular source of truth..."

Like most companies, we use multiple tools to measure usage, marketing effectiveness, and customer communication – this results in numerous pieces of code being deployed on our site that don't always report the same data (for example, our chat tool Intercom might classify a user as coming via direct traffic whereas our analytics tool might call it a referral).

Having this type of variance is incredibly frustrating and can make accurate and meaningful reporting hard to achieve.

To combat this, we have worked hard to implement Segment across our site and tool to ensure only one piece of code is used to track everything and all of the data that is pushed to our tools is the same. Segment basically gives us a single criteria or definition of data and converts everything into the format of our respective tools – this means there is little to no discrepancy as there is only one data source.

The result has been much cleaner and reliable reporting as well as less development maintenance as we only have to adjust or add segment code vs. separate pieces for each tool.


Maryanne Steidinger Maryanne Steidinger

@msteidinger

Maryanne Steidinger is the Head of Marketing at Webalo.

"The best tools and practices for better data visibility & monitoring are highly dependent upon the environment upon which you are working..."

My response is based on the 'real-time' world of manufacturing operations. The needs within this area are to provide context (meaning) to the data so that workers or management can make decisions based upon the results or outcome.

With any data monitoring project, the ability to get as much relevant data as possible in order to make good decisions is critical. This means integrating into the appropriate applications, such as ERP (Enterprise Resource Planning), MES (Manufacturing Execution Systems), Maintenance Management (Enterprise Asset Management) or CRM (Customer Resource Management) if you are operating within a manufacturing environment, as these systems hold information such as customer data, delivery data, incoming supplier management data that all may have relevance to the decisions, such as 'is this material good,' 'what decisions or steps do I need to take next as a result of this non-conformance,' or 'what do I do with this incoming material?'

For process monitoring, dashboards are ubiquitous for providing context into operations data. Depending upon the tool you use, these dashboards can be real-time or they can be latent (such as those provided by a business intelligence product). Sometimes, you'll see dashboards and other KPI (key performance indicators) being displayed on mobile devices; this allows a task worker or knowledge worker to make immediate, informed decisions without being tied to a line or area. Mobile is extremely important especially if the worker is not at the plant or site, but is remote and needs to make a decision on the repair of an asset (such as a pump) or perform a specific maintenance procedure.

Overall, you should try to select a solution that fits the way you do business, versus changing your business to try to accommodate a tool. Selecting a product that reinforces your Good Manufacturing Practices not only 'institutionalizes' but 'digitizes' your operations for standardization. Having a product that can integrate with existing applications, extend their performance with rich visualizations, mobile capabilities and drill down, capture barcode and other identifications, and use technologies such as GIS (geographical location) all enrich the user experience for data visibility and process monitoring.


Matt HallMatt Hall

@Bocada

Matt Hall is CEO of Bocada, a backup reporting software company used by more Fortune 500 organizations than any other solution to simplify data reporting operations, reduce IT risk, and support independent audit and compliance oversight.

"One of the best practices associated with data monitoring and oversight is..."

The upfront automation of data mining, normalization, and aggregation. It's easy to speak about data reporting and visualization. The reality, however, is that anyone monitoring activities that come from disparate platforms, software, or other tools will be greatly taxed by the sheer amount of manual effort it takes to pull data from each of these tools one-by-one, format, and simplify the data in a way that ensures it's all aligned, and then synthesize it for easy interpretation. It's time intensive and fraught with human error.

True business strategy and insights come from being able to proactively monitor and visualize data. Yet when your team is stuck with only manual approaches to collecting and aggregating data, they have little time for these higher-value activities. As a result, seeking tools tailor-made to your core industry or functional area that automate data collection and aggregation across your most-used platforms will go a long way to freeing up time for activities that truly impact business performance.


Ilia SotnikovIlia Sotnikov

@Netwrix

Ilia Sotnikov is an accomplished expert in cybersecurity and IT management, VP of Product Management at Netwrix, and a vendor of information security and governance software.

"Whatever marketers say, a tool itself cannot deliver value for better data visibility and security..."

Tangible benefits lie in the well-established processes that include those tools and help better control your data. I'd like to highlight three best practices as absolute must haves:

  • Regularly classify data based on its sensitivity – at least twice a year. Thus, you'll be able to focus your security efforts on protecting most valuable assets. Many organizations rely on users to classify data but automated classification will eliminate human errors and streamline the process.
  • Ensure that access rights are granted to authorized staff only who need to access and work with sensitive content. Look for identity and access management solutions to control permissions, both as part of an ongoing process as well as ad hoc.
  • Get rid of stale data every 90 days, so you don't take responsibility for data you no longer need. Deploy an automated solution that can find stale data and collaborate with the data owners to determine which data can be archived or permanently deleted.


Nik Whitfield Nik Whitfield

@nikwhitfieldUK

Nik Whitfield is the CEO of Panaseer. A computer scientist and security technology entrepreneur, Nik has built advanced cyber security monitoring platforms for the world's most prominent commercial organizations. He's worked for over a decade with CISOs of global firms in the US and UK, building behavioral analytics and threat intelligence management tools.

"Last month, we commissioned a study where we asked 200 enterprise security leaders about their biggest cyber challenges..."

The vast majority (89%) said that they were struggling with visibility and insight into trusted data. Nearly a third (31%) were concerned that a lack of visibility will impact their ability to adhere to regulations.

Complex and fragmented IT environments have compounded the visibility challenges for security teams. These issues are being exacerbated by the sheer number of security tools in use. On average, enterprise security teams are grappling to manage an average of 57.1 discreet security tools. Over a quarter of respondents (26.5%) claimed to be running 76+ security tools across their organization.

Buying more tools does not equate to enhanced security. In many cases, they can actually impair visibility and cause bigger headaches as they often integrate poorly, have overlapping functionality and gaps in coverage. Also, because we lack visibility across security controls and technical assets, the tools we buy aren't fully switched on.

To resolve the issue with visibility we need to look to continuous controls monitoring, which enables automation, which will give security leaders an ability to see gaps in real-time and make faster decisions on the best steps to enhance their overall cyber risk posture.


David Mitroff, Ph.D.David Mitroff, Ph.D.

@PiedmontAve

David Mitroff, Ph.D. is business consultant, marketing expert, author, and keynote speaker who founded Piedmont Avenue Consulting, Inc., where he advises on leveraging new technology to create brand awareness, strengthen customer loyalty, and generate new business leads.

"In the age of information and technology, we have to deal with and manage a large amount of data..."

Most of the data is so sensitive and important that we need safe data visibility and monitoring solutions. There are many software and online data tools available that assist you for better data visibility and monitoring by even the maintenance of its confidentiality.

Digital Guardian's (DG) data visibility and control is the best software for the management and storage of your data. It encrypts your information through the five-steps inspection of a document. It is quite efficient in identifying and monitoring the PCI, PII, and PHI data through automatic content inspection. DG set the alerts for the policy violations into the system to get real-time data transmission. It provides high protection against security threats at the endpoints. DG controls the access of the document and stops data theft.


Catherine GuzevatayaCatherine Guzevataya

@promodo_en

Catherine Guzevataya is a web analyst in the Promodo digital marketing agency.

"It all depends on the tasks that you set and the sources of data collection for visualization..."

If you don't have enough data and it is already stored in an aggregated form, Google Data Studio will meet your needs. Data is imported into your report directly from the source, and it is quite easy to compare different periods. The update depends on how often the data is updated at the source. Basically, you can use only the default elements for visualizations. Another plus is that the tool is free. Access is open to any Google account and can be provided with a link.

For more complex tasks, I recommend considering Microsoft Power BI. There, you can import aggregated data from the source into your reports and process these data directly, using the tool. It's usually used to process large volumes of data. In addition to the default features, there is a visualization gallery and the ability to create a custom one.

Updating the data in the report also depends on the frequency of data updating in the source, and you can customize your schedule. Access can be opened to a Microsoft account, but only if you obtained the Pro account for $10 per month. The ability to control access levels and use different visualizations for different users (depending on the need) is another advantage of Microsoft Power BI.

Tableau is another popular visualization tool for processing large volumes of data. To use these reports, you need to purchase a license – an annual subscription is $12 (over 100 licenses). The tool has a variety of visualization elements. But as in Data Studio, this is better for processing data separately (for example, in any ETL tools).


Jessica ThieleJessica Thiele

@VL_OMNI

Jessica Thiele is the Director of Marketing at VL OMNI, a point-to-multipoint serverless data integration platform able of capturing business rules for a fully automated supply chain and technology stack.

"Most businesses, no matter the vertical, employ between 3-5 applications..."

Increasingly, these applications are web-based and in the cloud, and many come with varying degrees of analytics built right in.

But good (and great) data visibility and monitoring doesn't happen in individual applications. Think about it: if you're logging in to 3-5 different applications and exploring their analytics, not only are you not getting a complete, unified picture of your company's performance, but you're going to waste valuable time logging into each application and digesting the information each contains.

The most important tool for better data visibility and monitoring is the unification of all the data between all of your applications with data integration. Data integration, the automated movement of data between applications, is absolutely integral to business intelligence. Without it, you're working with an incomplete picture, at best. And great data integration isn't out of reach, either: depending on your business size, data volume and movement, and overarching strategic goals, everything from a simple plug-and-play integration to strategic data integration that approaches your entire business as a strategic endeavor are available from free to reasonable monthly fees. (The trick is choosing the right solution integrations or other self-serve approaches, only to replace them within weeks after realizing they won't do what the business requires.)


Nick OrserNick Orser

@Verato_Software

Nick Orser is the director of product marketing for Verato, a company that develops patient matching solutions.

"As organizations pursue initiatives that rely on having complete, connected, and secure data, they cannot lose sight of..."

The tools that fundamentally enable their data to be complete, connected, and secure. In this vein, organizations should consider upgrading their master person index (MPI) tools to ensure that all their consumer data is associated with the right people so they can gain 360-degree views of their consumers. SaaS MPI tools in particular are gaining ground with innovative organizations who need to leverage new, disparate data sources for analytics, marketing, and consumer experience initiatives, but who don't want to deal with the hassle of their legacy master data management (MDM) tools.


Monica Eaton-CardoneMonica Eaton-Cardone

@Monica_Eaton

Monica is the co-founder and COO of Chargebacks911, a global risk mitigation firm helping online merchants optimize chargeback management globally through offices in North America, Europe, and Asia.

"Logentries is a useful tool that helps businesses easily understand their log data from day one..."

It provides an in-depth visual analysis of data trends, management, and performance tools as well as security features. Retrace is another great software that allows you to explore your logging fields, customize your log properties, trace web transactions, and track your messages. It combines APM, errors, logs, metrics, and monitoring in a single dashboard.

ManageEngine OpManager is an additional tool that provides workflow automation, virtualization monitoring, and network traffic analysis. These tools will help you identify the cause of any software error. Best of all, they can be used for as little as $10 per month, meaning one needn't break the bank to reap these benefits.

A few practices to keep in mind include filtering out unnecessary data, classifying the most important systems as well as making sure these systems will have the most number of alerts, testing your monitoring alert system, developing a process of resolving alerts, and most importantly, documenting everything.


Clare BittournaClare Bittourna

@gocodal

Clare Bittourna is a Chicago-based digital marketer, currently in a role as the Marketing Designer at Codal, a UX design & development agency. She is a forward-thinking, creative self-starter with a passion for compelling digital experiences.

"My favorite tool for monitoring and data is Nagios..."

Nagios is a great tool because it reduces the risk of unexpected downtime by early detection of potential failures. It also provides a complete monitoring of storage systems, is very flexible, and has a highly modular functionality.


Kayla KellyKayla Kelly

@PayproCorp

Kayla Kelly is a Marketing Manager at Paypro.

"My coverall advice on both fronts is to find a SaaS partner that can grow with your company..."

When you're looking for a SaaS product to cover your team, it's certainly understandable to look for features that will serve you today. But you should also look to the future. What kinds of features will your company need as it grows 2x, 5x, and even 10x? Or, is your company planning for no or slow growth, and you need software that can ensure their features you love today will be around in the long-term?

Whatever your company's goals, data is going to be one tool you leverage along your journey. Just as we've already long passed the days of desktop calculators, and mountains and cabinets full of paper-based calculations, we're already moving past the growth experts who have 3-5 subscriptions to different software platforms and spoon-feed their clients the data.

Every company should look to a SaaS product or two that can cover their data needs today, and serve them as they grow into the future.


Gabriel Meira FigueiredoGabriel Meira Figueiredo

@SlicingDice

Gabriel Meira Figueiredo is the Inbound specialist for SlicingDice.

"The best tool for data visibility and monitoring is..."

Datadog because it's built for hybrid cloud environments, which can help companies monitor the performance on their network, tools, apps, and applications. It's got flexibility through PI, and Datadog is easy to install and operate. Companies should have an end-to-end view of their entire data environment through different places, data centers, and cloud-based solutions, and Datadog has simple functionalities to help them achieve that.


Nacho LafuenteNacho Lafuente

@Datumize

Nacho Lafuente is the CEO and Co-Founder at Datumize. He is a Computer Science Engineer with 15+ years in the IT sector. He gained expertise in software development, QA, consulting, presales, sales, and PM. Eventually, he decided to realize his vision around data, accumulated over the years of working with enterprise mission critical systems and founded Datumize to make it happen.

"Start gathering your non-structured and non-stored data, thinking of network transactions, industrial networks, WiFi technology, distributed databases as..."

New sources of relevant information for your security management. Innovative technologies for data capturing are helping to integrate data from fancier sources, so you have your data lake nourished with new and better data.


Alexandra ZelenkoAlexandra Zelenko

@DDI_Development

Alexandra Zelenko is a Marketing and Technical Writer at DDI Development company, which delivers web and mobile digital solutions for a wide range of business verticals.

"Being an innovative solution in network management and monitoring, a network monitoring switch enables security technologies to..."

Retrieve exactly the right data at the right time and provides visibility to the entire network, rather than a potentially distorted view of a subset of the network. In addition to that, network monitoring switches provide complete network visibility by aggregating, filtering, and replicating traffic so all tools get the data they need.

That's why the network monitoring switch is one of the best practices to deliver the right data at the right time to security tools. Network engineers and security professionals can collect the information across hard-to-reach network ports and describe which security and monitoring tools need particular data. What's more, they can define and filter the data that allows the security professional to deliver just the data required for analysis to each security tool.


Mat SteinlinMat Steinlin

@peerfit

Mat Steinlin is Director of Security for Peerfit. He is an IT management executive with 18 years of broad experience across industries, currently overseeing security for the digital health company. Mat leads on the principle that people, not technologies, are the most critical asset of any organization.

"The trend to SaaS (Software-as-Service) based applications is an excellent development for businesses..."

It reduces IT-related maintenance work and allows companies to focus on their core competencies. As beneficial as this trend is, it comes with additional challenges for today's IT departments: How do we genuinely integrate these solutions into our enterprise? How are we achieving overarching visibility between these disjointed SaaS solutions? How do we ensure a compliance-ready security posture in this decentralized setup? There is not a single tool or solution which addresses all of these questions. I want, however, to highlight one of the core solutions every enterprise should call their own in such an environment.

SIEM (Security Information and Event Management) software empowers companies to get deep insights and visibility into their environment, far beyond pure IT security. Collecting log files in a centralized SIEM enables security and IT Teams to correlate security events, cross reference platform-diverse data, create threshold-based alerting, and permits all-encompassing trend analysis.

Key benefits are from my perspective:

  • Ensure and demonstrate IT compliance
  • Preventing potential security breaches
  • Event tracing across the entire enterprise
  • Reducing the impact of security events
  • Analysis, reporting, and data retention in a single solution
  • Increased efficiency and proactive alerting
  • State of the art evidence and artifacts delivery during certifications

I would encourage every company to take a good look at adding a SIEM to its IT footprint. A SIEM is not a set-it-and-forget-it solution and requires constant adjustments and work to keep the produced insights meaningful.


Arawan GajajivaArawan Gajajiva

@matillion

Arawan Gajajiva is an experienced data and cloud technologist with over 20 years of technical expertise. Arawan has helped various companies across different industries and verticals to build and maintain their data architecture. Presently, Arawan is the Principal Architect at Matillion, an ETL solution purpose-built for cloud data warehouses.

"A best practice for data visibility and monitoring for security professionals is to implement security controls..."

Security controls help define the processes that enterprises can use to prevent, address, audit, and recover from data breaches. You should implement these controls in the following categories:

  • Preventive Controls, help prevent a security incident from occurring which includes data masking/encryption, regular security scans at all levels of your data architecture, creating data storage zones with associated user-level access;
  • Detective Controls, help identify a security incident in progress with things like data loss prevention services and audit logging; and
  • Collective Controls that limit the extent of impact if a security incident occurs (think automated backups).


Gene VilleneuveGene Villeneuve

@Tehama_io

Gene is a seasoned IT leader with more than two decades of experience. He is passionate about helping customers and partners adopt technology for better business outcomes. He directs the strategy, operations, and market growth of the Tehama cloud-based platform. Gene also sits on the board for the Canadian Advanced Technology Alliance (CATA) Cyber Council.

"With recent data breaches like Capital One and Desjardins taking over the news, cybersecurity is top of mind for enterprises of all sizes..."

While there are many essential factors in data security, there are three things that are crucial for companies to maintain a healthy security posture.

1. Control identities so they can't be copied. Any employee with access to sensitive data to do their work needs extra security measures in place to secure and monitor sensitive data. Providing access to sensitive data exposes enterprises to new risks. To manage these risks, businesses typically assign access privileges through identities that tell the corporate network the individual trying to log on has the authority to do so. But what if a privileged identity falls into the wrong hands? In fact, stolen or compromised identities are at the heart of many major data breaches. That was the case with global IT outsourcing and consulting giant Wipro in an attack that unfolded over multiple months. Attackers employed a legitimate and widely used remote access tool on hacked Wipro devices to break into client systems. Nearly a dozen other companies were targeted during the same period.

2. Reduce the chance of malware infection from endpoint devices. Then there's no risk of malware transferring to the virtual machines or more importantly to the corporate data assets and systems. A cloud-based workspace alone isn't enough. For true malware protection, a virtual workspace must be equipped with features such as:

  • Clean OS images that have never touched the Internet. Laptops used for email and web browsing are highly susceptible to malware and identity theft. Leveraging virtual OSes that haven't been exposed in those ways removes this risk. Isolating virtual OS instances from the Internet is the purest form of protection.
  • Isolation from the outside world. Remote users must access virtual desktops using network protocols that do not directly attach to clean virtual desktops. Using presentation-layer protocols such as PCoIP and HTTPS for desktop access removes the threat of malware intrusion from potentially infected endpoint devices.

3. Create a Zero-Trust Network. What makes a virtual environment "zero-trust"? Four things:

  1. Dynamic firewall rules. These allow firewalls to adapt and respond intelligently to shifting conditions.
  2. Segmented Network Access. This significantly restricts even properly authenticated individuals from accessing network assets they should not. It restricts any unauthorized malware that might gain the opportunity to browse the network looking to expand: segmented networks eliminate lateral network access potential.
  3. A firewall with dynamic IP/port resolution. This allows network administrators to specify which IP addresses and ports users can access. To simplify processing, firewall rules have to remain inactive until the moment they're needed, barring access until the system can authenticate the user.
  4. Lastly, audit trails and access logs. These support regulation and compliance by enabling monitoring and forensic analyses – shedding light on who has done what within the environment.


Teresa RuleTeresa Rule

@RNT_ProServices

Teresa L Rule, PMP, CISA, is the President and Co-Founder of RNT Professional Services, LLC. The Oklahoma native is a USMC Veteran, with over 20 years experience in cybersecurity project management. Clients include commercial, small business, and federal organizations in multiple industry sectors.

"Secure data management requires a multi-faceted approach..."

The initial facet must be the use of secure hardware to host the data and with which to build the network. The second facet much be a strong governance, risk, and compliance (GRC) team within the organization. The GRC team ensures business rules around usage of data as well as policies and procedures to define which organizational roles have access to which data. The third element should be a monitoring tool which tracks who accesses which data. This tool should be able to track both external intrusions and internal usage which falls outside normal behavior. The Wraith tool by Vanguard Infrastructures is an excellent example of this functionality. Validation of functionalities and data security within an organization should include a routine quarterly assessment from a licensed third-party assessor. Together, a closely integrated information security process with unique tools can mitigate known risks such as insider threats and industrial espionage and ensure a better cyber security posture.


Gabe TurnerGabe Turner

@securitybaron

Gabe Turner is Director of Content at Security Baron, a website dedicated to cybersecurity and the Internet of Things.

"For better data visibility and monitoring for security professionals, I recommend a few things..."

Authentication: Security professionals should set up two or multi-factor authentication so users can securely log into their accounts using a passcode, or biometrics like facial and fingerprint recognition.

Audit trail: You should also track who accesses computer systems and what applications they used on the systems.

Risk assessments: Periodically, security professionals should monitor potential threats to the security of their devices and perform software updates. Professionals should check for cyber-attacks on systems, laptops, hardware, data from customers as well as any intellectual property of the company. There are a bunch of different tools you can use like Cyber Health Check and vsRisk – Risk Assessment Tool from IT Governance USA, a website all about IT and cybersecurity.


Caroline GerenyiCaroline Gerenyi

@v5visallo

Caroline Gerenyi is the Head of Marketing at Visallo.

"One of the most important trends right now in data visibility for security is investigative link analysis..."

This is a relatively new way for investigators and analysts to connect the dots in their data and see otherwise hidden connections through intuitive visualizations.

Link analysis tools don't replace the role of the analyst, but augments their hard-earned experience and intuition with data-driven insights and analysis that would be difficult, if not impossible, to discover otherwise.

Investigative link analysis tools assist analysts in many critical ways, including:

  • Providing much greater insight by helping them discover complex hidden connections in massive amounts of data
  • Enabling them to work faster and achieve better situational awareness using an extensive collection of visualization tools
  • Ensuring nothing is missed by aggregating and analyzing data of all types into a single, searchable, investigation system
  • Promoting collaborative work across investigative teams, leading to more rigorous conclusions

Tags:  Data Privacy Data Security

Recommended Resources

The Definitive Guide to Data Loss Prevention
The Definitive Guide to Data Loss Prevention

All the essential information you need about DLP in one eBook.

6 Cybersecurity Thought Leaders on Data Protection
6 Cybersecurity Thought Leaders on Data Protection

Expert views on the challenges of today & tomorrow.

Digital Guardian Technical Overview
Digital Guardian Technical Overview

The details on our platform architecture, how it works, and your deployment options.