December 6, 2022
Organizations are struggling with mounting data losses, increased downtime, and rising recovery costs due to cyberattacks — to the tune of $1.06 million in costs per incident. Meanwhile, IT security staffs are stalled on getting defenses up to speed.That's according to the 2022 Dell Global Data Protection Index (GDPI) survey of 1,000 IT decision-makers across 15 countries…

Organizations are struggling with mounting data losses, increased downtime, and rising recovery costs due to cyberattacks — to the tune of $1.06 million in costs per incident. Meanwhile, IT security staffs are stalled on getting defenses up to speed.

That’s according to the 2022 Dell Global Data Protection Index (GDPI) survey of 1,000 IT decision-makers across 15 countries and 14 industries, which found that organizations that experienced disruption have also suffered an average of 2TB data loss and 19 hours of downtime. 

Most respondents (67%) said they lack confidence that their existing data protection measures are sufficient to cope with malware and ransomware threats. A full 63% said they are not very confident that all business-critical data can be reliably recovered in the event of a destructive cyberattack.

Their fears seem founded: Nearly half of respondents (48%) experienced a cyberattack in the past 12 months that prevented access to their data (a 23% increase from 2021) — and that’s a trend that Colm Keegan, senior consultant for data protection solutions at Dell Technologies, says will likely continue. 

“The growth and increased distribution of data across edge, core data center and multiple public cloud environments are making it exceedingly difficult for IT admins to protect their data,” Keegan explains.

On the protection front, most organizations are falling behind; for instance, 91% are aware of or planning to deploy a zero-trust architecture, but only 12% are fully deployed.

And it’s not just advanced defense that’s lacking: Keegan points out that 69% of respondents stated they simply cannot meet their backup windows to be prepared for a ransomware attack.

Data Protection Strategies Face Headwinds

One of the primary reasons data protection strategies are failing is the lack of visibility of where that data resides and what it is — a problem exacerbated by the rapid, ongoing adoption of cloud-native apps and containers. More than three-quarters of survey respondents said there is a lack of common data protection solutions for these newer technologies.

“Seventy-two percent said they are unable to keep up with what their developers are doing in the cloud — it’s basically a blind spot for them,” Keegan says.

Claude Mandy, chief evangelist of data security at Symmetry Systems, a provider of hybrid cloud data security solutions, agrees that a lack of visibility is the primary reason current data-protection strategies fail.

“Organizations simply do not know what data they have, where it is, let alone how it is protected,” he says. “Unfortunately, a lot of the data-protection failures are preventable by simply knowing the answers to these questions.”

He adds that the problem is worsened by the constant change of data within an organization. From his perspective, the sheer scale and complexity of millions of individual data objects across thousands of data stored in multiple clouds, multiplied by a seemingly infinite combination of roles and permissions for thousands of user and machine identities, would be challenging for chief information security officers (CISOs) to secure even if they were static. They’re not, so the situation is even more challenging.

To boot, in a lot of cases, organizations are using multiple data security tools for different silos of information, with no overarching integration between them.

“The billions of objects form over months or years, and change constantly,” Mandy says. “This is further exacerbated through continuous data flows, privilege creep, data sprawl, and organizational churn, resulting in [visibility] to data that is far from ideal.”

Zero-Trust Implementation Lags, Despite Interest

Zero trust is growing in popularity in enterprise security because not trusting users by default works well to reduce risk. Indeed, virtually all the GDPI respondents indicated they intend to implement zero trust into their environments at some point.

However, actual deployment is not happening at a rapid pace — as mentioned, only 12% of respondents indicated they have fully deployed at zero-trust architecture into their environments. According to researchers, the main problem is a critical shortfall in IT skills, particularly as it relates to cyber recovery and data protection.

Widely reported shortages of trained cybersecurity professionals are driving the industry to try to come up with some with creative recruiting and training solutions, but just 65 cybersecurity professionals are in the workforce for every 100 available jobs, a recent study shows.

“If you don’t have cybersecurity professionals on staff, it’s virtually impossible to make progress on deploying a zero-trust framework, unless, of course, you rely on partners to help you get there,” Keegan says. “Now consider the demand for these resources in the market. Like supply chain constraints, demand is high, and the supply is low.”

Patrick Tiquet, vice president of security and architecture at Keeper Security, a provider of zero-trust and zero-knowledge cybersecurity software, says that zero-trust management can be challenging even with staff on board.

“Implementation of [zero trust] is currently a common data-protection strategy,” he explains. “However, for [zero trust] to be effective, access and roles must first be configured correctly.”

This means ensuring the right people have access to the right data and resources within the zero-trust architecture. Roles must be implemented that are adequately scoped to protect the data that role can access — and correctly configuring access to data just one time (“set it and forget it,” in other words) is not enough.

“The organization must maintain and manage data access through the lifecycle of the data, and as the organization grows,” Tiquet adds. “Organizations must make sure that, as teams grow and change, the access given to a specific role is still appropriate.”

Vendor Consolidation on the To-Do List

Keegan says it’s likely there will be some retooling at organizations in terms of platforms — many survey respondents (85%) said they believe they would see a benefit through reducing the number of data protection vendors they work with.

“The research tends to support this sentiment,” he adds. “For example, those using a single data protection vendor had far fewer incidents of data loss than those using multiple vendors.”

Likewise, the cost of data loss incidents resulting from a cyber attack was approximately 34% higher for those organizations working with multiple data protection vendors than those using a single vendor, according to the survey.

John Bambenek, principal threat hunter at Netenrich, a security and operations analytics software-as-a-service (SaaS) company, says the current spate of M&A and consolidation in the cybersecurity market speaks to those drivers — but warns that vendors trying to be all things to all security problems has its own downsides.

“The larger tech firms get, the less ability they have to innovate and solve problems leading to opportunities for new vendors to step in with new solutions,” he explains. “It’s [a cycle] we see of M&A frenzy and stagnation, then new companies enter to innovate — and more M&A.”

Keegan, meanwhile, says he is hearing calls in the analyst community that organizations need to consider shifting their investments from cybersecurity prevention to resiliency.

“This means accepting the inevitability that security breaches will occur,” he notes. “Moreover, companies need to have a plan that enables them to recover their critical data and business applications in a timely manner to meet their service level objectives.”

Source