Orgs Can Reduce Breach Costs by 70% with Faster Detection
The impact of “dweller time” on the cost of data breaches can be significant, according to new research: A 2X improvement in the time it takes to detect and respond to an attack translates to a roughly 70% lower business impact.
The research, from the Aberdeen Group, also found that by the time a vulnerability is disclosed, roughly 80% of relevant exploits already exist, but only 70% of vendor provided patches are available.
The analysis, based on data provided by Verizon, uncovered that in more than 1,300 data breaches, investigated between 2014 and 2016, half of detections took up to 38 days, with a mean average of 210 days. That average was skewed by the fact that some incidents took as long as four years to be uncovered.
“The business impact from a data breach is the greatest at the beginning of the exploit, when records are first compromised,” said Barbara Kay, senior director of product and solutions marketing at McAfee, which sponsored the report. “That’s logical, since attackers want to get in and out with the goods (your data) in as little time as possible. Most responders are closing the barn door well after the horse has gone, when most of the damage has already been done.”
This data shows that cybersecurity practitioners can improve their ability to protect business value if they can implement strategies that prioritize faster detection, investigation and response to incidents.
For instance, attackers have become increasingly adept at morphing the footprint of their malicious code, to evade traditional signature-based defenses. But advanced pre-execution analysis of code features, combined with real-time analysis of code behaviors, are now being used to identify previously unknown malware without the use of signatures, before it has the opportunity to execute.
On the containment front, advanced endpoint defense capabilities now allow potentially malicious code to load into memory—but block it from making system changes, spreading to other systems or other typically malicious behaviors. This approach provides immediate protection, and buys additional time for intelligence—gathering and analysis.
For data center and cloud security, some of the above endpoint tactics can be applied to server and virtual workloads to protect against both known and unknown exploits. Aberdeen Group also suggested that users can improve their results through shielding and virtual patching tactics. This concept has been around for years, but is especially helpful when assets are centralized.
Virtual patching is another strategy, the firm said. This establishes a policy that is external to the resources being protected, to identify and intercept exploits of vulnerabilities before they reach their intended target. In this way, direct modifications to the resource being protected are not required, and updates can be automated and ongoing.
And finally, security designs that use fewer policy enforcement points (i.e., at selected points in the enterprise network, as opposed to taking the time to apply vendor patches on every system) is a good best practice.
“As an industry, we are spending more and working harder to shorten the time advantage of the attacker,” McKay said. “Modern tools and thoughtful practices in endpoint and data center infrastructure complement the analytics and automation investments that are transforming the security operations center (SOC), technologies such as anomaly detection and threat intelligence correlation.”