(re)Building a Security Program – part 3
October 16th, 2015- Posted By: Stew Stremel
- Comments Off on (re)Building a Security Program – part 3
Just the Facts Mam, Where’s the Data?
Data is king Especially when trying to make trade-offs between competing perspectives.
Frequently, organizations will understand that they have a gap/need in an area and quickly purchase a tool to solve that issue. Then, due to lack of time and shifting priorities, never fully implement the tool. This leaves the organization in an even worse position of management thinking the problem has been “solved” when it fact it hasn’t and all the while spending money on the tools that they aren’t fully using.
In this case, we surveyed the tools in use and divided them up into two categories:
- proactive: they actively removed/blocked a security risk
- detective: they reported on the coverage/usage of tools
What we found was that the proactive tools where pretty well deployed. However, there was a big gap in the detective tools. This gap was allowing for security risks to slowly build up over time.
We choose to lean into the vulnerability scanning tool as the biggest driver for visibility into where the organization was exposed. We implemented an weekly scanning process on nearly all the devices in the organization. Now the challenge was how to measure and prioritize.
Vulnerabilities are constantly being discovered. This has constant rate of flow to them. So, we choose to look at vulnerabilities in three buckets:
- 0-45 days old (blue)
- 45-90 days (orange)
- > 90 days (red)
We then created a ratio of the number of devices authenticated to during the weekly scan vs. the number of vulnerabilities in that age bracket. This showed us that we currently had, on average, 26 sev4/sev5 vulnerabilities per device that had been present for more than 90 days. This “exposure ratio” was indicative of things that had been just building up over time (patch deployments failing, adobe and java not being centrally managed, users installing unsupported products, etc.).
You can see how the monthly releases of new patches from Microsoft/Adobe/Oracle/etc show up as vulnerabilities and are then correspondingly patched in the environment. In addition, we prioritized key clean-up activities each month out of the >90 day pool to drive the ratios back down.
Now comes the truly hard part: wash, rinse, repeat. However, I think the organization is in a great place to build on its successes as the organization tackles key issues in each capability block.
0 Comments