I have been consistently tracking a fun metric around vulnerabilities since March 19, 2024. Before that I would occasionally mention it during talks or chat, but I don’t think I formally blogged about it before this and didn’t track the exact number. So here we are to discuss the prevalence of vulnerabilities in security software, the very thing designed to protect us. As best I recall, 10 – 20 years ago I was pretty consistent in saying around 3.5% of all vulnerabilities we tracked in OSVDB / VulnDB were in security software. While that is a low number, relatively speaking, that is still a lot of vulnerabilities by number.
As of this blog post, there are 22,411 publicly disclosed vulnerabilities in security software, at the very least, if not more.
Back in June, 2014, I did a presentation at Shakacon on the history of vulnerabilities, and updated accordingly. Toward the end I mentioned the vulnerabilities in security software and it was at 2%. So that is a big jump to add 3% since then. Why did that happen though? I am fairly confident it was not a serious uptick in new vulnerabilities being introduced into that software. Instead it highlights an interesting challenge in tracking metadata.
When vulnerabilities are disclosed they get imported into a queue for analysis. This happens at any real vulnerability database that does their own analysis. In the past, Secunia, SecurityFocus, and IBM’s X-Force all did this as well as OSVDB and VulnDB. As best I know, OSVDB turned VulnDB is the only database to ever track this in any fashion. That requires flagging ‘Security Software’ in metadata. Can you honestly say you know every single piece out there? How about over 20 years with dozens of analysts with varying degrees of experience? It created a big gap of that two to five percent.
So recently I did a somewhat comprehensive audit trying to update and flag as many entries as warranted. Many were easy as we recently identified and flagged as such, but historically not so much. A quick search in VulnDB for e.g. $SecuritySoftwareName yielded many entries that needed the update. When I started the audit, the total was in the mid to high 3% range. Searching for specific words that are prone to appear in the names of such software expanded it further. “Fire” and “Bastion” are good examples, the latter more prevalent in Chinese software for example.
After that audit of VulnDB entries, it bumped it up seriously at one point (find that date range)
latest, resulting in 5.01%. Thus, this blog post since it broke 5%! But is that all? Absolutely not. Is it that clean cut to track this? Again, absolutely not. So let’s start with the history of tracking since early 2014:
| Date | Total Vulns | Percentage vs All Vulns |
| 2026-04-22 | 22,388 | 5.01% |
| 2026-01-01 | 21,095 | 4.88% |
| 2025-12-22 | 20,092 | 4.67% |
| 2025-12-01 | 18,746 | 4.39% |
| 2025-11-03 | 17,327 | 4.09% |
| 2025-10-13 | 16,293 | 3.87% |
| 2025-01-01 | 14,224 | 3.70% |
| 2024-03-19 | 12,498 | 3.56% |
| 2014-06-01 | n/a | 2% |
As noted, that is not all even counting pure security software. Meaning, where the software’s entire purpose is to provide security in some form. There are going to be software packages that do just that, are lesser known, and do not have names that scream their purpose. So consider 5.01% to be the absolute minimum and I further believe this to be accurate, with so few false positives to be statistically irrelevant. Certainly not enough to counter the unflagged entries that should be.
The other side of the equation alluded to above is the question of “is it that clean cut to track this?” I said no, but why? Right now the criteria is that the sole purpose of the software is to add security. But these days a lot of software has significant functionality built in. If software is exclusively about offering sandbox functionality then it counts. But what about software that has recently added that functionality as a defense-in-depth security measure? The software itself is not “security”, but the vulnerability is in functionality explicitly provided as “security”. Should that count?
The Linux Kernel is not security software, its primary purpose is to provide an operating system. But what if the vulnerability is in the native iptables / firewall system? This becomes a sort of philosophical and pedantic debate, the kind we love to have and do more often than most would believe. I can argue this both ways so I am curious what my readers think. Please, sincerely, comment below with your thoughts! Rather than track only ‘software’, is it better and more helpful to track the ‘functionality’ instead when it is clear cut?
To help put this in perspective, as of several weeks ago here are some numbers based on title searches, where the entries are sometimes flagged as security software, many are not:
- 1183 sandbox
- 700 sandbox bypass
- 193 ‘sandbox escape’
As you can see, that would add some portion of 2,000 more vulnerabilities meaning it would jump the 5.01% total up a fair amount. That is just a single word that is trivial to search out too! Then add iptables and similar that are embedded into the parent software. So, food for thought and I welcome you to this exercise.
