VulnCon: NVD Symposium, Answers, and More Concerns

Yesterday, at the first inaugural VulnCon, Tanya Brewer from the NVD gave a presentation that was listed on the agenda as “NVD Symposium”. At the talk, her slides began with a header “The National Vulnerability Database: Exploring Opportunities”. However, neither the symposium nor the opportunities were the primary topics that most people were interested in. Fortunately for the crowd Tanya, the NVD Program Manager for the last four and a half years, addressed the elephant in the room first. That being, what is going on with NVD, why is the backlog of vulnerabilities awaiting analysis growing rapidly, and why is there almost no analysis being done.

Tom Alrich has already published his thoughts on the matter. I will share my notes about the topic, along with my own perspective as a vulnerability database manager, along with some additional information and thoughts to put some of her comments in perspective. Tanya definitely brought some interesting general tidbits about NVD’s operation to light that help piece together some longer-standing questions I have had. I attended the conference on my own dime and do not represent my day job in this post.

Tanya began by appearing to give some reassurances that didn’t go over well according to the people I chatted with during and after the talk. “We’re not completely shut down…” and “We’re taking care of priority things… KEV, Patch Tues, Ivanti… we’re getting critical things taken care of, just not as much as we were.” First, I feel this is disingenuous. The amount of vulnerabilities covered by “KEV and Ivanti” number in the couple of dozen range and does not speak to any real output and certainly not all issues of interest. If this is meant to give any level of comfort, it did not.

Second, I simply do not think this is an accurate statement. Saying that NVD is covering Patch Tuesday is deceptive at best. Unless she was using a different meaning for “taking care of”, then looking at the March 12 Patch Tuesday, over two weeks ago, we see plenty of Adobe that are awaiting analysis. Moving to Microsoft, we see the same thing. SAP? Yep, once again awaiting analysis. You cannot say a database is “taking care of priority things”, specifically call out Patch Tuesday, and not have analyzed those vulnerabilities from weeks ago. That is why so many people are confused and frustrated with NVD right now; they aren’t doing the minimum required analysis for high-profile issues.

Root-cause Analysis

So why did this happen? Tanya described it as “we ended up with a little bit of a perfect storm last year” which basically claims this was a rare event, and once past it, we should not see it again. However, that turned around immediately when she said that in May or June “we saw things coming“. So whatever this series of events were, that could be forecast to some degree, it was still somehow unavoidable. That should be worrisome to those relying on NVD that this could happen again and worse, that they may not be able to stop it even if they know a problem is coming.

When Will It Be Fixed?

This is the question everyone wants answered, and the news is good on the surface. Tanya said that they are “actively reallocating personnel, working with other agencies“, that it will “take us a couple more weeks to get moving“, and that they are “in the process of fixing the problem“. She was very adamant though, and this was reassuring to a degree, that “the NVD is not shutdown, it will not be shutdown.” Further reassurance came in form of her saying “we’ll make it robust again, we’ll make it grow, we’re not going away“. In fact, her presentation was explicitly about that growth but as Tom points out in his blog linked at the start, there are concerns there too.

For those needing a concrete answer directly from NVD, Tanya said that a written statement is coming in as soon as a couple weeks. She said that it is “hard to write in a concise manner what happened” but the statement will go out in a press release and be posted on the NIST news site. Unfortunately, Tanya hit her time limit and a lot of questions were left outstanding. I had a colleague approach her after the talk to ask one question of mine that I still wanted details on. My question posed via the conference Discord was “Can you tell us more about this ‘perfect storm’ you saw coming in May/June 2023, that led to a near shutdown of NVD for 30+ days in 2024?” Her answer, paraphrased by my colleague was “that there was a lot of red tape in terms of getting funding approved. Wasn’t very clear beyond that. it always has been an issue“. I have problems with this answer, and the fact that she quickly changed subjects according to my colleague, only strengthens my concerns. I’ll talk more on all of this later in the blog.

This part of the talk lasted about five minutes and then moved quickly into the ‘Symposium’ part, which really should have been titled ‘Consortium’, along with areas that she plans for NVD to grow and improve on. 

NVD Growth?

While the presentation did cover many areas of potential growth within NVD, and some of it is optimistic for the community, it came with a gut-punch disclaimer. Tanya said that the NVD is “considering opportunities for growth and enrichment that should be addressed within the next one to five years“. Seems clear to me that people can expect them to return to the status quo at some point in the coming weeks or months, but any significant features and growth is not coming immediately.

Some of the features and growth highlighted:

  • Ability for outside parties to submit CPE data to the dictionary in ways that scale. This is something that frustrates one team at my day job, as we have to predict CPEs every day for products that are found vulnerable, but do not appear in the CPE dictionary.
  • Setting up email alerting based on products, directly on the NVD site. While this is can be handy, email alerting does not scale for a vast majority of organizations out there. Further, that same feature is already freely available on other sites.
  • Supporting JSON 5.0 for data ingestion. This was a big one that many in the audience were extremely interested in, and seemed quite annoyed that it wasn’t coming sooner.
  • Supporting new types of data such as EPSS and NIST Bugs Framework data.
  • Including pURLs in NVD data.
  • Additional translations beyond English and Spanish.
  • … and several other potential improvements.

The next portion of the talk included J’aime Maynard, the Consortia Agreements Officer at the Technology Partnerships Office (TPO), who is directly involved in the upcoming consortium initiative. J’aime talked about some of the bureaucracy of how it will work, what organizations can participate, and some of the red tape that surrounds it including signing a Cooperative Research and Development Agreement (CRADA) with NIST.

Bottlenecks, Red Tape, Solutions, and Concerns

Jumping back to the question of when things will resume and the associated challenges, there is more to be said. A lot of this information and perspective was born out of the Q&A that occurred during the talk.

One of the bottlenecks the NVD faces is generation of CPE, “taking 65% of their analysis time“. Tanya indicated that they are currently training more people to be able to do CPE analysis but it will take time to go through the backlog (4,581 at the time of this blog). She said there are “only so many people in the world trained for this kind of thing“, but that too was disingenuous to me. Fairly accurate programmatic prediction of CPE generation based on vendor and product names was not difficult for us almost ten years ago. Training someone to do CPE, especially with the full CPE dictionary there as reference, should not represent a significant hurdle.

When asked about supporting CVSSv4, Tanya replied they “are not ready for it this moment, want to work through consortium to do that in short term.” That the NVD is in the “process of implementing v4 internally. Working on training internal staff to do assessments.” This includes generating the ability to consume CVSSv4 scores from e.g. CNAs. She assured the crowd that NVD will continue to provide CVSSv3.1 scores “as long as agencies require them.” She was also asked why NVD does not provide temporal scoring, to which she replied that historically, they did not “for the sake of operational need of not having to keep that threat intel updated” and are “not ready to offer support for that” as “their cadence for updating that isn’t compatible.

A Note on Budgets

At some point, I am not sure if my note-taking broke down or the conversation around Q&A got disjointed, but Tanya said that “only senior executives in the gov can spend quickly, and the NVD director is not that.” This was a curious comment and leans in to one unsubstantiated rumor that “the NVD was defunded“. I explicitly asked in Discord for her to confirm or deny if the NVD had been defunded in any way, but that question was not asked. This was of specific interest to me because via a FOIA request, I happen to know their 2019 calendar year budget for “personnel, materials, computers, and applicable expenditures” was a whopping $6,066,924.85. Even if the NVD had their budget cut in half, that is still an obscene amount of money to run that database. I say that as someone who has been managing a vulnerability database in some form or another since 1993, 13 of those years professionally.

One other thing Tanya mentioned was that at its largest, the NVD team was 21 people. For the sake of argument, let’s say that was the team size in 2019 with the budget above. Cutting out two million for infrastructure, which is beyond generous, meaning each of the 21 people could have enjoyed a $190,476 salary. I think it is safe to say that while the director may earn that, the rest of the team would not. So where is all this money going, and why are we facing this shutdown now? To help answer that question, I put in a new FOIA request on March 21 for their 2023 and 2024 budgets. While I am curious what Tanya makes, I learned my lesson after including that in a request in 2005 and upsetting the director at the time. A subsequent FOIA request to NIST for NVD information was suddenly met with an exorbitant fee to complete.

Other Bits of Information

Chris Turner, a senior advisor to NVD was present, and answered some of the questions. One resulted in him sharing that “NVD does checks between CPE/CWE data submitted [..] and their perspective.” That the NVD spot checks “5% of data coming in from providers“. By providers, I assume he means CNAs. This really causes me more concern when put in the context of that budget. What is this large team doing if they are checking that little data? Trust me, I know that doing vulnerability disclosure analysis is a painful process. The team I work with, smaller than NVDs but apparently more capable, fully analyzes and enriches every disclosure you see in NVD, and on average in 2023, an additional 22 vulnerabilities per day.

There was considerable debate and rumbling in the conference chat and after the presentation about NVD CVSS scoring. The presenters said that when “information comes from the CVE list, e.g. if Linux Kernel vuln with a CVSS score, that is what is used in NVD data.” That “if they analyze [the vulnerability] and there are nuances about implementation, they may score differently.” This is certainly a pain point for any vulnerability database doing CVSS scoring, ours included. 

One thing Tanya said that was germane to this, was that (slightly paraphrased) “NIST answers to US federal agencies first, but doesn’t intend to draw the border there. They have state/local/city/tribal governments using their data. [..] It’s not a question if the Department of Defense (DoD) uses their data, it is a question of how many departments of the DoD use it. [..] That is 21 departments around the world. [..] The NVD gets 160 countries a month using their data. They know they have a global audience and try to serve it as best they can.” I can assure you, Tanya is correct and the pains NVD feels on CVSS scoring and the pushback from stakeholders is real.

I look forward to more people sharing their take-aways from the NVD presentation and information. Note: the quotes above were taken ~ verbatim during the presentation and I believe I captured them accurately.

[Update 1h after publication: There was a panel talk, not titled well unfortunately, that included Tanya Brewer. In that panel discussion she mentioned that during the pandemic, the volume of emails NVD received tripled and never subsided. Despite that, the amount of staff available to handle that load stayed the same. When Tanya was asked if there was a root-cause analysis as to why the increase in volume, she replied that they didn’t have the resources to do that. Finally, I will leave you with one quote from that panel that is refreshing to see as far as the honesty and intent to improve.]

NVD is not the best database. if it was, I wouldn’t be putting together a consortium. There is a lot of room for NVD to improve.” – Tanya Brewer

[3/29/2024 Update: Cyberscoop has published an article titled “Plan to resuscitate beleaguered vulnerability database draws criticism”.]

[4/26/2024 Update: NIST/NVD has released a new statement that uses different words, but does not appear to give any new information.

Leave a Reply

Discover more from Rants of a deranged squirrel.

Subscribe now to keep reading and get access to the full archive.

Continue reading