Buckle Up For the DoD's Software Fast Track ATO (SWFT)
A look at the emerging DoD expedited software cybersecurity procurement process Software Fast Track (SWFT)
If you’re like me, you likely have been following headlines about the new DoD software procurement cyber program dubbed “Software Fast Track (SWFT).” This program aims to improve the Pentagon's legacy purchasing process and how it assesses and authorizes the software the DoD procures and uses.
The DoD and the Federal government more broadly represent one of the largest buyers of IT and software in the industry, spending billions a year on IT procurement and consumption. This often involves cybersecurity processes to get assurance around the software they purchase and place on their networks.
This is where SWFT comes in, aiming to replace the existing Authority to Operate (ATO) process used by the DoD. For those unfamiliar with ATOs, they are a concept part of the broader Risk Management Framework (RMF) from NIST and what publications such as NIST 800-37 revolve around.
In this article, I will take an early look at the emerging SWFT process, what it is likely to involve based on recent public comments from DoD leaders and RFIs, and the involved requirements for those not familiar with some of the concepts it mentions.
Interested in sponsoring an issue of Resilient Cyber?
This includes reaching over 45,000 subscribers, ranging from Developers, Engineers, Architects, CISO’s/Security Leaders and Business Executives
Reach out below!
Authority to Operate (ATO) 101
While I won’t be diving deep into ATOs in this article, for those unfamiliar, an ATO stands for Authority to Operate (ATO). Before a system and/or software is allowed to go into production environments in the DoD/Federal space, it must receive an ATO from an Authorizing Official (AO).
This historically involves the implementation of a subset of controls from the overall NIST 800-53 security controls, an assessment of the controls implementation and an accompanying Body of Evidence (BOE) of artifacts related to the controls and security program, such as Incident Response, Business Continuity, Access Control and much more, based on the 800-53 security control families.
The fact ATO’s receive a bad reputation isn’t so much because security controls are inadequate, but that the process itself has typically been done manually, with static, snapshot-in-time assessments of the control implementations, static paper-based documents in Word/PDF/Excel and a time consuming cumbersome process for stakeholders such as Information System Security Officers (ISSO)’s, System/Software/Mission Owners, Security Control Assessors (SCA)’s, Authorizing Official (AO) and the software vendors involved.
I want to emphasize that this problem isn’t isolated to just the DoD/Federal community, but Governance, Risk and Compliance (GRC) more broadly, which I laid out in my recent article “GRC Is Ripe for a Revolution: A look at why GRC lives in the dark ages and how it can be fixed”.
Software Fast Track
Enter the DoD’s Software Fast Track (SWFT).
While the ATO process has long been a pain point in the Federal community, the recent presidential administration change has brought about transitions among key Federal and DoD leaders, including the DoD CIO/CISO, who is now Katie Arrington.
Ms. Arrington isn’t new to the DoD and has held various roles before her current CIO role, including CISO from 2019-2020. She also had a congressional bid of her own. She has now returned as the CIO and seems intent on implementing drastic change, which arguably can be seen as aligning with a broader deregulatory and innovation push, such as the EO tied to deregulation.
Ms. Arrington began providing insights into the forthcoming SWFT process at a recent public sector event. Below are some quotes from a DefenseScoop article.
While her comments are a bit of a generalization, such as the fact that the paperwork below is the rigor of security control implementation and assessment, to ensure DoD/Federal systems and data are secure, she isn’t entirely wrong either.
As I cited in my GRC Revolution article above, GRC is fundamentally broken. In an age of Cloud, DevOps, APIs, CI/CD Pipelines, Automation, and now AI-driven development, GRC lives in static documentation, system sampling, snapshot-in-time assessments, cumbersome methodologies, and questionable impact in terms of risk reduction due to how the activities are carried out.
Nuances around ATOs aside, the DoD is clearly adamant about replacing the existing process with a new one, and that is where SWFT comes in.
So what does/will SWFT entail?
We’re getting early signs of that with a recently released RFI for SWFT Tools.
Below is an excerpt from that RFI:
Here are the 6 questions the RFI asks, which really indicates what sort of information, artifacts and more the DoD intends to use to assess the risk of software rather than the traditional ATO process:
For those unfamiliar with some of these items, I wanted to explain them briefly.
Secure Software Development & Software Supply Chain Security
The first question is about secure software development and supply chain security (SSCS). I will leave the software development piece alone, because we will touch on that next, but SSCS is a broad domain, and entire books can, and have been, written on it. I know this because I’ve written one of the leading books on the topic. I published Software Transparency: Supply Chain Security in an Era of a Software-Driven Society.
There are many SSCS frameworks, guides, and publications from organizations such as NIST, Cloud Security Alliance (CSA), Center for Internet Security (CIS), Cloud Native Computing Foundation (CNCF), CISA/NSA, and others. I covered many of these in my book, and they have some similarities as well as some differences.
The DoD seems to be trying to gauge what industry references and standards organizations are adhering to and aligning with in this area.
NIST 800-218 Secure Software Development (SSDF)
Next up on the list of questions is NIST’s SSDF, which focuses on secure software development, as the name implies. For those unfamiliar with SSDF, it already existed and has been updated as part of downstream activities from the Cybersecurity Executive Order (EO) 14028. It deals with implementing security throughout the software development lifecycle (SDLC).
Rather than reinventing the wheel as we often do in Cyber, SSDF leaned into existing industry frameworks around secure SDLC, such as the Building Security in Maturity Model (BSIMM) and OWASP’s Software Assurance Maturity Model (SAMM), among others.
SSDF involves four key groups, which are:
Each of these groups includes defined practices and lists activities that align with securing software throughout the SDLC, from design, development, deployment, and sustainment.
These wide-ranging practices include defining security requirements, implementing security toolchains, preventing tampering, identifying vulnerabilities in code, and having vulnerability disclosure and remediation processes in place.
Software Bill of Materials (SBOM)
Third on the list of questions indicating where SWFT is headed, which coincides with Ms. Arrington's public comments, involves SBOMs.
For those not familiar with SBOMs, they are far from a new concept. They primarily saw evangelism from organizations such as NTIA, CISA, and, to some extent, the DoD. SBOMs are often compared to a “list of ingredients” or “food label” for software, showing what individual components and libraries actually make up an application, software, or product.
SBOMs saw a lot of initial momentum after SolarWinds/Log4j and coming out of the Cyber EO I discussed above. They were even part of initial conversations related to CISA’s Secure Software Attestation requirements and even a previous year’s draft of the National Defense Authorization Act (NDAA), which is more DoD-specific.
Ironically, it subsequently was stripped from both and received many industry criticisms around its real value and challenges with SBOM data completeness, quality, disparate SBOM outputs from existing SBOM tools, and producing SBOMs for complex systems/software such as Software-as-a-Service (SaaS).
For those unfamiliar with SBOMs, there are primarily two SBOM formats: the Software Package Data Exchange (SPDX) and CycloneDX, championed by the Linux Foundation and OWASP, respectively. Each format has its unique nuances, and to some extent, the lack of adoption of SBOMs can be tied to the competing standards.
I have written extensively about SBOMs, their formats, how they can be operationalized, and more, in articles such as:
Understanding OWASP’s SBOM Maturity Model: Not all SBOMs are Created Equal
Vulnerability Exploitability eXchange Explained: How VEX Can Make SBOMs Actionable
Nonetheless, despite a lackluster industry adoption, concerns around their completeness, quality, and operationalization for risk management and procurement, the DoD seems set to move forward with pivoting to using SBOMs instead of the existing ATO process.
One irony is that the initial complaints about the ATO process are that it is “static” and the risks are dynamic. The same can be said of SBOMs, as they change with every new software release where any changes are made, new dependencies and components are involved, new code is written, etc.
How the DoD avoids SWFT’s use of SBOMs is also static, and a snapshot-in-time remains to be seen. To prevent this, the DoD will need a new SBOM every time there is a change to the product being used, across all DoD software suppliers, following the SWFT process. This is arguably tens of thousands of products, with even more versions likely existing across the complex, diverse DoD networks and systems. This will be a challenge even if/when the DoD settles on one or more SBOM management tools/vendors to help them consume and utilize SBOMs as part of SWFT.
Additionally, another challenge with SBOMs is that there are still many unanswered questions when it comes to SaaS, where software is deployed on top of complex cloud environments and underlying software and infrastructure. Where does the software vendor's SBOM begin, and where does the CSP, platform provider, etc.'s SBOM end?
Walter Haydock and I tried to make the case for a SaaSBOM several years ago. I know some, such as OWASP, have worked hard on this, but I do not believe it is an answered question, last I knew. Our article from four years ago, “The Case for a SaaS Bill of Materials,” can be found here.
Aside from the components involved, the DoD will need to (or should) also understand the associated vulnerabilities with those components, which involve more than just CVEs but other vulnerability databases and identifiers such as GitHub Security Advisories, OSV, and more, especially in the face of the struggles with NVD and CVE recently.
Futhermore, DoD will need to understand if the vulnerabilities have known exploitation, are likely to be exploited, have compensating controls, and are reachable, both at source and runtime, to optimize vulnerability management efforts and avoid a TON of toil and frustration for both the DoD and software vendor community, which are problems that exist well beyond SBOMs but to vulnerability management more broadly, and the entire ecosystem, not just the DoD.
It also remains to be seen what level of risk the DoD will deem acceptable. How many vulnerabilities, what severities, what remediation timelines, etc., all tied to their risk tolerance, and it will likely exist on a spectrum related to the intended use case of the software and what clearance level, impact level, and types of environments the software will be utilized in.
It's safe to say that there is a lot to be seen in how the DoD implements this SBOM requirement and how effectively it does so to avoid it becoming another performative art and checkbox exercise.
Risk Assessments, Artifacts, and Automation
4th on the list are questions around risk assessments, artifacts, and automation. This is a broad area/question and can involve previously mentioned publications, such as SSDF, as well as artifacts such as SBOM.
The DoD seems to be looking to better understand how the industry conducts software risk assessments, what artifacts are consumed and produced in that process, and what level of automation is and can be used.
Artifact and Information Sharing
5th in the list of questions is the DoD asking if these artifacts could be provided to the DoD for DoD-led risk assessments, and if not, what artifacts the industry recommends the DoD use instead.
This section is focused on equipping the DoD to know what artifacts to request from software vendors, likely in what format, how those artifacts can and should be exchanged, and what methods may be involved.
This section may seem trivial, but it is anything but for various reasons.
These artifacts, such as risk assessments, penetration test reports, and SBOMs, more often involve sensitive data about vulnerabilities, weaknesses, and flaws in the vendor's products. They can be exploited to impact the DoD and any organization using the vendor's products and software, representing a key part of the software supply chain.
For the SWFT process to be effective, efficient, and “fast,” as the name implies, processes and tooling must be used to enable timely sharing of the artifacts and the ability to ingest and assess the data they communicate. Otherwise, the process will suffer from the same bottlenecks and challenges of historical ATO processes.
Automated Accelerated Software Verification
In the same vein, the 6th and final question focuses on how organizations (and ideally tools/products) can support the automated and accelerated exchange of information to support the DoD’s SWFT program, which is aiming for the outcome of an accelerated software verification process.
The AI Angle
Another aspect of the SWFT process which isn’t totally clear yet but will certainly play a part of the use of AI. This is primarily seen in the RFI titled “SWFT Automation * Artificial Intelligence”. Among the public comments Ms. Arrington made are those around using AI/LLM’s to evaluate the vendors artifacts and submission, see below:
What this fully looks like remains to be seen, and it will be interesting to see if the use of LLMs it tied to reviewing the submissions and attestations, SBOMs, source code etc. but the emphasis on the use of AI aligns with the overall goal of moving faster.
Below are the four questions the AI RFI focuses on:
So the key here is accelerating SWFT assessments with AI, addressing potential challenges with automation and the data required such as information about the vendor, their SBOM(s), the DoD buyer(s) and use cases and other third-party sources.
Closing Thoughts
We’re in the very early stages of the DoD’s overhauled software security verification process SWFT. That said, it clearly signals the DoD’s intention to fundamentally change the way they authorize software products to be used for DoD systems and data, and a pivot away from the existing control-heavy ATO process.
There is A LOT that can be debated about this so far, and many will validly argue the ATO process itself isn’t broken, but how it is executed is - e.g. manual, inefficient, failing to take advantage of automation, snapshot in time assessments and all of the issues I mentioned above, but that would be an entire piece of its own.
The direction of the future for the DoD and software verification is written on the wall.
This also aligns with other related organizations undergoing massive transformations, such as FedRAMP and its pivot to “FedRAMP 20x,” which emphasizes automation, continuous assessment, and so on.
Among the questions I have is if the DoD will also conduct third-party assessments of NIST 800-218 compliance/alignment of the vendors, or merely accept self-attestations on that, which would be interesting, given Ms. Arrington and the DoD have also moved towards using 3PAO’s for CMMC/800-171 compliance, after learning the hard way that self-attestations have..problems, to put it lightly. It is a double edged sword in the sense that third-party attestations of another framework such as SSDF would take time and not expedite cyber and procurement activities, but self-attestations aren’t always to be trusted either.
One big unopened question that remains to be seen, though, is if the RMF ATO process for the DoD procured software needs to be blown up due to inefficiencies and ineffectiveness, what about the use of RMF ATOs for actual DoD systems and government off-the-shelf (GOTS) software produced by DoD/Federal contractors and system integrators?
If I had to bet, it would be headed in a similar direction. Otherwise, we would have a massive contradiction: the existing process is used internally by the DoD for its systems and software, but a leaner, more product-centric process is used for externally procured software.
We will see if I’m right and where the future of SWFT goes.
Debates aside, one thing is irrefutable..
The existing processes and methodologies do not move at the current pace of software development or delivery and are arguably creating MORE risk for DoD systems while keeping mission owners, warfighters, and citizens alike using antiquated legacy software and products rather than commercially leading offerings.
All at a time when we’re struggling to outpace our nearest competitor in China.
I can assure you that Chinese nation-state attackers aren’t concerned with your System Security Plan (SSP) or if you’ve checked all the boxes for your Authorizing Official (AO).
I'm curious to see how Infrastructure as Code falls in this mix. Is that software or does it need to be stuck on traditional processes as virtual hardware?