CISA - Secure Software Development Attestation Final Form
A look at the final version of CISA's Secure Software Development Attestation Form, which will be required for companies to sell software to the Federal government.
Welcome to Resilient Cyber!
If you’re interested in FREE content around AppSec, DevSecOps, Software Supply Chain and more, be sure to hit the “Subscribe” button below.
If you’re interested in Vulnerability Management, you can check out my upcoming book “Effective Vulnerability Management: Managing Risk in the Vulnerable Digital Ecosystem” on Amazon. It is focused on infusing efficiency into risk mitigation practices by optimizing resource use with the latest best practices in vulnerability management.
After much industry buzz and speculation, the Cybersecurity Infrastructure Security Agency (CISA) released their final draft of the Secure Software Development Attestation Form.
For those who haven’t been following along, there have been previous versions of the form issued by CISA for public comment/feedback. The form is part of a broader effort derived from the Cybersecurity EO 14028 (formally called “Improving the Nations Cybersecurity) which had a Section 4 dedicated to software supply chain security in the fallout of incidents such as SolarWinds which impacted tens of Federal agencies as well as the commercial industry.
As a result of the EO, the Office of Management and Budget issued two memorandums, M-22-18 “Enhancing the Security of the Software Supply Chain through Secure Software Development Practices” and M-23-16 “Update to Memorandum M-22-18”.
I have covered both 22-18 and 23-16 in depth in previous articles but the summary is that the forms require the Federal government to obtain self-attestation forms from software suppliers of the Federal government. Software suppliers have to attest to complying with Federal Government-specified secure software development practices which were drawn from NIST’s Secure Software Development Framework (SSDF).
If you’re unfamiliar with NIST SSDF, I covered it in-depth in a previous article as well, but in summary, it draws from various existing industry frameworks focused on secure software development such as OWASP SAMM and BSIMM, which focus on secure software development practices. For those unfamiliar with Secure Software Development Frameworks such as SAMM and BSIMM, you can read my article where I break them down here.
That said, if you’re a Federal agency or procurement official, you need to be familiar with the CISA Self-Attestation Form and its requirements, and the same goes if you’re a software supplier, so let’s give in and take a look, as well as some items that have changed from the previous draft or potentially been overlooked.
What software does it apply to?
First off, let’s clarify what software the form does and doesn’t apply to.
The form states that it applies to software that meets any of the below conditions:
The software was developed after September 14th 2022
The software was developed prior to September 14th 2022, but was modified by a major version change (e.g. using a semantic versioning schema of Major.Minor.Patch, the software version number goes from 2.5 to 3.0) after September 14th, 2022
The producer delivers continuous changes to software code (as is the case for SaaS products or other products using CI/CD)
So we can see right away it will apply to any products that are properly maintained, given it is nearly 2 years from September 2022 already. Additionally, we can anticipate a bit of gamification from software vendors around “major version change” to try and skirt the requirements and minimize the compliance burden they face for each new version of software (everything is a patch or minor update now?)
It is great to see the Government not let SaaS off of the hook, with the specific mention, given the rise of software supply chain attacks, including SaaS companies and impacting some of the largest most well resourced companies on the planet (e.g. Microsoft, Okta etc.) as well as small startup SaaS-based companies.
The Federal government of course has FedRAMP for security and compliance of SaaS based software, but a recent GAO report highlights the fact that agencies are using SaaS that isn’t FedRAMP authorized, which isn’t surprising, given FedRAMP only has 300~ approved SaaS offerings in its marketplace after over a decade and in a market of tens of thousands of SaaS suppliers.
Additionally, the FedRAMP security control baselines are not a 1:1 match against SSDF, which focuses on secure software development practices, but that’s another conversation.
Former DoD Chief Software Officer Jason Weiss did a great analysis between the two (FedRAMP/SSDF), finding:
We also know that most modern software development environments are utilizing agile and DevSecOps methodologies, including CI/CD platforms to iteratively deliver software - hence another good emphasis in the CISA form to see.
What doesn’t it apply to?
We covered what software the attestation form does apply to, but what doesn’t it apply to?
Per the form, the following categories are not in scope based on 22-18 and 23-16:
Software developed by Federal agencies
Open-source software that is freely and directly obtained by a Federal agency
Third-party open source and proprietary components that are incorporated into the software end product used by the agency
Software that is freely obtained and publicly available
There’s some interesting aspects to breakdown here in terms of what isn’t covered by the forms requirements and the implications of that.
For example, software developed by Federal agencies is on the surface a logical exclusion, given the OMB memos are focused on software suppliers of the Federal government. That said, as mentioned above NIST RMF and 800-53, which is used for compliance of Federal IT systems and software isn’t a direct match to NIST SSDF, which focuses on secure software development. The Federal government is no stranger to having its own systems and software compromised (*OPM has entered the chat*).
This means Federal developed software has a gap that needs to be addressed to ensure secure software development practices are aligned with. Also, most software isn’t developed by the Federal government directly, but instead through its massive government contracting ecosystem to the tune of tens of billions of dollars a year. Does the form preclude contractor developed software for the Federal government? Or is it only excluding software the Federal government developed with strictly Federal employees?
The second item about excluding open source software (OSS) is also logical, given OSS maintainers and contributors are NOT suppliers, which I have previously written extensively about.
That said, OSS still poses risk just like proprietary software and not just in the form of CVE’s, which are lagging indicators of risk, but also in other metrics such as the origin of contributors, poor hygiene posture, lack of maintenance and more. For a deeper dive into that please see the OSS Security Top 10 Risks and the article where I break down those risks in great detail.
Additionally, as mentioned above, Federally developed software is excluded, as well as potentially software the Government had developed by Federal contractors. If so, this is also concerning, given we know from studies that modern codebases are composed of 70-80% OSS components and over 90% of modern codebases contain some OSS components.
This includes critical infrastructure, and for a great breakdown of that topic, see security researcher Chinmayi Sharma’s “How Digital Infrastructure is Built on a House of Cards”.
That said, there does seem to be some controls we will discuss soon that do apply to third-party components included in the products by suppliers. The form mentions “software producers who utilize third party components in their software are required to attest that they have taken specific steps to minimize the risks of relying on such components in their products”. We will discuss those steps in a section to follow of the article below.
Lastly, is the mention of “software that is freely obtained and publicly available”. This of course refers to Free and Open Source Software (FOSS). This is another interesting exclusion. Again, OSS maintainers and contributors are not suppliers in the typical sense so it makes sense for the CISA attestation requirements to not apply to them.
That said, many modern software product companies are pursuing a product-led growth strategy, which includes making versions of their products freely available and open source, to get customers engaged with their products and ideally move on to paid-tier versions of the product.
This means these free and open source versions of products are not within scope for secure development requirements, despite still posing risk to the agency, especially if production data is put into these products or they are integrated throughout the enterprise in the broader architecture and so on, which often does occur as pilots and proof-of-concepts/value (PoC/V).
How is it submitted and by whom?
One early critique from industry of the form was that the only apparent option was a static PDF document.
That has now changed and there will be a URL “https://softwaresecurity.cisa.gov” what software suppliers can use, and assumably Federal agencies will have access to the site for products they’re using as well as potentially be able to access forms for suppliers that other agencies are using to minimize the churn and toil on software suppliers needing to provide these forms to agencies nth number of times individually.
The local PDF is still an option as well, which can be filled out by the software producer and submitted to the agency with their provided POC’s email.
For those who prefer Markdown, as many Developers do, supply chain security startup Chainguard published the requirements in Markdown format here.
One key call out of a deviation from the previous version is the ability for the form not just to be signed by the CEO but alternatively, someone they designate. It states
“By signing, that individual attests that the software in question is developed in conformity with the secure software development practices delineated in this form”.
Some have been upset that the form allows for the CEO to designate someone else to sign, setting up a situation where that person may become a scapegoat should the attestation information not be correct, a product be compromised, agency be impacted and so on.
It is of course peculiar, given CISA themselves in their Secure-by-Design guidance and public commentary have called for CEO and/or C-Suite involvement in the security of their products. Some have speculated that the change is a result of lobbying efforts, of course we will never know.
Additionally, much like the prior attestation draft form, it allows for software producers to demonstrate their alignment with the requirements by using a Third Party Assessor Organization (3PAO). The 3PAO must be certified/approved via FedRAMP and has to use the NIST guidance that covers the elements in the form (e.g. SSDF) as part of their assessment resources.
It also specifically states that agencies will properly safeguard assessments and not post them publicly. This makes sense given it may highlight deficiencies in a software producers security posture and potential attack vectors.
What’s required?
Now that we have had a discussion around what software is in scope of the attestation requirements, let’s take a look at the actual requirements calls out in the form.
The form allows suppliers to submit either:
New Attestations
Attestations Following Extension or Waiver
Revised Attestations
The attestations can be:
Company-wide
Individual products
Multiple Products or specific Product Versions
Something worth calling out is the footnote below:
While I’m not a lawyer, I interpret this as the form and attestations applying to all future versions of a product unless the supplier specifically informs the Government otherwise. This is something key for suppliers to keep in mind, knowing any future versions of software will be bound by previously made attestations.
The form goes on to have some data and signature blocks and lays out specific required practices derived from SSDF. They include:
The software is developed and built in secure environments. Those environments are secured by the following actions, at a minimum:
Separating and protecting each environment involved in developing and building software
Regularly logging, monitoring and auditing trust relationships used for authorization and access:
To any software development and build environments
Among components within each environment
Enforcing multi-factor authentication and conditional access across the environments relevant to developing and building software in a manner that minimizes security risk
Taking consistent and reasonable steps to document, as well as minimize use or inclusion of software products that create undue risk within the environments used to develop and build software
Encrypting sensitive data, such as credentials, to the extent practicable and based on risk
Implementing defensive cybersecurity practices, including continuous monitoring of operations and alerts and, as necessary, responding to suspected and confirmed cyber incidents
The software producer makes a good-faith effort to maintain trusted source code supply chains by employing automated tools or comparable processes to address the security of internal code and third-party components and manage related vulnerabilities
The software producer maintains provenance for internal code and third-party components incorporated into the software to the greatest extent feasible
The software producer employs automated tools or comparable processes that check for security vulnerabilities. In addition:
The software producer operates these processes on an ongoing basis and prior to product, version, or update releases;
The software producer has a policy or process to address discovered security vulnerabilities prior to product release; and
The software producer operates a vulnerability disclosure program and accepts, reviews, and addresses disclosed software vulnerabilities in a timely fashion and according to any timelines specified in the vulnerability disclosure program or applicable policies.
Many of these practices seem common and expected of any modern organization developing digital products, such as separating development and production environments, utilizing MFA and implementing logging and monitoring.
That said, there is a fair bit of ambiguous language at play that no doubt will be capitalized on by software suppliers, as well as complicating any potential legal liabilities.
For example, what and who determines “reasonable steps” when it comes to documenting and minimizing the use and inclusion of software products that create undue risk within environments?
What defines “to the extent practical” when it comes to encrypting sensitive data?
What is a “good faith effort” when it comes to maintaining trusted source code supply chains and addressing the security of internal code and third-party components and their associated vulnerabilities?
Will this be as simple as using SAST and SCA tooling and will it apply to only direct, or also transitive dependencies? Will vulnerability remediation be driven by base CVSS scoring or account for aspects such as known exploitation (e.g. KEV), exploitation probability (e.g. EPSS) and reachability/execution of the components in the code?
The form goes on to provide a mapping of the attestation requirements to the related Cyber EO subsection and related SSDF practices and tasks.
Critiques and Changes
Some have pointed out some fundamental changes in the form from the previous drafts as well as potential gaps, such as:
Any mention of SBOM (which is peculiar given CISA is the agency championing the use of SBOM’s)
No emphasis on the need for Threat Modeling, despite Threat Modeling being emphasized in other guidance such as Secure-by-Design from CISA
No mention of Memory Safe Programming Languages
No use of the phrase “Secure-by-Design”
The obvious pivot from being signed by the CEO or COO to now the CEO or “Designee” - opening the door to shirking the responsibility and accountability of the C-Suite
Now what?
With the publication of the final attestation form, the official deadlines now begin. As stated in 23-16 from OMB.
OMB officially approved the form on March 8th, 2024, this would mean the requirements are effective:
Jun 8th, 2024 for Critical Software (as defined by NIST, see their definition here)
September 8th, 2024 for all other software in scope for the form
So this officially starts the clock for Federal agencies to begin collecting the forms from their software suppliers by the dates above and for software suppliers selling to the Federal government to be prepared to attest to the requirements laid out in the form at the dates above as well.
So, I recommend getting to work.
Nice analysis! One small thing: the link to the Chainguard Markdown version is broken!
Thanks for the detailed breakdown and your perspective of implications, challenges and next steps. Quality as always.