Public Sector Compliance Conundrums
A look at the Federal and Defense communities grappling with the dichotomy between cybersecurity, innovation and compliance (FedRAMP, ATO's & Self-Attestations)
Welcome to Resilient Cyber!
If you’re interested in FREE content around AppSec, DevSecOps, Software Supply Chain and more, be sure to hit the “Subscribe” button below.
Join 5,000+ other readers ensuring a more secure digital ecosystem.
If you’re interested in Vulnerability Management, you can check out my book “Effective Vulnerability Management: Managing Risk in the Vulnerable Digital Ecosystem” on Amazon. It is focused on infusing efficiency into risk mitigation practices by optimizing resource use with the latest best practices in vulnerability management.
If you’ve been in or around the public technology space for any time, you inevitably see the industry wrestling with the competing priorities of ushering in digital modernization, innovation and growing complexity around cybersecurity concerns and compliance.
We see it manifesting in several areas, such as:
The Government’s consumption of cloud service offerings (CSO)’s from cloud service providers (CSP)’s
Internally developed and maintained IT systems managed under the Risk Management Framework (RMF) that receive an Authority to Operate (ATO)
Externally procured software in the Governments immense software supply chain in the form of the Cybersecurity Executive Order (EO), Office of Management and Budget (OMB) memos and self-attestation forms around secure software development.
In this article we will discuss several of these concurrent efforts and pain points and try and unpack the complex intersection between the Government’s desire to digitally modernize and utilize software to transform how things operate across Federal Civilian and Defense entities while juggling with the constantly evolving and complex threat landscape in cybersecurity.
To set the tone, we will quote a couple of key documents that emphasize the importance software has across both the Federal and Defense communities.
DoD Software Modernization Strategy (February 2022):
“The Department’s adaptability increasingly relies on software and the ability to securely and rapidly deliver resilient software capability is a competitive advantage that will define future conflicts. Transforming software delivery times from years to minutes will require significant change to our processes, policies, workforce and technology.”
Executive Order on Improving the Nation’s Cybersecurity (Cyber EO 14028 May 2021):
“In the end, the trust we place in our digital infrastructure should be proportional to how trustworthy and transparent that infrastructure is, and to the consequences we will incur if that trust is misplaced”
Here we see the inherent conflict and tension between speed and governance, security and resilience.
A conflict the public sector continues to grapple with, as they seek to usher in innovative technologies and capabilities from the private sector, accelerate internal development and deployment timelines, all while meeting existing and emerging compliance and security requirements - and therein lies the challenge.
So let’s take a look.
FedRAMP
First up on the list is the Federal Risk and Authorization Management Program (FedRAMP). It aims to “provide a standardized approach to security authorizations for Cloud Service Offerings (CSO)’s.”
For those unfamiliar, the Federal Risk Authorization Management Program (aka FedRAMP) is a program the Federal government uses to securely adopt cloud services and offerings, and it is oriented around NIST 800-53 and associated security control baselines (Low, Moderate and High).
Cloud Service Providers (CSP)’s undergo as assessment and authorization process, receiving either a Provisional Authority to Operate (P-ATO) from the FedRAMP Program Management Office (PMO) or an agency issued ATO.
FedRAMP itself is a complex program with various stakeholders such as CSP’s, Third Party Assessment Organizations (3PAO’s), Advisors/Consultants, the FedRAMP PMO and agencies using and authorizing the cloud service offerings.
FedRAMP originated from a 2011 memo from the Federal CIO titled “Security Authorizations of Information Systems in Cloud Computing Environments”. Much has changed since that 12 year old memo that established FedRAMP and this "Modernizing FedRAMP” serves as a conduit to help the FedRAMP program align with the evolution of the industry.
Additionally, in 2022 we saw Congress pass the FedRAMP Authorization Act, codifying FedRAMP in law along with specifying requirements for the program and its operations. I previously covered the act in an article titled “Dissecting the FedRAMP Authorization Act”, which can be found here.
Federal documents such as the FedRAMP Modernization Act say:
“The purpose of the FedRAMP program is to increase Federal agencies’ adoption of and secure use of the commercial cloud, while focusing cloud service providers and agencies on the highest value work and eliminating redundant effort.”
The unfortunate reality here though is that despite existing for over a decade and in a market of tens of thousands of CSO’s, the FedRAMP Marketplace at the time of this writing only lists a paltry 337.
There’s also the reality that due to the cumbersome and complex nature of the FedRAMP process (and DoD Security Requirements Guide (SRG) on the DoD front) that it isn’t uncommon for many to seek workarounds and ways to leverage the cloud services without going through FedRAMP, leaving a slew of shadow SaaS and cloud usage flying under the radar of the very program that was created to govern it.
The government is aware of this challenge, which is why the FedRAMP Modernization Memo states:
“The FedRAMP marketplace must scale dramatically to enable Federal agencies to work with many thousands of difference cloud-based services that can accelerate key agency operations while allowing agencies to directly manage a smaller IT footprint.”
FedRAMP has long been cited as too cumbersome and costly for SMB CSP’s and many have bemoaned that despite the good intentions of the program, it is actually hindering the Governments adoption of innovative commercial cloud.
Because of this, we have begun to see organizations establish Platform-as-a-Service (PaaS) service offerings, where they get the platform FedRAMP’d and then add new commercial software companies onto the platform, letting them inherit the FedRAMP authorization of the platform rather than pursuing it themselves, saving time and money.
While this has created commercial business opportunities, its obviously the unintended side effect of a process that has been cost and resource prohibitive for most commercial CSP’s, keeping them out of reach of Federal and Defense organizations looking to use cloud as part of their technology efforts.
The Government is aware of the challenges, which is why we have see things such as the FedRAMP Authorization Act, FedRAMP Modernization Memo and FedRAMP Emerging Technologies Prioritization Framework, as well as the forming of the Federal Secure Cloud Advisory Council (FSAC), all aimed at
The FSCAC which just met this month in May 2024 emphasized in their meeting that they are “aiming to reduce barriers to entry, and speed certifications”.
This also doesn’t account for the fact that the DoD has its only process, oriented around the DoD Security Requirements Guide, involving DISA, and adding additional controls/control enhancements to existing FedRAMP authorizations to allow CSO’s to be used for DoD data at Impact Levels 2,4,5 and 6.
Not only do these cumbersome processes lead to Shadow SaaS consumption across the entire Federal/Defense landscape, they lead to software companies reverting to self-hosted and on-premise models to avoid the expense and toil of FedRAMP and the SRG - ironically causing both of the requirements to impede the very thing they set out to facilitate, the Governments adoption of secure cloud service offerings.
Authority-to-Operate (ATO)
Next up on the list of challenges the Federal and Defense community continues to face that exists at the intersection of looking to accelerate digital modernization and leveraging software and cybersecurity is that of the ATO process.
For those unfamiliar, ATO as defined by NIST is:
“The official management decision given by a senior Federal official or officials to authorize operation of an information system and to explicitly accept the risk to agency operations, agency assets, individuals, other organizations and the Nation based on the implementation of an agreed-upon set of security and privacy controls”
That’s a long way of saying that the granting of an ATO is the decision for the Government Authorization Official (AO) to allow a system and/or software to go into production.
Much like FedRAMP, the ATO is beginning to routinely be pointed at as a massive pain point of the Federal and Defense communities ability meet things laid out in the software modernization memo we led into this article with (e.g. speed, scale, and pace of relevance).
We’ve seen Federal agencies and programs look to tackle the ATO challenges in a variety of ways.
Software Factories
While the definition of a “Software Factory” will vary, here is a great blog with a definition I can live with:
“A DoD Software Factory is a software development pipeline that embodies the principles and tools of the larger DevSecOps movement with a few choice modifications that conform to the extremely high threat profile of the DoD and Defense Industrial Base (DIB)”.
As pointed out in the blog, “establishing a software factory” is one of the plays laid out int he DoD DevSecOps Playbook.
For those not ingrained in DoD and Federal technology culture, the closest parallel is thinking about the emergence and growth of Platform Engineering in the commercial space, where you have programs building robust platforms often consisting of cloud-native technologies, Kubernetes, containers, CI/CD pipelines and open source software, all leveraging Agile and DevSecOps methodologies to iteratively deliver software at speed and scale within and beyond an organization.
There’s now a burgeoning Software Factory community in the DoD, with then Deputy Chief Information Officer (CIO) for Information Enterprise Lily Zeleke as being quoted as saying there were nearly 50 across the Department at the end of 2023.
There’s definitely debate about what actually makes a DoD Software Factory, if there should be so many, should they be consolidated around the most capable and so on, but we will leave that debate aside for the purpose of this article.
The growth of SWF’s has now led to a “Software Factory Coalition”, being led by great folks such as the CISO for the Army Software Factory, Angelica Phaneuf.
The group aims to enhance their innovation efforts by sharing discoveries conducts bi-monthly conferences to build a vibrant community. It also aims to grow and nurture a software community in the department by hosting Quarterly Conference Events, with results oriented activities as well as an annual summit, all of which can be found at their “SFC Events”.
I’ve had the opportunity with past companies and now my own digital services firm Aquia to support some of the SWF’s such as Platform One and Kobayashi Maru but other notable SWF’s include Kessel Run and BESPIN on the USAF side. Each SWF positions itself with a unique mission, purpose and often, area of expertise.
These SWF’s all play a part in both accelerating the adoption and use of technologies such as Cloud, Microservices, Kubernetes, Containers, CI/CD pipelines and the speed at which the Department can develop and deliver software, making them core to the DoD’s Software Modernization Strategy which we previously cited.
They also help tackle the compliance burden through things such as inherited security controls, streamlined security processes, hardened images/containers, pipelines with robust security tooling and automation and more.
While this section has focused on the DoD SWF community, it is worth noting Federal Civilian agencies have also established similar platforms even if they don’t take on the DoD’s industrial culture with the “factory” name.
Commercial Platform-as-a-Service (PaaS)
Similar to internal Federal/Defense PaaS called “Software Factories” are commercial PaaS offerings which look to accelerate access to commercial SaaS and software solutions for Federal/Defense Mission Owners.
As we have and are discussing throughout this article, there are several cumbersome compliance requirements (e.g. FedRAMP, DoD SRG, CMMC et. al) which make it cost prohibitive and time consuming for commercial software companies to sell to the Federal government. The same requirements ironically make it so that Federal mission owners can’t access the latest commercial software innovations due to the lengthly timelines and costly requirements.
The most notable example of a commercial PaaS is Gard Warden, which is offered by Second Front (2F). It’s dubbed as a “DevSecOps platform that delivers SaaS to government at speed with built-in accreditation and security at its core.
In a testament to the total addressable market and product market fit of 2F’s Game Warden, they just announced their Series C of $70M, which included new investors Salesforce Ventures and Battery Ventures in addition to their existing investors NEA, Moore Strategic Investors and Artis.
This comes on the heels of 475% revenue growth and over 350% customer growth in the previous 18 months by the 2F team.
Commercial PaaS such as Game Warden aim to alleviate the pain points of commercial SaaS companies looking to sell into the Federal and Defense markets, while also accelerating access to Federal mission owners to innovative commercial software solutions, while meeting the rigorous security and compliance requirements that the public sector demands.
Continuous ATO/Ongoing Authorization
Another prominent concept and term in the discussion around accelerating software development, delivery and integration in the Federal/Defense community is that of Continuous ATO (cATO)/Ongoing Authorization.
Examples in popular Government and Government Contractor (GovCon) tech media include:
Legacy ATO process is slowing software upgrades at DoD, experts say (NextGov 2024)
Today’s battles happen at the pace of software. The pentagon needs to hit the accelerator (DefenseOne 2024)
Can a New Information-Security Approach Save the Navy $1B a Year? (DefenseOne 2023)
In fact, the Defense Innovation Board (DIB) (not to be confused with the Defense Industrial Base (DIB)) even did a study titled “Lowering Barriers to Innovation”, where they cite cATO as a strategy to “ensure rapid software updates without the requirement for a new ATO”.
In the same report, they made a recommendation for the DoD CIO to employ a policy of direct reciprocity, allowing ATO’s for products to be used across different DoD users. They recommendation was focused cloud-based SaaS products, but it is worth noting that the DoD CIO just published the “Cybersecurity Reciprocity Playbook” in May 2024, in an effort to get DoD organizations to “accept each others security assessments”.
Ironically enough, reciprocity is another longstanding NIST concept and one that has existed in RMF for quite some time, but the DoD CIO had to literally release a playbook to direct DoD organizations to actually do what they are already allowed to do. As stated by my friend Jacob Horne, “It’s become way too easy to write off behavioral tendencies as compliance”.
If you recall, Federal/DoD systems require an ATO to operate in production/with production data. While the term “cATO” hasn’t existed traditionally in NIST RMF terminology, it has taken hold among the community, and builds on an actual existing NIST concept of Ongoing Authorization.
The term Ongoing Authorization has existed for over a decade in NIST terminology. NIST Risk Management Framework (RMF) is documented in SP 800-37 “Risk Management Framework for Information Systems and Organizations”.
The latest 800-37 even emphasizes Ongoing Authorization throughout the document, discussing using automation for RMF and moving towards real-time or near real-time risk-based decision process for senior leaders. The goal is to enable near real-time risk based decision making through effective continuous monitoring (ConMon) to reduce cost and drive efficiency.
Ongoing Authorization aims to both drive efficiencies in resources and risk management, as well as move away from the legacy 3 year cyclical nature of ATO’s where an initial authorization is provided and then the system receives annual assessments/audits with a subset of controls assessed once annually and then a full re-authorization every 36 months.
This is both inefficient because the nature of the cybersecurity threats doesn’t operate in months, or years and controls aren’t static in today’s dynamic and ephemeral cloud-native architectures, as well as the fact that software delivery now moves in minutes to hours, and the push for OA moves compliance towards the future to align with DevSecOps.
In the same vein of thought, the Defense community has begun to rally around the term Continuous ATO (or cATO shorthand). Several of the aforementioned SWF’s discussed have achieved or are pursuing cATO’s and the DoD CIO has even released an “cATO Memo” where they state:
“cATO represents a challenging but necessary enhancement of cyber risk approach in order to accelerate innovation while outpacing expanding cybersecurity threats.”
They also emphasize some core aspects to achieve cATO’s such as":
ConMon - continuously monitoring and assessing all of the security controls within the information system’s security baseline, including common controls. They point out the need for all security controls needing to be fed into a system level dashboard view for realtime visibility to the AO(s) involved.
Active Cyber Defense - the ability to respond to cyber threats in real, or near real time. This means going beyond patching and having the ability to in real time or near real time, deploy appropriate countermeasures to thwart cyber adversaries
Secure Software Supply Chain - This points to the increased prevalance of software supply chain attacks such as SolarWinds, Microsoft, Log4j and more and the increased complexity of the modern software supply chain including both commercial products and open source software components. The memo emphasizes SBOM’s, and approved software platforms and development pipelines.
The memo stresses that to achieve a cATO the system needs to embrace the DoD DevSecOps Strategy, align with the DevSecOps Reference Design, which can be found at the DoD CIO Library.
If you’re looking for an in-depth history of the origins and evolution of cATO you can find that in the “cATO Manifesto” from Rise8 and their founder Bryon Kroger, who helped coin the term in 2018 at SWF Kessel Run. That said, as stressed in the manifesto itself, it was building on momentum from leaders such as GSA’s 18F’s Accelerated ATO on cloud.gov, NGA’s ATO-in-a-day for their GEOINT Services Platform and others.
It’s also worth pointing out that there are a lot of opinions on how cATO/OA should be done and in the manifesto I mentioned above it even calls out “Shadow ATO”, as an example where organizations are looking to get the benefits of cATO and continuously deploying software to production without doing the actual work discussed in the cATO Memo I mentioned from the DoD CIO.
I can attest to this reality, as I have supported some programs with signed cATO’s that lacked sufficient documentation, such as System Security Plans (SSP)’s, as well as Security Requirements Traceability Matrices (STRM) which document the responsibility of security controls between mission owners and platforms (e.g. fully inherited, partially inherited or fully up to the mission/system owner).
LinkedIn has been an incredibly popular place for a lot of the ATO reform discusses to occur, such as posts by former Army Enterprise Cloud Management Agency (ECMA) Director Paul Puckett, who also was involved in NGA, whom were early pioneers as I mentioned above when it comes to ATO modernization and innovation.
One peculiar and off-putting aspect of the increased realization of ATO pain is suddenly many see it as a business opportunity and/or opportunity to espouse perspectives, some of which aren’t grounded in actual experience or expertise, but such is life. We now see many organizations positioning themselves as having cATO expertise, or looking to pitch their product offerings as part of helping the Government achieve cATO.
Despite all of the hype and focus on cATO and it being positioned in popular Government media outlets and via LinkedIn as the panacea to longstanding ATO woes, there are some fundamental aspects of the conversation missing however.
Most seem to gloss over the fact that to obtained a cATO, you first need to be capable of achieving an initial authorization. This means you still need the full body of evidence (BOE) as well as the additional capabilities and criteria laid out in the DoD CIO memo.
Furthermore, while cATO/OA does help the government move compliance towards DevSecOps, Agile and Cloud with iterative software delivery to production, there are much deeper problems that also remain unaddressed, such as:
The longstanding dependence on legacy subpar GRC platforms such as eMASS and Xacta, which lack robust API’s, automation, integrations with cloud platforms, CI/CD platforms and automations
The fact that much of the ATO activity still occurs in a manual bespoke fashion, moving at a glacial pace, with static documentation largely in formats such as PDF and Excel rather than as-Code (e.g. OSCAL etc.)
The pervasive practice of system sampling and other inefficient and ineffective approaches that don’t give the full picture of the system being authorized or the full scope of its attack surface
Failure to lean into innovative cloud-native services and third-party products that facilitate near real-time reporting of compliance and security posture and can help inform assessment, authorization and ConMon activities
A massive workforce gap when it comes to the folks often building the systems living in Git, CI/CD, Kubernetes, Cloud, Microservices and Code, and the assessor and compliance community who live in Excel, Word, legacy GRC tools and have a general lack of expertise or experience in the technologies being used to build the systems, making assessments and authorizations very painful, leading to often discussions and education just to help the Assessors understand how something even works or functions
And more.
That said, these problems aren’t just impacting the Government, there is a wide gap in general between the GRC/Compliance/Assessment community and the engineering and development teams building modern technologies, architectures and applications
I’ve written extensively about these problems in places such as:
“Compliance is Cumbersome, Cloud Can Help” (PluralSight)
Books “Software Transparency” and “Effective Vulnerability Management” (Wiley)
“DevSecOps: Are We There Yet?” (DevSecOps D.C.)
Despite all of the challenges and items I mention above, it is worth pointing out that the Government technology community is increasingly waking up to the fact that the way they have done cybersecurity and compliance in the past won’t get them to where they want to go in the future.
It is incredibly out of touch with technology innovations and modern capabilities and the challenges the compliance culture impose on achieving digital modernization and getting real outcomes and capabilities into the hands of Citizens, Warfighters, System Owners and Society is now being called out at the highest levels of Government.
These are all good things, and we should champion this, fostering a community of compliance disruptors, sharing lessons learned, taking risks and moving towards being an able mission enabler rather than an impediment that our engineering and development peers dread engaging with.
Software Supply Chains, Secure Software Development and Self-Attestations
Last up on the list of items I will discuss in this article is that of software supply chain, secure software development and self-attestations.
For those who have been paying attention, software supply chain attacks have been on an exponential rise. This includes attacks against commercial software suppliers as well as widely used open source software (OSS) components (the former of which the latter is overwhelmingly made up of by the way, so the two are intertwined).
The Government has woken up and arguably has led the charge of securing the software supply chain, from things such as Section 4 of the Cybersecurity Executive Order (EO) to NIST’s Secure Software Development Framework (SSDF), OMB Memos 22-18 and 23-16 and now CISA’s Secure Software Development Attestation Form.
If you’re new to any of these topics or want a deeper dive I have hyperlinked each of them above to articles where I break all of them down individual in great detail.
The summary of these activities is that the Federal government is trying to use its massive purchasing power, to the tune of tens of billions a year annually of IT spend to force a systemic change in the software supply chain.
They are pushing to have ALL software suppliers that sell to the U.S. Federal Government begin to self-attest that they have followed the secure software development practices derived from NIST SSDF which are laid out in the CISA self-attestation form.
This of course isn’t a silver bullet and is often fodder for naysayers, but it is a push to give the Government some level of assurance that their software suppliers are implementing widely cited best practices when it comes to the development of their digital products which the Government procures.
On one hand the self-attestation model represents a soft approach, letting suppliers attest to doing the work, rather than implementing a third-party audit approach which is often cited as being too costly, cumbersome and heavy handed (for example see FedRAMP which includes a 3PAO model and have 300~ authorized services despite being around for over a decade).
On the flip side, the self-attestation brings concerns of false claims or squishy attestations, and many saw this play out with the U.S. Defense Industrial Base (DIB), which had a self-assessment/attestation model and led to many suppliers in the DIB getting recked by malicious actors and nation states despite claims of security controls in place, now ushering in the Cybersecurity Maturity Model Certification (CMMC), which now includes a 3rd party assessment (feels like we’re going in a circle here right?)
So on one hand while the Government is actively taking measures to try and mitigate software supply chain attacks, or at a minimum have some level of assurance of their suppliers, they are also introducing net new complexity and work for software companies working with the Government, which inevitably runs contradictory to their goals of enabling faster, easier, streamlined access and use of commercial technologies.
Conclusion
While each of these topics is incredibly complex and can be an entire article or series of article themselves (which I’ve done, as well as integrated into two books I’ve published), it is clear to see the Government has challenges.
On one hand it is looking to streamline access to innovative commercial software and capabilities, and on the other it is looking to address emerging cybersecurity threats and compliance requirements.
The two goals are in an inevitable point of friction, as they can impede one another.
That said, with the right combination of people, process and technology (emphasis on the first), we can help the Government achieve secure digital transformation and meet the two goals independently and mutually in a way that services our citizens, national security interests and society as a whole.
Software is an imperative part of our society and increasingly powers everything we rely on and our ability to maintain competitive advantages on this front will help domestically in terms of trust in our institutions and serving fundamental Governmental functions as helping our most vulnerable populations, as well as allowing us to keep pace with geopolitical adversaries or those looking to disrupt the U.S. place in the global hierarchy.
We’re living at the intersection of a Government desperately trying to move out of an industrial-era and modernize its processes, technology and culture and move towards a digital-native future - and I couldn’t be more thrilled to get to get to play a small part in it.
Nice summary. For example..Fedramp on-going cost (process, audits and people) makes this very expensive for startups and does not pay-off until a multi-million $ deal is closed. Until then you must get waivers (work around) to make ROI work for startups.