Dissecting the FedRAMP Authorization Act
Drastic improvement, or too early to tell? What other improvements can be made in the Federal Cloud Security space?
U.S. Federal lawmakers recently introduced a piece of legislation known as the “FedRAMP Authorization Act”, which we will discuss in this article.
If you’ve worked with Cloud and the U.S. Federal Government for any time, you’re inevitably familiar (potentially painfully) with the Federal Risk and Authorization Management Program (FedRAMP).
To put it succinctly, FedRAMP provides a standardized approach to security authorizations for Cloud Service Offerings within the Federal ecosystem. (There is also an additional framework dubbed the Security Requirements Guide (SRG) aka FedRAMP+ which builds on FedRAMP for the U.S. Department of Defense, often further delaying the DoD’s adoption of innovative cloud services but that is beyond the scope of this article and a topic for another day).
FedRAMP was formally established in 2011, aimed at “providing a cost-effective, risk-based approach for the adoption and use of cloud services by the federal government”. Whether it has done either of those is certainly debatable, given we are over a decade from its inception and it boasts a meek 287 authorized Cloud Service Offerings (CSO)’s in the FedRAMP Marketplace at the time of this writing, in an industry of tens of thousands of CSO’s, coupled with the fact that FedRAMP timelines can range from 12-18 months and cost several hundreds of thousands of dollars to complete.
That said, despite these metrics, FedRAMP has inevitably met some of its intended goals, such as providing that standardized process to assess CSO’s for Federal use, help establish an industry standardized compliance framework, and enrich the discussion around cloud governance and secure use.
FedRAMP builds on existing requirements and regulatory requirements such as the Federal Information Security Modernization Act (FISMA) which required organizations to protect federal information. Additionally, it ties to Office of Management and Budget (OMB) Circular A-130, which states that when agencies implement the previously mentioned FISMA, they do so with NIST standards and guidances.
FedRAMP leverages said NIST standards and security controls such as NIST 800-53 Security and Privacy Controls for Information Systems and Organizations as part of the FedRAMP baseline security controls for Low, Moderate and High categorized systems, each of which have their own respective security control baselines (also derived from NIST guidance, FIPS 199 in particular).
Despite some of the challenges mentioned above, the FedRAMP program has been absolutely fundamental to the U.S. government and DoD’s adoption of cloud, and inarguably tied to publications such as the governments Cloud First and Cloud Smart policies, which are part of the broader Federal Cloud Computing Strategy.
Cloud has been emphasized by nearly every Federal IT/Security leader when it comes to digital modernization, innovation and even national security (recently taking a prominent role in the DoD’s Zero Trust Strategy, which we covered last week here).
Building on its role within the governments adoption of Cloud, lawmakers are introducing the FedRAMP Authorization Act to reform FedRAMP and help it continue to serve its fundamental role in the governments use of cloud services and providers.
So let’s take a look at the FedRAMP Authorization Act and see where it may help, where it may not, and what else can be implemented to accelerate and improve the Governments use of secure cloud service offerings.
FedRAMP Authorization Act
The FedRAMP Authorization Act opens with some key findings by Congress, which we will summarize below.
Findings
The bill opens with emphasizing how critical cloud is to expediting the modernization of legacy IT systems across the Federal departments and agencies and also aids in job creation, with the cloud computing market growing three fold since 2004 and creating more than 2M jobs and more than $200 billion to the U.S. GDP. Cloud has been emphasized through the previously mentioned Cloud First/Smart strategies of prior U.S. presidential administrations and also a key part in the Cybersecurity Executive Order (EO) “Improving the Nation’s Cybersecurity” of the current President.
The bill touts FedRAMP’s role the governments secure authorization and re-use of cloud products and services and reduced burden and costs on both agencies and cloud companies looking to enter the Federal market. It also points to the nearly 300 authorized CSO’s in the FedRAMP marketplace that have been re-used 2,700 times across various agencies (whether the 300 authorizations is something to brag about is another story and I will touch on that in the article).
Congress states that providing a legislative framework for FedRAMP, which would include new authorities to GSA, OMB and Federal agencies will help expedite authorizations for CSO’s, further enable re-use of FedRAMP authorized CSO’s across agencies, further reduce burden/costs for CSP’s seeing FedRAMP authorizations and provide more transparency and dialogue between industry and the Federal Government.
FedRAMP Board/Independent Assessment
The bill goes on to re-iterate existing aspects of FedRAMP, such as the Joint Authorization Board (JAB) and Independent Assessors. To summarize, the primary governance and decision-making body for FedRAMP is the JAB which includes representatives from DoD, GSA and DHS. FedRAMP also has involvement from OMB and NIST as mentioned previously.
In addition to the JAB, FedRAMP makes use of Independent Assessors which analyze, validate and attest to CSP’s compliance to the FedRAMP security controls and requirements. These assessors are generally referred to as Third Party Assessment Organizations (3PAO)’s. CSP’s seeking FedRAMP authorization’s work with 3PAO’s to have their cloud services assessed prior to an authorization decision. You can find out more about FedRAMP Assessors at the official page here.
Key Changes/Modernizations
Now that we’ve discussed some of the primary aspects of FedRAMP and the proposed legislations status quo aspects, let’s look at some of the new areas it aims to usher in and discuss their merit.
Federal Secure Cloud Advisory Committee
The proposed bill aims to establish a “Federal Secure Cloud Advisory Committee”. It would be made up of no more than 15 members, which includes representatives from both the public and private sectors, who would be appointed by the GSA Administrator (or their representative) since they will act as the Chair of the committee. This committee’s makeup will include:
The GSA Administrator/Rep
1 representative each from CISA and NIST
2 existing CISO’s within Federal agencies (I propose a nomination below)
Federal Agency Chief Procurement Officer
1 individual from an independent assessment firm
5 representatives from unique businesses that provide cloud computing services and products, 2 of which much be from a small business perspective as defined by the Small Business Act
2 other representatives of the Federal Government as the Administrator deemed necessary to bring balance and expertise to the overall committee.
The groups aim will be to “ensure effective and ongoing coordination of agency adoption, use, authorization, monitoring, acquisition and security of cloud computing products and services to enable agency mission and administrative priorities”. This includes improving the FedRAMP authorization process, increase agency re-use of authorizations, find ways to reduce the burden and cost associated with FedRAMP for CSP’s and improve the number of FedRAMP authorizations among small businesses.
The committee will bring a diverse group of public and private industry expertise that addresses a myriad of perspective from compliance, security, procurement/acquisition and business.
This will be a welcomed addition of expertise that can hopefully effectively advocate for changes that will accelerate the Governments authorization and adoption of cloud, which arguably hasn’t kept pace with industry, as evident by the current FedRAMP marketplace metrics and other points we will make coming below.
Presumption of Adequacy
One aspect that has drawn a lot of interest and commentary from industry is Section (e) under 3613 “Roles and Responsibilities of Agencies”. In addition to requiring agencies to adhere to using FedRAMP authorized services, use existing authorization data (which will seem duplicative in a moment) and provide authorization artifacts to the FedRAMP PMO, there is a section titled “Presumption of Adequacy” that is getting attention.
This section states that “The assessment of security controls and materials within the authorization package for a FedRAMP authorization shall be presumed adequate for use in an agency authorization to operate”.
What this is striving to do in laymen’s terms is to avoid the all-too-common duplicative security and compliance assessments that occur, where despite the existence of a FedRAMP authorization, agencies make CSP’s jump through hoops and provide documentation and data that the FedRAMP Authorization process has already validated. This unfortunate reality is cumbersome and costly for cloud providers and agencies alike and should be avoided.
One industry commentator was MITRE’s Center for Data-Driven Policy, Dave Powner, who stated “The presumption of adequacy clause is a really big deal. It creates a new standard for cloud risk determination and now FedRAMP authorization can be reused without further oversight and agencies can feel safe about not getting penalized.”
While I don’t know Dave, and this isn’t a personal attack against him, nor MITRE, which I have tremendous respect for, I think this sort of language can be dangerous or misleading for reasons I will discuss below. While not being penalized from a compliance perspective is one thing, there are still significants amount of responsibility and due-diligence needed by agencies consuming FedRAMP authorized cloud services, that go well beyond the FedRAMP authorization checkbox and I will touch on those in this article. It is also worth pointing out that while this presumption of adequacy may help with agency re-use of FedRAMP authorizations, it does nothing to help the initial intake bottleneck that is the current FedRAMP process itself, keeping it to less than 300 offerings after a decade. The Cyber EO opened the door to alternatives where appropriate, and presumption of adequacy makes much more sense in that context, as the initial bottleneck is much more of a problem than post-authorization re-use when it comes to scaling the success and effectiveness of FedRAMP.
Recommendations & Improvements
Below are some proposed recommendations and improvements from my perspective. While these generally wouldn’t be appropriate for an avenue like the FedRAMP Authorization Act itself, perhaps they can be taken into consideration by the forthcoming Federal Secure Cloud Advisory Committee or other senior security leaders in the Federal and DoD ecosystem when it comes to secure cloud computing to enable mission outcomes.
I suggest these as someone who’s been handling compliance and securing Cloud workloads and environments for the DoD and Federal Civilian agencies for the last 7 years, including as a Federal Employee with the Navy and then GSA FedRAMP team as a Technical Representative on the JAB briefly.
I’ve led Cloud Security efforts for Federal Civilian agencies such as GSA/FedRAMP and Centers for Medicare & Medicaid (CMS) along with the DoD for Navy, USAF and Space Force using their Cloud Security Requirements Guide (SRG) for workloads across Impact Levels 2, 4, 5 and 6. I’ve also have co-authored Cloud Security Alliance’s Cloud Incident Response Framework and SaaS Governance Best Practices for Cloud Customers guide and hold various industry certifications such as AWS Security, Azure Security and the CSA Certified Cloud Auditing Knowledge (CCAK) Certificate and regularly speak, write and consult on the topic of cloud security.
I make these recommendations in genuine belief that current programs/teams, efforts and policies are well intended, but unfortunately falling short from both an innovation and a security perspective and can be improved.
I agree that Cloud is critical to the way we modernize Federal/DoD IT systems and that is why I am hopeful we can continue to it evolve how we authorize and secure cloud to make those modernization goals a reality.
Compliance Framework Reciprocity
One unfortunate reality of the cybersecurity landscape is that we’re compliance framework rich but implementation poor.
This problem isn’t isolated to the Federal space, it seems like every time there is a major incident or situation, out comes another framework. This creates a patchwork disjointed landscape that leaves organizations and vendors alike both frustrated and confused. One way to improve this issue is through the reciprocity and reconciliation of disparate frameworks where you have similar or even the same security controls.
This problem exists in the context of FedRAMP because many CSP’s have already made the investment of time and resources to comply with other frameworks, such as SOC2, ISO, Cloud Security Alliance’s Cloud Controls Matrix (CCM) and others. However, despite those investments that provide commercial cloud consumers assurances, the Federal government has its own unique compliance framework in FedRAMP with its unique security controls. This means CSP’s have go through the FedRAMP process despite already potentially adhering to similar compliance frameworks with overlapping and similar security controls to some extent.
The Federal Government recognized this problem and it was even pointed out in the Cybersecurity EO. Section 3 of the EO focuses on Modernizing Federal Government Cybersecurity. In Section 3 sub-section (v) specifically calls for mapping relevant compliance frameworks onto FedRAMP requirements where appropriate and allowing for them to be used as a substitute.
In fact, I even wrote about this in a 2021 NextGov article titled “Executive Order Hints at FedRAMP Alternatives”.
However, that article was written nearly 18 months ago and this has not been done, or at least publicly shared and pushed for to my knowledge. This is despite a tremendous amount of momentum behind other Cyber EO priorities, such as Software Supply Chain Security and Zero Trust, each of which have seen a significant amount of public dialogue, memos, strategies and so on.
So here we are, 18 months later with the Government’s adoption of innovative Cloud technologies still being hamstrung by problems such as this one, among the others we will discuss. This lack of attention directly contradicts the emphasis on accelerated cloud adoption in artifacts such as the Cloud First/Smart strategies we previously discussed.
The Government wouldn’t even need to start from scratch on this front, as the Cloud Security Alliance has already done much of the leg work with their CCM I mentioned above, which cross-walks various compliance frameworks, including NIST 800-53 Revision 5 (which is what FedRAMP uses) across frameworks such as ISO and PCI DSS among others.
Tackling this problem would broaden the ecosystem of innovative CSP’s and cloud service offerings the Federal government can use while also minimizing the burden and cost associated with doing so - but it needs to be done for that to occur.
This isn’t to say all of the above mentioned frameworks and resources are a one-to-one match, but much overlap does exist and desires to fully cover gaps between them should also be weighed against risk. In security we often take a one-directional look at risk. We look at the risk of failing to fully comply with “x” framework, while totally neglecting the risk that occurs due to technical stagnation, becoming obsolete and being avoided/sidestepped (which frequently happens to the security community), presenting its own risk.
Compliance-as-Code/Codified Documentation
One major drawback with the way compliance is generally handled is through the use of static documents and paperwork in the forms of PDF and Word documents. These documents are lengthy, cumbersome and largely antiquated soon after their creation and rarely if ever visited or used by the actual system owners or engineers who’s names are on them. The irony of this is its occurring in a world where as an industry we’re largely moving to an as-Code paradigm. Our infrastructure is code, in the form of declarative languages such as AWS CloudFormation, Terraform, Kubernetes Helm Charts and Container Manifests.
Despite everything being captured as code, we still for some reason handle all of our compliance documentation in a static format. Luckily this is a tide that is changing and thankfully it is being led by none other than FedRAMP themselves, in collaboration with peers at NIST. It is being done through what is known as the Open Security Controls Assessment Language (OSCAL) and is being championed by a friend I admire, Dr. Michaela Iorga of NIST.
OSCAL is a set of formats expressed in XML, JSON and YAML and allow for machine-readable representations of both control catalogs and baselines as well as standard compliance artifacts required under FedRAMP (and RMF and 171/CMMC) such as System Security Plans (SSP)’s.
This machine-readability allows for extensible architectures, tooling and integrations that can remove much of the human toil associated with compliance documentation and allow machines do what they do best, which is often centered around automation. It also allows for automating activities such as control assessments, updates and remediations as we start to see a synergy between codified compliance artifacts and declarative architectures and environments that technologies such as Cloud and Kubernetes facilitate.
For an example of what OSCAL can do, be sure to check out the “Awesome OSCAL” GitHub repo that includes a collection of content, such as Center for Internet Security (CIS) Critical 18 Security Controls or CMS’ Acceptable Risk Safeguards (ARS) control baselines in OSCAL format. The repo also includes a robust set of OSCAL tooling from organizations such as EasyDynamics in their OSCAL Rest API Service, GSA’s OSCAL Tools, GRC Innovator RegScale’s community edition platform that allows for integrating with security and compliance tools via an API to keep compliance documents continuously up-to-date. Lastly, the repo includes a trove of OSCAL blogs from industry leaders and innovators.
These innovative platforms and technologies are increasingly being adopted by industry and even entities such as 3PAO’s that assess for FedRAMP compliance and it would be in the Government’s best interest to do the same, and rip and replace legacy GRC tools such as eMASS (DoD) and Archer (FedCiv) which have simply failed to keep pace with the direction and innovation in commercial industry around GRC. These new innovative solutions are API-centric, have close integrations with the Cloud platforms themselves, where most modern workloads are hosted and also can integrate with technologies such as CI/CD pipelines to update and modify control compliance in real-time. This is much closer to the DevSecOps era we find ourselves in than the legacy disconnected GRC tooling of the past, which needs to go to the wayside and is a major time sink in addition to giving a false sense of security.
FedRAMP themselves shared they accepted their first OSCAL System Security Plan (SSP) from a CSP (AWS) in an article here. This trend of moving towards machine-readable compliance artifacts needs to accelerating while also ramping up Federal/DoD efforts to unshackle themselves from legacy GRC vendors that didn’t keep pace with the age of DevOps and Cloud-native architectures driven by API’s and velocity that legacy systems simply aren’t suited for.
Genuine Continuous Monitoring
Another gap or area of improvement based on my experience in the Federal Cloud and more broadly, compliance space, is our antiquated approach to Continuous Monitoring.
It’s no secret that the Federal/DoD ecosystem approaches cybersecurity from a largely compliance (aka check box) lens. It’s a sentiment that had been mentioned by countless senior leaders and industry practitioners in a negative context. (For an example, look at the Navy’s CIO, Aaron Weis who is pushing for a shift from checkbox compliance to one of Cyber Readiness).
In environments where configurations, architectures and code are changing dynamically, we still largely monitor our security controls and posture in a static, snapshot-in-time fashion.
Security control baselines, as discussed above, consist of hundreds of controls (and often thousands of sub-controls, control enhancements and associated control implementation statements and so on). Yet as an industry we generally handle “Continuous Monitoring” by reviewing controls on an annual basis, often with a core set of controls or by sub-dividing the control baselines, such as say 350 controls into 1/3rds or 33%, each of which are reviewed in a single year over a 3 year cyclical authorization period.
This means that your security controls you may have reviewed in year 1 of the authorization don’t get revisited until year 3 during a new full assessment/authorization and the latter 66% of controls don’t get reviewed until years 2 and 3. Of course some of these controls are reviewed annually, if they’re deemed “critical controls” or controls critical enough to warrant an annual review but even that process is often still manual.
For examples of this, look no further than FedRAMP’s own Continuous Monitoring and Annual Assessment Guidance, which states the following:
The glaring problem with this approach to anyone with a rudimentary knowledge of modern IT systems is that your environment doesn’t just change annually, or in a 36 month period, it changes constantly, in an ongoing fashion - hence why we need real Continuous Monitoring.
We continue to take this approach despite the existence of innovative vendor solutions, OSS tooling and cloud-native services that can analyze and validate security control compliance in a near real-time fashion in an ongoing basis.
For examples of these, look at AWS’ Audit Manager or Azure’s Defender for Cloud - each of which boast a robust set of industry compliance frameworks and benchmarks that can assess a cloud environment iteratively, on-demand. There are also innovative third-party solutions in the market to address the need for multi-cloud coverage.
Many of the CSP’s, particularly SaaS providers in the FedRAMP market place are of course hosted in underlying FedRAMP compliant IaaS environments such as AWS, Azure and GCP. Given this reality, why do we continue to validate control compliance in a manual inefficient cumbersome fashion?
Not only is it inefficient but it is also insecure - providing no real assurance on the compliance posture of a CSP’s environment.
We could be utilizing these cloud-native services and third-party vendor tools to have a far better understanding of the CSP’s hosting environment, configurations, security and compliance posture - but we don’t. (This situation exists among most DoD/Federal IT systems and their ATO’s as well by the way, not just CSP’s falling under the FedRAMP purview).
There are also tools and technologies to support this same sort of rigor in Cloud-native Containerized/Kubernetes environments such as Compliance-as-Code/Policy-as-Code by inspecting container and Kubernetes manifests prior to deployment to a runtime environment, as well as monitoring them in a runtime environment to avoid configuration drift and therefore compliance deviations.
Making this shift will help usher in broader success for the push towards Continuous ATO (cATO)/Ongoing Authorization and get rid of the subjective 3 year authorization window that is largely security theatre focused on compliance check boxes and manual activities.
I have four young kids, and our current approach would be akin to peeking in their bedrooms to check that the room is clean once a month or year and then walking around the house with a false sense of assurance that it stays that way between those visits.
It’s simply foolish.
Our adversaries don’t operate this way and our approach to compliance and security shouldn’t either, especially given the modern tooling and technologies we could be using.
The uncomfortable truth is that there is nothing “Continuous” about how we do Continuous Monitoring other than it being continuously ineffective and inadequate.
Pre-Hardened Environments
Another current shortfall is the lack of resources for CSP’s looking to get FedRAMP authorized. I’m not talking about lengthy PDF’s and Word documents but I am talking about pre-hardened environments and reference architectures that comply with the FedRAMP security control baselines, can be taken and deployed rapidly by aspiring CSP’s, such as SaaS providers and tremendously expedite their timeline to authorization.
FedRAMP has long been viewed as being cost-prohibitive for SMB CSPs and that was further emphasized in the FedRAMP Authorization Act we kicked off this article with.
The current ecosystem requires a CSP aspiring for FedRAMP authorization to start from scratch, configuring an existing hosting environment to align with FedRAMP or moving into an environment that does (you see this being offered by “FedRAMP-in-a-Box” MSP’s who focus on the FedRAMP market of CSP’s - buyer beware with some of these and be sure to do your own due-diligence on some of the vendors promises).
But why can’t FedRAMP and the Government provide these resources ourselves? For a concrete example of this, take a look at the DoD Cloud Infrastructure-as-Code (IaC) effort by the DISA Hosting and Compute Center (HACC). It offers “IaC templates, called “baselines” that use automation to generate preconfigured, pre-authorized, Platform-as-a-Service focused environments”.
These IaC templates get System Owners (Mission Owners in DoD Speak) up and running in the Cloud in a compliant fashion very rapidly, deploying their workloads in hyper-scale CSP’s such as AWS, Azure and GCP.
Why can’t the FedRAMP/Government offer something similar to this to help get SMB’s up and running in FedRAMP compliant cloud environments. This model would work particularly well for SaaS providers looking to go through FedRAMP and building in the leading IaaS providers environments.
Third-party MSP and businesses provide these services, but of course, at a cost. If the Government wanted to help SMB’s be more successful in FedRAMP and save time and money, this is a specific example of how to do so.
Remember those cloud-native services I mentioned above that facilitate near real-time compliance assessments? These pre-hardened IaC templates make use of those to maintain constant visibility of compliance posture, and could do the same here for FedRAMP CSP’s.
Industry services vendors already offer these innovative solutions through a combination of IaC and cloud-native services to accelerate FedRAMP timelines and also improve security posture. I know this because my own firm, Aquia Inc. has a Zero-to-FedRAMP (Z2F) offering we’ve cultivated through our expertise in both FedRAMP compliance and cloud engineering.
Show me your “CODEFAX” aka SBOM or SaaSBOM
Forgive me for the cheesy pun but I’m a dad after all, so dad jokes are a given.
Another significant blindspot in the Federal cloud ecosystem is the focus on underlying IaaS environments, instances, VM’s, hosting architectures, configurations and so on but not looking at the code that actually powers the modern cloud ecosystem. For example, FedRAMP reviews and approves additional cloud-native services, features and functionality from CSP’s but do they ask to see the CSP’s SBOM/Software-as-a-Service BOM (SaaSBOM) that powers these cloud-native services?
The answer is generally no.
This means we, as the Federal government and Federal/Defense Industrial Base (DIB) have little to no visibility into the software makeup, use of OSS components and the potential vulnerabilities or risks associated with those services or their interactions. This may come in the form of traditional CVE’s (lagging indicators of risk), or more maturely, leading indicators of risk such as project health, mean-time-to-remediation, contributor activity, national origin and the list goes on. All of these are critical given that modern applications are largely composed of OSS components.
SaaS was initially not part of the conversation around SBOM’s and software asset inventory but that is changing. Co-Author Walter Haydock (who has a blog titled “Deploying Securely”, which I recommend subscribing to) and I made the case for SaaSBOM’s in 2021, in this article, and many in the industry seem to be rallying around the concept.
Luckily this is another area that is changing with the introduction of the Cyber EO and subsequently OMB’s 22-18 which will state to require third-party vendors to self-attest (and potentially be attested by a 3PAO) that they follow secure software development practices, such as NIST’s Secure Software Development Framework (SSDF), NIST’s Software Supply Chain Guidance, and also potentially providing artifacts such as SBOM’s/SaaSBOM’s, which will show the software component makeup of their software, even if delivered as a SaaS and shine a line on potential vulnerabilities or components that introduce unacceptable risks.
This is increasingly becoming possible today too, with the leadership of SBOM Formats such as CycloneDX, which provides native support for a SaaSBOM to help inventory services, endpoints, and data flows and classifications that power cloud-native applications.
The use of SaaSBOM’s can help understand complex systems, such as SaaS, including the inventory of all services and their reliance on other services, endpoint URL’s, data classifications and the flow of data between those services. It can also help the Federal government understand the software component makeup of said SaaS services. It also lets us move beyond verifying policies and processes and look at the raw outputs, which is software, and the security posture of it.
SSPM/SaaS Security
One other critical gap in the Federal Government’s approach to Cloud Security has been a myopic focus on IaaS when it comes to Cloud Security. One quick look at things such as the Cybersecurity EO, Security Requirements, and general industry dialogue of your usual “insert” Federal industry outlet/publication/events (often which are pay-to-play, inviting Federal speakers and only representatives from industry firms willing to pay to be present, regardless of actual expertise, but I digress) and nearly all conversations of Cloud Security focus on securing IaaS (e.g. AWS, Azure and AWS). This is despite the reality that agencies are consuming 3~ IaaS providers but tens to hundreds of SaaS offerings. Many of these SaaS offerings aren’t even FedRAP authorized, generally due to the issues discussed above, such as time, cost and resources required to become authorized. This means there is widespread SaaS usage unaccounted for and most large enterprises, commercial and federal like, would struggle to provide a truly accurate SaaS usage inventory.
Agencies would benefit tremendously from a SaaS Governance program, as I discuss in this article here. Yet, most agencies don’t generally govern SaaS, aside from the finger wagging of saying only FedRAMP authorized CSP’s (including SaaS) should be used.
The only SaaS Governance program in the Federal space that I am aware of is at Centers for Medicare & Medicaid (CMS) under the thought leadership of the agency’s CISO, Robert Wood, who I think would be a great candidate for the Federal Secure Cloud Advisory Committee the FedRAMP Authorization Act calls for. He’s both a sitting Federal CISO, and also a forward thinking Cloud Security leader that the committee could benefit from having.
When it comes to securing SaaS, most agencies largely lean on the FedRAMP authorization as the gold-standard of security. This flies in the face of the Shared Responsibility Model, which still has requirements for the cloud consumer, even in the case of SaaS. Sure FedRAMP compliance is a good first step, but there are many activities left to the cloud consumer still. These activities go well beyond paper documentation and static artifact reviews and require actual rigor of securing the SaaS environment in key areas such as Identity and Access Management (IAM), Data Security, and Security Operations (SecOps) just to name a few.
This is unfortunately the state of how the Federal ecosystem approaches SaaS security despite the growth and innovation coming from vendors such as Grip Security, Axonius, AppOmni, Netskope and Obsidian, all of which have brought SaaS Security Posture Management (SSPM) tooling to the market. Yet the use of SSPM tooling in the Federal ecosystem is minor, and often non-existent for most agencies. SSPM tooling provides automated continuous monitoring of cloud-based SaaS applications to ensure security, compliance and prevent misconfigurations.
Our approach to “ConMon” is broken because it relies on snapshot-in-time assessments and paperwork artifacts, not real-time assessment and validation of configurations, control compliance and posture, where agencies are failing to take advantage of SSPM tooling, much like I mentioned in the IaaS context, with Cloud Security Posture Management (CSPM) tools that leading CSP’s offer for their IaaS environments.
CISA even released a “Cloud Security Technical Reference Architecture” as part of the Cyber EO, almost none of which touches on SaaS, and largely focuses out securing IaaS and PaaS environments, which I call out in this article about the reference architecture and its shortfalls - which generated a response from CISA’s Director Jen Easterly on LinkedIn in 2021.
It’s still your data.
You can’t outsource accountability, regardless of the responsibility model.
In our conversation around Software Supply Chain Security and Cybersecurity Supply Chain Risk Management (C-SCRM), SaaS is a critical piece of that conversation, or should be. There are several notable SaaS data breaches/security incidents that have impacted thousands of organizations and millions of users. One clear example was the Twilio security incident, which impacted 130~ other organizations, many of which were SaaS providers themselves. You start to see the cascading impact SaaS can have due to its ubiquitous use across organizations and the pervasive connectivity between not only SaaS and enterprises but SaaS to SaaS as well.
However, when it comes to securing Cloud, and securing SaaS, we largely look at these CSP’s and offerings in isolation. Analyzing the offering/provider, their vulnerability scans, configurations, and so on. I want to suggest that we take a more holistic (overused term, I know) look at the App-to-App integrations that are pervasive in most enterprise environments, including among SaaS-to-SaaS and SaaS-to-IaaS providers. Some vendors, such as Astrix have begun to lead the charge on this front, recognizing the risk associated with our ubiquitous App-to-App integrations, including among SaaS, where a compromise of one vendor or app can have a cascading impact across our ecosystem and environments. This should be a key part of the conversation in software supply chain, moving beyond specific software components and vulnerabilities and understanding the broader ecosystem of connectivity we now find ourselves in.
For those looking to learn more about SaaS Security, I presented on the topic at a Federal agency forum and the recording is here.
Moving Forward
So where do we go from here? It is clear that FedRAMP is here to stay and is going to be a key part of the Federal (and Defense) approach to secure cloud consumption. The codification into law will further help bolster its significance and the addition of the Federal Secure Cloud Advisory Committee will bring a stellar group of public and private sector expertise to innovate the way the government consumes and secures cloud service offerings.
All of that said, there are still tremendous gaps and areas of improvement as discussed above. These include the initial authorization process, consideration for other industry compliance frameworks, shifting from static paper-work compliance artifacts, implementing real Continuous Monitoring and maturing beyond viewing Cloud Security through the lens of mostly IaaS.
I hope some of these recommendations can be taken into consideration in the Federal space and I truly appreciate all of the hard work from government and industry to get us this far, but let’s not stop here. As always I am available to chat about these topics and more - just please feel free to reach out and thanks for reading!
Chris. Tremendous article. Would love to add that bias in auditing along with lack of audit tools that have full independence separated from ops side (biased) tools is critical. ZeroBias Auditing is the only way to unlock a cATO by a 3PAO. Ecosystem incompatibility with OPS side compliance/ security/ config tools balanced by fully independent zero bias external auditing is the only way to drive risk out of the risk party
I’d also like to point out that under the section of continuous monitoring the author mentioned AWS Audit Manager as an example offering but it’s important to know that as of this article being published, AWS Audit Manager is not available in AWS Gov cloud.