I. Introduction
Imagine a world where traffic lights can work in unison, monitor traffic flow, and adjust the lights to decrease traffic. This type of work is possible with the Internet of Things devices (IoT). New computational capabilities called edge and fog computing have been employed to make IoT devices a reality on a large scale. Edge computing works by completing computations on the network’s “edge,” meaning close to the sensor of the devices or the device itself. Fog computing allows for computations between the cloud (remote servers) and devices. The Department of Defense has recently stated that the federal government needs fog computing capabilities. Furthermore, the federal government is considering pilot programs for fog computing. It is only a matter of time before fog computing is frequently used across the government, as it is “the missing-link architecture for IoT deployment.”
However, with new technology comes new risks. IoT devices may be susceptible to cyberattacks not characteristic of other technologies. For example, researchers were able to hack and inject a computer worm into one IoT device, a “smart lamp,” and eventually infect every single lamp in the network. The vulnerability that the researchers found is unlike any other technology, and it was unique to the IoT ad hoc networking capabilities, which cannot be monitored or stopped by traditional Internet-based security measures but are also fundamental to the benefits provided by these technologies. As a result, researchers determined that hacking into just one lamp can lead to entire city sectors going dark at the wish of a malicious actor. Now, imagine a scenario where, instead of lamps, it is a group of drones, cameras, or a nuclear plant. It is not difficult to conceive the devastating impact a malicious actor could make by hacking into a network of a thousand highly technically capable devices.
The U.S. government, across political lines, cares about ensuring cybersecurity at the highest levels of government. Cybersecurity affects all aspects of life, from national security to people’s privacy and security. The executive branch continuously invests resources and effort in improving national cybersecurity efforts.
The General Services Administration (GSA) is the federal agency that manages the procurement of cloud-based technology for the federal government. The agency ensures the standardized procurement of secure cloud technology across the federal government through the Federal Risk and Authorization Management Program (FedRAMP) as a procurement standard. FedRAMP was created as a cost-effective approach “for the adoption and use of cloud services” for executive agencies. The program’s goals are streamlining and creating a cost-effective procurement process for cloud technology and increasing the use of more secure technology across the federal government. FedRAMP was “designed to provide a standardized approach to security assessment, authorization, and monitoring of cloud computing.”
The FedRAMP program incorporates and relies on technical security controls developed by the National Institute of Standards and Technology (NIST). Specifically, FedRAMP uses NIST’s Special Publication (SP) 800-53 in its program, which serves as a technical regulatory framework that agencies, including GSA, can adopt. NIST publications are updated periodically. NIST’s newest final version of SP 800-53 update, Revision 5, came out in 2020 and included technical controls for IoT and edge devices for the first time. However, the FedRAMP program was based on NIST’s SP 800-53 Revision 4, which does not include special updates on IoT edge computing technology.
At least in one instance in 2022, a year before FedRAMP adopted the new NIST SP 800-53 Revision 5, the FedRAMP program authorized the use of the IoT edge device for federal agencies. While NIST SP 800-53 Revision 5 addresses some novel issues of IoT, fog computing is yet to be addressed by NIST or the FedRAMP program. As such, the FedRAMP program, as of now, does not have the necessary technical controls and protections to ensure the safe procurement of fog technology. If the program authorizes technologies for which it does not have sufficient requirements, it leaves the technology vulnerable. Since FedRAMP authorizes procurement of cloud-related technologies for the rest of the federal government, it risks having vulnerable technology used by federal agencies.
The FedRAMP needs to incorporate security measures for fog computing. Although fog computing is still an emerging technology, it is a crucial component of the evolving cloud paradigm, and the FedRAMP must ensure its secure procurement. This Note proposes adopting a fully management-based regulatory framework instead of a combination of risk-based and management-based approaches for acquiring fog-edge cloud technology.
Part I of this Note gives an overview of the issue. In Part II, this Note will analyze the background of fog computing and the unique cybersecurity threats that it faces. Next, Part III will explore how FedRAMP operates and how NIST regulations and special publications play into FedRAMP. Part IV analyzes regulatory gaps in applying FedRAMP to fog computing technology. Lastly, this Note will discuss various solutions that the GSA should implement to procure secure fog computing technology.
II. Definitional Analysis of Fog, Edge, and Cloud Computing
Cloud, fog, and edge computing have different architectural properties. Cloud computing is used for remote processing and storing data rather than on the device itself. Fog computing is an architectural model that allows for computation between the device and the cloud layer, unlike standard architecture, where the computations exclusively happen on the cloud or the device. Figure 1 demonstrates the structure of the architecture. Fog computing brings the “cloud closer to [the] ground,” meaning that computation occurs on the network instead of the cloud architecture. The network where the computation takes place consists of “fog nodes,” which can be “routers, switches, gateways, [or] access points” used to complete computations across the network. Additionally, fog nodes enable a “distributed” approach to computing. This means that the tasks from the cloud central-management center get distributed across intermediate fog nodes to reduce the energy that the network has to use to complete computations. Figure 2 demonstrates the fog nodes and their relation to the cloud architecture. Finally, edge computing allows for computation close to the source of the information about the world rather than sending it for processing to the cloud.
The Internet of Things (IoT) devices often use fog computing capabilities for on-the-spot decision-making. For example, surveillance video processing could be more private and tailored to the tasks. Instead of uploading all of the imagery from the camera directly to the cloud, the fog function would allow only sending data “filter[ed]” by the fog nodes to include only necessary information. For instance, if the cameras are programmed only to send information about unlawful activity, fog computing would process camera imagery in real time and only send the requested information to the cloud. Because the processing takes place close to the network, it decreases delays in processing and gives real-time updates.
Fig. 1: Fog Computing Architecture Diagram
Fig. 2: Fog architecture
[Please refer to page 708 in Public Contract Law Journal 54:3 Spring 2025 to view Fig. 1 & 2]
From a regulatory perspective, fog and cloud computing each have their definitions and characteristics defined by the regulating agency. NIST is a federal agency established to promote innovation and industrial competitiveness in the United States and provides measurements and standards for various technologies. NIST considered the fog paradigm in its special publication (SP) 800-325, defining fog computing as “a layered model for enabling ubiquitous access to a shared continuum of scalable computing resources.” Furthermore, NIST provides definitions of fog nodes, which are physical or virtual components that provide computing resources to devices. The fog nodes operate on a stand-alone basis or communicate amongst themselves in clusters to provide scalability. Additionally, NIST SP 800-145 defined cloud computing, which FedRAMP adopted. The SP defines cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
In summary, from a technical engineering perspective, fog computing has a “decentralized” system, which can help to minimize the amount of information that gets uploaded to the cloud. From the regulatory perspective, fog and cloud computing have their own properties and definitions created by the technical advising agency NIST.
III. Security Concerns of Fog Computing and Its Role in the Future of the Internet of Things.
A. Fog Concerns
Standard cloud security measures cannot be directly implemented in fog computing due to some unique fog features. Some features of the fog paradigm, like where the data is processed and handled, create different security and privacy issues in fog devices. The following section highlights some common security areas and explains why cloud security measures are inapplicable.
Authentication: Fog nodes are part of a more extensive fog network that requires each node to be authenticated before it joins the network. The usual cloud authentication models, like Public-Key Infrastructure, the standard form of encryption used to ensure communications security, cannot be used with fog computing models because fog nodes establish a “trust” relationship with the IoT device and other fog nodes.
Communication: Fog node communication does not mirror standard network communication because it is a distributed system. This means that fog nodes need to be secure in communication between one another and between the IoT devices. Thus, it is challenging to implement existing communication security techniques in the fog computing environment.
Data Privacy: Due to the way that fog nodes work, it is difficult to establish a centralized control system. This means that just one compromised node can serve as an entry point for malicious actors into the network, undermining users’ data privacy. Lastly, due to a large communication load between fog nodes, there is a higher chance of privacy leakage.
Fog development: Fog computing has large potential, but the technology is still in the early stages of development. Furthermore, fog computing lacks standard architecture, making implementing it into an existing cloud paradigm challenging.
B. Fog, Edge, Cloud Complex—Making “Internet of Things” a Possibility
The IoT is a network of devices, like appliances, vehicles, drones, and other “things” that are connected to one another and the Internet. Because IoT devices use large amounts of computational power, the current reliance on cloud computing alone is insufficient to ensure widespread adoption of the IoT. Fog computing is an essential component necessary in the evolution of the cloud infrastructure, helping to complete the computations outside of the cloud and make the IoT a reality. A more precise term for the interoperability of different types of architecture required to enable the IoT is the Cloud to Things (C2T) continuum, which refers to the ability to share the computational powers between fog, edge, and cloud. There is an opportunity to integrate fog into the C2T. However, difficulties may arise for integrating these technologies from different software and hardware compatibilities (one designed for the cloud, the other one designed for fog). Furthermore, C2T collects data that is far too complex and voluminous for the manual management of such infrastructure, resulting in a need for more autonomous system management. Lastly, the shift in the paradigm is significant because it creates a need to connect many different devices, like IoTs, nodes, and sensors within the C2T architecture.
To summarize, not only do fog and cloud computing have different definitions, capabilities, technical properties, and security risks, but the problem is further complicated by the fact that fog computing would eventually become integrated within cloud solutions, creating a problem that the current NIST and FedRAMP regulatory framework might not be able to address.
IV. Implications of Cloud and Fog Procurement for the Federal Government and the Role of FedRAMP in the Growing New Market
Cloud technology has considerable potential, with revenue in global cloud computing growing from $109 to $344 billion between 2015 and 2020. Federal agencies have consistently increased their spending on cloud technology, with spending projected to grow. Government procurement spending on cloud computing was $41.5 billion between the 2018 and 2022 fiscal years. The Department of Defense alone spent $3 billion in cloud computing contracts in 2022. The fog computing market is also expected to grow as a new critical technology. The market value of fog computing is projected to increase from $80 million in 2023 up to $2.34 billion by 2032. Finally, on average, the FedRAMP authorization use by federal agencies increased by sixty percent between 2019 and 2023.
Fog computing is becoming an essential part of modern technology. With the growth of data from IoT devices, cloud capabilities alone cannot address processing and computational needs. Fog computing addresses that problem by providing additional processing capabilities to offload cloud services and storage closer to the data origin. Some potential applications of fog computing include smart cars exchanging information with city centers to reduce traffic accidents, the Internet of Drones assisting human workers in disaster relief without the assistance of a human drone driver, and using built-in censors to administer patients’ drugs as their vitals change in real time. While fog computing is a distinct technology working independently from cloud computing technology, they often work in tandem, with fog technology taking the load off the cloud system.
Fog computing capabilities caught the eye of some federal agencies, which launched exploratory programs into fog computing. For example, the Department of Defense (DoD) issued statements about procuring fog technologies for experimental programs.
FedRAMP could serve a great purpose and help procure some of these newer technologies for the federal government, not just cloud products. However, FedRAMP is a “government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services.” The FedRAMP program was recently used to authorize and certify IoT edge devices before it had the necessary technical controls for procuring IoT, extending beyond its goal of authorizing cloud products. As such, fog computing could also be authorized by the FedRAMP, but potentially with some missing technical controls.
V. Legal Framework, the Big Overview: FISMA, FedRAMP, and NIST
A. FISMA
The Federal Information Security Modernization Act of 2002 (FISMA) requires federal agencies to create, document, and enforce a program that ensures the security of their information and systems, including those managed by other agencies, contractors, or outside sources. FISMA consists of two phases: (1) the development of specific security standards; and (2) compliance for the public and private sectors. FISMA tasked NIST to create the Federal Information Risk Management Framework (RMF), which outlines three types of mandates: (1) the risk levels of the information systems (low, moderate, high); (2) the types of systems in each category; and (3) the types of controls or technical requirements that information systems must obtain to achieve necessary security level. The RMF strongly emphasizes reliance on a continuous management process to ensure that risk is adequately managed and processed.
NIST developed Federal Information Processing Standard (FIPS) 199 to address the first and second RMF mandates. To address the third mandate, FISMA developed FIPS 200. NIST Special Publication 800-53, as required by FIPS 200, defines specific security controls that service providers must obtain.
FISMA directed NIST to provide standards and guidelines allowing agencies to choose various solutions and approaches to address security risks. Such flexible security solutions are necessary to ensure that agencies tailor their security programs to their needs while complying with the law.
Lastly, FISMA requires the Office of Management and Budget (OMB) to oversee the implementation of guidelines and standards on information security systems, which also applies to contractors providing services on behalf of the government, such as cloud computing. As a result of this requirement, FISMA also mandates OMB oversight of the FedRAMP program.
B. FedRAMP
The Federal Risk and Authorization Management Program (FedRAMP) is a federal program that creates a standard approach to the security assessment for cloud computing, authorizes federal government agencies to use these services, and mandates continuous monitoring of those services. The FedRAMP is established through the FedRAMP Authorization Act as a part of the FY23 National Defense Authorization Act (NDAA). The FedRAMP program “promotes the adoption of secure cloud services across the federal government by providing a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services.” Cloud Service Providers (CSPs) who want to engage in business with the U.S. government must be authorized through the FedRAMP program to ensure compliance with FISMA. This allows the use of certified cloud-based technologies across the federal government. All federal agencies must meet the FedRAMP requirements when they use cloud services, and the provider of those services must use the FedRAMP security requirements.
C. NIST and Its Role in FedRAMP
NIST is a federal agency that establishes standards and guidelines to define and describe the technical properties of new and vital technologies. Furthermore, NIST served as a technical advisor for the FedRAMP program, and the program is based on NIST’s special publications. When NIST releases updates, FedRAMP must periodically update its program to include such publications. NIST SP 800-53 describes specific security controls that cloud service providers must have and NIST provider recommendations to FedRAMP on which controls should be included in the program and how to apply them. FedRAMP drew assessment criteria controls from NIST SP 800-53. The latest version of FedRAMP has a total of 323 controls for moderate control and 156 of low baseline controls, which are based on NIST 800-53 special publication. These controls are organized into twenty families of security controls, with each individual control “involv[ing] aspects of policy, oversight, supervision, manual processes, and automated mechanisms that are implemented by systems or actions by individuals.”
Furthermore, the FedRAMP continuous monitoring component is based on NIST Special Publication 800-137 titled Information Security Continuous Monitoring for Federal Information Systems and Organizations. The overall Risk Management Framework was documented by NIST in Special Publication 800-37. NIST serves as a technical advisor in helping implement NIST SP 800-37 and a specific to federal information systems publication NIST SP 800-137, which cloud service providers follow. The FedRAMP Continuous Monitoring Guide recommends that agencies monitor and assess security risks for cloud service providers on an ongoing basis. According to the guide, if agencies identify issues with their cloud service providers, they implement escalation levels to determine how to remedy deficiencies. The continuous monitoring framework ensures that security systems operate in the intended way.
VI. Issues with the Existing Regulatory Framework: Current Problems with FedRAMP and Application to Fog Computing
Two main issues exist within the current technological development and the current FedRAMP/NIST legal framework: (1) problems with NIST technical requirements; and (2) existing FedRAMP issues, which make it challenging to implement new technology.
A. NIST Complications
NIST regulation of the cloud sector is rooted in a risk-based approach. NIST 800-37 RMF, used in cloud regulation, gives a framework for securing technology. It suggests two methods: one for initial controls selection for baseline controls, available through SP 800-53B, or an organization-generated controls approach. The RMF network acknowledges that the controls can be applied to different types of technologies; since it is technology-neutral, it does not require adjusting the process to accommodate specific technologies.
First, NIST has separate frameworks for defining cloud and fog technologies. Unlike cloud computing, for which NIST developed SP 800-53, NIST has not developed a security controls framework for fog computing. Because of the way that fog computing works, it has a different set of vulnerabilities, unlike the ones presented by cloud computing. Fog computing, unlike cloud computing, does not have any established guidelines on cyber-threat mitigation. This creates a regulatory gap, as NIST has yet to develop any controls for the emerging fog technology.
The issue is compounded for the NIST since it currently regulates the cloud using specific controls, likely to be unfit for fog computing. Fog computing is a developing technology that would help implement the use and efficiency of the IoT to cloud solution by processing information close to the center of the technology. While fog has large potential, it is still in the early research and implementation stages. Additionally, challenges exist when implementing fog into the existing cloud paradigm due to the lack of standard architecture for fog computing. Due to the early stages of development of this technology and the lack of a standardized architecture for fog, a more flexible regulatory approach is necessary.
B. FedRAMP Issues
Certification: Both components of the FedRAMP program—certification and ongoing monitoring—have problems. The lack of certain common technical controls in the FedRAMP program leads to security vulnerabilities in the cloud. Furthermore, some agencies are struggling to apply cloud definitions to the program. The FedRAMP enables agencies to choose the requisite security controls for their cloud solutions, yet this flexibility leads to a lack of standardized contract terms, diminishing the advantages of “write once, use many times” and potentially compromising the security of these solutions. Furthermore, some agencies do not consistently implement all the elements mandated by the FedRAMP authorization, making the program less effective. Lastly, implementation and authorization through the FedRAMP process is very time-consuming and resource-intensive, posing a “major or moderate challenge” to both agencies and service providers.
Ongoing monitoring issues: The second component of the FedRAMP program, ongoing monitoring, is also plagued with difficulties. There have been instances of failure in implementing the continuous monitoring portion of FedRAMP. Furthermore, there are instances where federal agencies fail to include any monitoring language in the contractual agreements. Ongoing monitoring also requires a high level of technical expertise from the government to ensure compliance, which is not always easy to obtain. Furthermore, the CSPs do not have a strong incentive to provide disclosures of noncompliance, affecting the government’s ability to revise and review whether the implemented controls prevent risks.
Other problems: The FedRAMP has two other important issues that slow down the technology adoption process: compliance works and interagency friction. To sustain FedRAMP compliance, CSPs must use manual systems and document over a thousand pages of work product. Additionally, agencies that try to use an existing authorization do not trust specific third-party assessment organizations used in the authorization process.
VII. The Solution to FedRAMP Deficiencies for Edge and Fog Computing
Fog computing has not yet matured. Additionally, the complexity of the legal landscape and the difficulty of FedRAMP implementation and use raise questions about how to incorporate fog computing into the FedRAMP framework. Finally, the issues with monitoring are also prompting a rethink of FedRAMP. There are three primary responses to address gaps created by rapidly changing technological advancements: “extend extant laws,” “create new law,” and “reassessing the legal regime.” Each one has its drawbacks and advantages.
The “extending extant laws” approach is used when extending an existing regulatory framework or case law onto the new technology is possible. “Creating new the law” means creating new rules, and this method is appropriate when a technology issue does not fit into the existing regulatory frameworks. In extreme circumstances, reassessing the legal regime is necessary, and it mostly happens when the assumption upon which the laws were created is no longer reliable. Due to the difficulties of the FedRAMP as it applies to cloud computing and the fact that fog computing changes what was assumed about cloud computing (at the very minimum, the fact that it is a missing puzzle piece in enabling IoT, offloading computing from the cloud), reassessing the legal regime is necessary to address these existing issues.
A. Reassessing the Legal Regime Is the Solution to Fog Computing Procurement
Reassessing the legal regime is appropriate for a complete gap in the regulatory regime, meaning the regulation relies on a wrongful assumption about a technology. A need to change the regime often arises when a complete gap in the regulatory scheme leaves the issue unaddressed. Additionally, legal regimes can be reassessed following a “tech-fostered social change.”
In the case of FedRAMP, there is an incorrect assumption that cloud computing is a fully developed technology, leaving fog computing subject to regulatory gaps. As described in the “Fog, Edge, Cloud Complex” section, fog computing is not completely developed, and cloud computing will further evolve to incorporate elements of fog and edge computing. The FedRAMP was created with the assumption of procuring cloud services alone but is now extended for the use of IoT.
B.Using Machine Learning and IoT Regulation as a Guiding Light
For two reasons, an existing regulatory framework on machine learning provides a good model for regulating developing technologies, especially fog computing. First, machine learning has a heterogeneous core like fog computing, meaning that this technology has many potential uses. Second, machine learning is a powerful fog-computing technology tool. Machine learning can help offload processing from IoT devices.
Two implementation methods are standard when reassessing legal regimes: shifting regulatory powers from one institution to another or creating new institutions. Where regulatory gaps exist within the current regime, new agencies are not required, and the problem can usually be addressed by assigning new problems to existing regulators. The biggest concern with regulating machine learning is building up the technical capabilities within the regulating agency via human capital and experienced data science professionals to ensure that the regulator can keep up with the pace of the technology. Similarly, fog computing as a developing technology might pose the same risk for the regulator.
Because machine learning has many application types, regulating it is not a “one-size-fits-all” solution, but rather an “agility, flexibility, and vigilance” framework. Agility in the context of machine learning means that regulators need to participate actively in the regulation process. This requires regulators to constantly and incrementally update the regulations while building the capacity to understand evolving industries and technology. Flexibility means that the laws need to accommodate diverging applications. Flexible rules can either focus on the outcomes that the regulator is trying to achieve, rather than specific standards-based regulations, or it can mean that a regulator focuses on evaluated risks identified by the firm. The latter approach is known as management-based regulation. Lastly, vigilance asks that regulators maintain rigorous oversight to ensure that firms’ engagement with the regulations does not atrophy, meaning that firms continue evaluation processes rather than filling out paperwork. Vigilance mandates regulators to actively assess the industry’s management efforts and have the technical capacity to do so.
The key to regulating machine learning lies in management-based regulation. Under management-based regulations, the regulator is asking the firms to engage in the internal evaluation of the systems and create internal responses to address them. Under the management-based approach, the regulator mandates firms to assess potential issues and creates “internal responses to correct them.” The regulators under this approach also do not have to have the same level of knowledge as the firms that they are regulating.
The management-based approach is dynamic in nature and mandates firms to engage in “systemic managerial activities” to identify issues. Usually, this approach requires firms to “develop [] management plan[s], monitor for potential risks, produce internal procedures and training to address [the identified] risks, and to maintain documentation” and, depending on the regulations, subject firms to third-party auditing. Regulators, in turn, have to be actively involved with the industry and gather information from them to adopt these newer flexible approaches to regulation.
Furthermore, regulators must have the capacity to assess how well firms use and comply with the established risks. Regulators need to consider whether the newest innovations are worth the risk they may pose to the public, meaning that procuring the newest technologies with unknown risks might be against the government’s interest.
C. Implementation of a Pure-Management-Based Approach for FedRAMP: Creating a New Framework
Implementing the new management-based FedRAMP framework is a multistep process involving NIST, GSA, industry, a new FAR clause, and implementing agencies.
FedRAMP should transition to a purely management-based approach to procure fog technology. FedRAMP is a mix of performance-based and management-based regulation, where the certification portion, which focuses on specific technical controls, corresponds to the performance-based aspect, while ongoing monitoring is responsible for the management-based portion. Like machine learning, fog computing technology has many applications, making regulating every single aspect of it impossible. Under the new fully management-based approach, FedRAMP must assess, audit, and document how fog and cloud providers work. This approach does not require GSA or implementing federal agencies to have the same knowledge of the technology as the companies to measure all the outcomes. As such, the assessment of the program should focus on evaluating the compliance program and imposing stricter disclosure requirements on firms so that once a vulnerability becomes evident, the firms have an incentive to manage it.
1. NIST’s Role
NIST should develop a new framework and establish a potent compliance complex managed by the GSA. To comply with FISMA, NIST must create requirements that fog technology needs to achieve, replacing the management component of SP 800-53, specific controls, and SP 800-37. First, NIST would need to release a singular management-based special publication, as has been recommended for other evolving technologies like artificial intelligence (AI). The new NIST publication, similar to the existing NIST AI publication, needs to articulate, with specificity and clarity, the types of systems, processes, and systems that fog computing firm must obtain and create an evaluation for the regulator (GSA or agencies) of the management component of the program, ensuring that any deviations are caught, evaluated, and changed, if necessary. The new NIST framework should focus on managing risk and identifying the issues, instead of having technology controls established in SP 800-53. Furthermore, the standard should incorporate third-party auditors as active participants in the qualification process and ongoing monitoring to accommodate this developing technology while the government builds an internal knowledge base and capabilities to understand the fog. Lastly, NIST should pay particular attention to details and describe as precisely as possible the type of evaluation and the particular program aspects that must be in place for the companies.
In addition, given the current issues with the amount of compliance work that the cloud service providers must obtain, NIST should assist in developing and advising on an automated monitoring program. Some sources recommend solutions like the Continuous Diagnostic and Mitigation (CDM) program for a more automated approach to managing cybersecurity risks. The CDM allows continuous monitoring by integrating dashboards into the solution, which provides almost real-time assessment of the government’s cloud environment.
2. GSA’s Role
GSA would need to have a two-step approach. First, the GSA must collaborate with NIST on the new special publication to incorporate it into FedRAMP, similarly to how the program functions now. Second, the GSA must create a document describing the contractor’s compliance requirements and ongoing authorization procedures and conduct periodic spot checks of compliance. Third, The GSA should have a list of authorized third-party assessment organizations (3PAO) that help to perform initial and periodic assessments of cloud systems, similar to how the program functions now. These third-party assessment organizations are accredited companies looking into whether the cloud service providers are compliant with the FedRAMP as a part of continuous monitoring efforts. However, an updated list of independent third-party organizations should be considered to include organizations that end-use agencies trust. This could be achieved through an inquiry conducted by the GSA to authorize and establish a revised list of vendors. Lastly, through the FedRAMP program, GSA should randomly select companies for testing via third-party assessment organizations throughout the fiscal year to ensure ongoing compliance with the requirements and making informed, risk-based decisions for cloud services.
The certification component of programs should remain in place but with some twists. While the annual assessment should be retained and be conducted by the authorizing agency, GSA should establish and incorporate metrics to ensure consistent evaluations, especially since a fully management-based approach might have variations in solutions. Similar to the existing FedRAMP program, the contractors must submit monthly reporting summaries of any program vulnerabilities and track such vulnerabilities.
Lastly, the GSA should implement and oversee the new automated monitoring program. The biggest issue with compliance faced by contractors is the cost and time involved in a manually driven program. A House bill was proposed in the past that would task the GSA with automating the compliance aspect of the FedRAMP program. This would greatly benefit the program, increasing efficiency and reducing compliance burdens on companies.
3. Additional Hooks to Ensure Compliance: False Claims Act
To ensure continuous compliance with the new FedRAMP program, contractors must be incentivized to upkeep their internal compliance programs. Along with all of the suggested program changes, the new FedRAMP shall mandate compliance certification, adding a hook to the False Claims Act (FCA) liability in case of noncompliance. The FCA provides a strong incentive for contractors to comply with their certifications of compliance because it puts monetary liability on companies and criminal liability for individuals. The FCA’s liability in cybersecurity matters is consistent with the existing regulatory regime since the Act has been used more frequently for cybersecurity and would ensure that contractors are engaging with evaluation and compliance. Once companies have completed the authorization, the contracts between them and the federal agencies that they provide services with should incorporate a standardized FAR provision outlining FCA liability for violators.
The new program will undoubtedly burden the industry. However, the new management-based FedRAMP program will likely encourage the industry to adapt by establishing compliance teams, tracking technology issues, and implementing regular internal assessments to ensure compliance with the new requirements. The particular contractor mandates and specific actions that the contractor is supposed to take could be incorporated directly into the contract with a standardized FAR clause when the certified contractors are deemed compliant with the regulation.
4. Implementing Agencies
Since the agencies using the FedRAMP may have issues complying with the FedRAMP monitoring practices, the GAO recommended that the OMB establish a monitoring process and hold agencies accountable for authorizing cloud services through FedRAMP. Others recommended providing more specific core terms to support the program’s execution while remaining focused on the principles. These terms could be incorporated directly into contractual agreements that agencies have with cloud service providers.
This new regulation would be within the powers of OMB as authorized by FISMA because the FISMA director is to “revise or repeal operational directives that are not in accordance with the director’s policies.” The funds and yearly program needs would similarly be authorized annually by the NDAA, as is done now for the FedRAMP authorization act. To implement some of the changes described above, a bill would be the most likely way to create this comprehensive change. It would need to include the language of the new FAR clauses to incorporate into a contract. Additionally, the bill should introduce a new FAR clause for mandatory disclosures, a mandate to automate the FedRAMP monitoring process, and a requirement for the GSA to establish a new list of third-party assessment organizations.
D. Alternative Methods of Regulation and Why They Are Not Optimal
Implementing a management-based approach aligns with the evolving technology and flexibility the government needs when employing cybersecurity regulations. Unlike the specific controls-based approach, a management-based approach would provide more flexibility, helping to ensure that the federal government can procure the most advanced technology and implement changes incrementally without having to wait and regulate every new possible combination of cloud-edge-fog computing.
The management-based approach has a myriad of benefits. It captures technologies that the risk-based approach fails to provide for the FedRAMP—like continuous oversight and regulating across many domains—and allows for a flexible approach to regulation, which would be an excellent solution for the FedRAMP, as it currently struggles with a lack of oversight.
There are other alternative approaches; however, they do not address existing technology pacing issues within FedRAMP. One possible solution would be to keep the existing FedRAMP framework and extend it to fog computing by creating a NIST special publication with technical controls specifically for fog computing. This approach would be the simplest. The drawback of this approach is that it will likely not work due to architectural differences between fog and cloud computing. Furthermore, this framework will still have issues with pacing because it takes time to implement NIST-developed technical controls into the FedRAMP program, likely making it ineffective since there might be further developments in the fog and cloud computing infrastructures. Lastly, since fog is a developing technology, even if NIST were to come up with a special publication, it still leaves the potential for not capturing developing fog risks.
An additional alternative approach would be to create a new law with a tech-specific approach. However, it also does not address significant problems with the existing framework. Tech-specific laws are more effective in preventing actors from evading regulations because they provide clear, specific rules for compliance. Additionally, tech-specific laws have a shorter lifespan, which means that when the technology evolves, the law automatically discontinues applying to the evolved version of the tech. The tech-specific approach has a few setbacks: tech-specific laws “increase avoision”; they might be overinclusive, especially when they are used in the early days of a new technology. In the case of FedRAMP and edge procurement, there are some concerns regarding the new technology, as fog computing is still in development, and very rigid and rigorous regulations would be ineffective.
VIII. Conclusion
FedRAMP, as it stands, is not equipped to deal with the rapid development of technology, specifically fog computing. Technology is advancing fast, and oversight presents a significant issue. The current system is insufficient to address future needs. FedRAMP should apply to fog computing, but the program needs a change. Pure management-based regulation can make the program flexible and adaptable to the federal government’s changing needs.