Our products, especially the ASRs collect, store and may analyze certain types of personal or identifying information regarding individuals that interact with the ASRs. The regulatory framework for privacy and security issues is rapidly evolving worldwide and is likely to remain uncertain for the foreseeable future. Federal and state government bodies and agencies have in the past adopted, and may in the future adopt, laws and regulations affecting data privacy, which in turn affect the breadth and type of features that we can offer to our clients. In addition, our clients have separate internal policies, procedures and controls regarding privacy and data security with which we may be required to comply. Because the interpretation and application of many privacy and data protection laws are uncertain, it is possible that these laws may be interpreted or applied in a manner that is inconsistent with our current data management practices or the features of our products. If so, in addition to the possibility of fines, lawsuits and other claims and penalties, we could be required to fundamentally change our business activities and practices or modify our products, which could have an adverse effect on our business. Additionally, we may become a target of information-focused or data collection attacks and any inability to adequately address privacy and security concerns, even if unfounded, or comply with applicable privacy and data security laws, regulations, and policies, could result in additional cost and liability to us, damage our reputation, inhibit sales, and adversely affect our business. Furthermore, the costs of compliance with, and other burdens imposed by, the laws, regulations, and policies that are applicable to the businesses of our clients may limit the use and adoption of, and reduce the overall demand for, our products. Privacy and data security concerns, whether valid or not valid, may inhibit market adoption of our products, particularly in certain industries and foreign countries. If we are not able to adjust to changing laws, regulations, our business may be harmed.
As our operations and business grow, we may become subject to or affected by new or additional data protection laws and regulations and face increased scrutiny or attention from regulatory authorities. In the USA, certain states have also adopted privacy and security laws and regulations, which govern the privacy, processing and protection of health-related and other personal information. For example, the California Consumer Privacy Act, as amended by the California Privacy Rights Act (collectively, the "CCPA") requires covered businesses that process the personal information of California residents to, among other things: provide certain disclosures to California residents regarding the business's collection, use, and disclosure of their personal information; receive and respond to requests from California residents to access, delete, and correct their personal information, or to opt out of certain disclosures of their personal information, and enter into specific contractual provisions with service providers that process California resident personal information on the business's behalf. Similar laws have been passed in other states and are continuing to be proposed at the state and federal level, reflecting a trend toward more stringent privacy legislation in the USA. The enactment of such laws could have potentially conflicting requirements that would make compliance challenging.
Furthermore, the Federal Trade Commission ("FTC") also has authority to initiate enforcement actions against entities that make deceptive statements about privacy and data sharing in privacy policies, fail to limit third-party use of personal information, fail to implement policies to protect personal information or engage in other unfair practices that harm customers or that may violate Section 5(a) of the FTC Act. Failing to take appropriate steps to keep consumers' personal information secure can constitute unfair acts or practices in or affecting commerce in violation of Section 5(a) of the Federal Trade Commission Act. The FTC expects a company's data security measures to be reasonable and appropriate in light of the sensitivity and volume of consumer information it holds, the size and complexity of its business, and the cost of available tools to improve security and reduce vulnerabilities. Additionally, federal and state consumer protection laws are increasingly being applied by FTC and states' attorneys general to regulate the collection, use, storage, and disclosure of personal or personally identifiable information, through websites or otherwise, and to regulate the presentation of website content.
We are also or may become subject to rapidly evolving data protection laws, rules and regulations in foreign jurisdictions. For example, in Europe, the European Union General Data Protection Regulation (the "GDPR") went into effect in May 2018 and imposes strict requirements for processing the personal data of individuals within the European Economic Area ("EEA") or in the context of our activities within the EEA. Companies that must comply with the GDPR face increased compliance obligations and risk, including more robust regulatory enforcement of data protection requirements and potential fines for noncompliance of up to €20 million or 4% of the annual global revenues of the noncompliant company, whichever is greater. Among other requirements, the GDPR regulates transfers of personal data subject to the GDPR to third countries that have not been found to provide adequate protection to such personal data, including the United States, and the efficacy and longevity of current transfer mechanisms between the EEA, and the United States remains uncertain. Case law from the Court of Justice of the European Union ("CJEU") states that reliance on the standard contractual clauses - a standard form of contract approved by the European Commission as an adequate personal data transfer mechanism - alone may not necessarily be sufficient in all circumstances and that transfers must be assessed on a case-by-case basis. On July 10, 2023, the European Commission adopted its Adequacy Decision in relation to the new EU-US Data Privacy Framework ("DPF"), rendering the DPF effective as a GDPR transfer mechanism to U.S. entities self-certified under the DPF. We expect the existing legal complexity and uncertainty regarding international personal data transfers to continue. In particular, we expect the DPF Adequacy Decision to be challenged and international transfers to the United States and to other jurisdictions more generally to continue to be subject to enhanced scrutiny by regulators.
Since the beginning of 2021, after the end of the transition period following the UK's departure from the European Union, we are also subject to the United Kingdom General Data Protection Regulation and Data Protection Act 2018 (collectively, the "UK GDPR"), which imposes separate but similar obligations to those under the GDPR and comparable penalties, including fines of up to £17.5 million or 4% of a noncompliant company's global annual revenue for the preceding financial year, whichever is greater. On October 12, 2023, the UK Extension to the DPF came into effect (as approved by the UK Government), as a data transfer mechanism from the UK to U.S. entities self-certified under the DPF. As we continue to expand into other foreign countries and jurisdictions, we may be subject to additional laws and regulations that may affect how we conduct business.
The regulatory framework for AI is rapidly evolving as many federal, state and foreign government bodies and agencies have introduced or are currently considering additional laws and regulations. Additionally, existing laws and regulations may be interpreted in ways that would affect the operation of our AI technologies. As a result, implementation standards and enforcement practices are likely to remain uncertain for the foreseeable future, and we cannot yet determine the impact future laws, regulations, standards, or market perception of their requirements may have on our business and may not always be able to anticipate how to respond to these laws or regulations.
Already, certain existing legal regimes (e.g., relating to data privacy) regulate certain aspects of AI, and new laws regulating AI are expected to enter into force in the United States and the EU in 2024. In the United States, the Biden administration issued a broad Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (the "2023 AI Order"), that sets out principles intended to guide AI design and deployment for the public and private sector and signals the increase in governmental involvement and regulation over AI. The 2023 AI Order established certain new requirements for the training, testing and cybersecurity of sophisticated AI models and large-scale compute centers used to train AI models. The 2023 AI Order also instructed several other federal agencies to promulgate additional regulations within specific timeframes from the date of the 2023 AI Order regarding the use and development of AI. Already, agencies such as the Department of Commerce and the FTC have issued proposed rules governing the use and development of AI. Legislation related to AI has also been introduced at the federal level and is advancing at the state level. For example, the California Privacy Protection Agency is currently in the process of finalizing regulations under the CCPA regarding the use of automated decision-making. Such additional regulations may impact our ability to develop, use and commercialize AI technologies in the future.
In Europe, on March 13, 2024, the European Parliament passed the EU Artificial Intelligence Act ("EU AI Act") which establishes a comprehensive, risk-based governance framework for artificial intelligence in the EU market. The EU AI Act will enter into force twenty days after its publication in the Official Journal of the EU and will be fully effective two years later. The EU AI Act will apply to companies that develop, use and/or provide AI in the EU and includes requirements around transparency, conformity assessments and monitoring, risk assessments, human oversight, security, accuracy, general purpose AI and foundation models, and proposes fines for breach of up to 7% of worldwide annual turnover. In addition, on September 28, 2022, the European Commission proposed two Directives seeking to establish a harmonized civil liability regime for AI in the EU, in order to facilitate civil claims in respect of harm caused by AI and to include AI-enabled products within the scope of the EU's existing strict product liability regime. Once fully applicable, the EU AI Act and the Liability Directives will have a material impact on the way AI is regulated in the EU. Further, in Europe we are subject to the GDPR, which regulates our use of personal data for automated decision making including individual profiling that results in a legal or similarly significant effect on an individual and provides rights to individuals in respect of that automated decision making. Recent case law from the CJEU has taken an expansive view of the scope of the GDPR's requirements around automated decision making and introduced uncertainty in the interpretation of these rules. Specifically, the CJEU has expanded the scope for automated decision making under the GDPR by finding that automated decision-making activities can fall within the GDPR's restrictions on those activities even if the required legal or similarly significant effect for the individual is carried out by a third party. The EU AI Act, and developing interpretation and application of the GDPR in respect of automated decision making, together with developing guidance and/or decisions in this area, may affect our use of AI and our ability to provide, improve or commercialize our services, require additional compliance measures and changes to our operations and processes, result in increased compliance costs and potential increases in civil claims against us, and could adversely affect our business, operations and financial condition.
It is possible that further new laws and regulations will be adopted in the United States and in other non-U.S. jurisdictions, or that existing laws and regulations, including competition and antitrust laws, may be interpreted in ways that would limit our ability to use AI for our business, or require us to change the way we use AI in a manner that negatively affects [the performance of our products, services, and business and the way in which we use AI. We may need to expend resources to adjust our products or services in certain jurisdictions if the laws, regulations, or decisions are not consistent across jurisdictions. Further, the cost to comply with such laws, regulations, or decisions and/or guidance interpreting existing laws, could be significant and would increase our operating expenses (such as by imposing additional reporting obligations regarding our use of AI). Such an increase in operating expenses, as well as any actual or perceived failure to comply with such laws and regulations, could adversely affect our business, financial condition and results of operations.
Although we work to comply with applicable laws, regulations and standards, our contractual obligations and other legal obligations, these requirements are evolving and may be modified, interpreted and applied in an inconsistent manner from one jurisdiction to another, and may conflict with one another or other legal obligations with which we must comply. Any failure or perceived failure by us or our employees, representatives, contractors, consultants, collaborators, or other third parties to comply with such requirements or adequately address privacy and security concerns, even if unfounded, could result in additional cost and liability to us, damage our reputation, and adversely affect our business and results of operations.