MoFo’s State + Local Government Enforcement Newsletter
MoFo’s State + Local Government Enforcement Newsletter
Morrison Foerster’s State and Local Government Enforcement team is pleased to provide our bimonthly newsletter summarizing some of the most important and interesting developments from state attorneys general (“State AGs”) across the country and local government agencies and legislative bodies, with links to primary resources. This month’s topics include the following:
A coalition of 40 State AGs[1] sent a letter to Congressional leaders opposing a proposed 10-year ban on states enforcing any state law or regulation addressing artificial intelligence (“AI”). The ban was included by the House Energy and Commerce Committee as an amendment to the budget reconciliation bill. The State AGs argue that the 10-year moratorium “would be sweeping and wholly destructive of reasonable state efforts to prevent known harms associated with AI.” Specifically, they argue that the ban would eradicate any state-level framework already in place, without any regulatory scheme to replace or supplement such framework. The State AGs allege that the bill would directly harm consumers and prevent State AGs from “fulfilling their mandate to protect consumers.”
The letter also outlines legislation passed by states addressing a number of harms and risks associated with AI. The letter highlights those states that have passed a series of laws designed to protect against (1) AI-generated explicit material, (2) deep fakes, (3) spam phone calls and texts, (4) AI disclosure requirements, and (5) identity protection for AI-generated content. The State AGs note that these laws and regulations have been developed for years, through careful consideration and input from consumers, industry, and advocates. The State AGs also warn that unforeseeable risks associated with the technology are likely to arise in the coming years. Finally, the State AGs stress that, in the face of Congressional inaction, states are likely to be the forum for addressing the real-world harms of AI.
The proposed bill faces further procedural hurdles before officially becoming a part of the budget bill, and then it remains unclear whether the proposed moratorium ultimately will be enacted. And, even if enacted, it probably would be subject to a number of constitutional challenges. But the effort by the federal government to limit the role of the states with regard to AI reflects a desire to allow companies to accelerate their use of AI without having to navigate a patchwork of different state regulations. Some lawmakers have suggested that the moratorium would give Congress time and space to craft a federal framework to regulate AI. In the meantime, however, businesses should continue complying with state regulations and expect State AG enforcement in this area to continue to be a priority.
As enforcement by the Consumer Financial Protection Bureau (“CFPB”) faces uncertainty, State AGs are stepping up to fill the perceived enforcement gap in consumer financial protection. One way in which the State AGs have begun to take action is by looking to expand and strengthen their respective state consumer protection laws.
For example, on March 13, 2025, New York State AG (“NYAG”) Letitia James proposed the Fostering Affordability and Integrity through Reasonable Business Practices, or FAIR Business Practices Act (“Fair Act”). The Fair Act would authorize the NYAG to seek civil penalties and restitution against businesses for a wide array of scams, including AI-based schemes, online phishing scams, hard-to-cancel subscriptions, junk fees, and data breaches. The Fair Act would close loopholes that the NYAG alleges make it “too easy for New Yorkers to be scammed.”
Similarly, on April 16, 2025, Michigan AG Dana Nessel urged members of the Michigan House Judiciary Committee to strengthen the state’s consumer protection laws following a setback in her efforts to challenge the law in court. Michigan AG Nessel argued that the Michigan Consumer Protection Act (“MCPA”)’s safe-harbor provision has been interpreted too broadly by the courts to essentially exempt regulated industries and licensed professionals.[2]
Currently, a case involving an investigation into the insulin prices of a large pharmaceutical company is pending before the Michigan Supreme Court. The Court held oral arguments last fall on whether the pharmaceutical company’s sale of insulin was exempt from the MCPA. The Court recently requested a second round of oral arguments, which prompted AG Nessel to seek a legislative solution. Her office later testified in support of Senate Bill 134, which would authorize the Michigan AG to investigate regulated businesses.
As State AGs look to expand their jurisdictional powers to address practices that they deem harmful to consumers, it remains critical for companies to monitor consumer complaints and maintain compliance with state laws.
On April 16, 2025, a district court judge in the Eastern District of Kentucky granted a 60‑day stay in a case from 18 Republican State AGs challenging the U.S. Securities and Exchange Commission’s (“SEC”) crypto enforcement strategy. The coalition of State AGs allege that “the SEC has sought to unilaterally wrest regulatory authority away from the States through an ongoing series of enforcement actions targeting the digital asset industry” without Congressional authorization. The SEC sought the stay and alleged that the newly created Crypto Task Force on digital assets “could facilitate the potential resolution of this case.” The coalition of State AGs consented to the stay.
The SEC has filed similar requests in other crypto-focused cases, reflecting the SEC’s shift away from aggressive crypto enforcement toward guidance, rulemaking, and deregulation. Earlier this year, Acting SEC Chairman Mark Uyeda established a Crypto Task Force, led by Commissioner Hester Peirce, to develop “a comprehensive and clear regulatory framework for crypto assets.” The SEC’s Crypto Task Force’s stated goal is to provide a regulatory framework applicable to the crypto industry within the existing “statutory framework provided by Congress” and any possible future framework that may evolve though Congress.
Democratic State AGs, on the other hand, continue to regulate cryptocurrency. For instance, in January 2025, the NYAG filed a lawsuit to recover $2.2 million worth of cryptocurrency held in digital wallets and stolen in a remote job scam. NYAG James called for federal legislation to strengthen regulations on cryptocurrencies and digital assets in April 2025, after the DOJ announced the dismantling of federal criminal cryptocurrency fraud enforcement. Similarly, last year, the California Department of Financial Protection and Innovation reached a $1.5 million settlement against a cryptocurrency platform that it alleged had engaged in the unqualified offer and sale of securities via its cryptocurrency interest-earning program. Although cryptocurrency companies may face less scrutiny from the federal government going forward, companies should continue to monitor and comply with state law to avoid potential enforcement actions at the state level.
More than a year ago, on March 6, 2024, the SEC issued rules requiring public companies to provide climate-related disclosures in their registration statements and annual reports. These climate disclosure rules mandated that companies disclose their direct and indirect greenhouse gas emissions. Stakeholders immediately filed petitions seeking an administrative stay pending judicial review of the rules, which were consolidated in the Eighth Circuit. The SEC issued an order staying the rules in April 2024.
More recently, on March 27, 2025, the SEC voted to end its defense of the climate disclosure rules. On April 24, 2025, in response to a motion by several intervenors that included 18 states[3] and the District of Columbia, to hold the case in abeyance, the Eighth Circuit issued a 90-day stay, ordering the SEC to determine whether it “intends to review or reconsider the rules.” Despite the SEC’s agreement not to defend the rules and the pause in the lawsuit, the rules have not been repealed or rescinded. Additionally, states such as California, New York, Colorado, New Jersey, and Illinois have proposed legislation requiring climate-related disclosures. The proposed legislation in each of these states goes further than the SEC rules, as they require more robust annual public disclosures of emissions.
Despite the indications that the SEC climate disclosure rules are unlikely to be implemented in the near term, companies should be mindful of applicable state disclosure requirements and watch for additional state legislation mandating such disclosures.
State AGs are increasingly focused on how companies protect children’s privacy and ensure their safety in digital spaces. For example, on April 17, 2025, the New Jersey AG (“NJAG”) sued a popular messaging application for allegedly misleading parents about the effectiveness of its safety controls and downplaying the risks to children on the platform. The complaint alleges that the platform was designed to appeal to children and to encourage “unmoderated engagement among users.” Additionally, messages that child users received from “friends” on the platform were not automatically scanned for explicit content, as advertised in its “Safe Direct Messaging” feature. The lawsuit also alleges that the platform “actively chose not to” verify the date of birth of new users, allowing children under 13 to register for accounts. This lawsuit follows similar suits from the NJAG including a multistate suit in 2024 against a popular social media and video platform for failing to protect children from excessive time on an app. Notably, these suits were brought under New Jersey’s Consumer Fraud Act (“CFA”), which is similar to a number of other states’ laws specifically focused on online safety but that have been the subject of extensive challenges with some having been enjoined.
Other State AGs are similarly targeting platforms that handle young peoples’ data and personal information. On April 29, 2025, the Michigan AG sued a streaming platform for allegedly collecting the data and personal information of its underage users and sharing it with third parties without the notice or parental consent required by law. The Michigan AG cites violations of the Children’s Online Privacy Protection Act (“COPPA”); COPPA requires platforms to notify parents about the data it is collecting from their children and to obtain consent before collecting the data.
Companies that offer online services or other connected technologies—particularly services that appeal to children and teens—should take note of the shifting regulatory landscape. The legal landscape surrounding youth privacy and data laws remains dynamic and will likely continue to evolve in the coming months. Companies operating online services should continue to monitor federal and state developments while considering industry best practices.
[1] The letter was signed by the Attorneys General of California, Colorado, Tennessee, New Hampshire, Vermont, American Samoa, Arizona, Arkansas, Connecticut, Delaware, District of Columbia, Hawaii, Illinois, Indiana, Kansas, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Nevada, New Jersey, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Utah, U.S. Virgin Islands, Virginia, Washington, and Wisconsin.
[2] Two Michigan Supreme Court decisions, Smith v. Globe Life Insurance Co., 597 N.W.2d 28 (1999) and Liss v. Lewiston-Richards, Inc., 732 N.W.2d 514 (2007), essentially preclude the Michigan AG from investigating suspected consumer protection violations by regulated industries and licensed professionals.
[3] The 18 states include Massachusetts, the District of Columbia, Arizona, Colorado, Connecticut, Delaware, Hawaii, Illinois, Maryland, Michigan, Minnesota, Nevada, New Mexico, New York, Oregon, Rhode Island, Vermont, Washington, and Wisconsin.