The Federal Trade Commission put businesses handling children’s and Americans’ most sensitive data on notice that they’ll be top enforcement targets of the agency under its new leadership.
Statements from FTC regulators this week reinforced a shift in the agency’s priorities under the Trump administration to abandon broader definitions of consumer harm and instead foster AI innovation while also focusing its enforcement on laws such as the Children’s Online Privacy Protection Act (COPPA).
“The commission is committed to protecting consumers’ privacy and security interests while promoting competition and innovation,” FTC Commissioner Melissa Holyoak said Tuesday at the annual Global Privacy Summit hosted in Washington, D.C., by the IAPP, an association for data governance professionals.
“We’ll do that by enforcing the laws we have—and not by stretching our legal authorities,” Holyoak said. “And we’ll continue to do it by taking a flexible, risk-based approach to privacy enforcement that balances potential privacy harms, consumer expectations, legal obligations, business needs, and competition.”
Commitment to a more balanced regulatory approach was echoed by her colleague, Christopher Mufarrige, newly appointed director of the FTC’s Bureau of Consumer Protection, during the Interactive Advertising Bureau’s Public Policy and Legal Summit also held on Tuesday.
Mufarrige criticized what he said was the Biden administration’s skepticism toward technology and its benefits for consumers. He emphasized that the FTC will now be “much more in favor of innovation” and will focus on “actual, concrete harms” to consumers.
“The commission’s role is to reinforce market practices, not replace them,” he said.
Compliance
The FTC should specifically focus on its enforcement of three laws under its jurisdiction, Holyoak said: COPPA; the Fair Credit Reporting Act; and the Gramm-Leach-Bliley Act, which targets financial institutions.
While the FTC will continue to go after companies with misleading data-sharing or deficient security practices, data brokers and other businesses that sell Americans’ sensitive data in bulk to foreign adversaries and bad actors will also be targeted.
To do so, Holyoak hinted at future “opportunities” to partner with the Department of Justice as it enforces the recently enacted rule restricting the bulk transfer of sensitive information into certain countries.
“We need stronger enforcement against companies that sell, transfer, or disclose Americans’ sensitive personal data, like precise geolocation data,” she said.
Such information “can be exploited and poses significant—and frankly unacceptable—risks to our national and economic security,” Holyoak added.
The DOJ released its initial guidance for how to comply with its final rule on April 11, a few days after it went into effect. The rule limits US companies’ transfers of certain types of data to countries that pose national security concerns—including China, Cuba, North Korea, Russia, Iran, and Venezuela.
More broadly, FTC regulators also encouraged companies to actively engage with the agency’s requests for information.
Mufarrige addressed frustrations from attorneys who felt they weren’t treated fairly by the Biden-era FTC despite cooperating with Civil Investigative Demands (CIDs), a tool that allows the agency to compel businesses to share documents or answer specific questions.
“Folks will be getting a fair shake,” he said, warning that businesses will still have to “robustly comply with our CID requests.”
“It’s important for everyone to understand that I’m expecting folks to be responsive. So if we’re asking for things, we really want to see responsiveness from those targets,” he said.
CIDs now will be “much more streamlined” and “targeted,” Mufarrige added, acknowledging that these requests create costly burdens, especially for smaller businesses.
Kids’ Privacy
Children’s privacy and safety online is one of Hoyloak’s “top priorities as a commissioner,” she said, adding that the agency “must use every tool that Congress has given it” to regulate companies with products that could negatively affect kids online.
Primary among them is COPPA, she said, pointing to the agency’s amended children’s privacy rule that was published in the Federal Register on Tuesday.
By April 2026, platform operators will need to publish data retention schedules, obtain separate parental consent for disclosure of children’s personal information to third parties, and comply with enhanced data security requirements, she said.
The commission also will continue to study emerging technologies and their impact on children’s online activities, including AI-powered chatbots.
AI
One of the most visible shifts in enforcement priorities is in regard to artificial intelligence.
Looking ahead, the agency will focus on “how AI is used to facilitate frauds and scams,” instead of challenging the technology in itself, Mufarrige said.
Cases such as the one accusing the firm Rytr of providing its subscribers with the “means and instrumentalities” to produce false and deceptive AI-generated content are prime examples of the agency relying on overly broad definitions of consumer harm or injury, he said. The Rytr case was settled in December.
“Focusing our attention on specific instances of concrete injury, which is what we didn’t do with Rytr, that’s what you’re going to be seeing moving forward,” Mufarrige said.