The FTC bans AI impersonations of individuals — and unveils greater powers to win stolen money back
FTC says it's fighting in the public's corner
The Federal Trade Commission (FTC) has moved to ban the practice of using AI tools to spoof individuals, as well as announcing greater powers to win stolen money back from scammers.
The agency said that it is "taking this action in light of surging complaints around impersonation fraud, as well as public outcry about the harms caused to consumers and to impersonated individuals."
The rise of public generative AI tools such as ChatGPT has meant that cybercriminals are able to spoof brands and organizations with greater accuracy and ease. Unreal images, voices and videos can be generated in moments that make use of high-profile figures; these are known as deepfakes, and they have been proliferating at a worrying rate.
New powers
The FTC also said that it is "seeking comment on whether the revised rule should declare it unlawful for a firm, such as an AI platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation."
FTC Chair Lina M. Khan added that the agency wants to expand the proposals to its impersonation ruling - which will now include individuals, not just governments and business - in order to "[strengthen] the FTC’s toolkit to address AI-enabled scams impersonating individuals.”
The commission said that it is making these expansions in response to feedback from the public to its previous proposals, as comments made "pointed to the additional threats and harms posed by impersonation of individuals."
The FTC claims that the expansion "will help the agency deter fraud and secure redress for harmed consumers."
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
It has also finalized the Government and Business Impersonation Rule, which will arm the agency with better weapons to fight scammers who abuse AI to spoof real entities.
It will now be able to file federal court cases directly, in order to get cybercriminals to return their earnings made from impersonation. The FTC believes this is a significant step, as it claims that a previous supreme court ruling (AMG Capital Management LLC v. FTC) "significantly limited the agency’s ability to require defendants to return money to injured consumers."
Threat actors that use logos, email and web addresses, or imply a false affiliation with business and government, can now be taken to court by the FTC to "directly seek monetary relief."
The commission voted on this final ruling, passing 3-0. It will be published in the Federal Register.
MORE FROM TECHRADAR PRO
Lewis Maddison is a Reviews Writer for TechRadar. He previously worked as a Staff Writer for our business section, TechRadar Pro, where he had experience with productivity-enhancing hardware, ranging from keyboards to standing desks. His area of expertise lies in computer peripherals and audio hardware, having spent over a decade exploring the murky depths of both PC building and music production. He also revels in picking up on the finest details and niggles that ultimately make a big difference to the user experience.