AI scams, including ‘robot lawyer,’ target of new FTC crackdown


The Federal Trade Commission announced Tuesday a new law enforcement effort called Operation AI Comply. As part of the sweep, it took action against multiple companies that used artificial intelligence to “supercharge” deceptive products and services.

The cases involved AI-generated fake reviews, “the world’s first robot lawyer,” and online storefront schemes.

The FTC said in a statement that consumers lost tens of millions of dollars, lured by the promise that AI-enabled problem-solving and automation would save them time, money, and, in the e-commerce cases, lead to increased earnings.

Ultimately, the companies didn’t deliver on their claims, and knew they were deceiving customers.

“Using AI tools to trick, mislead, or defraud people is illegal,” said Lina M. Khan, the agency’s chair. “The FTC’s enforcement actions make clear that there is no AI exemption from the laws on the books.”

The FTC’s complaint against Rytr, the company that offered the AI writing service, describes how subscribers could generate effectively fake reviews that had no basis in their user’s input. The FTC said that, in many cases, the AI-generated reviews included false information that would deceive consumers interested in purchasing a certain product. Some of Rytr’s subscribers created thousands of reviews that potentially featured inaccuracies.

Mashable Light Speed

The FTC argued that Rytr offered a service capable of disseminating a “glut of fake reviews that would harm both consumers and honest competitors.” The FTC has proposed barring Rytr from advertising, promoting, marketing, or selling a service related to generating consumer reviews or testimonials in the future. The agency banned AI-generated and fake reviews in August.

As part of Operation AI Comply, the FTC took action against DoNotPay, a company that told consumers its AI robot could help them “sue for assault without a lawyer” and “generate perfectly valid legal documents in no time.” While the company billed the service as “the world’s first robot lawyer,” it didn’t conduct testing to compare its AI chatbot to a human lawyer. Nor did it have attorneys on staff.

Instead, the company told customers that it could check a small business’s website for violations of federal and state law, a feature that was not effective.

DoNotPay told Mashable in a statement that the FTC’s complaint related to services that “a few hundred” customers used, and which have “long been discontinued.” The company noted that it settled the case without admitting liability.

The FTC also filed complaints against three companies that preyed on people looking to open online storefronts, including on TikTok, Walmart, Amazon, and Etsy. These business typically charged a significant fee to start an online store powered by proprietary software and AI that could boost their earnings. Some customers were required to purchase inventory that didn’t sell.

One company, FBA Machine, promised customers they could operate a “7-figure business” that would be “risk-free.” It falsely guaranteed refunds to customers who fell short of recouping their initial investment, which the FTC said ranged from tens of thousands to hundreds of thousands of dollars. In total, customers lost nearly $16 million to the scheme.

Another company, Ascend Ecom, told customers that they could start stores that would eventually produce a five-figure monthly income, thanks to its “cutting edge” AI tools. The FTC estimated that the scheme defrauded customers of at least $25 million.

UPDATE: Sep. 25, 2024, 11:55 a.m. UTC This story has been updated to include a statement from DoNotPay.





Source link