Hackers are using artificial intelligence to mine unprecedented troves of personal information dumped online in the past year, along with unregulated commercial databases, to trick American consumers and even sophisticated professionals into giving up control of bank and corporate accounts.
Armed with sensitive health information, calling records and hundreds of millions of Social Security numbers, criminals and operatives of countries hostile to the United States are crafting emails, voice calls and texts that purport to come from government officials, co-workers or relatives needing help, or familiar financial organizations trying to protect accounts instead of draining them.
“There is so much data out there that can be used for phishing and password resets that it has reduced overall security for everyone, and artificial intelligence has made it much easier to weaponize,” said Ashkan Soltani, executive director of the California Privacy Protection Agency, the only such state-level agency.
The losses reported to the FBI’s Internet Crime Complaint Center nearly tripled from 2020 to 2023, to $12.5 billion, and a number of sensitive breaches this year have only increased internet insecurity. The recently discovered Chinese government hacks of U.S. telecommunications companies AT&T, Verizon and others, for instance, were deemed so serious that government officials are being told not to discuss sensitive matters on the phone, some of those officials said in interviews. A Russian ransomware gang’s breach of Change Healthcare in February captured data on millions of Americans’ medical conditions and treatments, and in August, a small data broker, National Public Data, acknowledged that it had lost control of hundreds of millions of Social Security numbers and addresses now being sold by hackers.
Meanwhile, the capabilities of artificial intelligence are expanding at breakneck speed. “The risks of a growing surveillance industry are only heightened by AI and other forms of predictive decision-making, which are fueled by the vast datasets that data brokers compile,” U.S. Consumer Financial Protection Bureau Director Rohit Chopra said in September.
With no federal privacy legislation to stem the flood, national security experts fear that foreign spy agencies will keep vacuuming up everything they need to hack, recruit or blackmail officials with sensitive missions, debts and embarrassing personal secrets. “Six or seven years ago, people said there was too much data; adversaries don’t know what to do with it,” CFPB Senior Counsel Kiren Gopal told The Washington Post. “Now they have AI tools to sift through for things that are actually useful.”
It is far from clear what the arrival of President-elect Donald Trump’s administration will mean for privacy efforts. His campaign platform does not mention the topic but does commit to a massive deportation of immigrants and slashing regulations, which suggests that the government will be a major consumer of location data and that it would not be inclined to limit its collection. Spokespeople for Trump didn’t respond to emailed questions.
Regulators aren’t waiting to find out. Chopra’s consumer bureau on Tuesday proposed restricting the sale of sensitive but nonfinancial data, such as Social Security and phone numbers and street addresses, the same way that credit and salary histories are limited. Under the new rules, those could not be sold for marketing, but only for approved purposes such as employment background checks, law enforcement needs or identify verification.
“The rule would ban misuse of our sensitive personal identifiers,” Chopra said.
Financial data such as income levels would also get more protection, with brokers required to give consumers access and the right to delete inaccurate information, the same way big credit bureaus must. The rules are now subject to a three-month comment period before taking effect.
Sen. Ron Wyden (D-Ore.), who had sought the CFPB’s action for years, said he was concerned about its prospects for survival under the new administration. “Unfortunately, it will be up to Trump’s CFPB to finalize this,” he said.
For its part, the Federal Trade Commission unveiled settlements Tuesday with two major data brokers, prohibiting them from selling users’ sensitive location data. The agency has been empowered by a recent judge’s decision that carelessness with sensitive data can be considered an unfair trade practice.
And the Justice Department, which previously accused three data brokers of deliberately selling lists of vulnerable seniors to deceptive marketers, is pushing out rules aimed at stopping adversarial countries from obtaining data about people serving in the U.S. military abroad. Researchers have been able to identify troops by the movement of their devices or data on special interests for marketing, and other countries have used stolen data to identify undercover intelligence agents.
Such administrative and legal actions come amid the dearth of federal privacy legislation, which has failed largely because Republicans and Democrats have disagreed about what to put in it. A Republican-dominated Congress that includes members concerned about government snooping might be in a better position to get something passed. While some states have passed laws specifically protecting information that might reveal visitors to abortion providers, Republicans have expressed concerns about data on police officers and others. Rep. Gus M. Bilirakis (R-Fla.), for one, has backed a national privacy rights standard as a bulwark against Big Tech companies, and House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-Wash.) has been a major proponent of compromise legislation.
Even if all that and more comes to pass — and Trump adviser Elon Musk’s threat to wipe out the CFPB remains unfulfilled — so much data is now available about so many people that any government action is likely to have limited effect. Soltani himself, a former FTC technologist and widely recognized privacy expert, nearly fell for one of the schemes he has been working against.
A caller from a Google support number warned the California official that someone might be trying to take over his email account by adding a new recovery number. He told Soltani his previous recovery number and asked whether the recently added number was legitimate. When Soltani said no, the caller created a ticket for the issue and sent Soltani an email to confirm he had control over the account. Then he said he would send a code to Soltani’s phone to make sure he had control of the right recovery number and ask him to read it back.
Though he believed the person was trying to help him, Soltani told him he never repeated one-time codes to anyone. Eventually, the scammer disconnected. Only then did Soltani piece things together — the tiny delay before each time the caller spoke was probably due to AI-assisted translation from text to speech; Soltani’s real phone number and email had come from a database or a leak; and the one-time code would have handed his account to the scammer.
Horror stories have flooded the consumer bureau from stalking victims and those unable to make major purchases because of fraud committed using their names and personal details. In other cases, the sea of publicly available data has allowed hackers to target the personal online accounts of executives, which they can use to get into corporate networks, said Mark Weatherford of Gretel, who advises dozens of security companies and was an early cybersecurity leader at the Department of Homeland Security.
“Attacks that were once laborious and required elaborate, time-consuming manual research can now be carried out quickly and precisely by criminals who leverage machine learning to aggregate intimate details from vast online databases,” Weatherford said.
AI is also being used to impersonate executive voices to dupe colleagues.
Soltani’s agency is preparing to enforce a California law under which consumers can, beginning in 2026, have the agency order all commercial data brokers registered in the state to delete some data about them, instead of having to go to each one separately themselves. Data brokers will still be able to sell information related to fraud prevention. The provision is among the toughest of the data privacy laws 19 states have on the books and will be the first of its kind to take effect.
One key argument the data brokering industry has marshaled in fighting those laws is that the sector has a free-speech right to provide information that is public but obscure. That defense took a hit in a closely watched case last week, when a federal judge ruled brokers’ speech rights were not violated by a New Jersey law requiring them to delete the home addresses of judges and police officers who demand it.
Soltani said he hopes that state laws could lead to a strong national privacy practice, the same way Florida’s Do Not Call registry eventually went national, even if the new proposed consumer bureau regulations fail. Otherwise, the horror stories will keep coming.