The U.S. Federal Communications Commission on Thursday outlawed robocalls that contain artificial intelligence-generated voices, citing their capacity to misinform voters.

The ruling, which takes effect immediately, targets robocalls made with AI voice-cloning tools under the Telephone Consumer Protection Act. The 1991 law regulates junk calls that use artificial and prerecorded voice messages.

“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting the fraudsters behind these robocalls on notice,” FCC Chairwoman Jessica Rosenworcel said in a statement announcing the unanimous decision.

The ruling will give state attorneys general new tools to target perpetrators behind these calls “and ensure the public is protected from fraud and misinformation,” Rosenworcel said in the statement.

The ruling comes after AI-generated robocalls that impersonated President Joe Biden sought to discourage people from voting in the New Hampshire primary election last month. The New Hampshire attorney general announced this week that the calls were traced to companies in Texas.

“We need to protect voters from deepfake AI technology disenfranchising them or manipulating their thought processes,” Theresa Payton, CEO of cybersecurity company Fortalice Solutions, told VOA.

“This is a step in the right direction, and I would like to see more debate and open discussion on what the guard rails for the usage of this technology should be,” added Payton, who previously served as the White House’s Chief Information Officer during the George W. Bush administration.

Under the new regulation, the FCC can fine companies that use AI voices in their calls or block the service providers that carry the calls.

The agency has previously used the law to hold robocallers accountable for election interference, including imposing a $5 million fine on two conservative hoaxers who used robocalls to try to reduce Black voter turnout in the 2020 presidential election.

Deepfake AI technology has also been used in attempts to influence voters around the world.

Just days before elections in Slovakia in September 2023, for instance, a deepfake audio circulated online imitating a politician who appeared to be discussing how to rig the election.

The FCC had been considering banning AI voices in robocalls due to a rise it saw in these types of calls. In January, a bipartisan group of 26 state attorneys general urged the FCC to go ahead with a ruling.

While the ruling is a positive development, according to Payton, she added that there still may be the threat of foreign forces using AI-generated robocalls in attempts to influence elections.

“This is going to be helpful as it relates to American political operatives, but I am not clear how this would be enforced against non-Americans that may choose to abuse this technology and interfere in our political process,” she said.