(NEWSnet/AP) — President Joe Biden's administration is pushing the tech industry and financial institutions to shut down a market of abusive sexual images made with artificial intelligence technology.

Generative AI tools have made it simple to transform someone's likeness into a sexually explicit AI deepfake and share those images across chatrooms or social media. Victims have little recourse to stop it.

The White House seeks voluntary cooperation from companies, in the absence of federal legislation. By committing to a set of specific measures, officials hope the private sector can curb creation, spread and monetization of such nonconsensual AI images, including explicit images of children.

“As generative AI broke on the scene, everyone was speculating about where the first real harms would come. And I think we have the answer,” said Biden's chief science adviser Arati Prabhakar, director of White House's Office of Science and Technology Policy.

She described to The Associated Press a “phenomenal acceleration” of nonconsensual imagery fueled by AI tools and largely targeting women and girls in a way that can upend their lives.

A document shared with AP urges action from AI developers and payment processors, financial institutions, cloud computing providers, search engines and gatekeepers, namely Apple and Google, that control what gets onto mobile app stores.

The private sector should step up to “disrupt the monetization” of image-based sexual abuse, restricting payment access particularly to sites that advertise explicit images of minors, the administration said.

Prabhakar said many payment platforms and financial institutions already say they won't support the types of business promoting abusive imagery.

Follow NEWSnet on Facebook and X platform to get our headlines in your social feeds.

Copyright 2024 NEWSnet and The Associated Press. All rights reserved.