White House Announces New Private Sector Voluntary Commitments to Combat Image-Based Sexual Abuse
President Biden and Vice President Harris have been committed to addressing gender-based violence in all of its forms since Day One of the Biden-Harris Administration. As we celebrate the progress made over the 30 years since the Violence Against Women Act became law, we continue to take on gender-based violence wherever it occurs, including by taking action to counter the risks and harms posed by technologies such as artificial intelligence (AI).
Image-based sexual abuse—both non-consensual intimate images (NCII) of adults and child sexual abuse material (CSAM), including AI-generated images—has skyrocketed, disproportionately targeting women, children, and LGBTQI+ people, and emerging as one of the fastest growing harmful uses of AI to date. This abuse has profound consequences for individual safety and well-being, as well as societal impacts from the chilling effects on survivors’ participation in their schools, workplaces, communities, and more. President Biden and Vice President Harris have taken broad and comprehensive action to address the range of AI’s risks to safety, security, and trustworthiness. In her remarks in London before the AI Safety Summit, Vice President Harris specifically underscored image-based sexual abuse as an urgent threat exacerbated by AI that calls for global action.
Today, following the White House Call to Action to Combat Image-Based Sexual Abuse, the Biden-Harris Administration is announcing new voluntary commitments from AI model developers and data providers to reduce AI-generated image-based sexual abuse. Combatting image-based sexual abuse will continue to evolve. AI model developers and data providers are taking a variety of actions against NCII or CSAM to curb the creation of image-based sexual abuse, including the following:
- Adobe, Anthropic, Cohere, Common Crawl, Microsoft, and OpenAI commit to responsibly sourcing their datasets and safeguarding them from image-based sexual abuse.
- Adobe, Anthropic, Cohere, Microsoft, and OpenAI commit to incorporating feedback loops and iterative stress-testing strategies in their development processes, to guard against AI models outputting image-based sexual abuse.
- Adobe, Anthropic, Cohere, Microsoft, and OpenAI, when appropriate and depending on the purpose of the model, commit to removing nude images from AI training datasets.
These actions build on the voluntary commitments from leading AI companies that the Biden-Harris Administration announced last year. Today’s commitments represent a step forward across industry to reduce the risk that AI tools will generate abusive images. They are part of a broader ecosystem of private sector, academic, and civil society organizations’ efforts to identify and reduce the harms of non-consensual intimate images and child sexual abuse material.
In addition, since the White House Call to Action in May, several companies have taken further actions to combat image-based sexual abuse. Companies have also reaffirmed efforts to prevent the creation, distribution, and monetization of image-based sexual abuse:
- Cash App and Square are curbing payment services for companies producing, soliciting, or publishing image-based sexual abuse, including through additional investments into resources, systems, and partnerships to detect and mitigate payments for image-based sexual abuse.
- Cash App and Square commit to expanding participation in industry groups and initiatives that support signal sharing to detect sextortion and other forms of known image-based sexual abuse to help detection and limit payment services.
- Google continues to take actions across its platforms to address image-based sexual abuse, including updates in July to its search engine to further combat non-consensual intimate images.
- GitHub, a Microsoft company, has updated its policies to prohibit the sharing of software tools that are designed for, encourage, promote, support, or suggest in any way the use of synthetic or manipulated media for the creation of non-consensual intimate imagery.
- Microsoft is partnering with StopNCII.org to pilot efforts to detect and delist duplicates of survivor-reported non-consensual intimate imagery in Bing’s search results; developing new public service announcements to promote trusted, authoritative resources about image-based sexual abuse for victims and survivors; and continuing to demote low quality content across its search engine.
- Meta continues to prohibit the promotion of applications or services to generate image-based sexual abuse on its platforms, has incorporated solutions like StopNCII and TakeItDown directly into its reporting systems, and announced it had removed around 63,000 Instagram accounts that were attempting to engage in financial sextortion scams in July. Meta also recently expanded its existing partnership with the Tech Coalition to include sharing signals about sextortion activity via the Lantern program, helping to disrupt this criminal activity across the wider internet.
- Snap Inc. commits to strengthening reporting processes and promoting resources for survivors of image-based sexual abuse through in-app tools and via their websites.
Complementing today’s announcements, leading private sector companies, civil society organizations, and researchers are announcing a set of voluntary principles to combat image-based sexual abuse in an effort led by the Center for Democracy and Technology, the Cyber Civil Rights Initiative, and the National Network to End Domestic Violence. Through a multi-stakeholder working group, they will continue to identify interventions to prevent and mitigate the harms caused by the creation, spread, and monetization of image-based sexual abuse. In addition, numerous companies have committed to the industry-leading “Safety by Design” principles and mitigations outlined in April by Thorn and All Tech is Human for preventing the misuse of generative AI in furthering child sexual abuse. Thorn plans to publish its first round of transparency reports from the Safety by Design commitments later this year.
Standing with survivors has been a hallmark of the President and Vice President’s careers in public service. As we celebrate 30 years of the Violence Against Women Act, we recognize there is more work to do to build a world where women and children and all people are free from sexual violence, harassment, and abuse, online and offline, and will continue to welcome voluntary actions from industry to prioritize the safety and well-being of women and children.