Although tech platforms can help keep us connected, create a vibrant marketplace of ideas, and open up new opportunities for bringing products and services to market, they can also divide us and wreak serious real-world harms. The rise of tech platforms has introduced new and difficult challenges, from the tragic acts of violence linked to toxic online cultures, to deteriorating mental health and wellbeing, to basic rights of Americans and communities worldwide suffering from the rise of tech platforms big and small.
Today, the White House convened a listening session with experts and practitioners on the harms that tech platforms cause and the need for greater accountability. In the meeting, experts and practitioners identified concerns in six key areas: competition; privacy; youth mental health; misinformation and disinformation; illegal and abusive conduct, including sexual exploitation; and algorithmic discrimination and lack of transparency.
One participant explained the effects of anti-competitive conduct by large platforms on small and mid-size businesses and entrepreneurs, including restrictions that large platforms place on how their products operate and potential innovation. Another participant highlighted that large platforms can use their market power to engage in rent-seeking, which can influence consumer prices.
Several participants raised concerns about the rampant collection of vast troves of personal data by tech platforms. Some experts tied this to problems of misinformation and disinformation on platforms, explaining that social media platforms maximize “user engagement” for profit by using personal data to display content tailored to keep users’ attention—content that is often sensational, extreme, and polarizing. Other participants sounded the alarm about risks for reproductive rights and individual safety associated with companies collecting sensitive personal information, from where their users are physically located to their medical histories and choices. Another participant explained why mere self-help technological protections for privacy are insufficient. And participants highlighted the risks to public safety that can stem from information recommended by platforms that promotes radicalization, mobilization, and incitement to violence.
Multiple experts explained that technology now plays a central role in access to critical opportunities like job openings, home sales, and credit offers, but that too often companies’ algorithms display these opportunities unequally or discriminatorily target some communities with predatory products. The experts also explained that that lack of transparency means that the algorithms cannot be scrutinized by anyone outside the platforms themselves, creating a barrier to meaningful accountability.
One expert explained the risks of social media use for the health and wellbeing of young people, explaining that while for some, technology provides benefits of social connection, there are also significant adverse clinical effects of prolonged social media use on many children and teens’ mental health, as well as concerns about the amount of data collected from apps used by children, and the need for better guardrails to protect children’s privacy and prevent addictive use and exposure to detrimental content. Experts also highlighted the magnitude of illegal and abusive conduct hosted or disseminated by platforms, but for which they are currently shielded from being held liable and lack adequate incentive to reasonably address, such as child sexual exploitation, cyberstalking, and the non-consensual distribution of intimate images of adults.
The White House officials closed the meeting by thanking the experts and practitioners for sharing their concerns. They explained that the Administration will continue to work to address the harms caused by a lack of sufficient accountability for technology platforms. They further stated that they will continue working with Congress and stakeholders to make bipartisan progress on these issues, and that President Biden has long called for fundamental legislative reforms to address these issues.
Attendees at today’s meeting included:
- Bruce Reed, Assistant to the President & Deputy Chief of Staff
- Susan Rice, Assistant to the President & Domestic Policy Advisor
- Brian Deese, Assistant to the President & National Economic Council Director
- Louisa Terrell, Assistant to the President & Director of the Office of Legislative Affairs
- Jennifer Klein, Deputy Assistant to the President & Director of the Gender Policy Council
- Alondra Nelson, Deputy Assistant to the President & Head of the Office of Science and Technology Policy
- Bharat Ramamurti, Deputy Assistant to the President & Deputy National Economic Council Director
- Anne Neuberger, Deputy National Security Advisor for Cyber and Emerging Technology
- Tarun Chhabra, Special Assistant to the President & Senior Director for Technology and National Security
- Dr. Nusheen Ameenuddin, Chair of the American Academy of Pediatrics Council on Communications and Media
- Danielle Citron, Vice President, Cyber Civil Rights Initiative, and Jefferson Scholars Foundation Schenck Distinguished Professor in Law Caddell and Chapman Professor of Law, University of Virginia School of Law
- Alexandra Reeve Givens, President and CEO, Center for Democracy and Technology
- Damon Hewitt, President and Executive Director, Lawyers’ Committee for Civil Rights Under Law
- Mitchell Baker, CEO of the Mozilla Corporation and Chairwoman of the Mozilla Foundation
- Karl Racine, Attorney General for the District of Columbia
- Patrick Spence, Chief Executive Officer, Sonos
Principles for Enhancing Competition and Tech Platform Accountability
With the event, the Biden-Harris Administration announced the following core principles for reform:
- Promote competition in the technology sector. The American information technology sector has long been an engine of innovation and growth, and the U.S. has led the world in the development of the Internet economy. Today, however, a small number of dominant Internet platforms use their power to exclude market entrants, to engage in rent-seeking, and to gather intimate personal information that they can use for their own advantage. We need clear rules of the road to ensure small and mid-size businesses and entrepreneurs can compete on a level playing field, which will promote innovation for American consumers and ensure continued U.S. leadership in global technology. We are encouraged to see bipartisan interest in Congress in passing legislation to address the power of tech platforms through antitrust legislation.
- Provide robust federal protections for Americans’ privacy. There should be clear limits on the ability to collect, use, transfer, and maintain our personal data, including limits on targeted advertising. These limits should put the burden on platforms to minimize how much information they collect, rather than burdening Americans with reading fine print. We especially need strong protections for particularly sensitive data such as geolocation and health information, including information related to reproductive health. We are encouraged to see bipartisan interest in Congress in passing legislation to protect privacy.
- Protect our kids by putting in place even stronger privacy and online protections for them, including prioritizing safety by design standards and practices for online platforms, products, and services. Children, adolescents, and teens are especially vulnerable to harm. Platforms and other interactive digital service providers should be required to prioritize the safety and wellbeing of young people above profit and revenue in their product design, including by restricting excessive data collection and targeted advertising to young people.
- Remove special legal protections for large tech platforms. Tech platforms currently have special legal protections under Section 230 of the Communications Decency Act that broadly shield them from liability even when they host or disseminate illegal, violent conduct or materials. The President has long called for fundamental reforms to Section 230.
- Increase transparency about platform’s algorithms and content moderation decisions. Despite their central role in American life, tech platforms are notoriously opaque. Their decisions about what content to display to a given user and when and how to remove content from their sites affect Americans’ lives and American society in profound ways. However, platforms are failing to provide sufficient transparency to allow the public and researchers to understand how and why such decisions are made, their potential effects on users, and the very real dangers these decisions may pose.
- Stop discriminatory algorithmic decision-making. We need strong protections to ensure algorithms do not discriminate against protected groups, such as by failing to share key opportunities equally, by discriminatorily exposing vulnerable communities to risky products, or through persistent surveillance.