Social media companies have harmed our economy, government, social fabric, and public square. The January 6, 2021 insurrection at the United States Capitol, which was fueled not just by partisan networks like Parler, but by household social media platforms like Facebook and Twitter, has made it clear that government intervention and better oversight is urgently needed.
While many people still still understand the problems in general terms, such as, “social media makes us polarized,” or “there’s no privacy online” there is a growing and strengthening movement to hold these companies accountable. Too often the voices of these organizers, researchers and civil society groups are missing from the discussion about how to develop better public policy, track online mis and disinformation, and hold platforms accountable through public advocacy campaigns.
Last year, we commissioned an independent report from ORS Impact to gain an in-depth understanding of the policy ideas and issues these organizations are pursuing and create a comprehensive view of current efforts to address these problems at their roots. This kind of report is an important part of our work at Democracy Fund that we use to make informed ongoing decisions about our strategy and highlight the vital work of grassroots organizations. We are publishing this report to help funders and organizations interested in doing platform accountability work gain an understanding of the field as it stands today, and develop effective strategies and programs of their own.
Three major learnings from the report will inform Democracy Fund’s Platform Accountability strategy:
1. The algorithms behind social media platforms often amplify existing inequalities along the lines of race, class, and gender, and allow for bad actors both foreign and domestic to manipulate public opinion. Existing laws and legal precedent make it difficult to regulate algorithms with public policy. For example, current interpretations of the First Amendment generally protect algorithms as a form of speech. And to begin with, Section 230 of the Communications Decency Act absolves social media companies of responsibility for the content their users publish on their platforms, under the theory that the threat of being held liable for what users post would make platforms act as speech police rather than open platforms for free expression. In practice, the platforms have used this protection to avoid all responsibility for hate speech and mis/disinformation that manipulates public opinion and undermines elections. They have also used the liability protection under Section 230 as a shield against transparency and due process in their moderation practices.
Important grantees and partners in this area include:
- Groups like Change the Terms, Color of Change, and MediaJustice that have led public campaigns to pressure the platforms to adopt stronger policies against hate groups, in particular white supremacists.
- Free Press Action and the Lawyers’ Committee for Civil Rights Under Law, which have proposed comprehensive legislation to protect user data, prevent algorithmic discrimination, give users more control over their own data, empower robust enforcement of regulations, and more.
- Ranking Digital Rights, which provides a public index ranking social media companies by their transparency and integrity.
2. Journalists, researchers, and other investigators face difficulties as they try to understand how the platforms distribute and amplify information. It’s also very difficult for everyday users to know who is behind the political advertising they see. The platforms have offered very little access to internal data, and as a public, we can’t solve problems we don’t understand. Opportunities include the potential for research institutions to partner with one another to collect data about how the platforms operate, and act as data brokers between platforms and researchers. Challenges include the additional need for more qualitative data from the platforms about how they develop policy and make decisions about their algorithms and content moderation processes.
Important grantees and partners in this area include:
- The NYU Online Political Ads Transparency Project, which has created a free tool that allows users and researches to track the sources of political advertising on platforms.
- The Stigler Committee on Digital Platforms, which has argued that the Federal Trade Commission could be empowered to have access to platforms databases, so they can perform their own research on platform impacts, and grant selective access to independent researchers.
- The German Marshall Fund, which advocates for new legislation similar to existing law that requires politicians to disclose the funding source of their TV ads (the Honest Ads Act).
3. There is a need for coordination between grantees, funders and partners to distribute important civic information at scale by leveraging the tools of social media. At present, there are few viable ideas for large-scale intervention, which points out the need for more research, strategy, and relationship-building. Major efforts in this space include the 2020 Elections Research Project, a first-of-its-kind collaboration between Facebook and outside academic researchers to study Facebook and Instagram’s impact on political participation and the shaping of public opinion; the Civic Information API, which aggregates essential information on local representatives and elections to empower developers and inform everyday people; and the Voting Information Project, which helps voters find reliable information on where to vote and what issues are on their ballots.
Important collaborations in this area include:
- The Social Science Research Council, which supports scholars, generates new research, and connects researchers with policymakers, nonprofits, and citizens.
- The Google News Initiative and Facebook Journalism Project, both of which provide monetary and in-kind support to help local news publishers connect with their communities and adapt their business models for the digital age.
- The Facebook Civil Rights Audit, which Facebook initiated after a campaign led by groups like Free Press and Color of Change pressured the company to take civil rights issues on its platform more seriously.
The ORS Impact report will inform Democracy Fund’s grantmaking strategy, and how we build networks between grantees that cut across traditional divides between researchers, civil society organizations, advocates and policymakers. The report provides a snapshot of the field during a critical time for platform accountability work, providing a fuller understanding of the current context. Our sister organization, Democracy Fund Voice, will be implementing a similar review process in the coming months for its Media Policy strategy, which will include in-depth interviews with several grantees mentioned in this report about how the challenges of 2020 have impacted their work.
To learn more about our Digital Democracy program, contact Paul Waters, associate director, Public Square Program at pwaters [@] democracyfund.org.