Blog

Fighting for an internet that is safe for all: how structural problems require structural solutions

/
September 30, 2020

In 2017, a college student named Taylor Dumpson achieved what many young scholars dream of: she was elected student body president. As the first African-American woman president at American University in Washington, D.C., news of her election was celebrated by many as a sign of growing racial equity in higher education.

But day one of her presidency was anything but triumphant. The night before, a masked man hung bananas around campus inscribed with racist slogans. The neo-Nazi website The Daily Stormer then picked up news reports of the incident and directed a “troll army” to flood the Facebook and email accounts of Dumpson and AU’s student government with hateful messages and threats of violence. Dumpson feared being attacked while carrying out her duties as president and attending class and was later diagnosed with post-traumatic stress disorder.

Two years later, the Lawyers’ Committee for Civil Rights Under Law helped Dumpson win a major lawsuit against her harassers. Building on the D.C. Human Rights Act of 1977, Dumpson’s legal team successfully argued that the harassment she faced online limited her access to a public accommodation, her university. It was a significant victory for online civil rights, but her case raises an important question: why weren’t there laws or policies to protect her in the first place?

Part of the problem is that civil rights laws have yet to be updated for the 21st century. “No one predicted the internet when they wrote these laws,” says David Brody, a lead attorney in Dumpson’s case. “Only just now are these laws getting applied to the internet,” he added. A 2020 Lawyers’ Committee report that Brody co-authored shows that laws preventing discrimination online vary widely state-to-state, leaving large gaps in civil rights protections online. 

The second part of the problem is that social media platforms are designed to optimize for engagement, — to keep people on their platform as long as possible. This sounds like a reasonable business goal, but the result is that oftentimes the platforms’ algorithms elevate the most extreme or offensive content, like racist threats against an African-American student body president, simply because it gets the quickest and most intense reactions. While Brody and the Lawyers’ Committee did not pursue this issue in the Taylor Dumpson case, experts agree that it is a major structural barrier to ensuring civil rights in the 21st century. Optimizing for engagement too often means optimizing for outrage, providing extremists and hate groups tools to spread and popularize their destructive ideologies.

Deeply rooted problems like these have created an internet that is often unsafe and unjust, particularly for people of color and women, who have long borne the brunt of online harms, leaving them with an impossible choice: stay on social media and accept daily threats and harassment, or leave the platforms altogether, giving up on participating in the 21st century public square. In 2014, Black feminist bloggers like l’Nasha Crockett, Sydette Harry, and Shafiqah Hudson warned of the rise of online hate and disinformation – two full years before “alt-right” groups and Russia-funded “troll armies” wreaked havoc on public discourse during the 2016 U.S. presidential election

The harassment of people of color and women on platforms owned by Facebook, Google, and Twitter  illustrates larger problems that should concern us all. The digital tools and technologies we have come to depend on are largely owned by private companies driven to maximize profits — even at the expense of the civil rights protections guaranteed under U.S. laws and the Constitution. When clicks and viral posts are prioritized at any cost, democracy suffers. 

Policymakers must recognize that we need to update our civil rights laws, and create new laws where necessary, to fulfill our nation’s Constitutional promises. Within the private sector, tech companies must take it upon themselves to track and combat discrimination on their platforms and stop the spread of online hate. When they do not, we must build public movements to hold them accountable and demand equal access to civil rights protections. Structural problems require structural solutions. Some possible solutions that Democracy Fund grantees have put forth include things like: 

The Digital Democracy Initiative is proud to fund groups like the Lawyer’s Committee, Data for Black Lives, and MediaJustice who work to fill gaps in law and public policy — as well as groups like Stop Online Violence Against Women and Color of Change, whose work exposing and combatting coordinated hate and harassment specifically centers the concerns of people of color and women.

Democracy Fund supports coalition building, independent research, and policy development that hold platforms accountable to the public interest, not just their own profits. If you would like to get involved, here are three things you can do: 

  1. Learn more about root causes. Take a look at our systems map to gain a greater understanding of the interconnected nature of the issues we’re working on. 
  2. Support organizations working on these issues. This is incredibly important, particularly as budgets are strained during the COVID-19 pandemic. See our grantee database for the full list of organizations Democracy Fund is supporting. 
  3. Look for ways to make your voice heard. Grantees like Free Press and Color of Change regularly organize petitions to hold tech platforms accountable

To learn more about our work, contact Paul Waters, associate director, Public Square Program, at pwaters [@] democracyfund.org. 

Democracy Fund
1200 17th Street NW Suite 300,
Washington, DC 20036