Statement

Statement From Democracy Fund President Joe Goldman 

/
November 7, 2020

Today, Joe Biden was projected the winner of the 2020 presidential election. He will serve as the 46th president of the United States, and Kamala Harris will be the first Black and South Asian woman to serve as vice president. The fact that we have a projected winner today in light of the unprecedented challenges we have faced as a nation demonstrates the resilience of our democratic institutions and the successful execution of our electoral process.

Our nation’s ability to run an election amid a global pandemic was not a foregone conclusion just a few days ago. It is remarkable that this election proceeded with few reports of issues in election administration and limited instances of the violence we feared. Most notably, it is worth celebrating that more than 150 million Americans exercised their right to vote — many for the first time — resulting in a once-in-a-century level of turnout. And despite political pressure, it is a testament to the endurance of our democracy that all of their votes are being counted. 

These remarkable accomplishments are due to the tireless efforts of dedicated public servants and civil society groups who have worked against all of the odds to ensure the integrity of our electoral systems, improve voter access, and inspire the public to get engaged. We also cannot overstate how important it is that hundreds of thousands of Americans, particularly Black and Brown women, answered the call to serve their democracy as organizers, volunteers, and poll workers. We are incredibly grateful to all of our grantees who made this possible, and look forward to celebrating their hard work in the coming weeks and months. 

But despite these accomplishments, the results of this election have not brought the threats to our democracy to an end — in many ways, this election has revealed the true depths of the challenges we face.

The rhetoric coming from President Trump and his supporters baselessly questioning the validity of this election are appalling and undemocratic. Their worst rhetoric — challenging votes from historically disenfranchised communities for extra scrutiny — requires swift and strong condemnation. With no credible evidence of fraud to date, the myriad legal challenges brought forth are unlikely to change the outcome. They are instead intended to foment public unrest and  to undermine the long-term health of our democratic institutions.  

Americans must refute these efforts. Leaders within both parties and within the media – along with trusted nonpartisan leaders within our religious, business, and veteran communities – must work to assure voters of the facts in order to facilitate a peaceful transition of power. 

It is time to begin to repair the damage done by an administration bent on tearing down democratic norms and questioning our foundational institutions. We now have an opportunity to create a democracy worthy of the trust of every American and shift the underlying, toxic dynamics that have poisoned our political system. From the federal government to the local level, we must undertake a new era of reforms to foster effective, inclusive government institutions able to deliver for the American people. 

Make no mistake: creating a more open and just democracy remains an enormous undertaking. We must not be daunted, but let the urgency of this moment energize and focus us for the work ahead.

Statement

Count Every Vote: Statement From Democracy Fund President Joe Goldman 

/
November 4, 2020

This election season brought unprecedented turnout across the country. Millions of voters cast their ballots through early voting, vote-by-mail, and yesterday, at the polls. They turned out  despite a global pandemic that required new processes and protocols. Moreover, with the support of thousands of election officials and volunteers, early reports suggest the country had a relatively smooth Election Day, without widespread intimidation, interference, or administrative problems. 

But we do not yet know who will occupy the White House, nor who will hold many congressional, state, and local positions across the country. Now is the time to count every vote and listen to the voice of every voter. Politicians — including President Trump — have had their say over many months, but now the will of the American people must be heard.

Authoritarian calls to end counting of legitimate ballots have no place in our democracy. The President’s calls to do so undermine the will of the people and violate a basic principle of democracy. But this moment is what many experts, civic organizations, and media have spent months preparing for. We remain confident the system was built for this moment. 

The two most important things Americans can do over the next 24 hours are: 

  • Be patient and let every vote be counted: A core principle of American democracy is that we choose our leaders – our leaders do not choose their voters. We must be patient and count every vote because every vote counts. 
  • Support and trust election officials: These state and local civil servants have been administering elections for years. We must give them time to count every vote and verify the results.

We’re going to know a lot more in the next few days. There are thousands of civic leaders and democracy champions — including highly qualified election officials — who are ensuring this process proceeds according to our highest democratic ideals. The campaigns can say what they want, but the votes of the American people will determine who takes office in January. Alongside our partners, Democracy Fund will be working until the final hour to ensure every vote is counted and the American people have the final say.

Blog

Fighting for an internet that is safe for all: how structural problems require structural solutions

/
September 30, 2020

In 2017, a college student named Taylor Dumpson achieved what many young scholars dream of: she was elected student body president. As the first African-American woman president at American University in Washington, D.C., news of her election was celebrated by many as a sign of growing racial equity in higher education.

But day one of her presidency was anything but triumphant. The night before, a masked man hung bananas around campus inscribed with racist slogans. The neo-Nazi website The Daily Stormer then picked up news reports of the incident and directed a “troll army” to flood the Facebook and email accounts of Dumpson and AU’s student government with hateful messages and threats of violence. Dumpson feared being attacked while carrying out her duties as president and attending class and was later diagnosed with post-traumatic stress disorder.

Two years later, the Lawyers’ Committee for Civil Rights Under Law helped Dumpson win a major lawsuit against her harassers. Building on the D.C. Human Rights Act of 1977, Dumpson’s legal team successfully argued that the harassment she faced online limited her access to a public accommodation, her university. It was a significant victory for online civil rights, but her case raises an important question: why weren’t there laws or policies to protect her in the first place?

Part of the problem is that civil rights laws have yet to be updated for the 21st century. “No one predicted the internet when they wrote these laws,” says David Brody, a lead attorney in Dumpson’s case. “Only just now are these laws getting applied to the internet,” he added. A 2020 Lawyers’ Committee report that Brody co-authored shows that laws preventing discrimination online vary widely state-to-state, leaving large gaps in civil rights protections online. 

The second part of the problem is that social media platforms are designed to optimize for engagement, — to keep people on their platform as long as possible. This sounds like a reasonable business goal, but the result is that oftentimes the platforms’ algorithms elevate the most extreme or offensive content, like racist threats against an African-American student body president, simply because it gets the quickest and most intense reactions. While Brody and the Lawyers’ Committee did not pursue this issue in the Taylor Dumpson case, experts agree that it is a major structural barrier to ensuring civil rights in the 21st century. Optimizing for engagement too often means optimizing for outrage, providing extremists and hate groups tools to spread and popularize their destructive ideologies.

Deeply rooted problems like these have created an internet that is often unsafe and unjust, particularly for people of color and women, who have long borne the brunt of online harms, leaving them with an impossible choice: stay on social media and accept daily threats and harassment, or leave the platforms altogether, giving up on participating in the 21st century public square. In 2014, Black feminist bloggers like l’Nasha Crockett, Sydette Harry, and Shafiqah Hudson warned of the rise of online hate and disinformation – two full years before “alt-right” groups and Russia-funded “troll armies” wreaked havoc on public discourse during the 2016 U.S. presidential election

The harassment of people of color and women on platforms owned by Facebook, Google, and Twitter  illustrates larger problems that should concern us all. The digital tools and technologies we have come to depend on are largely owned by private companies driven to maximize profits — even at the expense of the civil rights protections guaranteed under U.S. laws and the Constitution. When clicks and viral posts are prioritized at any cost, democracy suffers. 

Policymakers must recognize that we need to update our civil rights laws, and create new laws where necessary, to fulfill our nation’s Constitutional promises. Within the private sector, tech companies must take it upon themselves to track and combat discrimination on their platforms and stop the spread of online hate. When they do not, we must build public movements to hold them accountable and demand equal access to civil rights protections. Structural problems require structural solutions. Some possible solutions that Democracy Fund grantees have put forth include things like: 

The Digital Democracy Initiative is proud to fund groups like the Lawyer’s Committee, Data for Black Lives, and MediaJustice who work to fill gaps in law and public policy — as well as groups like Stop Online Violence Against Women and Color of Change, whose work exposing and combatting coordinated hate and harassment specifically centers the concerns of people of color and women.

Democracy Fund supports coalition building, independent research, and policy development that hold platforms accountable to the public interest, not just their own profits. If you would like to get involved, here are three things you can do: 

  1. Learn more about root causes. Take a look at our systems map to gain a greater understanding of the interconnected nature of the issues we’re working on. 
  2. Support organizations working on these issues. This is incredibly important, particularly as budgets are strained during the COVID-19 pandemic. See our grantee database for the full list of organizations Democracy Fund is supporting. 
  3. Look for ways to make your voice heard. Grantees like Free Press and Color of Change regularly organize petitions to hold tech platforms accountable

To learn more about our work, contact Paul Waters, associate director, Public Square Program, at pwaters [@] democracyfund.org. 

Blog

Legal Clinic Fund Expands Support for Local Newsrooms with Five New Grants to First Amendment Clinics

/
September 23, 2020

As law students across the country return to the classroom, many are also putting their education to work through legal clinics where they can help advance critical issues facing our democracy. From San Juan, Puerto Rico to Cleveland, Ohio, First Amendment law students are helping defend local journalists and fight vital press freedom battles in what is shaping up to be the worst year in a decade for press freedom.

 

Blog

Social Media Transparency is Key for Our Democracy

/
August 11, 2020

According to the Pew Research Center, one in five Americans rely primarily on social media for their political news and information. This means a small handful of companies have enormous control over what a broad swath of America sees, reads, and hears. Now that the coronavirus has moved even more of our lives online, companies like Facebook, Google, and Twitter have more influence than ever before. And yet, we know remarkably little about how these social media platforms operate. We don’t know the answers to questions like: 

  • How does information flow across these networks? 
  • Who sees what and when? 
  • How do algorithms drive media consumption? 
  • How are political ads targeted? 
  • Why does hate and abuse proliferate? 

Without answers to questions like these, we can’t guard against digital voter suppression, coronavirus misinformation, and the rampant harassment of Black, Indigenous, and people of color (BIPOC) online. That means we won’t be able to move closer to the open and just democracy we need. 

A pattern of resisting oversight 

The platforms have strong incentives to remain opaque to public scrutiny. Platforms profit from running ads — some of which are deeply offensive — and by keeping their algorithms secret and hiding data on where ads run they avoid accountability — circumventing advertiser complaints, user protests, and congressional inquiries. Without reliable information on how these massive platforms operate and how their technologies function, there can be no real accountability. 

When complaints are raised, the companies frequently deny or make changes behind the scenes. Even when platforms admit something has gone wrong, they claim to fix problems without explaining how, which makes it impossible to verify the effectiveness of the “fix.” Moreover, these fixes are often just small changes that only paper over fundamental problems, while leaving the larger structural flaws intact. This trend has been particularly harmful for BIPOC who already face significant barriers to participation in the public square.   

Another way platforms avoid accountability is via legal mechanisms like non-disclosure agreements (NDAs) and intellectual property law, including trade secrets, patents, and copyright protections. This allows platforms to keep their algorithms secret, even when those algorithms dictate social outcomes protected under civil rights law

Platforms have responded to pressure to release data in the past — but the results have fallen far short of what they promised. Following the 2016 election, both Twitter and Facebook announced projects intended to release vast amounts of new data about their operations to researchers. The idea was to provide a higher level of transparency and understanding about the role of these platforms in that election. However, in nearly every case, those transparency efforts languished because the platforms did not release the data they had committed they would provide. Facebook’s reticence to divulge data almost a year after announcing the partnership with the Social Science Research Council is just one example of this type of foot-dragging

The platforms’ paltry transparency track record demonstrates their failure to self-regulate in the public interest and reinforces the need for active and engaged external watchdogs who can provide oversight. 

How watchdog researchers and journalists have persisted despite the obstacles

Without meaningful access to data from the platforms, researchers and journalists have had to reverse engineer experiments that can test how platforms operate and develop elaborate efforts merely to collect their own data about platforms. 

Tools like those developed by NYU’s Online Political Transparency Project have become essential. While Facebook created a clearinghouse that was promoted as a tool that would serve as a compendium of all the political ads being posted to the social media platform, NYU’s tool has helped researchers independently verify the accuracy and comprehensiveness of Facebook’s archive and spot issues and gaps. As we head into the 2020 election, researchers continue to push for data, as they raise the alarm about significant amounts of mis/disinformation spread through manipulative political groups, advertisers, and media websites. 

Watchdog journalists are also hard at work. In 2016, the Wall Street Journal built a side-by-side Facebook feed to examine how liberals and conservatives experience news and information on the platform differently. Journalists with The Markup have been probing Google’s search and email algorithms. ProPublica has been tracking discriminatory advertising practices on Facebook.

Because of efforts like these, we have seen some movement. The recent House Judiciary Committee’s antitrust subcommittee hearing with CEOs from Apple, Facebook, Google and Amazon was evidence of a bipartisan desire to better understand how the human choices and technological code that shape these platforms also shape society. However, the harms these companies and others have caused are not limited to economics and market power alone. 

How we’re taking action

At Democracy Fund, we are currently pushing for greater platform transparency and working to protect against the harms of digital voter suppression, coronavirus misinformation, and harassment of BIPOC by: 

  • Funding independent efforts to generate data and research that provides insight regarding the platforms’ algorithms and decision making; 
  • Supporting efforts to protect journalists and researchers in their work to uncover platform harms;
  • Demanding that platforms provide increased transparency on how their algorithms work and the processes they have in place to prevent human rights and civil rights abuses; and
  • Supporting advocates involved in campaigns that highlight harms and pressure the companies to change, such as Change the Terms and Stop Hate for Profit.

Demanding transparency and oversight have a strong historical precedent in American media. Having this level of transparency makes a huge difference for Americans — and for our democracy. Political ad files from radio and television broadcasters (which have been available to the public since the 1920s) have been invaluable to journalists reporting on the role of money in elections. They have fueled important research about how broadcasters work to meet community information needs. 

The public interest policies in broadcasting have been key to communities of color who have used them to challenge broadcaster licenses at the Federal Communications Commission when they aren’t living up to their commitments. None of these systems are perfect, as many community advocates will tell you, but even this limited combination of transparency and media oversight doesn’t exist on social media platforms. 

Tech platforms should make all their ads available in a public archive. They should be required to make continually-updated, timely information available in machine-readable formats via an API or similar means. They should consult public interest experts on standards for the information they disclose, including standardized names and formats, unique IDs, and other elements that make the data accessible for researchers.

Bottomline, we need new policy frameworks to enforce transparency, to give teeth to oversight, and to ensure social media can enable and enhance our democracy. Without it, the open and just democracy we all deserve is at real risk.  

Blog

How Political Ad Transparency Can Help Fix Democracy’s Cybersecurity Problem

/
August 7, 2020

Without sufficient transparency and accountability, online platforms have become hotbeds for disinformation that manipulates, maligns, and disenfranchises voters, especially people of color and women. The Online Political Ads Transparency Project is critical to Democracy Fund’s Digital Democracy Initiative’s goal of providing greater transparency and oversight to combat coordinated disinformation campaigns, minimize misinformation, and define and defend civil rights online. 

There is nothing new about misinformation, dirty tricks, and voter suppression in the history of democracy. But as political campaigns – like much of the rest of public life – have moved online, so have tactics to subvert election outcomes. Political ads and messaging are micro-targeted at voters who have no idea who is paying to influence them or what their motives might be. Or, as Laura Edelson and Damon McCoy, researchers for the Online Political Ads Transparency Project at New York University’s Center for Cybersecurity, would put it, democracy has a cybersecurity problem. 

In May 2018, Edelson and McCoy found a perfect opportunity to study this problem: they decided to look at Facebook’s newly public, searchable archive of political ads. Facebook had released this archive following criticism that it was profiting from political ads while not disclosing information about them to the public. Unlike TV and radio broadcasters, who are required to report political ad buys on television and radio to the Federal Communications Commission, online platforms like Facebook — to this day — are not legally required to do so. But while Facebook’s lack of transparency was technically legal, that doesn’t mean it was right. The  democratic process is harmed when Americans don’t know who is attempting to influence them via political ads. 

Diving into Facebook’s archive of political ads, Edelson and McCoy scraped information and used the resulting data to publish an analysis that showed that from May 2018 to July 2018, Donald Trump was the largest spender on the platform — a key insight into political influence on Facebook. Unfortunately, Facebook eventually shut down the NYU team’s ability to gather information by scraping — but this was only a temporary setback. Facing mounting pressure from the research community, Facebook soon after created a way for researchers to obtain these data programmatically, via an API interface. This made it simpler to do an ongoing analysis of the ad library corpus, versus a one-time scrape covering a limited time period. 

In doing all of this work, the researchers’ goal was to push Facebook to adopt better transparency policies — by presenting them with the evidence of how inadequate their current policies were. But Edelson and McCoy were learning that was an even more difficult task than they had expected. 

“When you are battling a traditional cybersecurity problem like spam” explains Edelson, “the honest actors – whether it’s a bank, an insurance company, or something else  – have incentives to change their behavior, because their customers will reward them with increased profits. But in this case, online platforms may have a long-term interest in being good citizens, but their short term interest is in making money off of ads and targeted content, precisely the tools the bad actors are gaming. So it’s hard to get them to change.” In other words: social media platforms have competing motivations. 

But the team did have one advantage: the power of public pressure. And they uncovered plenty of things that would worry the public. When they conducted a thorough cybersecurity analysis of how well Facebook was adhering to its own policies on political ad disclosure, they found numerous problems. More than half of the advertising pages they studied – representing $37 million of ad spending – lacked proper disclosure of which candidate or organization paid for the ads. Even when names of sponsors were disclosed, the information was sloppy and inconsistent.

They also identified “inauthentic communities” — clusters of pages that appeared to cater to different racial or geographic identity groups that do not adequately disclose how they are connected to each other.

Rather than going straight to the public with this information, Edelson and McCoy reached out to Facebook to share their findings, letting the company know that they planned to present their research publicly in May 2020 at the IEEE Symposium on Security and Privacy. And it did have an impact: in response, Facebook made internal changes that addressed some of these issues. 

This was a victory for the researchers, but the work continues and many obstacles and mysteries remain. Sometimes the Facebook API stops working. Sometimes researchers find ads that are clearly political, but are not included in the official ad library. And sometimes the reports that Facebook releases that aggregate ad data don’t match the raw data they’ve collected. 

But despite the difficulties, Edelson and McCoy persist. “I’m proud of the fact we’ve moved Facebook on transparency,” says Edelson, “but there is always more work to do. Voters need to know who is targeting them and how — and how much they are spending — to help them make informed decisions when they fill out their ballots.”

In 2020, the researchers are continuing to work on projects aimed at making Facebook and other platforms safer for our democracy. They have launched AdObserver, a browser plugin that allows Facebook users a way to volunteer data on the ads they are seeing. This will yield valuable information on whether ads are missing from the Facebook Ad Library, as well as information on targeting that the social media platform does not make available. And they are creating a new tool that will help civil society organizations – who represent people who often are targeted by such ads – to quickly identify problematic ad campaigns. While there’s no doubt democracy still has a cybersecurity problem, the NYU researchers are working hard to protect it from threats. 

Cover Photo: Laura Edelson and Damon McCoy of The Online Political Ads Transparency Project at New York University’s Center for Cybersecurity. Photo Credit: New York University. 

Report
Toolkit

Advancing Diversity, Equity, and Inclusion in Journalism: What Funders Can Do

Michelle Polyak and Katie Donnelly, Dot Connector Studio
/
October 16, 2019

Diversity, equity, and inclusion are fundamental to fostering robust American journalism that supports a healthy democracy. The failure of newsrooms to fully reflect their communities, to build a culture of inclusion that supports and retains diverse staff, and to foster equitable models of reporting that reflect the truth of people’s lived experiences is undermining trust in media and risking the sustainability of the press.

Foundations can play a role in addressing these concerns, but too often funders have exacerbated these problems through grantmaking that reinforces inequalities. Funders must therefore urgently refocus their efforts on diversity, equity, and inclusion (DEI) as the right thing to do, both morally and strategically.

Report
Toolkit

Guide to Assessing Your Local News Ecosystem

/
November 5, 2019

A step-by-step toolkit to help you gather the information you need to fund local news and information in your community.

Video

Dissatisfaction with American Democracy and Increasing Openness to Authoritarianism

/
July 6, 2020

While most Americans express belief in democratic values and preference for a democratic political system, a new report published by the Voter Study Group, “Democracy Maybe,” suggests that our democracy is increasingly vulnerable. Democracy Fund president, Joe Goldman, joins Hill TV to discuss shifting attitudes on American authoritarianism and democracy amid an economic recession, a global health crisis, social unrest and a polarizing election year.

Watch the video.

Blog

NewsMatch 2019 Campaign for Nonprofit News Was Best Year Yet

Newsmatch
/
June 3, 2020

Results from the 2019 NewsMatch cycle, published for the first time today, show that it was the most successful to date, with an initial pool of $3.37 million in philanthropic funds leveraged into a $43.5 million payout — a nearly 1,200% return on philanthropic investments that infused much-needed cash into independent newsrooms just as the coronavirus disrupted business as usual.

Cover photo by Elizabeth Hambuchen for Mississippi Today.

Democracy Fund
1200 17th Street NW Suite 300,
Washington, DC 20036