Blog

2014 for 2016: Supporting Innovations in Voter Information

/
October 16, 2014

This post is by Tom Glaisyer, Kelly Born, and Jonathan Kartt. Tom Glaisyer is the Program Director of the Informed Participation Initiative at the Democracy Fund. Kelly Born is a Program Officer at the Hewlett Foundation, where she works on both Special Projects and the Madison Initiative, and Jonathan Kartt works in Programs & Evaluation for the Rita Allen Foundation.

Last week, we shared our early research on voter information platforms and the breadth of exciting new organizations that our research unearthed. The impetus: The Hewlett Foundation, the Rita Allen Foundation and the Democracy Fund all share an interest in better equipping voters with the information they need: to participate in elections, vote in ways that reflect their interests, understand candidate positions and ballot issues, and to keep track of their representatives.

We partnered to explore dozens of these platforms, and quickly realized that we weren’t sure how best to support the field, or which groups to partner with. So the Hewlett Foundation and the Rita Allen Foundation crafted an RFP to solicit proposals from a handful of potential nonprofit partners, with the goal of funding them in a rapid-cycle innovation project. We were open to all kinds of ideas, and suggested a few possibilities:

  • Consulting Support: Because the ultimate success of any voter information platform depends on the quality of its design and resultant resonance with users, we suggested potential projects aimed at supporting design iteration and experimentation.
  • Implementation Support: These needed to be projects that were essentially shovel-ready, capable of being fielded before (and tested during) the 2014 election cycle.
  • Learning Support: There is much to be learned during this election cycle that might help inform later work in 2016. So we were open to jointly establishing a learning agenda for 2014 and then pairing nonprofit partners with researchers to test the effectiveness of different innovations.

Ultimately the proposals we received included some combination of all of these options.

Independently, the three foundations reviewed and assessed the pros and cons of all of the proposals, and between us we are now funding three public charities that responded to the RFP:

  • The Healthy Democracy Fund, to pilot its deliberative ballot decision-making approaches in Arizona and Colorado, and to conduct communications research around the efforts to understand what kind of messaging works with voters.
  • Maplight, to further develop its Voter’s Edge tool such that it can be more easily embedded in other platforms (e.g., news sites, civic organizations).
  • Seattle City Club’s Living Voters Guide, to further develop the site and to expand it to encompass not just ballot information but candidate data, including information from Voter’s Edge.

All of these projects include a research component to help understand what nonpartisan information resonates with voters, in hopes that we can learn and improve in future election cycles.

We are optimistic about the possibilities of these charitable projects, and about innovations in the sector more broadly – both for-profit and non-profit. These efforts offer hope that in future cycles citizens will have access to—and use—a wealth of information for even down-ticket races.

But we also have (lots of) questions:

  • When do people search for this information? How do they find it?
  • How do you expand the audience beyond political junkies to reach a broader population?
  • How useful do voters find this information? When and how does it actually influence decision-making?
  • What formats do voters prefer?
  • Do the platforms increase public trust in the political process or might some, particularly those that offer candidate matching, increase polarization?
  • How can the platforms be sustained?
  • Are the approaches scalable? What level of data standardization is desirable or feasible? For example, it is currently easy to get information on Congressional candidates, but much harder to digitally aggregate even the names of candidates for down-ballot races, let alone any meaningful information about them.

We are wrestling with these questions, supporting some research with these partners to test aspects of them, and exploring more broadly how we can aid the emerging community of practice that exists around this next generation of nonpartisan voter information tools. As always, we welcome your comments.

Blog

The Rapidly Expanding Field of “Voter Information Platforms”

/
October 8, 2014

This post is by Tom Glaisyer, Kelly Born, and Jonathan Kartt. Tom Glaisyer is the Program Director of the Informed Participation Initiative at the Democracy Fund. Kelly Born is a Program Officer at the Hewlett Foundation, where she works on both Special Projects and the Madison Initiative, and Jonathan Kartt works in Programs & Evaluation for the Rita Allen Foundation.

How will voters find information in 2014?

For those who care about US democracy, this question is front and center in a world where both the structure of the news media and the channels through which voters get information are in flux. In the not too distant past, voters received most of their information about candidates and ballot measures through mass market dailies and TV or radio—places where the message was mediated by gatekeepers. The only opportunity to directly communicate with voters was through paid advertising or in-person contact. Nowadays, candidates have limitless options to directly reach voters – even television, when delivered via satellite, permits hyper targeting of political advertising messages.

 

But it’s not just campaigns that are exploiting these new digital opportunities—a host of (mostly new) organizations, non-profit and for-profit, are seeking not to win a vote, but to inform voters about their options.

It’s an exciting time for the field. Abroad, websites that match voters to policy positions held by parties, so-called voting advice applications, have seen significant adoption. In Germany, for example, Wahl-o-Mat was queried 13.2M times in 2013—not bad when you consider there are only 80M people in the country. In the US, we have encountered dozens of similar sites such as Vote411, ethePeople and Project VoteSmart.

 

The digitization of data permits an increasing amount of contextual information to be added to what was previously just a thumbnail sketch of a candidate or issue. For example, information on candidates or ballot initiatives can now be combined with “rules of the road” on where and when to vote, and what materials to bring. This digital “plumbing” is often under-appreciated—Google’s Civic Information API provide a way to lookup polling places in 2014 and listed the candidates on the ballot. It builds on data from the PEW Charitable Trust’s Voting Information Project and augments a recently developed iOS app.

Recognizing the possibilities in this emerging ecosystem of voter information, the Hewlett Foundation, the Rita Allen Foundation and the Democracy Fund partnered to explore the dozens of voter information websites that have developed in the last few years. We examined a number of dimensions:

  • Candidates vs Ballot Initiatives (or both): Many of the sites focus on candidates, while others like Healthy Democracy in Oregon and Washington State’s Living Voters Guide have (until recently) focused exclusively on ballot measures. Others like ethePeople, Project VoteSmart and PollVault, cover both.
  • Geographic Scope: Many provide national coverage, whereas others, like ethePeople, partner with media and civics groups in specific states or localities. Maplight’s Voter’s Edge, cover national races, while also offering some down-ballot coverage in particular states (in this case, California).
  • Audience: Some, like Ballotpedia, provide detailed information that might appeal more to policy wonks like ourselves, whereas Voter’s Edge or Who’s On The Ballot seek to serve those who prefer a less detailed view.
  • Approach: Sites like Voter’s Edge provide “just the facts” (on a lot of dimensions, including candidate’s prior jobs, campaign funding sources, etc.). Others, like the newly launched Crowdpac,use campaign funding sources to predict candidates’ positions, in an attempt to address the challenge of comparing a 30-year incumbent’s record to that of a first-time challenger who has never held office. ISideWith uses matching algorithms – and has now paired more than 11 million users with their “ideal” candidates based on answers to basic philosophical and political questions (e.g., “what is your stance on taxation?”). Still others actually involve citizens in the deliberative process: Healthy Democracy in Oregon convenes a representative panel of dozens of citizens for a week to evaluate the pros and cons of a particular ballot initiative. The information is then shared with voters in the official voting guide. Research has shown how valued that information has been – a majority of Oregonians were aware of the tool, and roughly two thirds who read the CIR statements found them helpful when deciding how to vote. In Washington State the Living Voters Guide has utilized a deliberative platform to allow voters to share why they are in favor of or opposed to a particular initiative.
  • Business Models: Half of what we found are for-profit operations like Crowdpac and Poll Vault. The other half (most of what we’ve discussed herein) are nonprofit. So we spoke with venture capitalists who had invested in several of the for-profit competitors to understand their reasons for doing so, and to ensure that we felt there was a good rationale for philanthropic investment in this space.
  • Operating and Partnership Approaches: Some, like Project VoteSmart, rely on teams of dedicated interns, while others are striving towards more automated systems. We also looked at organizations’ partnerships – many like ethePeople are collaborating extensively as part of their model, others are closer to independent operators.
  • Use: Finally, we looked at use. Not much is known about the end-users of these types of voting information services beyond broad demographic statistics. In terms of numbers, some platforms have received a fair amount of uptake, whereas others are so new that no usage data is even available yet – however, no site appears to have come close to Wahl-o-Mat’s success in Germany.

This wide variety of activity left us with lots of questions: whether and how to support this field, who to partner with, and on what kinds of projects? We have begun to explore these questions, and will discuss our early work on this topic in a follow-up post next week.

Blog

Guest Post: API seeks Best-Practices and New Tools for Fact-Checking

Jane Elizabeth
/
July 22, 2014

The practice of fact-checking is a core function of journalism in the 21st century. As American Press Institute (API) executive director Tom Rosenstiel argues: “Nothing comes closer to journalism’s essential purpose than helping citizens know what is true and what is not. And in an age when information is in greater supply, it is more important, not less, that there are trusted and skilled sources to help citizens sort through misinformation.”

During the 2014 U.S. election cycle, fact-checking projects emerged in print, television, radio and online newsrooms around the country in greater numbers than ever before. Dr. Michelle Amazeen, a Rider University professor and API researcher, found that mentions of “fact-checking” in media increased more than 75 percent between 2011 and 2012 alone.

From this year’s primaries to the general election in 2016, API’s fact-checking project, launched earlier this year with support from The Democracy Fund, is working to improve and develop fact-checking best practices and trainings for newsrooms that want to provide deeper coverage for their audiences. In fact, API’s new emphasis on fact-checking excellence is already reaching beyond U.S. borders. Kirsten Smith, a journalist in Ottawa, Canada, contacted API in May with a request for assistance for her “small newsroom with limited resources” to prepare for the 2014 municipal elections and the 2015 national election. “Have you a tip sheet or primer for a small scale fact check program?” In fact, API has developed a big-picture tip sheet precisely for requests like this, and we will be developing in-depth training programs based on upcoming research.

API also has developed blog features, convened a meeting of its researchers and media, participated in the world’s first international fact-checking conference, and is discussing additional funding sources with organizations interested in promoting better fact-checking. An important initial function of the project is to compile, curate and examine the latest news from the fact-checking front. Features include:

A major part of the initiative brings together six experienced scholars from around the U.S. and the U.K. to work on projects designed to examine and improve the practice of fact-checking. Their topics include: the impact of fact-checking on political rhetoric; the effectiveness of rating systems like the Washington Post’s “Pinnochios;” readers’ changing perceptions of fact-checking; and a survey of journalists on the prevalence of fact-checking. The group joined API’s fact-checking project, announced in February, with plenty of experience in the study of information, misinformation, and how facts are processed. Here are the scholars, with a brief description of their work for API:

Michelle Amazeen, Rider University. Amazeen, a Temple University graduate who holds a Ph.D. in mass media and communication, will study the effectiveness of political fact-checking rating systems (such as the Washington Post’s “pinnochios”). On Twitter @commscholar.

Lucas Graves, University of Wisconsin. Graves, who holds a Ph.D. in journalism from Columbia University, has written about fact-checking topics for CJR and other publications. He will study the effects of fact-checking on journalistic practice and is part of the team working on the study of rating systems. On Twitter @gravesmatter.

 

Ashley Muddiman, University of Wyoming. Muddiman, who holds a Ph.D. from the University of Texas, is part of the team that will study the effectiveness of rating systems. She also is involved in the Engaging News Project. On Twitter @ashleymuddiman.

Brendan Nyhan, Dartmouth College. Nyhan, who holds a Ph.D. from Duke University, will assist on the project on the effects of fact-checking and a project which will examine how attitudes toward political fact-checking change over the course of a campaign. On Twitter @BrendanNyhan.

Jason Reifler, University of Exeter, UK. Reifler also holds a Ph.D. from Duke University. He will work with the teams studying the effects of fact-checking and changing attitudes during the course of a campaign. On Twitter @jasonreifler.

 

 

Emily Thorson, George Washington University. Thorson holds a Ph.D. from the University of Pennsylvania. She will examine how contextual information in news coverage can minimize misperceptions, and will work with the team studying the effectiveness of ratings systems. On Twitter @emilythorson.

The American Press Institute will combine the researchers’ work with the work of other scholars and API’s own research to identify what kinds of fact-checking are most effective at stopping misleading rhetoric and are most informative to citizens. In the second year of the program, API will conduct workshops and meetings and develop other resources aimed at supporting news organizations interested in fact-checking on the eve of the 2016 election cycle.

Have questions? Topics you’d like to see tackled? A good fact-checked story of your own? Contact me at jane.elizabeth (at) pressinstitute.org, 571-366-1116, @JaneEliz.

Blog

Local Journalism – What will the new ecosystem look like?

/
January 16, 2014

Last year, the Democracy Fund convened a cross section of journalists, editors, and media experts to begin a dialogue about the major issues facing the field. It was a productive discussion that has greatly informed our approach to ensuring that the public has the information it needs to make informed choices. Perhaps the clearest priority voiced at the forum (and one that has the greatest impact on our thinking) is the need to support and improve the quality of journalism at the local level.

The challenges for reporters and publishers at the local level are legion—audience size is limited, online advertising rates aren’t anything like the rates obtained by print publications in the past, staff numbers in such outlets are small, and there are few opportunities for reporters to develop distinct capabilities or expertise. In the last months, the downsizing at Patch (AOL’s hyper-local network) and in Gannett’s community publishing division has just reinforced how tough this space is for all.

Since our spring 2013 meeting, I have been exploring how we can best understand the needs in this space and have been heartened by the research into news deserts being undertaken by Michelle Ferrier and the development of MediaCloud and the MediaMeter mapping the level at which the Boston Globe covers news stories. Thanks to these and other projects, we may soon be better able to understand both coverage and consumption at a much more granular manner than before.

What I have become most interested in are three themes that appear to be emerging as local news ecosystems transition:

1. Collaboration and sharing at a regional level.

One solution to the challenge facing local journalism is higher efficiency in the production of stories, or broader distribution through regional collaborations. As Jan Schaffer’s very useful recent research shows, collaborative efforts are emerging across the country. In Colorado, a local collaboration been led by the INewsNetwork started off as an independent organization and has now become part of a local PBS television station and built partnerships with 21 other outlets. In New Jersey, Montclair University’s School of Communications and Media is hosting NJCommons an effort to build collaboration between outlets within the state. This includes a story exchange as well as providing training to partner organizations. Other partnerships such as IdeaStream in North East Ohio that combines public television, radio, public access cable, and an online engagement platform shows how collaboration can grow within public media.

Regional and topic focused collaborations have also emerged. In radio, there has been the State Impact Project across public radio and partnering outlets. In public television, multiple local journalism centers have been set up. How much of this infrastructure will survive in the long term is unclear, but collaboration, often in a non-traditional manner, seems to be central to the provision and distribution of public interest journalism.

2. Specialization of outlets around news beats.

All too often, reporters at local papers simply do not have the bandwidth to develop the specialized knowledge they need to cover complex stories. Outlets that focus on a single beat can address this challenge by enabling local media to build on top of reporting they do and adding a local flavor. InsideClimateNews, winner of a Pulitzer for National Reporting in 2013, is perhaps one of the most well known example of a successful vertical outlet. They actively encourage republishing of their stories. ProPublica, goes a step further and provocatively asks people to steal their stories. The presence of non-profits such as the Food and Environmental Reporting Network suggest that there is momentum in provision of specialized beat news. In particular, Homicide Watch has been lauded for its coverage in D.C. and has expanded to Chicago via a partnership with the Sun Times.

3. Provision of services by a central organization

Another solution to improve local coverage is for small outlets to rely on a central entity to provide them with shared resources. The Shorenstein Center publishes Journalists Resource with the objective of providing journalists with easy access to relevant academic scholarship that can aid reporters. The Investigative Reporters and Editors organization has long provided datasets and operates DocumentCloud. For its part, our new grantee, the Investigative News Network provides a customized WordPress configuration that they are willing to customize further and host for organizations. The Public Insight Network operated by American Public Media serves as a source development platform for a number of outlets. The soon to be launched FOIA Machine platform is another great example. Nearly all of these are solutions narrowly tailored to a particular challenge but all seem to represent a promising trajectory.

There are many reasons to be cautious about the future of local news and journalism, but the impact of these three threads coming together in the right way appear considerable. We don’t know all the answers and how this field will develop, but we will continue to explore the space and welcome input on Twitter. @tglaisyer.

Blog

Guest Post: Journalism educators — Have you a project that will energize your local media ecosystem?

Jane McDonnell
/
December 2, 2013

Today ONA launched the application process for a contest for journalism educators to experiment with new ways of providing news and information

We know that you and your fellow j-school colleagues have been talking about experiments that innovative experiment that will shake up your curriculum. There’s a talented student who just needs the right mix of collaboration and inspiration to fulfill her promise. You have a local media partner willing to work with you and a cool engagement platform in mind. Researcher: Check. Designer? Could be. Developer? In the wings.

You’ve got the right ingredients to apply for the 2014 Challenge Fund for Innovation in Journalism Education, and inject up to $35,000 in the form of a micro-grant that can push your idea to launch and—we hope—make both your curriculum and your local news landscape stronger. The competition, run by ONA and funded not only by a collaborative that includes Excellence and Ethics in Journalism Foundation, the Robert R. McCormick Foundation, the John S. and James L. Knight Foundation as well as the Democracy Fund, will support live news experiments that further the development of teaching hospital models in journalism education, in which innovative projects are created by teams of educators, students, professionals, technicians, and researchers.

Micro-grants will be awarded to 15 to 25 projects to be completed during the 2014-2015 academic year. Irving Washington, ONA’s Director of Operations and Challenge Fund administrator, in advising applicants suggests “Your project should stretch the limits of what you think you can do. Don’t be afraid to fail. The goal is to empower journalism schools to lead professional innovation and thought leadership. The size of your school or program shouldn’t limit the project’s ambition.”

Teams will be selected based on ideas that show the most potential for:

  • encouraging collaborative, student-produced local news coverage
  • bridging the professor-professional gap
  • using innovative techniques and technologies
  • and producing shared learnings from their digital-age news experiments

The competition will culminate in at least one substantial grand prize for the project most likely to change either local newsgathering, journalism education or both. An overall prize will be given for the best project evaluation, regardless of the experiment’s outcome. The winners and their projects, chosen in consultation with academic advisers and ONA leaders, will be featured at upcoming ONA conferences and other news media education events.

For inspiration, FAQs and resources, visit journalists.org and follow the conversation on #hackcurriculum.

Have questions? Email challengefund@journalists.org.

Deadline is Feb. 13, 2014 and winners will be announced in April, 2014.

Good luck—we can’t wait to see what you come with up.

Blog

Guest Post: Engaging News Project Results

Talia Stroud
/
September 18, 2013

It has been an exciting debut year for the Engaging News Project. The aim of the project is to research democratically-beneficial and commercially-viable tools and strategies for engaging online news audiences. Below, we detail four take-aways from our work. Use a “Respect” Button in Comment Sections. “Like” is a common button on news websites. You can “like” news organizations, articles, and others’ thoughts in comment sections. The use of the word “like,” however, may exacerbate partisan reactions to news and comments. The word asks people to think in terms of agreeing or disagreeing, approving or disapproving. But not all words inspire the same reaction. Indeed, several organizations have incorporated other buttons onto their sites. The Tampa Bay Times has “Important,” Civic Commons “Informative,” and Huffington Post “Bored,” to name but a few. We tested a new word: “Respect.” Perhaps asking people to “Respect” others’ comments will lead to different behaviors in a comment section compared to “Like,” or another frequently used word, “Recommend.” The results were encouraging. From a democratic angle, “Respect” led people to click on more comments expressing different political viewpoints. From a business angle, “Respect” resulted in more clicks overall, particularly for some topics. We encourage news organizations to consider using this new button. Use Fact-Based Interactive Slider Quizzes. Online polls are regular features on news sites. They solicit site visitors’ opinions on everything from proposed legislation to how a sports team will fare during an upcoming season. Truth be told, however, the poll results are of limited value. They offer no insight on the sentiments of a broader public. They should not be used to form an opinion or to inform policy. The main purpose of these features is to keep people on a news page longer. We hoped that we could find a way to increase the democratic value of these interactive tools. Instead of polls that ask people to report their opinions, we tested quizzes that ask people about a fact and then provide reliable information. For example, a question could be posed about what percentage of the federal budget is spent on Social Security. Quiz questions also could ask people to predict public opinion about a topic, giving results based on reliably-gathered public opinion data. We tested two types of polls: a multiple-choice poll and a slider poll. Consider the Democracy Fund’s work with the “Oregon Citizens Initiative Review (CIR).” As Peter Levine explained in his earlier blog post, the project involves sending Oregonians pamphlets created by a representative group of 24 citizens that explain ballot measures. Here are examples of the two types of polls:

We first conducted a laboratory test on polls like these. We learned that people spend more time with slider polls and that these interactive tools improve learning compared to just telling people a fact (e.g. “nearly half of Oregon voters were aware of the CIR’s explanations.”). Next, we partnered with a news organization that let us include polls on their site. The code randomized whether people saw a multiple-choice or slider poll. Results showed that including more than one poll, and at least one slider poll, can increase the amount of time people spend on a webpage. Get Involved in the Comment Section. In visiting with news staff across the country as part of this project, we were struck by the tremendous variability in comment section practices. Some news organizations actively cultivate a vibrant online community. Others essentially ignore the section, writing it off as a wasteland that is included on a site because having people argue, the theory goes, could increase their time on site. Calls by political leaders from President Obama to Texas Governor Perry to improve the civility of public discourse inspired us to examine strategies for combatting incivility in comment sections. We worked with a local television station to randomize what took place following 70 different political posts to their active Facebook page. For some posts, the station’s popular political reporter interacted with commenters. For other posts, the station interacted with commenters. And for yet other posts, no one interacted with commenters. Results showed that having a political reporter interact with commenters can improve civility in the comment section. Use Caution in Labeling Hyperlinks. Our final project analyzed how hyperlinks are labeled on news sites. Right now, they are commonly labeled as “Top Stories” or as “Most Popular Stories.” We wondered whether other phrases, developed based on research in psychology, communication, and political science, could affect people’s on-site behavior. For example, we analyzed whether including the phrase “Follow the issues that worry you.” would influence people’s on-site behavior and attitudes about politics. We conducted a lab experiment where we compared how people behaved on a news site that was identical in all ways except one: a single phrase included on the site. Results were decidedly mixed. All of the six different phrases that we evaluated had effects, but none of them had uniformly positive business and democratic outcomes. For example, the phrase “Follow the issues that worry you” resulted in some respondents displaying less polarized political views, but it had no discernible business effect. As a result of our study, we can’t recommend any of the phrases that we tested. But we can report with confidence that news organization should tread cautiously in adding new phrases to their sites. A single phrase can affect people’s attitudes and behaviors. For the Engaging News Project these results are the beginning of the project’s work to help newsrooms. Many new tools and strategies remain to be discovered and evaluated and we look forward to continuing our work in the coming years. If you’re interested in getting involved, you can follow us on Twitter, like us on Facebook, Sign up for our newsletter, and Email us with any suggestions you might have!

Blog

Guest Post: Supporting a Beleaguered News Industry

Peter Levine
/
June 17, 2013

(This is the second in a series of blog posts by CIRCLE, which evaluated several initiatives funded by the Democracy Fund to inform and engage voters during the 2012 election. These posts discuss issues of general interest that emerged from specific evaluations. Join CIRCLE for an ongoing discussion of the posts using the hashtag #ChangeTheDialogue, as well as a live chat on Tuesday, June 25th at 2pm ET/1pm CT/11am PT.) Two Democracy Fund grantees—the Center for Public Integrity (CPI) and the Columbia Journalism Review—worked to support reporters and editors in order to improve their election coverage and better inform the public on key issues of national concern. We evaluated these initiatives by interviewing some of the potentially affected journalists, 97 in all. One theme that emerged very clearly was the challenging situation that confronts the news industry. This context has been well documented in other research. For example, according to a study of the changing news environment in Baltimore, conducted by the Pew Research Center, the number of news outlets in the city has proliferated to 53 “radio talk shows, . . . blogs, specialized new outlets, new media sites, TV stations, radio news programs, newspapers and their various legacy media websites.” But the number of reporters has fallen. That means there is more written and spoken about the news than ever, but it is highly repetitive. A search of six major news topics found that 83 percent of the articles and blog posts repeated the same material—sometimes with commentary—and more than half the original text came from paid print media such as the Baltimore Sun. In turn, Baltimore’s remaining professional journalists are so overstretched that they cannot provide what is called “enterprise reporting” (digging to find new information not already in the public domain). The city government and other official institutions now have more, rather than less, control over the news. The report notes, “As news is posted faster, often with little enterprise reporting added, the official version of events is becoming more important. We found official press releases often appear word for word in first accounts of events, though often not noted as such.” Our interviews found ample evidence of similar conditions. One reporter said, “the political reporting in our state has shrunk to the point where a lot of the major reporters are ones that have been doing it for decades and, quite frankly, their reporting (and lack of digging) reflects how tired they are.” On the whole, our interviewees were very pleased to be provided with support in the form of CPI’s in-depth reporting and the Columbia Journalism Review’s coverage of their work. For example:

  • “Without that kind of work I don’t know how one could sort themselves through what’s happened, unless they’ve been following for the past 5 years.”
  • “Without Open Secrets and CPI I don’t know how a journalist who is new could figure this stuff out.”

They noted various ways in which these interventions had affected them. They mentioned learning about good practices that are used in other newspapers, getting ideas for stories, and encouraging high quality work. Commenting on the CJR’s effort, one reporter said, “It sort of serves as a watchdog to remind people to do a good job, to do a thorough job, to look for fresh angles, to dig beneath the surface, and, ah, hopefully those are things that I’m doing already.” Local coverage emerged as an area that needs special attention and support. As a reporter told us, “One of the faults with journalism coverage and journalism criticism, in general, is that it tends to focus on the big national players and the big national issues. And as we’ve seen a number of major publications pull back on local coverage …, it’s become all the more important that we have some sort of press criticism function taking care of local media and engaging with local media. And I think that a lot of reporters working locally and regionally would benefit from that sort of attention and that sort of engagement as well.” There were, however, a few concerns that also related to the limited capacity and fragile financial condition of the news industry. CPI’s model is to provide in-depth reporting that news sources can use in writing their own articles and broadcasts, and a few respondents were worried that CPI might become a competitor for readers. The Columbia Journalism Review wrote appreciative as well as critical articles about political news coverage, but a few respondents felt that these articles did not demonstrate adequate sensitivity to the limited capacity of local newsrooms. Although most interviewees were pleased with the CJR’s coverage, the relatively few respondents who felt it was unfair were likely to think that the CJR’s correspondent had overlooked their limited capacity to accomplish what was being suggested. CIRCLE’s interviews suggest the following conclusions:

  • Because of staffing cuts and turnover in the profession, the news media struggles to cover politics. They are aware of their difficult situation and generally grateful for assistance.
  • Providing high-quality information and constructive criticism does change reporters’ behavior.
  • Professionals in the news media are understandably somewhat sensitive about being given advice unless the person offering it recognizes the practical limitations they face.They are also concerned about being manipulated by ostensibly nonpartisan organizations that they fear may have partisan objectives. (See our previous blog post on the problem of distrust.)
  • Interventions designed to support the news media should not inadvertently compete with the news media by taking away readers or viewers.

The previous entries in the series can be accessed below: 1 – Educating Voters in a time of Political Polarization

Blog

Guest Post: Educating Voters in a Time of Political Polarization

Peter Levine
/
June 13, 2013

(This is the first in a series of blog posts by CIRCLE, which evaluated several initiatives funded by the Democracy Fund to inform and engage voters during the 2012 election. Our posts discuss issues of general interest that emerged from the specific evaluations.) During the 2012 campaign season, the Democracy Fund’s grantees experimented with a wide range of strategies to educate and engage the public. Some produced videos and other educational content to directly inform the views of voters. Others worked with journalists to improve the information that the public receives through local and national media. In all cases, CIRCLE’s evaluations found that the public’s polarization made it significantly more difficult for these efforts to achieve their goals; polarized individuals often resisted the messages and opportunities offered to them. Americans perceive the nation as deeply divided along political lines. In February 2013, according to the Bipartisan Policy Center, 76 percent of registered voters said that American politics had become more divisive lately and 74 percent believed that this trend was harmful. Academics disagree somewhat about the degree of polarization and whether it has become worse over time, but few doubt that political polarization can exacerbate fear and distrust, prevent people from understanding alternative perspectives and considering challenges to their own views, and reduce the chances of finding common ground. The challenges of engaging polarized citizens emerged clearly in CIRCLE’s evaluations. For example, Flackcheck.org produced parody videos that taught viewers to reject deceptive campaign advertisements. In testing whether these videos were effective, we showed representative samples of Americans real campaign advertisements that we considered misleading. One example, “Obamaville,” produced by Rick Santorum’s campaign, displayed President Obama’s face alternating with that of Iranian President Mahmoud Ahmadinejad on a television screen in a post-apocalyptic setting:

More than 80% of Democrats but fewer than 20% of Republicans considered this video “invalid and very unfair.” Among the Republican viewers, some made comments like this:

  • “It does make him look like a threat…He is a threat to the United States and the well being of the people and welfare of our country…”
  • “Tells the truth about Obama”
  • “TO SHOW VERY CLEARLY WHAT OBAMA IS DOING AND TAKING THIS BEAUTIFUL COUNTRY! BELIEVE IN OBAMAVILLE”

We showed a different sample of respondents a MoveOn advertisement entitled “Tricky Mitt,” in which Mitt Romney’s image faded into Richard Nixon’s:

More than 70% of Republicans and less than 10% of Democrats considered that video “invalid and very unfair.” Some Democrats made critical comments about “Tricky Mitt” (e.g., “Accusatory, urges the viewer to associate guilt with Romney, not reflective of what I expect from politicians”), but many were positive about the video, saying things like this:

  • “excellent”
  • “entertaining and points out the crookedness of Romney”
  • “Giving us information that we didn’t know about. All true”
  • “I think it exposed the truth about Romney of what kind of person he really is.”

Essentially, people approved of ads that supported their own partisan position and criticized or invalidated ads that threatened their preexisting beliefs, although both ads we tested were deceptive. We also evaluated Bloggingheads.TV videos, which showed pundits of opposite political persuasion taking part in civil discussions about controversial issues. We asked people who watched various videos a scale of questions that measured their openness to the other side. An example of a question in this scale was “I have revised my thinking on the issue.” Regardless of which video they watched, the strong partisans were always less open to deliberation. Strongly polarized statements also emerged in many of the open-ended questions that CIRCLE asked of Democracy Fund grantees. For example, we asked a representative sample whether they ever shared political videos. Out of 195 respondents who chose to explain why they did so, 24% mentioned anti-Obama goals, often adding very strongly worded comments against the president. (“Obama confessing to being a Muslim”; “A black heavy set lady going on about Obama care, and that we should go ahead and work to pay for her insurance”; “Michelle Obama whispering to B.O., ‘all this over a flag!’”; “I come from a military family and I am extremely offended by the both of them. I have never seen a more un-American couple in the White House!”). Another 17% percent mentioned anti-Romney videos, often the Mother Jones video about the “47%.” Some of the Democracy Fund grantees did not directly influence average citizens, but rather worked to support professionals in newspapers or broadcast stations. In general, these journalists, editors, and station managers seemed less prone to partisanship than average citizens. However, some reporters expressed skepticism about the neutrality of Flackcheck.org and wondered whether it had a partisan agenda. “I am suspicious of so-called non-partisan fact checkers,” one said. A broadcast station-manager, asked how he or she would react to being told that a given ad was misleading, said, “It would be difficult to determine the true nature of the intent [behind the criticism] or that the third party was indeed unbiased.” These responses suggest that an atmosphere of polarization and distrust may create challenges even for organizations that work with nonpartisan professionals. Going forward, the Democracy Fund and its grantees may consider a range of possible strategies, such as:

  1. Focusing at least some attention on youth and young adults, since young people tend to be less committed to partisan and ideological views and still open to and interested in alternatives.
  2. Finding ways to get people of different ideological persuasions into sustained contact with each other, since simply knowing fellow citizens with different views makes it more difficult to stereotype and demonize them. Actually collaborating with a diverse group of people on some kind of shared goal can be especially helpful.
  3. Experimenting with new messages and formats that educate polarized adults more effectively.
Blog

Improving Local Coverage

Greg Marx
/
May 17, 2013

When my colleagues and I at the Columbia Journalism Review began the Swing States Project—critiquing and seeking to improve the quality of coverage in nine key states during the 2012 campaign—we weren’t sure quite how we would be received. Nobody likes a backseat driver, after all, and morale in many newsrooms—especially those owned by “legacy” media companies—is not necessarily high at the moment. To be sure, we ended up with our share of angry emails, tweets, and phone calls from journalists around the country who felt our critiques hadn’t quite found the right line. But we were pleased to discover that, far more often than not, reporters and editors were open to what our team of correspondents had to say—even when it was critical. They were keen to employ suggestions about how local TV station records can reveal who’s spending big money to swing election results, and eager to learn best practices for beating back political misinformation. When local reporters came across outstanding journalism, they would often share it with our writers, and of course, they appreciated it when we praised their good work. Most gratifying of all, we encountered journalists who engaged with our critique of their work—who pushed us to be better critics, and who were ready to be pushed to better serve their communities. Much has been said and written—including, fairly recently, at CJR.org—about the diminution of public-affairs coverage at the state and local level. The numbers showing a decline in reporters and in story counts are indeed grim, and, as we observed firsthand during 2012, coverage in many markets is patchy. But we also saw plenty of examples of “laurel”-worthy coverage, and an appetite for resources, tools, and know-how that will allow journalists to cover politics and policy better. As our initiative has evolved in 2013 into the United States Project, we have tried to meet that appetite. Our correspondents in the Mountain West, the Great Lakes, the Midwest, the mid-Atlantic, California, Florida, and Texas monitor coverage of federal, state, and even city issues in their regions, highlighting stellar work and identifying missed opportunities. They cover the experimentation in editorial and business-side models to support this sort of journalism in a challenging economic environment. And they are building networks of reporters with which they share resources, reporting strategies, and story ideas. Along with our regional roster, we have five “national” contributors—writers on the healthcare, tax and budget, money-in-politics, and factchecking beats, plus a roving reporter. Their subject-area expertise is a resource for our entire team, and they regularly produce primers on coverage of complicated subjects—like the rollout of the new health insurance “exchanges,” or how to tell when your congressman is skirting ethics laws to enjoy a lobbyist-sponsored junket—designed to be of use to state and local political reporters. Going forward, we expect to find new harmonies both among the regional roster and between the regional and national teams. As we look ahead to the 2014 elections and the many policy battles to be fought (and covered) before then, our goal is that the project will serve as a second layer of editorial support—providing practical guidance and constructive criticism, and exhorting journalists around the country to set ambitious standards for their work. For many years, CJR’s motto was “Strong press, strong democracy.” It’s not just the “press” anymore—but the old aphorism still applies.

Blog

Recognizing and Rejecting Patterns of Deception

Kathleen Hall Jamieson
/
April 18, 2013

During the 2012 election, FlackCheck.org flagged two different kinds of recurrent deceptions to put candidates on notice and increase public understanding of the substance of presidential campaigns. The first featured fabulations such as ‘Romney opposed abortion even in cases of rape and incest’ and “Obama ‘gutted’ the work requirement in welfare reform”— that persisted in the face of debunking by the major fact checkers. The second drew on campaign rhetoric to illustrate “patterns” – including false logic and misleading uses of language— that campaigns use to invite false inferences or propel audiences toward unjustified conclusions.

Two statements made by Bill Clinton and Newt Gingrich exemplify what we mean by a pattern of deception. In late 2011, Gingrich claimed that “I balanced the budget for four straight years…” and last summer Clinton said, “I gave you four surplus budgets for the first time in more than 70 years…”

Instead of crying “false” (because their level of self-congratulation is unwarranted) or “partially true” (because each did play a role in balancing budgets), the Detecting Patterns of Deception page identifies the misleading move that Clinton and Gingrich share as “Overestimating an Individual’s Power.” Each is claiming full credit for balanced budgets when the plaudits should be shared with many others, ranging from the Congressmen who supported the deficit reduction packages of two administrations to the Federal Reserve’s monetary policy and those who created the tech boom of the 1990s. Efforts to reject misleading moves and deception have been around for a long time. Since Aristotle defined thirteen fallacies, theorists have fashioned primers to protect audiences from seductive errors in reasoning and machinations that cloud judgment. Flackcheck’s Detecting Patterns of Deception page has followed this tradition, defining and illustrating 28 deceptive patterns clustered into six categories: Overestimating Power, Misleading Language, Misleading Audio-Visual Cues, Not Telling The Whole Story, False Logic, and Hypocritical Attack. With this work, we are targeting those too young to have developed the strong partisan reflexes that produce confirmation bias. We expect that regular exposure to the Detecting Patterns of Deception page will teach even those who rationalize their own side’s excesses to spot the sorts of recurrent moves that would have made Machiavelli proud. In the ‘more difficult but doable’ category of goals, we expect that our explanations will increase our audience’s understanding of how these inference-forging moves mislead. A tougher objective aspires, over time, to translate recognition and understanding into disapproval. In the “maybe under some circumstances” box, we hope (but with longer odds) that among at least some of our audience, our process of labeling, defining, explaining, and illustrating will lead them to reject the deceptive pattern regardless of the ideology of the candidate or cause employing it. Put more technically, the Detecting Patterns of Deception pilot project assumes that IF: a) We craft clear definitions that schematize the relationships among our Patterns of Deception, b) Identify cogent exemplars from both left and right to populate those schemas, and c) Over time familiarize those who have not yet formed strong partisan attachments (i.e., high school and first year college students) with the categories embodied in the labels, the explanations of why each is problematic, and illustrations of the misleading moves from both left and right, THEN WE WILL: d) Enhance audience political acuity by increasing recognition, understanding of the misleading nature, disapproval and rejection of misleading moves in ongoing campaigns and issue debates regardless of their source and do so without activating cynicism. To see how well the categories illumine the gun control debate take a look at the rhetoric we’ve labeled “out of context”. “overgeneralization,” “ad hominem,” “slippery slope”, “red herring”, “false categorical” and “guilt by association.”

Democracy Fund
1200 17th Street NW Suite 300,
Washington, DC 20036