Update June 2022: This project has been renamed from FACT CHAMP to Co·Insights. Updated information is available on the project page.

Misinformation is proliferating across the globe, but research into its nature and origins is often disconnected from the tools and techniques used to combat it.

Now, a team of nonprofit, academic, industry, and fact-checking organizations has received a 12-month, $750k grant from the National Science Foundation’s Convergence Accelerator to create a new platform to narrow the gap between research into misinformation and responses designed to curb it. The project’s initial focus will address misinformation in partnership with Asian American and Pacific Islander (AAPI) communities.

Partners in this project, called FACT CHAMP, include Meedan, a technology nonprofit focused on fact-checking tools; the Annenberg Public Policy Center of the University of Pennsylvania and its FactCheck.org project; the University of Massachusetts Amherst, University of Connecticut, and Rutgers University; and AuCoDe, a start-up that uses artificial intelligence to detect and analyze disinformation.

FACT CHAMP stands for Fact-checker, Academic, and Community Collaboration Tools: Combating Hate, Abuse, and Misinformation with Minority-led Partnerships. The project is designed to advance scientific understanding of how trust, misinformation, abuse, and hateful content affect underrepresented groups. The project is one of the 2021 cohort of Phase I projects on Trust & Authenticity in Communications Systems (Track F) within the NSF Convergence Accelerator.

"Problems faced day-to-day by fact-checkers are inspiring new research challenges and can open up new datasets for research," said Scott Hale, the project’s principal investigator and Meedan’s director of research. "At the same time, developments in computer science and social science can create new tools, approaches, and understandings to better counter misinformation in practice."

Co-PI Kathleen Hall Jamieson, director of the Annenberg Public Policy Center and a cofounder of FactCheck.org, said, "Identifying the misinformation and patterns of deception to which different societal groups are exposed is a prerequisite of effective fact-checking."

The first phase of this project, conducted over the next year, will involve working closely with leaders in AAPI communities to prototype and design a platform using tiplines, claim-matching, and state-of-the-art controversy detection models to help identify and triage potential misinformation within and about their communities.

"This project feels urgent, as it is really an intervention that engages with the tragedies that have spurred the #StopAAPIHate movement," said Jonathan Corpus Ong, a co-PI and associate professor in global digital media at UMass Amherst. "It’s essential we conduct more multidisciplinary research into how AAPI community members navigate the diverse threats of the contemporary digital environment, from racist conspiracy theories to an extremist right-wing ideology, that prey on the AAPI community’s current state of fear and anxiety."

Subsequent work in this project will include development of smartphone-based self-help resources for AAPI community leaders and building an infrastructure to securely share data and challenges with researchers who are investigating and addressing the ways misinformation propagates. The infrastructure will also allow academic solutions to be more easily used in practice.

Kiran Garimella, a co-PI and assistant professor at Rutgers University, said, "Working with fact-checking organizations and communities is an exciting opportunity but entails significant coordination costs. I am confident our team can reduce these burdens, making such collaborations easier and paving the way for amazing interdisciplinary projects in this space."

To ensure the project’s success, the multidisciplinary team will conduct three proof-of-concept activities over the next nine months to determine the best approaches for enabling meaningful collaboration between researchers, fact-checkers, and community leaders to combat hate, abuse, and misinformation. Through these activities, the team will advance research on detection of controversial and hateful content, improve their understanding of hate speech and misinformation, and develop new tools and adapt existing ones to create collaboration infrastructure aligned with the needs of researchers, practitioners, and communities. At the end of Phase I, the team will participate in a formal Phase II proposal and pitch evaluation to proceed to Phase II. If selected into Phase II, the team may receive up to $5 million to further the solution toward real-world application.

"The scope and scale of information ecosystem threats online, including disinformation, weaponized controversy, and hate speech, are growing rapidly and now far beyond the capacity of any individual organization to make a dent in," said Shiri Dori-Hacohen, co-PI and director of the Reducing Information Ecosystem Threats (RIET) Lab at the University of Connecticut.

Keen Sung, co-PI and vice president of research and development at AuCoDe, added, "We are excited and honored to confront the pernicious issue of rising misinformation and controversy in social communities, both online and off. The strong, complementary skill sets of the FACT CHAMP team allow us to work closely with leaders within these communities. That familiarity will help us to reduce the harm that misinformation causes to specific communities and simultaneously promote empathy."

About the FACT CHAMP team

Meedan is a technology not-for-profit that builds software and initiatives to strengthen journalism, digital literacy, and accessibility of information for global communities. The mission of Meedan’s research team is to increase and deepen our understanding of how information divides affect people differently and to develop the tools and methods to widen access. We envision a world in which all people regardless of their languages, locations, or other factors have the ability to effectively locate the most pertinent information, evaluate the quality and credibility of that information, and make the decisions they want.

The Annenberg Public Policy Center (APPC) of the University of Pennsylvania is a nonpartisan, nonprofit communication policy center focusing on research into political, science, and health communication. Its project FactCheck.org was founded in 2003 by APPC Director Kathleen Hall Jamieson and Brooks Jackson, a veteran investigative reporter who pioneered on-air fact-checking at CNN.

The University of Massachusetts at Amherst team includes scholars from Communication and Computer Science with expertise in global disinformation cultures, race and racism, and community engaged research. Prof. Jonathan Corpus Ong has published ethnographic and policy-relevant research about the intersections of disinformation and hate speech in the wake of Covid-19, with a record of public engagement work with Asian and Asian-American community leaders. Prof. Ethan Zuckerman is director of the UMass Institute for the Digital Public Infrastructure and a leader in global conversations about reimagining the Internet toward being more responsive to communities’ needs, values and interests. The bottom-up research approach they lead will inform the development of digital tools to be developed by computational social science researchers Prof. Brendan O’Connor (CS) and Prof. Weiai (Wayne) Xu (Communication).

The newly launched Reducing Information Ecosystem Threats (RIET) Lab will be leading the efforts from the Computer Science and Engineering (CSE) Department at the University of Connecticut team. Prof. Shiri Dori-Hacohen, director of the RIET Lab, is an expert in information ecosystem threats online, including controversy and its impact on mis- and disinformation; her research integrates insights from social science into novel computational models. She is joined by Prof. Amir Herzberg, the Comcast Endowed Professor for Security Innovation, whose research includes novel community detection approaches to detect malicious actors, as well as works on privacy, anonymity and cryptography protocols.

The School of Communication and Information at Rutgers University-New Brunswick has expertise in data science, library science, and communication that is uniquely situated to work on key interdisciplinary problems and come up with innovative solutions. Assistant Professor of Library and Information Science Kiran Garimella, a computer scientist working in the area of computational social science, is an expert on studying encrypted platforms at scale. His work deals with building interdisciplinary approaches to bridge computer science and the social sciences, creating solutions for various societal level issues like political polarization, misinformation, and migrant assimilation. Garimella’s research studying misinformation and hate speech on WhatsApp is one of the few that studies encrypted platforms at scale.

AuCoDe is an AI-based startup that detects controversies and misinformation online and turns them into actionable intelligence. AuCoDe is the recipient of the NSF I-Corps and NSF SBIR Phase I and II grants on the topic of controversy detection online. Dr. Keen Sung is the Vice President of Research and Development and is an expert in privacy, cybersecurity, and disinformation.

About the NSF Convergence Accelerator

Research is often driven by a compelling societal or scientific challenge; however, it may take the researcher community years to develop a solution. To deliver tangible solutions that have a nation-wide societal impact and at a faster pace, the National Science Foundation (NSF) launched the Convergence Accelerator program in 2019. Designed to leverage a convergence approach to transition basic research and discovery into practice, the Convergence Accelerator uses innovation processes like human-centered design, user discovery, and team science; and integration of multidisciplinary research and partnerships; the Convergence Accelerator is making timely investments to solve high-risk societal challenges through use-inspired convergence research.

We collaborated with 53 partner organizations worldwide to design and carry out our 2024 elections projects. We extend special gratitude to our lead partners in Brazil, Mexico and Pakistan, whose work we highlight in this essay.

Pacto pela democraciaINE MexicoDigital Rights Foundation

The 2024 elections projects featured in here would not have been possible without the generous support of these funders.

SkollSIDAPatrick J McGovernSVRI
Tags
Research

Footnotes

References

Authors

Words by

No items found.
No items found.
Words by
Organization

Published on

September 27, 2021
April 20, 2022