NSF Grant Could Lead to Better Computational Tools for Human Fact Checkers

The spread of misinformation online on social media platforms has presented immediate challenges with potentially dire consequences, particularly around topics like public health and election integrity.

A $750,000 grant from the National Science Foundation could help give human fact checkers the computational tools needed to accurately and efficiently push back against this growing swell.

As part of a larger $21 million investment made by the NSF, Georgia Tech researchers and collaborators at the University of Wisconsin and Washington State University will examine the spread of misinformation, how that is fact checked, the challenges that prevent accurate and efficient correction, and the messaging within those corrections that may impact public trust. The team will comprise computer scientists, journalists, mass communications experts, psychologists, and political scientists.

“Right now, fact checkers look at trending articles, do a deep dive, and spend hours to determine whether or not it is misinformation,” said School of Computational Science and Engineering Assistant Professor Srijan Kumar, a co-principal investigator on the project. “Within this are a number of biases: popularity biases in that only popular articles are examined, fact checking is slow, and it is language and region restricted. That’s where we want to bring a computational approach to improve the process.”

The Georgia Tech team, which also includes co-PI Munmun De Choudhury of the School of Interactive Computing, will focus on the computational element of the project by developing machine learning and artificial intelligence-based detectors. They will create algorithms that look not just at a piece of information, but also patterns like who is sharing, how it spreads through the network, and other attributes that can be fused together to create more efficient detection models.

“We want to empower fact checkers to be able to do their jobs more efficiently,” De Choudhury said. “How can they detect misinformation, provide corrections, and do so in a way that encourages public trust? That involves people of many different fields as you navigate complex ecosystems of values, ideologies, and beliefs.”

The project will look at misinformation in two of the most pressing areas: public health, including Covid-19 and vaccination misinformation, and election integrity. Information gleaned from this project and corresponding solutions, the researchers say, could be applied to additional areas of misinformation in the future.

The project is titled How Large-Scale Identification and Intervention Can Empower Professional Fact-Checkers to Improve Democracy and Public Health. The project grows out of a previous grant from the NSF for Rapid Research Response (RAPID), titled Tackling the Psychological Impact of the COVID-19 Crisis, which was also led by De Choudhury and Kumar.

Related Media

Click on image(s) to view larger version(s)

  • NSF project Co-PIs

For More Information Contact

David Mitchell
Communications Officer