Misinformation about COVID-19 has led to severe harms in multiple instances: as an example, a rumor that drinking methanol would cure the virus resulted in hundreds of deaths. While end-to-end encryption is an important privacy safeguard, this encryption prevents platforms such as WhatsApp, Signal, and others from employing centralized interventions and warnings about misinformation. Several options, however, from user interface changes to tip lines to having more intelligence on client devices offer hope.
In this presentation Dr Scott A. Hale will discuss how text similarity algorithms are being used to help fact-checkers locate misinformation, cluster similar misinformation, and identify existing fact-checks in the context of tip lines on platforms with end-to-end encryption. The presentation will detail research at the Oxford Internet Institute and Meedan, a global technology not-for-profit developing open-source tools for fact-checking and translation, that is actively being used by fact-checkers to improve the information available online.
The session will be a 30 minute presentation followed by 30 minutes of questions. Dr Chico Camargo, Postdoctoral Researcher in Data Science at the Oxford Internet Institute, will moderate the session.
Dr Scott A. Hale is a Senior Research at the Oxford Internet Institute, University of Oxford; Director of Research at Meedan; and a Fellow at the Alan Turing Institute. His cross-disciplinary research develops and applies new techniques in the computational sciences to social science questions and puts the results into practice with industry and policy partners. He is particularly interested in mobilization/collective action, agenda setting, and antisocial behaviour (e.g., hate speech) and has a strong track record in building tools and teaching programmes that enable wider access to new methods and forms of data.