The collaboration is part of Meedan’s Check Global program, a network of journalists, human rights investigators and fact-checkers working to improve information access and equity in key global regions.
Meedan partner VFRAME develops computer vision technology that allows human rights researchers and investigative journalists to scrutinize objects in war zones. Together with human rights and war crime investigators at Mnemonic, the collaborators are reaching breakthroughs in conflict zone investigative techniques.
In one project the partners were able to detect evidence of cluster munitions in around 1,000 videos. Cluster munitions are air-dropped bombs that release smaller bomblets and pose risks to civilians both during attacks and afterwards. They are prohibited under international humanitarian law.
Now with about 3.5 million videos processed by VFRAME’s tools, the collaboration is bringing in offline, on-the-ground evidence-gathering to add to the work of their artificial intelligence models.
“Moving forward it’s a new possibility that many of the conflict zone objects can be directly 3D scanned for full accuracy and resolution. This has the ultimate impact in creating more accurate object detection algorithms for analyzing large archives of visual media from partners,” said Adam Harvey, founder of VFRAME.
“Meedan’s Check Global program collaborates with a network for journalists and human rights researchers to build capacity for their projects and achieve impact through collaboration. We’re thrilled to see how these partners have been able to work together to solve critical issues in crisis settings,” said Dima Saber, director of Meedan’s Check Global program.
In June 2022, VFRAME's work was presented at the United Nations Headquarters in New York City during the Biennial Meeting on Small Arms and Light Weapons. The session focused on new technologies for arms control and how they can be leveraged for real world applications in conflict zone monitoring and arms control.
The work offers a model for region-wide collaborations between archivists, journalists, and technologists, to collect and annotate data and make sense of it using new technologies. VFRAME is also leveraging recent developments in AI and computer vision towards improving the capabilities of small, specialized research groups allowing them to seamlessly analyze large quantities of media collected from conflict zones.
VFRAME's image processing software is open-source (MIT licensed) and available at github.com/vframeio.
Videos via Mnemonic.org with RBK-250 detections by VFRAME.io
- Online conversations are heavily influenced by news coverage, like the 2022 Supreme Court decision on abortion. The relationship is less clear between big breaking news and specific increases in online misinformation.
- The tweets analyzed were a random sample qualitatively coded as “misinformation” or “not misinformation” by two qualitative coders trained in public health and internet studies.
- This method used Twitter’s historical search API
- The peak was a significant outlier compared to days before it using Grubbs' test for outliers for Chemical Abortion (p<0.2 for the decision; p<0.003 for the leak) and Herbal Abortion (p<0.001 for the decision and leak).
- All our searches were case insensitive and could match substrings; so, “revers” matches “reverse”, “reversal”, etc.