On Tuesday, Meta announced the launch of its open-source tool, HMA, which helps platforms to identify copies of images or videos promoting terrorism. It also announced a collaboration with GIFCT (Global Internet Forum to Counter Terrorism), an NGO that brings organizations together to detect terrorism content online with the help of research, technical collaboration, and knowledge sharing in January 2023.
Several companies can use the new tool called, Hasher-Matcher-Actioner (HMA) to stop spreading terror content on their platforms. HMA is built on meta’s earlier open-source and video-matching software and will be used for violating content.
Meta is one of the members who founded the GIFCT organization in 2017. GIFCT includes companies, technical experts, and government and civil society organizations to address terrorism online.
The launch of the HMA tool and Meta’s collaboration in GIFCT is a part of Meta’s commitment to detecting terrorist content and protecting users from harmful content online. Meta has been working with AI to remove harmful online content generated at scale. It has also blocked millions of fake accounts daily to avoid spreading misleading information. Since 2017, Meta has taken down 150 networks of malicious accounts worldwide.