Lessons learned on artificial intelligence and countering violent extremism conducive to terrorism

Lessons learned on artificial intelligence and countering violent extremism conducive to terrorism

15 July 2025

The proliferation of artificial intelligence (AI), especially in the form of generative models like ChatGPT, poses new challenges and opportunities. The ever-evolving use and application of AI, particularly by terrorist and violent extremist actors who are often early adaptors of these emerging technologies, necessitates further examination to better understand trends and anticipate threats. It also provides an opportunity to consider how AI may be used as a positive tool to support efforts to counter terrorism and violent extremism conducive to terrorism.

To increase awareness and understanding of these challenges and opportunities, the Countering Violent Extremism (CVE) Working Group – co chaired by Australia and Indonesia – and the International Center of Excellence for Countering Extremism & Violent Extremism (Hedayah) convened a virtual event. Bringing together governments, policymakers, and experts, the meeting acted as a platform for knowledge sharing and interregional cooperation.

Following preliminary research undertaken earlier this year, Hedayah presented key findings and recommendations from its Research Brief on Artificial Intelligence for Counter Extremism. Through consultation with local stakeholders, experts, and practitioners, the meeting identified promising opportunities and practical, actionable steps for ethical and effective use of AI to counter extremism and violent extremism conducive to terrorism.

Under its Work Plan, the CVE Working Group remains focused on diminishing recruitment and radicalization in an effort to prevent and counter violent extremism conducive to terrorism and terrorism online. Hedayah will continue to lead engagement on this critical subject matter area and ultimately develop a product capturing lessons learned and best practices that will be shared with GCTF Members.