Event

Registration now open - DIVERSIFAIR Civil Society Workshop

As part of the DIVERSIFAIR Erasmus+ project, we are delighted to invite civil society organisations, advocacy groups, and professionals working with communities affected by AI systems to our CSO Workshop: “Advocating against algorithmic discrimination: tools and strategies for civil society. 

Join the DIVERSIFAIR CSO Workshop on 24 November 2025 for an interactive session designed to help civil society organisations tackle algorithmic discrimination. The workshop will explore how AI systems can affect marginalised communities, how to understand these impacts using an intersectional lens, and how to translate knowledge into effective advocacy strategies. 

 

WORKSHOP DETAILS 

📅 Date: Monday 24 November 2025
Time: 3:00 – 5 pm CET
📍 Format: Online, interactive workshop with presentations, breakout sessions, and group reflections

Register now on Eventbrite

! Limited number of spaces available.

 

HOSTS 

Gemma Galdon-Clavell – Founder & CEO, Eticas.ai (Spain)
Gemma is a pioneer in AI auditing and safety, helping organisations identify and correct bias in predictive and large language models. She leads Eticas.ai, a venture-backed organisation that ensures AI tools serve society responsibly. Gemma advises international institutions including the UN, OECD, EIT, and the European Commission, and is a frequent speaker at high-level fora such as the US-EU Trade and Technology Council, re:publica, and TEDx. Her work has been featured in Wired, Forbes, Business Insider, and Computer Weekly. 

Steven Vethman – Researcher, Sciences Po (France)
Steven contributes to the European DIVERSIFAIR project and researches AI technologies, their societal impact, and governance. His work focuses on understanding how AI systems affect communities, how power structures influence AI deployment, and how civil society can advocate for more equitable and accountable AI practices. 

Raphaële Xenidis – Assistant Professor in European Law, Sciences Po (France)
Raphaële specialises in European discrimination and equality law, with a focus on intersectionality and algorithmic discrimination. Her research examines bias in automated decision-making systems and explores how legal frameworks can address inequalities created or amplified by AI. She has conducted projects on algorithmic fairness, data-driven discrimination, and the broader social implications of AI governance.

AIM 

To support civil society organisations in developing a critical and practical understanding of AI fairness, grounded in lived experience and power structures, enabling them to advocate effectively and take meaningful action. 

WHO SHOULD ATTEND 

This workshop is designed for civil society organisations, advocacy groups, and professionals working with communities affected by AI systems. 

 

WHY ATTEND

  • Understand algorithmic discrimination and its impact on marginalised groups
  • Analyse the problem through an intersectional lens to sharpen advocacy focus
  • Apply legal frameworks to give your actions a solid foundation
  • Learn from successful advocacy strategies and campaigns
  • Explore how AI auditing and participatory research can drive systemic change

THIS WORKSHOP WILL HELP

  • Identify the invisible harms of AI systems on marginalised groups
  • Work through a practical case study to analyse these harms and explore solutions
  • Connect AI fairness to broader campaigns for justice, rights, and equity

AGENDA (2h)

  • Welcome & Introduction
  • Algorithmic Discrimination: Why It Matters – Examples and Discussion
  • Legal Framework and Intersectional Discrimination
  • Learning from Success: Advocacy in Action
  • Impactful Strategy: How AI Auditing is Standing Up for People
  • Wrap-up & Reflections 

Follow DIVERSIFAIR on LinkedIn to stay updated. 

#FairerAI

Follow us on LinkedIn @DIVERSIFAIR Project to join the movement!

Project

Goals

Consortium

hey

Stay in the Loop!

Don’t miss out on the latest updates, subscribe to our newsletter and be the first to know what’s happening.

This project has received funding from the European Education and Culture Executive Agency (EACEA) in the framework of Erasmus+, EU solidarity Corps A.2 – Skills and Innovation under grant agreement 101107969.

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the Culture Executive Agency. Neither the European Union nor the granting authority can be held responsible for them.