When the Machine Meets the Survivor: Reclaiming Humanity in the Age of AI
Artificial Intelligence (AI) is beginning to shape how we respond to gender-based violence (GBV). It can collect data, connect survivors to information, and offer round-the-clock chat support. For some, that can be a lifeline, especially when fear, distance, or stigma make it hard to reach out in person.
But while technology can help, it cannot heal. GBV is a deeply human issue. It demands empathy, understanding, and genuine connection, things no algorithm or machine can ever truly give.
AI-powered tools can make help easier to find. They can guide someone to counselling services, list emergency contacts, or explain what to do after an assault. In rural areas, or after hours when crisis centres are closed, that can make all the difference. But there’s a catch: AI relies on information from the internet, and much of it is outdated or wrong. Some organisations have shut down, others have moved or changed numbers. For a survivor in crisis, that gap isn’t small, it can be devastating. Technology must always be verified, updated, and supported by real people who care.
AI can also help organisations make sense of vast amounts of information, from police reports to social media posts, to spot patterns and identify high-risk areas. That insight can guide prevention and response. Yet, while AI can process data in seconds, it cannot judge accuracy or understand emotion. Numbers and patterns are not people and stories. Human oversight keeps the data meaningful, and compassionate.
Then there’s the issue of fairness and privacy. AI learns from human data, and with it, human bias. Systems can reproduce sexism, racism, xenophobia, or class prejudice. In GBV cases, that bias can mean survivors are dismissed or disbelieved. And when someone shares their story online, that story must be protected. Mishandled data can put lives at risk. Survivors’ stories are not statistics, they are acts of courage that demand dignity, safety, and consent.
Healing from trauma takes more than information. It takes empathy. It takes trust. It takes being believed. No chatbot can offer the warmth of a counsellor, the reassurance of a kind voice, or the comfort of human presence. Technology can assist healing, but it can never be the healing. The best GBV responses blend innovation with humanity: let machines manage the data and let people hold the hearts.
If AI is to have a place in this work, it must be designed with care and used with conscience. It should amplify survivors’ voices, not replace them. Simplify access, not complicate it. Above all, it must always remain survivor-centred, technology in service of people, never the other way around.
AI can help us see more, but it cannot feel more. It can process the “what,” but not understand the “why.” And in GBV work, the “why” is everything. Behind every data point is a life forever changed, someone who deserves not just efficiency, but empathy.
The challenge is balance: to use technology not as a substitute for care, but as an extension of it. To let AI handle what it does best, sorting, connecting, detecting, while keeping human beings at the centre of listening, decision-making, and healing.
Because in the end, our progress won’t be measured by how smart our machines become, but by how wisely, ethically, and compassionately we choose to use them.
As Graça Machel so powerfully reminds us:
“There are precious lives behind these cold numbers; these are the beautiful faces, brilliant minds, and vibrant voices of our daughters, nieces, sisters, whose childhood and innocence we have left unprotected.”
