University researchers just launched a new set of tools to help protect young women and girls from violence and abuse on the dark web.
The project comes from Aston University in Birmingham, following the largest study ever done on online harms that women and girls face.
The tools include an emoji decoder, and they’re part of a bigger push to arm users (and parents, too) with the confidence to fight back. The team behind it says online problems don’t stay online; they spill into people’s real lives.
What the numbers from the study actually show
The CyberDIVA project started after researchers dug into how technology enables abuse. The findings were very shocking – almost 70,000 cases of tach-driven abuse between 2021 and 2024. And over 50,000 official harassment reports.
Men and boys face abuse online, no doubt. But the study found that women aged 16 to 34 took the hardest hit. So did a significant number of girls under 16. That data set the direction for everything CyberDIVA set out to do.
Dr. Anitha Chinnaswamy leads the project. She explained that online harm doesn’t stay behind a screen. “It pours over into your physical world, into emotional and sentimental spaces,” she said. “Into how you look at yourself and how you are in society.”
Tools designed by listening to victims’ experiences
The team didn’t just do things based on the numbers. They listened to women who lived through this. Researcher Nina Jane Patel shared a chilling experience. She was in virtual reality when “three to four male-sounding and male-representing avatars” surrounded her. They started sexually harassing her verbally, then sexually assaulted her avatar.
These kinds of immersive abuse experiences are often facilitated by platforms that operate in the shadows, platforms that law enforcement has begun dismantling, such as the 373,000 fake dark web sites taken down in Operation Alice, a global crackdown that targeted sites used for exploitation and scams.
Also, Harkirat Kaur Assi had a different story, but no less painful. A man catfished and abused her online. According to her, if you experience such, you will slowly start doubting yourself and become a shadow of yourself.
Both women were pointing to the same thing. People need to be way more aware of what girls and women face online. Patel put it simply, “Be aware it’s not just a game.” Further, Assi added a practical warning about oversharing. “Often grandparents think nothing of sharing details about their grandchildren with strangers,” she said.
Those experiences the women shared shaped how the CyberDIVA created the tools. There’s the emoji decoder, which helps adults understand what messages kids are really receiving. Most especially, it shows how humor can hide online harm. The goal? To give parents something concrete to rely on when helping their children navigate the online world.
The project brings in experts from various sectors
CyberDIVA isn’t working alone. The project brought together academics, telecoms companies, and West Midlands Police. That mix matters because tackling dark web abuse requires expertise from all angles.
Alongside the tools, Aston University is launching GRIIT (the Gender Equity Research and Inclusive Innovation Technology Group). It’s a new hub where academics and industry can keep working on these problems together.
Dr. Chinnaswamy said the goal is simple. Arm people to fight back. “We wanted to develop resources that would help not only young women and girls, but parents with confidence to help their child navigate this world.”
The emoji decoders and other tools are accessible via the CyberDIVA initiative. The whole point? To help folks, especially people who have felt powerless in dealing with online abuse, try to recover and just feel like they’re in control again.
But protecting university communities requires more than just individual awareness; institutions must also defend their networks against threats like the reported sale of Turkish university VPN access on the dark web, where criminals sell stolen credentials that could lead to devastating data breaches.