
Ofcom to name and shame tech firms over women’s abuse online
BBC Radio 5 Live

Looking through the comments on her social media posts can be an emotional rollercoaster for Miah Carter.
The 21-year-old influencer posts makeup, body positivity and lip-sync content for her 3.3m followers on TikTok, but says her success online comes with constant abuse.
Speaking to Radio 5 Live, Miah says she receives abusive comments “every second, every day… the trolling I get is disgusting”.
Miah is sharing her experience as Ofcom releases new draft guidance aiming to improve the experience of women and girls online.
Messages left under Miah’s posts include comments encouraging her to take her own life, and personal attacks on her appearance.
“When I first started social media, my following came really quickly,” she said. “With that, the hate (comments) came coming in.
“Back then, I didn’t understand it. I didn’t know how to deal with my emotions. It really affected my mental health and I had suicidal thoughts.
“Now I’ve learned to ignore the comments and if I can be bothered, I delete them.”
Chief executive of Ofcom, Dame Melanie Dawes, has said the draft guidelines issued on Tuesday, will, if tech firms adopt them, be a “proper blueprint” for protecting women and girls online.
The broadcasting watchdog’s draft guidance ranges from measures to tackle online misogyny and domestic abuse, to pile-ons and intimate image abuse.
Ofcom has previously issued guidance to tech firms around protections for children online and on dealing with illegal online content.
Speaking to Radio 5 Live, Dame Melanie said the organisation would “absolutely” name and shame companies who didn’t comply with their guidance, so the public would know which companies were “not taking (user’s safety) seriously”.
The regulator wants sites and apps to adopt these measures voluntarily in what they call “a safety by design approach”. For example, they could adopt “abusability” testing to identify how a service or feature could be exploited by a malicious user.
Rules in the Online Safety Act, due to come into force this year, will compel social media firms to show that they are removing illegal content – such as child sexual abuse, material inciting violence and posts promoting or facilitating suicide. The law also says companies have to protect children from harmful material including pornography.
Content creator Harriet Maynard has also experienced abusive comments, which sometimes escalate into pile-ons – where a large number of people harass a person online.
Her Instagram posts are aimed at a female audience and relate to issues around parenthood and lifestyle content.
Despite having mainly female followers, when a video of Harriet’s goes viral, she says she receives “an influx of negative messages, primarily from men”.
“I normally don’t let it bother me, but when you get a wave of online abuse, it can get you down.
“In a ‘normal’ workplace, if you were being bullied or harassed, then there’d be an HR department to deal with it accordingly. But for us making content online for a living, there’s nothing like that.”
‘Tech platforms do the absolute minimum’
Nicole Jacobs, domestic abuse commissioner for England and Wales, welcomes the draft guidance.
“I’m pleased that Ofcom are stepping up to start the process of providing guidance to tech companies to tackle this,” she said. “It’s now on these firms to implement these recommendations and ensure that perpetrators can no longer weaponise online platforms for harm.
“By taking meaningful practical action, not only will people be safer online, but it will demonstrate that tech companies are ready to play their part in tackling domestic abuse.”
Prof Clare McGlynn, an expert in sexual violence, online abuse and the legal regulation of pornography, says she feels the guidance, which has no legal force, will struggle to make meaningful change.
“Experience shows that tech platforms do the absolute minimum necessary to comply with the law and little more. In the current climate, this is unlikely to change,” she said.
“We urgently need to do more by strengthening regulation, making many of these recommendations legally binding. A dedicated Online Safety Commission which prioritises online harms would be a positive next step. “
Some of the suggestions to tech companies from Ofcom include:
- “Abusability” testing to identify how a service or feature could be exploited.
- User prompts asking people to reconsider before posting harmful or abusive material.
- Easier account controls, such as bundling default settings to make it easier for women experiencing pile-ons to protect their accounts.
- Removing geolocation by default.
- Training moderation teams to deal with online domestic abuse.
While a report from Ofcom shows women are five times more likely to suffer intimate image abuse and are more likely to report being negatively affected by harms experienced online than men, Dame Melanie said the guidance was not about “women versus men or demonising men”.
“I think many men are really concerned about this as well. And that wider culture that’s going on online, I just don’t think it’s healthy for anybody.
“The misogyny that’s becoming normalised in some parts of the internet, that’s not great for boys. It’s not going to help them to form proper, strong, healthy relationships as they grow up. So I really hope men will get involved in this too.”
‘I’d like to see tech companies do more’
Harriet believes some of the suggestions by Ofcom would be ineffectual, such as popups on screen to make users consider if they wish to post abusive comments.
“I don’t think these kind of people worry if they’re going to offend someone by doing what they’re doing. They hide behind their keypads. Complete cowards,” she says.
However, she would welcome more protections for women experiencing pile-ons if they were built into social media platforms, adding users should be able to protect themselves from “pure abuse”.
Miah feels the guidance may make some difference, if companies choose to follow it.
“I’d like to see tech companies do more,” she says, “(Ofcom has) a huge challenge, but real change is possible if they hold platforms accountable.
“Right now, reporting hate often leads nowhere – there needs to be stricter enforcement and actual consequences for harmful behaviour.”
In a statement given to the BBC, Meta, which owns Instagram and Facebook, said: “We remove any language that incites or facilitates serious violence, disable accounts, and work with law enforcement when we believe that there is a genuine risk of physical harm or direct threats to public safety.
“We continue to work with women’s safety groups to understand the different ways harassment towards women can show up, while improving our technology to find and remove abuse more quickly.”
The BBC has also contacted other social media companies, including TikTok and X, for comment.
If you have been affected by issues raised in this article, help and support can be found on the BBC action line website.
Radio 5 Live’s Nicky Campbell will be joined by a panel of experts and a studio audience to discuss women and girls’ safety online, in public spaces and at home. Listen on BBC Sounds from 0900GMT.