Social media companies to be 'named and shamed' for not protecting women and girls online, Ofcom chief tells LBC
by Kit Heren · LBCExclusive
Ofcom chief on abuse of women and girls online
By Kit Heren
The head of Ofcom has told LBC that the regulator will "name and shame" social media companies that fail to protect women and girls in the UK adequately.
Listen to this article
Loading audio...
Melanie Dawes, who runs the communications regulator, told LBC's Nick Ferrari at Breakfast that women face "a real culture of misogyny, harassment [and] intimate image abuse".
She was describing Ofcom's plans to "shine a light" on tech companies that fail to be proactive in making their sites safer and easier to use, especially for women and girls.
The regulator is consulting on new guidance on measures tech firms should use to help better protect women and girls on their sites.
These will include the better use of technology to prevent intimate image abuse, as well as a request for sites to consider introducing tools which would help spot and fix ways they can be exploited by abusers.
Dawes said the problem was even more widespread than some might think.
Dan Jarvis MP talks online safety on Andrew Marr
"When I talk to women, you know, what's so striking is this isn't a niche issue, this isn't about just vulnerable women, it's actually about ordinary women going about their daily lives online," she said.
She claimed that women are "five times more likely than men to have intimate image abuse against them, much more likely to have harassments or pile-ons just for expressing an opinion.
"So it's fundamentally about people's freedom of speech not actually being a real thing online, if you're a woman in the UK, in the same way as it is for men."
The guidance joins already published legally binding codes of practice on illegal content and protecting children online that have already been put in place by Ofcom. These will begin to take effect next month under the Online Safety Act, and which carry large financial penalties if found to have been breached.
Dawes said it was unclear to what extent the tech firms themselves would take action when the Online Safety Act comes in, and how much Ofcom would need to enforce.
"It's a six million dollar question how much action is going to come and how much are we going to need to drive through with enforcement action," she said.
"We're ready for that. But this is a big cultural change for the industry. There's a lot to fix. So all eyes now on the tech industry as far as I'm concerned."
Ben Kentish and caller Ned debate whether the Online Safety Act should be revisited
Dawes said in an earlier statement that the new guidance was a "call to action for online services".
"There's not only a moral imperative for tech firms to protect the interests of female users, but it also makes sound commercial sense - fostering greater trust and engagement with a significant proportion of their customer base," she said."
Dawes' colleague Jessica Smith, who led the development of the guidance, said the regulator was also prepared to use its powers under the Online Safety Act to highlight platforms not doing enough to protect women and girls on their sites through new online safety transparency reports.
"Effectively, what we are going to do is use our information-gathering and transparency powers," she told the PA news agency.
"So one of the things we're committing to do is, once the guidance is finalised and a sufficient period of time has passed, is we'll publish a transparency report that shines a light on what platforms are doing and not doing to keep their users safe.
"It's about putting information out there, so users can be informed and make a choice about where they spend their time online."
She added: "What we're saying to platforms today is that you have a commercial choice.
"We know that women spend longer online than men, for example, on a daily basis, and so we think it makes good commercial sense to take their safety seriously.
"For some platforms, they may not choose to do that, and that is their decision. But as I said, then we will make sure people know what kind of space they are entering into when they go on that kind of platform."
Under the Online Safety Act, platforms will be legally required to follow a new set of duties around protecting users from harm online, with fines of up to 10% of global turnover for those who fail to do so - which could run into billions of pounds for the largest services.
New regulation of the online world has been broadly welcomed, but some charities and campaigners have warned that the current plans have taken too long to implement, and do not go far enough to protect users from harm.
In response, Ms Smith said: "We're at the fairly early days when it comes to implementation of the Act.
"I think we're still at the stage of testing and implementing the powers that we have now, and I know that this has taken a while, so I really understand people's frustration.
"I think when it comes to this particular guidance, we are balancing quite difficult issues. There are issues around free expression."
As well as women and girls, Ms Smith said it was vital the regulator also heard from men and boys on the issues raised.
"Obviously, this guidance is focused on women and girls, and we've spoken to a lot of survivors and women's advocacy organisations as part of the process, but it's also for men and boys - these tools can be for everybody," she said.
"We know boys are more exposed to online misogyny than girls are, so we think that this will have broader benefits and we're keen to speak to men and boys as part of our consultation process."