internet

Ofcom calls on tech firms to step up action against ‘revenge porn’


Tech platforms should combat the rapid rise in explicit deepfakes and “revenge porn” by using a database of images to protect women and girls online, according to new guidance being drawn up by the UK communications watchdog.

The move is part of a raft of measures proposed by Ofcom to tackle online misogyny, harassment and the sharing of intimate images without the subject’s consent – often referred to as “revenge porn”.

The regulator is recommending the use of so-called hash-matching technology, which allows platforms to take down an image that has been the subject of a complaint.

Under the system, an image or video flagged by a user is cross-referenced against a database of illicit images converted into a “hash”, or digital fingerprint. This allows harmful images to be detected and removed from circulation.

Examples include StopNCII.org, which allows survivors and victims to generate hashes from non-consensual intimate images, details of which are then shared across participating platforms.

Ofcom research found that the proliferation of generative AI tools, which can create highly realistic images and audio from simple hand-typed prompts, had led to more deepfake intimate image abuse being posted online in 2023 than in all previous years combined.

Ofcom’s Jessica Smith, who led the development of the new guidance, said: “For intimate image abuse, we are saying that tech firms should sign up to a technology called hash-matching, which is basically a database of images which enables any image to be identified at scale wherever it is shared on a platform.

“It is really innovative technology. What that means is it does not have to be reported every time it is uploaded. It means it is reported once and wherever it exists it is identified.”

Read More   The Guardian view on the Pelicot rape trial: Merci, Gisèle, from women everywhere | Editorial

The draft guidance goes beyond what is required under the Online Safety Act 2023, in order to cover not just illegal harms but “legal” harms that do not meet a criminal threshold, with many misogynistic posts not technically illegal.

It sets out how service providers – including dating apps, social media, gaming, pornography sites and search services – can address content and activity that disproportionately affects women and girls in recognition of the specific risks they face.

Ofcom also said platforms should consider introducing “nudges” to ask if someone is sure they wish to post a potentially harmful comment – a feature that has been shown to lead to significant behaviour change.

Ofcom recommends platforms introduce bundling default settings so that women and girls can easily get the most private profiles without having to go through all the settings to pick those that protect them most.

In its consultation paper, Ofcom highlights that:

  • Women are five times more likely to be victims of intimate image abuse.

  • Nearly 70% of boys aged 11-14 have been exposed to online content that promotes misogyny and other harmful views.

  • Almost a quarter of teenage girls (23%) regularly see content that objectifies or demeans women.

  • Online domestic abuse is under-reported – half of survivors (49%) told no one about it.

  • Nearly three-quarters of respondents in a survey (73%) had experienced online threats and abuse.

A key message behind the Online Safety Act is safety by design, which means platforms are expected to build in safety measures from the design stage. Ofcom says it will use its transparency powers to highlight which platforms are compliant and which are not, to inform users.

Read More   Best podcasts of the week: Kick off 2024 with a self-help show free from ‘delusional positivity’

The draft guidance now goes out for feedback, with final guidance to be published later this year.

Melanie Dawes, Ofcom’s chief executive, said: “Our practical guidance is a call to action for online services, setting a new and ambitious standard for women and girls’ online safety.

“There’s not only a moral imperative for tech firms to protect the interests of female users but it also makes sound commercial sense, fostering greater trust and engagement with a significant proportion of their customer base.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.