The dark world of deepfake pornography has come to light with a shocking investigation revealing that more than 250 British celebrities have fallen victim to this disturbing trend. Channel 4 News uncovered that almost 4,000 famous individuals, including female actors, TV stars, musicians, and YouTubers, have had their faces superimposed onto pornographic material using artificial intelligence.
The five most visited deepfake websites received a staggering 100 million views in just three months, highlighting the widespread reach of this harmful content. Among the victims is Channel 4 News presenter Cathy Newman, who expressed feeling violated by the creation of fake pornographic videos featuring her likeness.
In response to the proliferation of deepfake pornography, the UK government passed the Online Safety Act on 31 January, making it illegal to share such imagery without consent. However, the creation of the content itself is not yet illegal, leaving many vulnerable to exploitation.
One victim, Sophie Parrish, bravely shared her experience of discovering fabricated nude images of herself online, describing the content as violent and degrading. The Online Safety Act is currently undergoing a consultation on how it will be enforced, with broadcasting watchdog Ofcom playing a key role in ensuring compliance.
Companies like Google and Meta, which owns Facebook and Instagram, have pledged to enhance their protections against deepfake pornography, with Google working on tools to help individuals protect themselves and remove harmful content from search results. Meta has taken steps to remove ads promoting deepfake creation apps from their platforms.
The disturbing reality of deepfake pornography serves as a stark reminder of the dangers posed by AI technology in the wrong hands. As the government and tech companies work to combat this harmful content, it is crucial to prioritize the protection of individuals from exploitation and violation.