[ad_1]
Helen Mort had never sent intimate photos of herself in her life. But someone, an acquaintance or a stranger, has fished harmless photos of her from social media and uploaded them to a porn platform so that other users can put their faces on the bodies of porn actresses. The whole thing should be as hardcore as possible, he urged. “My ordeal has scared, embarrassed, paranoid and crushed me,” says the British writer, who came across porn with herself over the past year.
Most people don’t want to appear in porn that they didn’t strip themselves for. Half of humanity can assume that this is unlikely to happen. The other half are women. Women like Helen Mort.
It is called deepfake when a well-trained artificial intelligence manipulates a video and goes through setting by setting until it actually looks as if the head belongs to the foreign body, as if someone is talking and moving, even though he himself does know nothing. The more convincing this is to be, the more computing power the computer needs and the more technical understanding the user needs. Or you can simply order a tailor-made video from the numerous order deepfakers on the Internet. If you don’t put too much emphasis on outsmarting even the last few experts, you can do it Cheapfake on. There are apps for that. Last year, a bot in the chat app Telegram became famous for automatically creating nude deepfakes from hundreds of thousands of photos sent in. Cost: $ 1.25 per image.
The number of fake videos on the Internet has been growing exponentially since 2018 and doubles about every six months, analyzes the company Sensity, whose software exposes deepfakes. More than 90 percent of the fake videos show pornography, and again almost all of them have a protagonist who did not know about it. Often the victims are women in public, actresses like Emma Watson or singers like Lena Meyer-Landrut. But it also affects the neighbor, the colleague, the ex-girlfriend. An over-ambitious mother in the US recently made deepfakes to bully her daughter’s rivals on the cheerleading team. In India, investigative journalist Rana Ayyub was attacked with a deepfake porn video that hit millions of cell phones. Ayyub ended up in the hospital with anxiety and felt more inhibited in her work afterwards.
The videos are a threat to democracy
Virtual violence also hurts. It can silence women, push them out of public life – and often aims to do just that. But the world can no longer afford silent women. We need more women in politics, business, culture and journalism. So we have to protect them better.
Deepfakes are a threat to democracy. Deliberately scattered misinformation on the Internet is a powerful weapon that deepfakes make even more convincing and powerful. Just imagine the damage it would do if a video shows a pharmaceutical boss handing Angela Merkel a thick stack of 100 euro bills. Or in which Annalena Baerbock holds hands with Armin Laschet.
People have to learn that basically you can’t rely on anything you see. That contradicts the self-image that has always been shaped. Seeing is believing it says in English. I believe what I saw with my own eyes. It will be a long process before a healthy distrust spreads. And then the question arises as to which mistrust is healthy in society as a whole. Finally, pictures document events. They are supposed to show what is true. Anyone who has seen a white policeman pressing his knee on the neck of the black George Floyd for almost nine minutes has a harder time denying racism and police violence. It is important for a functioning society that facts can be proven. But what if you can no longer trust your own eyes?
For a long time the answer was: people have to acquire media literacy. You have to learn to distinguish between truth and fake. But even experts have difficulties in identifying the particularly well fake videos as such. And the technology just keeps getting better. The solution cannot lie with the individual, but has to be systematic – laws are required. But as always, regulation is lagging behind technical progress. Writing good laws is complex. A complete ban would be wrong, because the technology has positive areas of application. In a film about activists fighting for homosexuals and transgender people in Chechnya, for example, fake characters replaced the vulnerable whistleblowers last year. And deepfakes are a means of satire. Just as an idea, there is nothing against letting mask corruption politicians Nüßlein, Sauter and Löbel dance a little to ABBA’s “Money, Money, Money”.
Technology can only be combated with technology and with laws that incorporate technology instead of banishing it. For example, a mandatory deepfake scanner that searches websites, especially porn and social media platforms, finds and reports manipulated media, would be conceivable. Fake porn must be banned and fakers punished who do business with strange faces and use violence. Poet Helen Mort has started a petition for deepfake law in the UK. It can’t be muted, she says. “I want deepfakes to be seen as a form of hate crime.”