Let’s Not Create Thoughtcrimes via the Backdoor

Charles Amos: Let’s not create thoughtcrimes via the backdoor

CHARLES AMOS

Charles Amos studied Political Theory at the University of Oxford and writes The Musing Individualist Substack.

Earlier this month Liz Kendall announced those creating nonconsensual intimate deepfakes of women and children will now face full criminal penalties. She also announced an outright ban on nudification apps.

Her justification for both measures is the need to ensure women and children’s dignity and to prevent their harassment, torment and abuse. These measures are predicated on a questionable morality; and, regardless, should preserving dignity warrant banning nudifications, i.e. computer imaginations, then, the state must be implausibly committed to mind control to stop men from imagining women naked too. Accepting reputational damage as properly tortious, deepfakes misrepresented as the truth should be dealt with via existing laws of defamation. Summarily, liberals must oppose this legal moralism outright.

Liz Kendall’s argument for banning nudification apps largely hinges on her idea of dignity. Since Kendall appears to take no issue per se with the consensual creation of intimate deepfakes, the dignity she refers to must amount to women having some right to their sexual image and likeness which can only be given up by them. This would very implausibly commit her to the claim men imagining their crushes and colleagues naked are committing a wrong because by creating their likeness in their mind’s eye they are violating women’s rights.

If Kendall is enforcing women’s rights fully – were mind control possible – she’d have to stop men from imagining real women naked, or, they could only sexually imagine porn stars, and their crushes after they’d got a written note from them. This is all very implausible; hence, the said right probably doesn’t exist, meaning, there is no moral basis for the intimate deepfake ban which is founded on said right needing legal protection. Put simply: if imagining naked women is fine, then, AI imagining naked women is fine too.

Kendall might dispute my analogy because the image in the mind’s eye is less vivid than Grok’s creations, but this hardly gets her to the total ban she wants anyway. Grok can probably put Julia Hartley Brewer’s head on the scantily clad body of Ava Gardner – a body which we all know is not Julia’s – yet Kendal would still want that banned, meaning, whether or not the image created is vivid, or, realistic, is irrelevant to her case. Ultimately, women have no more right to their sexual likeness than a landowner does to the likeness of their landscape, i.e. none at all. At this point, I suspect focus will move onto harassment, torment and abuse. Now if tormenting a person is to impose severe mental suffering upon them, torment clearly doesn’t warrant banning; indeed, it’s probably not even immoral. Getting into an argument you lose and watching a disturbing film at the cinema can cause severe mental suffering but we are not banning those. And should it be claimed people choose to go into them, I say, people choose to go onto X with deepfakes too.

The real concern about intimate deepfakes is they will be used to defame and blackmail people. Here is not the place for me to dispute, along with Murray Rothbard, whether defamation and blackmail should be illegal. Nevertheless, even accepting the unjustness of defamation and blackmail, this hardly warrants banning nudification apps and sharing intimate deepfakes, because, they’re obviously not real most of the time. And where deepfakes are difficult to tell apart from reality, a minor watermark in the corner of said image should do. Let’s disregard all of the aforementioned reasoning, however, and assume women do have a right to their sexual likeness: would this warrant possibly banning X, or, fining it 10 per cent of its eligible global turnover, in order to stop the mass erosion of the dignity of women and children?

No. Schools very often have 15-year-old girls in short skirts getting hot playing tennis in the summer with all the accompanying noise which goes with it. Some 19-year-old boys from a neighbouring university might walk past this scene and perhaps even stop to observe them and then imagine them later. Let’s, perhaps rightly, assume such sexual imagining is wrong. Would it warrant forcing the school to grow hedges all around the tennis courts and playing fields? Negative because the immorality has nothing to do with the school but the young men on the foot path. Neither then can X, which already bans intimate deepfakes of minors, be forced to police the whole of its site, i.e. the equivalent of growing hedges everywhere, and be threatened with closure in Britain for failure to do so.

Intimate AI deepfakes are simply vivid imaginings and should these be banned to protect Kendall’s understanding of dignity, then, nothing in principle stands in the way of her banning our everyday intimate imaginings too. Make no mistake, the wretched legislation that is the Online Safety Act is simply the thin end of the wedge to creating thoughtcrimes. After properly considering defamation and blackmail, liberals of all parties must see Kendall’s poorly argued sentimentality for what it really is: an unacceptable attack on the individual’s freedom to imagine.


This article (Charles Amos: Let’s not create thoughtcrimes via the backdoor) was created and published by Conservative Home and is republished here under “Fair Use” with attribution to the author Charles Amos
.
Featured image:Reuters
.

Be the first to comment

Leave a Reply

Your email address will not be published.


*