El regulador del Reino Unido quiere prohibir aplicaciones que puedan crear imágenes deepfake desnudas de niños.

The Children’s Commissioner in the UK is urging for a prohibition on AI deepfake applications that generate nude or sexual images of minors, as per a recent report. These «nudification» apps have proliferated to the extent that numerous girls have refrained from sharing photos on social networking platforms. Despite the fact that the creation or distribution of CSAM images is unlawful, apps utilized for producing deepfake nude images remain permissible.

Children’s Commissioner Dame Rachel de Souza expressed, «Children have confided in me about their apprehension towards the mere existence of this technology, let alone its utilization. They are anxious that anyone – a stranger, a peer, or even a friend – could exploit a smartphone to manipulate them by fabricating a naked image using these specialized apps. There is absolutely no valid justification for the existence of these applications.»

De Souza highlighted that nudification AI apps are readily accessible on mainstream platforms, including major search engines and app stores. Furthermore, she emphasized that these apps disproportionately target girls and young women, with many tools designed exclusively for female bodies. She also noted that young individuals are advocating for decisive action against the abuse of such tools.

In light of this, de Souza is urging the government to implement a comprehensive ban on apps utilizing artificial intelligence to produce sexually explicit deepfakes. Additionally, she is calling for the establishment of legal obligations for GenAI app developers to assess the risks posed by their products to children, establish effective mechanisms for removing CSAM from the internet, and acknowledge deepfake sexual exploitation as a form of violence against women and girls.

The UK has already taken steps towards prohibiting such technology by introducing new criminal offenses for the creation or dissemination of sexually explicit deepfakes. Furthermore, it has announced intentions to criminalize the act of capturing intimate photos or videos without consent. However, the Children’s Commissioner is specifically focusing on the detrimental impact of such technology on young individuals, noting a correlation between deepfake abuse and suicidal thoughts and PTSD, as reported by The Guardian.

A 16-year-old girl surveyed by the Commissioner remarked, «Even prior to any controversies arising, I could foresee the negative implications of this technology. It was evident to me that it would be a technological marvel that would be exploited for malicious purposes.»

For those in the US in need of support, the National Suicide Prevention Lifeline can be reached at 1-800-273-8255 or by dialing 988. Crisis Text Line is accessible by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia offers a compilation of crisis lines for individuals outside of these countries. Although creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal.

«Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone — a stranger, a classmate, or even a friend — could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps,» said Children’s Commissioner Dame Rachel de Souza. «There is no positive reason for these [apps] to exist.»

De Souza emphasized that nudification AI apps are easily accessible on mainstream platforms, with a focus on targeting girls and young women. She noted that many of these tools seem to only work on female bodies, prompting young people to demand action against their misuse.

In response, de Souza is urging the government to implement a total ban on apps that use artificial intelligence to produce sexually explicit deepfakes. She also calls for legal responsibilities for GenAI app developers to address the risks their products pose to children, establish efficient systems for removing CSAM from the internet, and acknowledge deepfake sexual abuse as a form of violence against women and girls.

While the UK has taken steps to criminalize the production and sharing of sexually explicit deepfakes and unauthorized intimate media, the Children’s Commissioner is particularly concerned about the potential harm these technologies can inflict on young individuals. Reports have linked deepfake abuse to suicidal thoughts and PTSD.

«I could already tell what it was going to be used for, and it was not going to be good things. I could already tell it was gonna be a technological wonder that’s going to be abused,» expressed a 16-year-old girl surveyed by the Commissioner.

For those in need of support, the National Suicide Prevention Lifeline in the US is 1-800-273-8255, or you can dial 988. Crisis Text Line is available by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia provides a list of crisis lines for individuals outside of these countries.

FUENTE

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *