© 2024 CoolTechZone - Latest tech news,
product reviews, and analyses.

Human Rights Watch: ‘Photos of children being used to train AI tools’


According to Human Rights Watch personal photos of Australian children are being used to create powerful artificial intelligence (AI) tools. Even worse: parents aren’t aware of it and haven’t consented.

The interest group claims that the images are scraped off the internet and uploaded into a large data set that companies use to train and create AI tools. This database is called LAION-5B.

The data set contains over 5.85 billion images that have been scraped from personal blogs, photo- and video-sharing sites and other pages on the web.

Once these images are collected, the true privacy nightmare begins.

Sometimes photos include captions that provide intimate details about children and even newly born babies, such as names, locations and medical records. Safeguards that are in place to prevent the leakage of personal information, have failed more than once.

Submitting a removal request doesn’t do any good. Human Rights Watch says that current AI models can’t forget what they have learned.

Even more harm is done when people who use these AI tools, which are often free and easy to use, manipulate them to create deepfakes and realistic children sexual abuse material (CSAM), or use them for grooming purposes. And once these fake images or videos hit the web, it’s practically impossible to have them removed and deleted.

“Children should not have to live in fear that their photos might be stolen and weaponized against them. The Australian government should urgently adopt laws to protect children’s data from AI-fueled misuse,” says Hye Jung Han, children’s rights and technology researcher and advocate at Human Rights Watch.

Han calls generative AI ‘a nascent technology’ that is able to defend children from harm. “Protecting children’s data privacy now will help to shape the development of this technology into one that promotes, rather than violates, children’s rights,” he says.


Leave a Reply

Your email address will not be published. Required fields are marked