Computer-generated child sexual abuse images made with artificial intelligence tools like Stable Diffusion are starting to proliferate on the Internet and are so realistic that they can be indistinguishable from photographs depicting actual children, according to a new report.
In some cases, kids are using these tools on each other. At a school in southwestern Spain, police have been investigating teens’ alleged use of a phone app to make their fully dressed schoolmates appear nude in photos. “They’re taking existing real content and using that to create new content of these victims,” he said. “That is just incredibly shocking.”
The report says technology providers could do more to make it harder for the products they’ve built to be used in this way, though it’s complicated by the fact that some of the tools are hard to put back in the bottle. Stability later rolled out new filters that block unsafe and inappropriate content, and a license to use Stability’s software also comes with a ban on illegal uses.
Malaysia Latest News, Malaysia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: UMonline - 🏆 27. / 51 Read more »
Source: DailyExpress_MY - 🏆 3. / 83 Read more »
Source: fmtoday - 🏆 5. / 72 Read more »
Source: malaymail - 🏆 1. / 86 Read more »
Source: malaymail - 🏆 1. / 86 Read more »
Source: fmtoday - 🏆 5. / 72 Read more »