, the bureau said it had recently observed an uptick in extortion victims saying they had been targeted using doctored versions of innocent images taken from online posts, private messages or video chats.
"The photos are then sent directly to the victims by malicious actors for sextortion or harassment," the alert said. "Once circulated, victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet." The bureau said the images appeared "true-to-life" and that, in some cases, children had been targeted.
The FBI did not go into detail about the program or programs being used to generate the sexual imagery but did note that technological advancements were "continuously improving the quality, customizability, and accessibility of artificial intelligence -enabled content creation."The manipulation of innocent pictures to make sexually explicit images is almost as old as photography itself, but the release of open-source AI tools has made the process easier than ever.
Reporter covering cybersecurity, surveillance, and disinformation for Reuters. Work has included investigations into state-sponsored espionage, deepfake-driven propaganda, and mercenary hacking.
Source: Tech Daily Report (techdailyreport.net)
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: dothaneagle - 🏆 337. / 59 Read more »
Source: ABC - 🏆 471. / 51 Read more »
Source: dcexaminer - 🏆 6. / 94 Read more »
Source: WSJ - 🏆 98. / 63 Read more »
Source: AllSidesNow - 🏆 572. / 51 Read more »
Source: FOXLA - 🏆 445. / 53 Read more »