Federal authorities are warning in a recent uptick in sexualized deepfake images designed to be used in a new wave of so called sextortion campaigns. On Monday, the Federal Bureau of Investigation warned of an uptick in such attacks since April."The FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats," the warning states.Targets are then extorted for money or face the threat of having deepfake images or videos shared with family members or via social media feeds with friends, the FBI warns.
Sinister new twist on extortion
Sextortion – the threat of leaking sexually compromising content featuring a victim if they don’t pay up – is a well-thumbed chapter in the cybercrime playbook.But threat actors have begun taking sextortion to the next level: using deepfake technology to generate explicit images or video that appears to be of the target. The fact the content is fake may not diminish the threat if the victim fears exposure could none-the-less embarrass or harm them.According to the FBI, attacks can start with sextortionists scraping content their victims have posted on social media and using it to create the deepfakes. Another source of victims include targets being duped into handing over images and videos of themselves, or discovering that content captured during a video call has been repurposed.“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the bureau said in a new alert from its Internet Crime Complaint Center (IC3).“The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes. ”Victims typically reported the malicious actors threatened to share the deepfakes with family members or social media friends if they didn’t pay or, in some cases, refused to send real sexually explicit content.Even before the deepfake element was added into the mix, sextortion was a growing industry for cybercriminals.Problem getting worse
In 2021, the bureau said $8 million was known to have been extorted from Americans over seven months. And in January of this year the FBI and partner agencies issued an alert about an “explosion” of cases involving children and teens being sextorted.“Over the past year (2022), law enforcement agencies have received over 7,000 reports related to the online sextortion of minors, resulting in at least 3,000 victims, primarily boys. More than a dozen sextortion victims were reported to have died by suicide,” the alert said.“The FBI, U.S. Attorney’s Office, and our law enforcement partners implore parents and caregivers to engage with their kids about sextortion schemes so we can prevent them in the first place.”The growing problem of malicious deepfakes – not just as a sextortion tool, but also for other nefarious activities including spreading misinformation – has attracted the attention of lawmakers. Some states, including California and Virginia, have already banned deepfake porn.One deepfake victim, identified in an Insider report as QTCinderella, tweeted about her exploitation: "The amount of body dysmorphia I’ve experienced since seeing those photos has ruined me. It’s not as simple as “just” being violated. It’s so much more than that."The amount of body dysmorphia I’ve experienced since seeing those photos has ruined me.
— QTCinderella (@qtcinderella) January 31, 2023
It’s not as simple as “just” being violated. It’s so much more than that.