WAVY.com

Fake images, real crimes: Content creators without conscience use AI-generated photos

HAMPTON ROADS, Va. (WAVY) — A local case involving child sexual abuse material illustrates a new battleground for investigators, as some of the images didn’t involve actual children, but broke the law just the same.

It’s what happens when technology becomes treacherous. An innocent snap of your child could end up in the hands of content creators without a conscience.


As part of a plea deal earlier this year, Dustin Harrell, 39, of Virginia Beach is serving seven years in prison. Federal undercover investigators found that Harrell had distributed child sexual abuse material. His devices had hundreds of images and videos, and while some showed real children others were animations of children.

“He had both [computer-generated imagery] and videos as well as actual victims and actual child pornography,’ said Brian Dugan, special agent in charge of the FBI Norfolk office. “Unfortunately it did not surprise me.”

But you don’t need to show a real child to make it real child pornography.

“People are fooling themselves if they think all I have is cartoons of kids doing things or images that I’ve created — I’m not guilty of child pornography. No, actually, you’re producing child pornography,” Dugan said. “The law allows us to arrest people that have representations of obscene material. There’s no gray area. Obscene material is obscene material, whether it’s live and there’s an actual victim or its representations.”

With its CyberTipline, the National Center for Missing and Exploited Children is a vital partner for investigators when it comes to child sexual abuse material.

Fallon McNulty, director of the tipline, said it’s “incredibly busy, so we have seen report volume grow exponentially over the years.”

McNulty said the tip line received 36 million reports of child sexual abuse material last year alone — the equivalent of nearly 100,000 reports of it everyday.

Technology is driving the deluge.

“The emergence of generative AI is, of course, of huge concern for us because of the implications that it can have on child safety,” McNulty said.

Andrew Rice, an assistant commonwealth’s attorney in Virginia Beach who handles child sexual abuse material prosecutions, said “[the abuse of technology] is the forefront of what we’re fighting right now.”

He said many begin as something innocent.

“These people are taking random photographs of clothed children,” he said. “They could go to the beach, you know, at the Oceanfront, take a picture of a child, upload it into artificial intelligence, and then their child’s clothes are gone.”

From January 2023 through June 2024, the CyberTipline took in more than 7,000 reports that appear to have used generative AI, which describes algorithms, such as ChatGPT, that can be used to make new content, including audio, code, images, simulations, text and videos.

ChatGPT is an artificial intelligence chatbot that can understand and put out text that sounds natural, and it is trained on volumes of text data to generate text similar to human conversation.

Both state and federal authorities say the role that the National Center for Missing and Exploited Children, or NCMEC, plays in prosecutions can’t be overstated.

“We’re working with them hand-in-glove because they’re able to get complaints from folks and then turn over to law enforcement,” Dugan said.

Said Rice: “NCMEC is invaluable to what we do.”

“The concern is just how easy it is to utilize these tools,” McNulty said. “There are a number of different sites and apps that are easily found online where even children can be using these to manipulate imagery, and that imagery has severe consequences.”

“It’s very alarming,” Rice warned. “There are multiple websites, some that you have to pay for and some that are free, that you can go to and upload photographs and create these pictures. And it’s devastating to the victims when they find out.”

10 On Your Side used a readily available AI generator — prompting it to create an innocuous image: “child on a swing set.” It refused, instead giving us a warning that it could be unsafe material.

This combination of technology has authorities emphasizing vigilance on the part of parents.

“Please check your kids’ cell phones,” Rice said. “There are so many social media apps out there right now. There are so many people that are pretending to be people that they’re not.”

Both the FBI and the Virginia Beach Commonwealth’s Attorney’s Office guard against burnout when it comes to child sexual abuse material. They will either rotate or ask for volunteers to handle the cases to limit the toll of exposure caused by these images.

“It’s horrific,” Dugan said, “some of the stuff that our folks, the task force officers, the victim specialists and agents come across on a daily basis as they do their investigations.”

He told 10 On Your Side that earlier in his FBI career, he had the chance to transfer to an office closer to home but the position would involve investigations of child sexual abuse material. He and his wife had just had a child. Once Dugan saw some of the imagery from previous cases, “I turned it down.”

When Rice began as a prosecutor in Virginia Beach, he handled child sexual abuse material cases, but after a couple years had to step away from it. Then he returned to those prosecutions after a few more years and discovered “it’s why I actually became a prosecutor.”

Rice said child sexual abuse material cases can be difficult to take to trial because they can involve young children having to testify. He said a seven-year-old is about the youngest he would put on the stand.