WAND Investigates continues to dig into artificial intelligence. As AI gets smarter, faster and more powerful each day, it's posing challenges for law enforcement. Criminals are using AI as investigators race to keep up.

With a few clicks and less than $50 for software, criminals are using generative AI to commit cyber attacks, identity theft, child abuse and sextortion.

"It makes it more challenging for law enforcement to determine origin and to be able to detect that, the systems are developing faster, you know, the technology developing faster than the ability to detect it," Justin Harris, a Computer Scientist with the FBI Springfield Office. 

The FBI's Springfield Office said few cases have come across their desk, but many are at agency branches across the country. This includes investigations where AI is used by cyber criminals.

"They can use, artificial intelligence to kind of increase their skill set. So maybe they don't know how to develop certain pieces of code to attack certain digital systems. So they'll use artificial intelligence then to take them to the next level so you know where they may be," Harris explained.

He said AI can write the malicious code, turning an amateur coder into an expert hacker.

"Is technology amazing? And are we going to find positive ways to use AI? Most definitely. But when it comes to this issue, when it comes to child sexual abuse material, the attorney general is going to put the safety of children into that conversation," Christine Feller told WAND News.

Feller is the Program Manager for the Internet Crimes Against Children Taskforce (ICAC) in the Illinois Attorney General's Office.

She said her investigators are trained to look for AI material.

"So as far as our digital forensic examiners that we have, they have extensive, extensive training background that they have to go through. And so really they're using different tools that are made available to them — as far as reviewing content, reviewing content to help determine, you know, are these identified or not?" Feller said.

She too said cases involving AI haven't come into the office yet, but her team is preparing.

"When it comes to working child sexual exploitation investigations, you know, we're seeing our investigators and our digital forensic examiners needing to spend more time in analyzing the content that they review," Feller explained.

She said criminals are using AI to either take an image of a real child or teen, and then create child sexual abuse material — or produce photos or videos that are artificially generated.

"When you're looking to try to identify victims, you know, we're in here. Yes we do we want to catch the bad guys — but at the same time, we want to remove kids from the hands of these offenders. And these offenders aren't just, you know, some of them are hands on. But what we know is that they're collecting images from people all around the globe. And these children could be here locally. They could be in another state. They could be in Australia. They could be absolutely anywhere," Feller added.

Feller said this adds a challenge for investigators to determine if a real child has been harmed, or if the image was artificially generated.

"You're looking at those images. Has this child been identified in the past? Is this a child that we have seen numerous times, maybe over the past decade? And when they're looking to see if it's a new victim now, what they might have to start considering is, is this AI?" Feller said.

She said criminals are also using AI after luring teens online.

"Oftentimes in these sextortion conversations, it'll start out where someone's probably posing as a hot girl and you've got a teenage boy chatting back. Maybe images do get sent. because again, they're visual, they're not thinking long term. They're in the moment — they send that picture. But what we see, though, in those sextortion cases, is that the individual then flips on them. They demand money to be sent to them. They will make collage images with the child's face, possibly the image that they sent," Feller explained.

Others may not even ask for pictures, but instead create deep fake nude images and use those to blackmail victims.

"I can just go online and find something — and who's going to know if it's real or not. But I'm going to have your face there. I'll have your Instagram, I'll have your Snap, your Facebook, any social media you're on. And you know what I'm going to send to your friends," Feller said.

Feller said there are steps you can take to protect yourself and your kids. She recommends parents be cautious online and talk with teens about what they're posting as well.

"Kids are going to make mistakes. And in these sextortion cases, and unfortunately, we know of at least a dozen suicides in the past year. And that's because kids have that moment where they're thinking short term. They're like, I have just ruined my life," Feller said.

Justin Harris said while bad actors are using AI to commit crimes, investigators can also use it to find perpetrators faster.

"So as cases progress, we accumulate a lot of data that we need to triage and go through fast and efficiently. Artificial intelligence allows us to do that. What that, does for us is generate leads, which leads to things that, you know, are relevant to the investigation. And then we have human experts then verify and validate those leads after artificial intelligence, sift through that data," Harris said.

He said the FBI is working to ensure agents and investigators understand the threat and how the technology works.

"So with the speed of the development, we've integrated a lot with private sector companies trying to stay ahead of the cutting edge of the artificial intelligence development," Harris added.

They are working to stay one step ahead of the criminals.

Copyright 2024. WAND TV. All rights reserved.