UC-developed chatbot to help supervision officers, parolees
The University of Cincinnati's School of Information Technology professors are developing a chatbot for the Ohio Department of Rehabilitation and Correction (ODRC) that will help supervision officers and people on parole.
Chatbots, such as ChatGPT, have exploded in popularity for their ability to mimic human conversation and generate responses to requests and questions posed by their human users.
Murat Ozer, Ph.D., an associate professor in UC's School of Information Technology, part of the university's College of Education, Criminal Justice, and Human Services, is leading the development of the corrections chatbot that will use curated information to offer a new resource for the criminal justice system.
"I think there's a big future for chatbots, especially for specialized chatbots," Ozer said.
Publicly available chatbots like ChatGPT and search engines such as Google use indexed data, scraping large swaths of the internet to collect information that informs their responses to queries.
The ODRC chatbot will be more selective in the information it uses to ensure clients receive appropriate responses. That information may include private information, such as password-protected corrections resources and indexed data that has been screened to ensure its accuracy.
With funding from a federal grant, Ozer is collaborating with two UC assistant professors, Nelly Elsayed, Ph.D., and Zag ElSayed, Ph.D., along with the Corrections Institute (UCCI) at UC to select the information it will feed into its chatbot and to train it to respond like supervision officers would.
"They provided us with videos based on the different scenarios of how supervision officers should respond," Ozer said of working with the UCCI.
While it won't replace the interactions between supervision officers and parolees, the chatbot will supplement them. The technology will be available at any time of the day, for both scheduled check-ins and unscheduled interactions, allowing clients to access resources whenever they're needed.
"If the client doesn't want to meet the supervision officers, they would be able to go with the chatbot," Ozer said. "Sometimes they are shy or they don't want to share something with the supervision officer—but they can talk to the chatbot."
When users begin their interactions with the chatbot, they'll converse freely like small talk between individuals.
"A problem with clients for their first visit is often they are anxious," Ozer said. "The chatbot will share information and will motivate them; motivation is very important at the first stage."
The chatbot then will move into screening questions, from which it will analyze the clients' responses to identify their needs.
"We are going to [design] the chatbot conversation [to mimic human interaction] as much as possible," Ozer said. "It's going to be very smooth and appealing to the client to continue to speak to the chatbot. It's going to be like their friend."
Elsayed and ElSayed are focusing on developing artificial intelligence that could detect deception from the chatbot's users.
The chatbot also will be able to detect risky behaviors and users' mental well-being. If it determines the client needs immediate attention, such as when the person is suicidal, it will alert people who could respond.
"If there is an urgent situation that we need to inform or alert supervision officers, we could send that to a center that would serve 24 hours a day," Ozer said. "The chatbot will guide them [and] will try to help them with their needs."
The team hopes to release its full version of the chatbot to the ODRC next year for statewide implementation.
In the future, Ozer would like to expand the chatbot's capabilities and release a version that would be available publicly for victims of crime.
"I don't think it will be too long after this because we will have the template, we'll know how it works," Ozer said of expanding the chatbot to victim services.
For crime victims who are reluctant to file a police report, the chatbot would allow them to anonymously access curated resources.
"We know there are more crimes than are officially reported. Victims do not know their rights or maybe they're afraid of going to the police and reporting the crime," Ozer said. "They may not have access to an attorney, but the chatbot could provide legal resources for them."
If victims of crime are able to anonymously seek out resources, Ozer expects it to help authorities gain a more complete picture of crime and allow them to better understand where crimes occur and which ones are underreported. "We want to help them and reduce victimization," Ozer said.
"We want them to know their rights and how they can get help."
For the chatbot development, Ozer also is collaborating with the Information Technology Solution Center, an initiative of UC's School of Information Technology that aims at exploring innovative and affordable solutions that combine the power of computing technology and data to serve the needs of individuals and organizations.
Provided by University of Cincinnati