God knows dealing with kids isn’t easy. No one knows that more than parents… and teachers. “When I was a high school teacher I can remember high school girls coming and saying I’m pregnant, I think I might need an abortion,” said Associate Penn State Professor Deborah Schussler.
It’s tough to know what to say. What happens when the student tells a teacher they’re being bullied or abused? Or that they’re gay? “Anybody that’s worked in one of those roles in the course of time can always identify a moment that sent them reeling. That they replay in their head year after year wishing they could go back and redo,” said Assistant Penn State Professor Jennifer Frank.
A team of researchers at Penn State, which includes Frank and Schussler, is trying to spare the teachers those regrets and better serve their students all through the help of artificial intelligence. They’ve created a computer program called Chatbot which features a virtual student named Eli. “He’s in fourth grade and he looped with the same teacher from a year before so he’s used to having that teacher, but he’s known to have some outbursts but generally a quiet student,” said Emily Chukusky who is a research assistant on the team.
It took about a year to create Eli and was paid for with a grant from Penn State. Because it uses artificial intelligence, you can continue to add to the program, giving Eli the ability to have elaborate interactions with someone training to become a teacher. “If they say something and the child is emotionally charged. The child may say I’m really pissed off. Leave me alone. We haven’t put any expletives in it, but it’s not unusual that a child may say to a teacher f*!k you,” said Schussler.
Being able to react to a computer simulation of a kid that’s saying the F word gives you the chance to practice, so that you’re prepared if it happens in real life. Twenty five teacher candidates used the simulation for the first study. Each one of them got to interact with Eli four times. Some handled him pretty well, but not everyone. “They felt that they’re inadequate. That if it happened in real life they would have felt really disappointed in themselves,” said Julia Mahfouz who is another research assistant on the team.
However that’s kind of the point, learning from your mistakes. You can only talk to Eli by typing, but the team is hoping to add animation to give him a more realistic look. In the future the team sees potential for the technology to be used in any field that can deal with volatile situations like nursing or law enforcement. “All kinds of potentially volatile situations, if you could identify what those exchanges are like and you’re able to identify how optimally to respond, artificial intelligence could be used to simulate that,” said Frank. These researchers are showing how machines can train people.