This high school’s science project could one day save lives

Adolescent mental health as diagnosed by artificial intelligence.
Adolescent mental health as diagnosed by artificial intelligence.

If you or someone you know might be considering suicide, contact the 988 Suicide and crisis lifeline by calling or texting 9-8-8 or Crisis text line by sending an SMS HOME to 741741.

Text messages, Instagram posts and TikTok profiles. Parents often warn their kids against sharing too much information online, tired of how all that data is being used. But a Texas high schooler wants to use that fingerprint to save lives.

Siddhu Pachipala is a senior at Woodlands College Park High School, in a suburb outside of Houston. He’s been thinking about psychology since seventh grade, when he was reading Think, fast and slow by psychologist Daniel Kahneman.

Concerned about teen suicide, Pachipala saw a role for artificial intelligence in detecting risk before it’s too late. In his view, it takes too long to help children when they are in pain.

The early warning signs of suicide, such as persistent feelings of hopelessness, mood changes, and sleep patterns, often go unnoticed by loved ones. “So it’s hard to get people to spot them,” says Pachipala.

For a local science fair, he designed an app that uses AI to scan text for signs of suicide risk. She thinks it could one day help replace outdated diagnostic methods.

“Our writing patterns may reflect what we’re thinking, but it hasn’t really been extended up to this point,” he said.

The app earned him national recognition, a trip to Washington, and a speech on behalf of his peers. It’s one of many ongoing efforts to use AI to help young people with their mental health and to better identify when they’re at risk.

Experts point out that this type of AI, called natural language processing, has been around since the mid-1990s. And it’s not a panacea. “Machine learning is helping us get better. As we get more and more data, we’re able to improve the system,” says Matt Nock, a psychology professor at Harvard University who studies self-harm in young people. “But chatbots won’t be the silver bullet.”

Colorado psychologist Nathaan Demers, who oversees mental health websites and apps, says personalized tools like Pachipala’s could help fill a gap. “When you walk into CVS, there’s that blood pressure cuff,” Demers said. “And maybe it’s the first time anyone realizes, ‘Oh, I have high blood pressure. I had no idea.’ “

He hasn’t seen Pachipala’s app, but theorizes that innovations like his increase self-awareness about underlying mental health issues that might otherwise go unrecognized.

Building SuiSensor

Pachipala set about designing an app that someone could download to self-assess their suicide risk. They could use their findings to support their care needs and connect with providers. After many nights of planning, he had done it SuiSensor.

Siddhu Pachipala

Chris Ayers Photography/Society for Science


hide caption

toggle caption on/off

Chris Ayers Photography/Society for Science

Using sample data from a medical study, based on journal entries from adults, Pachipala said SuiSensor predicted suicide risk with 98% accuracy. While it was only a prototype, the app could also generate a contact list of local doctors.

In the fall of his senior year of high school, Pachipala entered his pursuit in the Regeneron Science Talent Search, an 81-year national science and math competition.

There, panels of judges critiqued him on his knowledge of psychology and general science with questions like, “Explain how pasta boils. … OK, now let’s say we took it into space. What happens now?” Pachipala recalled. “You came off those panels and you were beat up and bruised, but, like, better for that.”

He finished ninth overall in the competition and took home a $50,000 prize.

The judges found that “his work suggests that the semantics in an individual’s handwriting could be related to his or her psychological health and suicide risk.” While the app isn’t currently downloadable, Pachipala hopes that, as an undergraduate at MIT, he can continue working on it.

“I think we don’t do it enough: try to cope [suicide intervention] from an innovation perspective,” he said. “I think we’ve stayed true to the status quo for a long time.”

Current applications of AI mental health

How does his invention fit into larger efforts to use AI in mental health? Experts note that there are many such efforts underway, and Matt Nock, for example, has expressed concern about hoaxes. He applies machine learning to electronic health records to identify people at risk of suicide.

“Most of our predictions are false positives,” he said. “Is there a cost there? Does it hurt to tell someone they’re at risk of suicide when they really aren’t?”

And data privacy expert Elizabeth Laird is concerned about the implementation of such approaches particularly in schools, given the lack of research. She directs the Equity in Civic Technology Project at the Center for Democracy & Technology (CDT).

While acknowledging that “we have a mental health crisis and should do everything we can to stop students from harming themselves,” she remains skeptical about the lack of “independent evidence that these tools do.”

All of this focus on AI comes as youth suicide rates (and risk) are on the rise. Although there is a lag in the data, the Centers for Disease Control and Prevention (CDC) reports that suicide is the second leading cause of death for youth and young adults ages 10 to 24 in the U.S.

Efforts like Pachipala’s tap into a wide range of AI-powered tools available to monitor the mental health of young people, accessible to both clinicians and non-professionals. Some schools use activity tracking software that scans devices for warning signs of a student harming themselves or others. One concern, however, is that once these red flags emerge, that information can be used to discipline students rather than support them, “and that such discipline falls along racial lines,” Laird said.

According to a survey shared by Laird, 70% of teachers whose schools use data-tracking software said it was used to discipline students. Schools may stay within the bounds of student record privacy laws, but fail to implement safeguards that protect them from unintended consequences, Laird said.

“The privacy conversation has moved from just legal compliance to what is actually ethical and right,” he said. She points to survey data showing that nearly 1 in 3 LGBTQ+ students report being discovered, or know someone who has been discovered, as a result of activity tracking software.

Matt Nock, the Harvard researcher, acknowledges the place of artificial intelligence in chewing numbers. He uses machine learning technology similar to Pachipala’s to analyze medical records. But he points out that much more experimentation is needed to test computational evaluations.

“Much of this work is really well-intentioned, trying to use machine learning and artificial intelligence to improve people’s mental health… but unless we do the research, we won’t know if this is the right fit.” .” he said.

More and more students and families are turning to schools for mental health support. Software that scans young people’s words, and by extension thoughts, is one approach to taking the pulse of young people’s mental health. But it can’t take the place of human interaction, Nock said.

“Technology will help us, hopefully, get better at knowing who is at risk and knowing when,” he said. “But people want to see humans; they want to talk to humans.”

#high #schools #science #project #day #save #lives

Leave a Reply

Your email address will not be published. Required fields are marked *