Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
25
26
27
28
29
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
20
21
22
23
24
26
27
28
29
30
31
1
12:00 AM - TEDMED 2017
2
3
4
5
Raleigh Health IT Summit
2017-10-19 - 2017-10-20    
All Day
About Health IT Summits Renowned leaders in U.S. and North American healthcare gather throughout the year to present important information and share insights at the Healthcare [...]
Connected Health Conference 2017
2017-10-25 - 2017-10-27    
All Day
The Connected Life Journey Shaping health and wellness for every generation. Top-rated content Valued perspectives from providers, payers, pharma and patients Unmatched networking with key [...]
TEDMED 2017
2017-11-01 - 2017-11-03    
All Day
A healthy society is everyone’s business. That’s why TEDMED speakers are thought leaders and accomplished individuals from every sector of society, both inside and outside [...]
AMIA 2017 Annual Symposium
2017-11-04 - 2017-11-08    
All Day
Call for Participation We invite you to contribute your best work for presentation at the AMIA Annual Symposium – the foremost symposium for the science [...]
Events on 2017-10-19
Raleigh Health IT Summit
19 Oct 17
Raleigh
Events on 2017-10-25
Events on 2017-11-01
TEDMED 2017
1 Nov 17
La Quinta
Events on 2017-11-04
AMIA 2017 Annual Symposium
4 Nov 17
WASHINGTON
Articles News

A study shows that AI can detect suicide risk early.

EMR Industry

As artificial intelligence helps doctors discover diseases like cancer at an early stage, it is also demonstrating its potential in tackling mental health crises. According to one study, artificial intelligence can detect patients who are at danger of suicide, providing a tool for prevention in everyday medical settings.

The study, published in the JAMA Network Open Journal, examined two approaches of notifying doctors about suicide risk: an active “pop-up” alarm demanding immediate attention and a passive system (less urgent) that displays risk information in a patient’s electronic chart.

The study discovered that active warnings beat the passive strategy, encouraging doctors to assess suicide risk in 42% of cases, against only 4% with the passive system. Furthermore, it emphasized the importance of using certain techniques to initiate a discourse about suicide risks.

This breakthrough, which combines automated risk identification with deliberately tailored alarms, provides hope for identifying and supporting more people in need of suicide prevention services.

Colin Walsh, an Associate Professor of Biomedical Informatics, Medicine, and Psychiatry at Vanderbilt University Medical Center, emphasized the importance of this breakthrough. “Most people who die by suicide have seen a healthcare provider in the year before their death, often for reasons unrelated to mental health,” according to Walsh.

Previous research indicates that 77% of people who commit suicide had contact with primary care doctors in the year before their death. These findings highlight the essential role AI can play in bridging the gap between conventional medical treatment and mental health interventions.

The Suicide Attempt and Ideation Likelihood model (VSAIL), an AI-driven system developed at Vanderbilt, was tested in three neurology clinics. The method uses normal data from electronic health records to calculate a patient’s 30-day probability of attempting suicide. When high-risk patients were identified, practitioners were encouraged to start focused conversations about mental health.

Walsh clarified: “Universal screening isn’t practical everywhere, but VSAIL helps us focus on high-risk patients and spark meaningful screening conversations.”

While the findings were promising, the researchers emphasized the importance of striking a balance between the benefits of active alerts and their possible drawbacks, such as workflow disruption. The authors proposed that comparable methods may be implemented for other medical specialties in order to broaden their reach and impact.

Cambridge University published a research earlier in 2022 that used PRISMA criteria to assess individuals who were at risk of attempting suicide.