2022
DOI: 10.1111/sltb.12853
|View full text |Cite
|
Sign up to set email alerts
|

Validating a predictive algorithm for suicide risk with Alaska Native populations

Abstract: Introduction:The American Indian/Alaska Native (AI/AN) suicide rate in Alaska is twice the state rate and four times the U.S. rate. Healthcare systems need innovative methods of suicide risk detection. The Mental Health Research Network (MHRN) developed suicide risk prediction algorithms in a general U.S. patient population. Methods:We applied MHRN predictors and regression coefficients to electronic health records of AI/AN patients aged ≥13 years with behavioral health diagnoses and primary care visits betwee… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…Walsh and colleagues evaluated the performance of a suicide attempt prognostic model in an urban clinical care setting. Shaw et al applied the Mental Health Research Network (MHRN) model to electronic health records of Alaska Native primary care patients and found good performance, with an AUC of 0.83. However, a recent analysis of the MHRN model performance disaggregated by race suggested the model underperformed among Black and Native American patients compared with White, Hispanic, and Asian patients .…”
Section: Discussionmentioning
confidence: 99%
“…Walsh and colleagues evaluated the performance of a suicide attempt prognostic model in an urban clinical care setting. Shaw et al applied the Mental Health Research Network (MHRN) model to electronic health records of Alaska Native primary care patients and found good performance, with an AUC of 0.83. However, a recent analysis of the MHRN model performance disaggregated by race suggested the model underperformed among Black and Native American patients compared with White, Hispanic, and Asian patients .…”
Section: Discussionmentioning
confidence: 99%
“…Suicide risk prediction algorithms using social data can also be prone to algorithmic bias, as shown by studies that have found that algorithms perform poorly when utilizing social media data, facial imagery, and speech samples collected from racial and ethnic minority groups compared with data collected from White groups (Blodgett and O'Connor, 2017;Hitczenko et al, 2022). However, algorithms for suicide risk prediction that have been tested and developed specifically for use within at-risk subgroups, such as one algorithm tested among Alaskan Native populations (Shaw et al, 2022) and another developed for use among Native American communities (Haroz et al, 2020), have had more promising results.…”
Section: Suicide Risk Identification or Prediction Algorithms Can Be ...mentioning
confidence: 99%
“…Machine learning (ML) models developed to predict suicide attempts from electronic health record (EHR) data have shown acceptable to good accuracy using a wide range of algorithms, predictors, and outcome windows . Although the practical utility of suicide prediction models has been debated, recent studies suggest that several models achieve good enough levels of accuracy at thresholds that take clinician and health care system burden into account .…”
Section: Introductionmentioning
confidence: 99%
“…Machine learning (ML) models developed to predict suicide attempts from electronic health record (EHR) data have shown acceptable to good accuracy using a wide range of algorithms, predictors, and outcome windows. [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19] Although the practical utility of suicide prediction models has been debated, [20][21][22][23][24] recent studies suggest that several models achieve good enough levels of accuracy at thresholds that take clinician and health care system burden into account. 25,26 Several health care delivery systems have already begun implementing suicide prediction algorithms into clinical practice, providing augmented risk information during mental health visits to enhance treatment planning and collaboration between patients and clinicians.…”
mentioning
confidence: 99%