2020
DOI: 10.1007/978-3-030-49576-3_7
|View full text |Cite
|
Sign up to set email alerts
|

Understanding Moderation in Online Mental Health Communities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
22
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 30 publications
(26 citation statements)
references
References 66 publications
1
22
0
Order By: Relevance
“…In light of this research, our work could inform the design of tools to support the work of substance use related online community moderators. For instance, our results could inform the design of tools to help moderators target users posting risky content for interventional outreach, as discussed in recent studies on marginalized populations appropriating social media for health needs ( Andalibi et al, 2016 ; Saha et al, 2020 ; Wadden et al, 2021 ). Facebook similarly uses artificial intelligence (AI) to provide resources to those identified as being at risk for suicide based on a partnership with the National Suicide Prevention Lifeline ( Constine, 2017 ), and in March 2020 Reddit announced a partnership with Crisis Text Line to allow users to flag other users who may be in crisis ( Perez, 2020 ).…”
Section: Discussionmentioning
confidence: 79%
“…In light of this research, our work could inform the design of tools to support the work of substance use related online community moderators. For instance, our results could inform the design of tools to help moderators target users posting risky content for interventional outreach, as discussed in recent studies on marginalized populations appropriating social media for health needs ( Andalibi et al, 2016 ; Saha et al, 2020 ; Wadden et al, 2021 ). Facebook similarly uses artificial intelligence (AI) to provide resources to those identified as being at risk for suicide based on a partnership with the National Suicide Prevention Lifeline ( Constine, 2017 ), and in March 2020 Reddit announced a partnership with Crisis Text Line to allow users to flag other users who may be in crisis ( Perez, 2020 ).…”
Section: Discussionmentioning
confidence: 79%
“…Although we cannot speculate on the exact reason why Twitter presented more misinformation in the case of OUD-related misinformation as that requires causal inference analysis, which is beyond the scope of this paper, prior literature has pointed out the lack of active expert or clinical-based moderation on Twitter [51]. Although web-based health communities are also not immune to bad behavior and antisocial activities such as trolling, spamming, and harassment, these communities are often guided by strict norms against such behavior and moderated to ensure the quality and credibility of the content being shared [52]. Prior studies on different types of web-based health communities have demonstrated that adequate active moderation increases the engagement of members and consequently also increases the beneficial outcomes for members in a web-based community [53].…”
Section: Principal Findingsmentioning
confidence: 99%
“…Due to the lack of access to mental health services, individuals with mental health concerns are turning to online mental health communities such as Reddit to share their emotional challenges and seek social support. The goal of such communities is to provide a "safe haven" for mental health disclosure and peer-to-peer support for stigmatized concerns (Saha et al 2020). Previous research sug-gests that self-disclosure through anonymous communications may promote better mental health outcomes.…”
Section: Self-disclosure In Online Mental Health Communitiesmentioning
confidence: 99%