Rigorous Examination of Anonymous Reporting System Data to Prevent Youth Suicide and Firearm Violence: An Applied Natural Language Approach
This study will analyze school-based anonymous and confidential reporting system submissions to characterize the types of tips reported, what factors influence student tip submissions and content, and whether exposure to training influences tip behavior and content.
Anonymous and confidential reporting systems (ARS) are widely implemented in the U.S., with more than 50% of schools having access to at least one such reporting system. ARS and related school-based tip lines are designed to facilitate student sharing of concerning or suspicious behaviors by eliminating barriers to reporting, such as the social cost of ‘tattling’, supporting students’ self-efficacy to correctly identify and report a threat, and creating norms around school safety. Originally developed in response to threats related to school shootings, the utility of ARS is such that students can report myriad concerns, including those related to mental health, bullying behavior, substance use, weapons, and intimate partner/domestic violence, among others. Currently, more than 20 states mandate student access to some form of anonymous or confidential reporting system. Despite their widespread adoption and availability to students, there is limited empirical data available regarding the functioning of these systems and their ability to prevent firearm and other violent injury in school contexts. Anecdotal and summary reports by ARS providers highlight the potential for preventing youth injury, but few in-depth analyses of anonymous reporting systems currently exist. The lack of available data inhibits standardization for cataloging the type of tips, criteria for determining credibility, and tip management and resolution across ARS and related systems. Other common challenges to ARS implementation include raising student awareness and community buy-in, insufficient information to act, and coordinating information among stakeholders in real time. These problems are compounded by the volume of tips received and the nature of tip data which is typically narrative in form. Examining variation in tip data – including the tip content and factors preceding tips – can inform processes to triaging and responding to tips. Further, understanding what students report and about whom can uncover both potential bias and inefficiencies in reporting systems. Finally, although most reporting systems are accompanied by school training in the use of the system, there has not been an examination of whether training exposure affects the number of tips or how responders resolve tip concerns. Using natural language processing and mixed effects regression modeling, our study will leverage a large, semistructured data set of more than 17,000 tips submitted to a statewide ARS over 5 years in order to characterize the types of tips reported, what factors influence student tip submissions and content, and whether exposure to training influences tip behavior and content. We will present our findings to school- and reporting-system stakeholders for guided translation to identify actionable strategies and best practices for improving and standardizing reporting systems.