How The Trevor Project is using AI to support at risk LGBTQ+ youth

Profile picture for user ddpreez By Derek du Preez October 25, 2021 Audio mode
Summary:
The Trevor Project is the world’s largest suicide prevention and crisis intervention for LGBTQ+ young people. The nonprofit is using AI to train counsellors and identify high risk individuals.

An image of a young LTBGQ+ Person
(Image by Sharon McCutcheon from Pixabay )

LGTBQ+ people, particularly young LGTBQ+ people, are one of the world's most marginalized and vulnerable populations. And as a group, young members of the LGBTQ+ community are at a higher risk of suicide compared to their heterosexual, cisgender peers. In fact, LGBTQ youth are four times more likely to attempt suicide and it is estimated that 1.8 million LGBTQ+ youth in the US seriously consider taking their own lives each year. 

Reaching these young people to offer them support in times of crisis is not easy, but it's a challenge that has been tackled by nonprofit organization The Trevor Project since it was founded in 1998. The Trevor Project is the world's largest suicide prevention and crisis intervention organization for lesbian, gay, transgender, queer and questioning young people. 

And whilst The Trevor Project has been praised for over two decades of community support work, the problem of scaling to reach all young LGBTQ+ people in need persists. Over the past year or so, the organization has been thinking about how it can adopt AI to help tackle this issue, to help it scale further. 

But, the teams at Trevor are all too aware that when it comes to applying AI to service marginalized populations, the stakes are incredibly high. As such, it believes that it is taking a unique human-centric approach to applying AI, to further its life saving work. 

John Callery, Vice President of Technology at The Trevor Project, was speaking at Twilio's recent Signal event, where he explained how the organization is using Twilio API and Messaging Services, alongside building AI models, to help prioritize high risk individuals. Callery said: 

When it comes to applying AI to serve marginalized populations, such as LGBTQ youth, the stakes are incredibly high. Unlike many organizations that look to replace human input with AI, Trevor seeks to optimize its services using AI as an enhancement to the human touch. 

We recently implemented new AI and natural language processing models to scale our counsellor training programme and to connect youth at highest risk with counsellors as quickly as possible, with the ultimate goal of reaching more LGBTQ young people.

One of the biggest challenges has been scaling to reach all the LGBTQ youth who need our support from the tech side of things. We recognized an opportunity to use AI to help Trevor expand its reach, without sacrificing our quality of care, starting with a natural language processing model that connects youth at the highest risk of suicide with counsellors as quickly as possible.

Using AI to prioritize those most at risk

One of the first problems that The Trevor Project sought to address through the use of AI was using AI to connect those at the highest risk of suicide wiht counsellors as quickly as possible. To achieve this, the tech team at Trevor built a bifurcation model that spans across Trevor Text (a confidential text messaging service) and Trevor Chat (a confidential browser-based chat). 

Before every conversation via either Trevor Text or Trevor Chat, the young LGBTQ+ user is given the opportunity to answer a series of structured questions, as well as one open-ended question: what's going on? Young LGTBQ+ people reach out to Trevor for a multitude of reasons - anything from bullying, to grief, to coming out - and these answers help counsellors to understand what issues the person in question may be facing. 

Dan Fichter, Head of AI & Engineering at The Trevor Project, explained how this information is being used to analyze those users most at risk. He said: 

Our machine learning teams' goal was to use this information to predict a risk level, and then connect youth who are likely at highest risk to the next available counsellor, using natural language processing. The combination of open ended and structured responses is not only incredibly helpful for Trevor counsellors to better understand the youth reaching out - counsellors can read these responses right before starting a conversation - but it also turned out to be sufficient to power a highly accurate, queue bifurcation model.

Trevor Text runs on Twilio Programmable Messaging and it switched its SMS line over to the system in Spring 2020. The move itself has helped The Trevor Project to trim about 10 seconds off each message exchange, which has meant an overall latency improvement of approximately 10 minutes per conversation. As you can imagine, when supporting highly vulnerable individuals, every second counts. 

But it's the bifurcation model that's proving to be incredibly useful to Trevor. The below image shows how the bifurcation model is being used, where the youth highlighted in magenta are those being identified as the highest risk. 

An image of Trevor Project’s bifurcation AI model
(Image sourced via Trevor Project)

Fichter explained: 

We ultimately fine tuned an Albert model and achieved excellent precision and recall in making that prediction. Before deploying our Albert based model, the best that we could do was to make a prediction of suicide risk by using young folks' answers to our structured, pre-chat questions. But our goal with this project was to produce a more accurate prediction by combining those responses with young folks' free form responses to what's going on.

There's no crisis too big or too small to reach out to a service like Trevor about. And often folks who are not feeling intensely suicidal in the moment they reach out to Trevor have felt suicidal before, and they're trying to work through something intense that they know could lead them back into crisis if they don't talk about it. 

Recognizing and understanding how someone's emotions could escalate if they're left without support can help us frame our conversations and get folks who are at the highest risk of suicide, to the front of our queue. 

Again, every second counts when we're serving youth in crisis. At Trevor it's also vitally important that our model be equally accurate for all the youth who might reach out to us. For some organizations AI model fairness is an after the fact consideration, but for Trevor, equity is absolutely core to what we do.

Ensuring intersectionality exists in AI models

Intersectionality is inherent in the LGBTQ+ community - and it's something that Trevor has to be very conscious of when building out AI features to support individuals. Fichter said: 

We exist because the kind of support that young LGBTQ folks need is not always the same as the kind of support that young folks who aren't queer, or LGBTQ adults over 25, might need. Young LGBTQ folks of colour, young trans folks, and young trans folks of colour, are often especially underserved by school, healthcare, and other social systems that weren't designed with their needs in mind. 

With this in mind, The Trevor Project is also undertaking work to pick up on any biases that may unintentionally make their way into the work being carried out by teams. Callery explained: 

We recognize the complicated ways different identities can intersect and put someone at higher risk for suicide. In order to ensure our AI models serve all groups equally, we pulled in stakeholders from various teams and developed a framework that allows us to evaluate how our models are treating various groups over time. 

This helps us catch undetected biases in their tracks and reevaluate our models on an ongoing needed basis. This intersectional approach to AI was crucial for the bifurcation model, but it remains core to our AI and machine learning work and has been key to the success of our more recent projects beyond using AI to enhance our services. 

Growing Trevor's counsellor base

Whilst AI is being used to prioritize young individuals most at risk, The Trevor Project is also thinking about how AI can be used to train more volunteers for the organization - again, with the aim of reaching those hundreds of thousands of young people that need support. Callery said: 

We knew that in order to meet the growing demand of youth reaching out, we needed to figure out a way to grow our counsellor base from hundreds to 1000s.

As part of The Trevor Project's training programme for new volunteers and staff counsellors, people have to participate in realistic roleplay conversations. Aspiring counsellors go through four roleplay conversations before going on a shift for the first time to support real young people. Each roleplay gives trainees a chance to interact with a different youth persona with a particular identity background and suicide risk level. Each roleplay can each take upwards of an hour, meaning it can be time consuming to get one counsellor to a point where they are ready to speak to a real individual. 

As such, The Trevor Project is looking at how AI could be used to train volunteers at scale. Fichter explained: 

We asked ourselves, what if we could build a conversational simulator that can play the part of the youth, and can say all of the things that a young person would say in that scenario for trainees to train with? So we set out to build our crisis contacts simulator and our first simulated youth persona - Riley. 

This simulator simulates text based conversations with LGBTQ youth in crisis, letting aspiring counsellors experience realistic practice conversations before conducting instructor-led role plays, and eventually monitored counsellor shifts under staff supervision. Just as new EMTs practice their skills on physical models of people before interacting with real patients. We wanted our trainees to be able to practice their new skills in active listening, safety planning and de-escalation in a safe simulated environment, before graduating training.

To our knowledge, there are very few organisations using AI for crisis services. And none using AI to simulate a person in crisis. The models we used here have not existed for very long, and we simply would not have been able to build as realistic a Riley, even three years ago. And of course, evaluating numerous base candidate models was a major part of the undertaking for us. We set out to build a model that could get to the emotional core of what a young person wants to share with a counsellor and can communicate in the ways that a young person really would.

The Trevor Project has been through over two hundred iterations of Riley, before it could faithfully simulate Riley, a gender queer young person in crisis, for a realistic hour long conversation with counsellor trainees. However, the time invested has been worthwhile and The Trevor Project is now building further personas. Callery said: 

The launch of Riley was incredibly successful for our team, and has already led to impactful results. This year alone, we expect to triple our number of digital volunteer crisis counsellors and eventually grow that pool by 10 times, significantly increasing the number of LGBTQ youth we can serve each year. 

The tool also enables counsellors to fit training into their own schedules, nearly 70% of our digital crisis counsellors volunteer on nights and weekends, and now they could complete part of their training during that time too. Looking ahead, we're excited to continue developing additional personas and adding other AI models to our toolbox.