How can AI be used to support workforce development in the harm reduction sector?
In the harm reduction sector, there are significant problems relating to staff training and retention. This means that those who work in are overworked, and often go without adequate training or resources, which in turn means that those who require their help often go without the help they truly need.
We believe that AI technologies can be used to speed up the training of staff in this sector, provide expertise around the clock, and, in turn, help those who use drugs.
A quick look at the contemporary harm reduction sector?
The harm reduction sector encompasses a huge array of different services, including: the NHS, charities, private sector clinics, companies such as our own, and services in the community, schools, universities and beyond. Depending on how narrowly you define it, the sector could refer exclusively to organisations, services and professionals whose main purpose is to reduce harm related to drug use, or it could extend to any person or service under whom this responsibility falls.
Suffice to say, however you define it, the harm reduction sector is huge. It encompasses many different areas of life, most of which provide some level of care to people who could in one way or another be considered vulnerable; be they unwell, young, homeless, or something else.
Despite the breadth of the sector, it remains underfunded and understaffed, and those staff tend to be poorly trained and leave at very high rates. This is not to criticise those staff working within the sector, as they tend to work extremely hard and for little reward, but merely to highlight that this is a huge burden of care that affects a large part of the population and is yet inadequately provided for.
Given that, and given that changes need to happen rapidly, innovative solutions need to be found that can support both the staff working in the harm reduction sector, and those people supported by it.
What role could AI play in assisting with harm reduction?
We believe that artificial intelligence, in the form of large language models (LLMs), has the potential to genuinely improve this sector at the necessary rate. If designed with care and implemented properly, this technology could provide novel, effective and—importantly—affordable solutions for everyone ranging from professionals in the mental health services to counsellors at schools and universities, and even for those who use drugs themselves.
Below, we’ll outline how we think AI can help.
How could this directly benefit harm reduction workers?
Those who work in the harm reduction sector, as with much of the health and social care sector, work under extreme pressure and with severely limited resources, all of which limits their capacity to do their jobs properly and, crucially, to deliver care in the way they’d like to. Not only do they not have the time and resources to care for their service users or other people who use drugs in problematic ways, but they also have little time to engage with ongoing, in depth training, leading sometimes to a gaping chasm between intent and ability.
AI products can help to solve this. By building specialist LLMs—highly specialised chatbots—it is possible to give every member of staff in any service access to a highly skilled professional 24/7. Tailored to the needs of each service, these AI products can answer any drug-related queries, from simple questions to complex ones relating to anything from drug interactions, to how to communicate with people who are currently on drugs. What’s more, these services are available around the clock, in many different languages, and the products’ responses will tailor themselves to the audience they are speaking to, making them highly adaptable.
Not only can they answer questions but they can proactively ask questions providers may not think of, and put together the dots even in high pressure situations.
In different contexts, AI models can also provide affordable and high quality training solutions. Taking on the role of people who use drugs, social services, police or other health and social care providers, AI can offer ongoing and ever-evolving training that simulates real life scenarios carers and staff might face. This means that service providers, rather than booking in costly consultants and clearing adequate space for them, can instead integrate training into their staffs’ schedules as an ongoing practice, saving time and money but also improving the quality of training due to the continuous nature of it.
How could this directly benefit people who use drugs?
Clearly, by helping those who work in harm reduction, these tools would have huge benefits for those who use drugs and require support as a consequence.
But these tools can also be designed to be deployed directly to people who use drugs. For instance, people who use drugs and who are signed up with a particular service, or who are enrolled in a certain institution (say a university) could have constant, 24/7 access to a custom built, highly specialised chatbot that would be able to immediately give meaningful and insightful harm reduction advice, and point the users towards local services that might be able to help with.
In this way, these products could also remove some of the burden from many of the services that practice harm reduction by providing people who use drugs with a first port of call, that in some cases may avert problems before they even arise.
What problems could using AI for harm reduction cause?
Of course, the introduction of AI into this sector is not free from potential problems. Though difficult to predict all potential issues, here are some potential drawbacks associated with this technology:
• Providers may become over reliant on AI: this might manifest with less emphasis given to ensuring that staff have a solid foundation of training, given that with certain AI products they can call up information as and when necessary.
• Staff might not think for themselves: AI can work as a tool to assist those working in the harm reduction sector, but it will not replace the fluidity and empathy of the human mind. Staff should not defer blindly to AI-based software. What’s more, AI products will make mistakes, and staff must be capable of spotting these.
• Staff/users will need to learn to use AI: to get the most from these products and use them effectively, it’s essential that users know how to best use the products, and/or the products are very easy to use.
How can we overcome these problems?
With appropriate care and thought, all of these problems can be overcome:
• Providers must understand that the AI is a complementary tool and not a replacement for expertise. It can absolutely fill gaps and provide training, but it cannot replace the need for skilled staff.
• Likewise, all users of AI products have to understand the nature of these products and still retain the ability to think for themselves. Any caring profession necessarily requires staff to be flexible and imaginative, it’s imperative that this doesn’t change because of AI.
• Users should be trained to use AI products effectively but, perhaps more importantly, AI products must be designed to be as user friendly as possible.
We believe that AI tools have the potential to radically change the harm reduction sector. What’s more, we believe it’s a sector that radically needs to change in order to support those it seeks to. AI can train staff in all manner of different areas, provide ad hoc advice on a range of subjects, and even be used by those who use drugs.
In all these ways, it can be a huge support to the sector and those who need it. If implemented properly, it can be extremely helpful.