Documents to download

Given the increasing demand for mental healthcare and capacity challenges within mental healthcare, many purpose-built solutions are being trialed by NHS Trusts and beyond. Many examples of AI tools are listed within the POSTnote. Currently much of the deployment has been to supplement delivery of therapy, such as alleviating administrative burdens. However, there is some debate on whether more autonomous solutions could work for some service users. 

Research suggests that purpose-built AI solutions can be effective in reducing specific symptoms of some mental health conditions such as anxiety or depression, tracking relapse risks (such as for psychosis), and inciting preventative behaviour changes. However, contributors and systematic reviews emphasised that longer-term and larger-scale studies are needed to better identify what works for whom. The need for more evaluation of cost and efficiency-saving claims was also highlighted. 

An area of particular interest to many stakeholders is precision psychiatry. These techniques harness the availability of multiple data sources, such as brain imaging, DNA or blood samples, and passive data collection from mobile phones (among many others). These techniques aim to make diagnosis, treatment and prediction of risks more precise, and large-scale trials are underway. To support implementation of all this, investment, strategy, upskilling, co-design and public trust building are needed. 

There are also numerous ethical and regulatory considerations, with responses from government agencies underway. See more details in PN738. 

 

Key points

  • Demand for mental healthcare is rising, and the NHS capacity to meet the demand is insufficient. Digitalisation is proposed as part of strategies to address capacity issues and improve care, and deployment of AI tools is part of this. 

 

  • Opportunities offered by AI tools include supporting the delivery of therapy and supporting administrative or clinical decision-making tasks. There is discussion on how autonomous such services should be, with arguments made on both sides. 

 

  • Many purpose-built AI tools are already being deployed or trialed by NHS Trusts. Notably, such tools are distinct from less regulated ‘wellbeing’ apps or from general purpose AI tools (such as ChatGPT, CoPilot or other tools) 

 

  • Another opportunity area is ‘precision psychiatry’. Innovations in this area aim to harness the opportunities of multiple data sources and AI’s data processing capabilities. The aspiration is to make diagnosis, treatment and prevention more personalised and precise. Large scale trials of such techniques are underway, but they are at an earlier stage than some other AI tools. 

 

  • Key delivery considerations include: investment in infrastructure (including data infrastructure), creating strategies to ensure benefits are felt across all NHS Trusts, designing AI tools to ensure workflow integration, revisiting service design to harness AI benefits, evaluating cost effectiveness, and building public trust through engagement, education, and coproduction. 

 

Acknowledgements 

POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including:  

Members of the POST board*  

Aynsley Bernard, Kooth 
 
Dr Graham Blackman, University of Oxford 
 
Professor Adriane Chapman, The Governance in AI Research Group (GAIRG)* 
Claudia Corradi, The Nuffield Council on Bioethics 
 
Dr David Crepaz-Keay, the Mental Health Foundation* 
 
Fiona Dawson, Mayden 
 
Zoe Devereux, University of Birmingham* 
 
Dr Piers Gooding, La Trobe University 
 
Dr Caroline Green, University of Oxford 
 
Lara Groves, Ada Lovelace Institute 
 
Rachel Hastings-Caplan, Rethink Mental Illness and Mental Health UK* 
 
Dr Gareth Hopkin, Science Policy and Research Programme Team, National Institute for Health and Care Excellence (NICE)* 
 
Dr Becky Inkster, Cambridge University 
 
Dr Grace Jacobs, Kings College London* 
 
Lauren Jerome, Queen Mary University of London 
 
Dr Caroline Jones, The Governance in AI Research Group (GAIRG)* 
 
Dr Indra Joshi, Trustee for Lift Schools 
Dr Andrey Kormilitzin, University of Oxford 
 
Associate Professor Akshi Kumar, Goldsmiths, University of London 
 
Professor Agata Lapedriza, Northeastern University; Universitat Oberta de Catalunya 
 
Dr Paris Alexandros Lalousis, Kings College London* 
 
Dr Sophia McCully, The Nuffield Council on Bioethics 
 
Dr Rafael Mestre, Southampton University* 
 
Associate Professor Stuart Middleton, Southampton University 
 
Dr Max Rollwage, Limbic* 
 
Dr Annika Marie Schoene, Northeastern University 
 
Julia Smakman. Ada Lovelace Institute* 
 
John Tench, Wysa 
 
Dr James Thornton, The Governance in AI Research Group (GAIRG)* 
 
Alli Smith, Office for Life Sciences 
 
Mona Stylianou, Everyturn Mental Health* 
 
Dr Pauline Whelan, CareLoop* 
 
Dr Gwydion Williams, Wellcome Trust* 
 
Dr James Woollard, Oxleas NHS Foundation Trust, NHS England* 
 
Andy Wright, Everyturn Mental Health 
 
Emeritus Professor Jeremy Wyatt, The Governance in AI Research Group (GAIRG)* 
 
Information Commissioners Office* 
 

*denotes people and organisations who acted as external reviewers of the briefing. Some of them were also part of the interview contribution process. Note that contributors are listed in alphabetical order by surname. 


Documents to download

Related posts