AI: The future of pain assessment

by

Professor Jeff Hughes, chief scientific officer at PainChek, explains how artificial intelligence (AI) and smart automation has the potential to revolutionise pain assessment in patients with dementia who struggle to communicate. 

The need for accurate pain assessment 

Dementia currently affects around 850,000 people. Around 70% of residents in the UK’s 18,000 care homes have some form of dementia, 80% of whom suffer pain at any one time, and 50% experience persistent pain. 

It is a daily challenge for carers and healthcare professionals to assess pain in people living with dementia or other cognitive impairments. Often pain goes undetected or untreated, which often leads to unnecessary prescribing, behavioural and psychological issues and decreased quality of life for residents. 

PainChek was developed as an effective solution to this problem. Its unique combination of automated facial-analysis technology and smart automation enables carers and healthcare professionals to identify the presence of pain when pain isn’t obvious, to quantify the severity of pain and monitor the impact of treatment to optimise and evidence overall quality of care.

Research and development

PainChek emerged from research carried out at Curtin University in Western Australia. Initially, we looked at automating the ‘Face of Pain Scale,’ but our research led us to identify the merits of a multidimensional pain tool, incorporating AI and smart automation. A unique 42-item observation pain assessment tool was conceptualised, based around the American Geriatric Society’s most commonly seen pain-related behaviours in people with cognitive impairment and the Facial Action Coding System (FACS). 

Research funding was secured in 2012, and a prototype app delivered in 2013, which first underwent validation testing in communicative people with chronic pain, and then in residents of aged care facilities who had moderate to severe dementia and could not self-report their pain. Performance of the app was compared with the Abbey Pain Scale, a paper-based validated pain assessment measurement. These initial studies demonstrated the validity and reliability of the tool, and the data was used to support regulatory clearance for PainChek as a Class 1 medical device in Australia and Europe, which was granted in 2017.

Behind the technology

PainChek uses facial analysis to detect the presence of facial micro-expressions (action units or AUs) indicative of the presence of pain. Utilising the computation power and the built-in cameras of smart devices, the app uses technology which detects the presence of a face, maps the facial features, and applies a series of algorithms to detect pain-related AUs in real-time from images from a short three-second video. This data is then combined with non-facial features observed by the app user and input via a series of digital checklists, which together allows automatic calculation of a total pain score and the assignment of a pain intensity level.

The system allows point-of-care assessment of pain using the app on either iOS or Android mobile devices, without the need for internet connection, meaning it can be used across a broad range of clinical settings. Data synchronisation to a cloud-based repository takes place when the device is connected to the internet, providing data security and data sharing across multiple users linked to the same software licence.

Opportunities for healthcare

Tools that increase accuracy of assessments and reduce the time to effectively evaluate pain are crucial. Technology that overcomes the gap in pain documentation equips care providers with the means to better plan and treat pain according to evidence-based pain management practices, which ultimately, improves the quality of care.

The opportunities AI offers to the global healthcare sectors, from hospitals and primary care, to aged care facilities and home care, are vast. AI can be used to automate patient assessment and remove assessor bias. It can evaluate patient risk, such as of a patient developing a particular disease or suffering a particular adverse event, diagnose disease, for example, by interpreting ECG results and X-ray images, select the optimal treatment based on a patient’s clinical history and the results of clinical trials, and monitor disease and detect early warning signs of deterioration. 

The use of AI in healthcare will be driven by the availability of big data on which to train predictive algorithms, which assist (rather than replace) human decision-making, facilitate curiosity-based thinking, enable collaboration and remove mundane tasks, enhancing patient care as a result.

Back to topbutton