Having the X-factor – how human factors engineering can save lives

by

Neil Ballinger, head of EMEA at manufacturing parts supplier EU Automation, explains the role of human factors engineering in optimising medical devices.

How many times have you tried to plug a USB in the wrong way round? Since both sides are usually similar on the outside, chances are that you have tried to force a USB in upside down at least once. The consequences of this mistake are usually negligible, but in healthcare, poor design can have serious repercussions. 

We all know how to plug in a flash drive, so why do we still slip up? Usually, it’s because we are concentrating on something more important. Imagine the same happening in a busy emergency unit where healthcare providers are carrying out life-saving procedures. They’re short on time and working under pressure. One of the paramedics is trying to resuscitate a patient who just went into cardiac arrest — he charges the defibrillator, presses “on”, but instead of shocking, the device switches off, causing the team to waste precious time. What went wrong? And how could the same issue be prevented in the future? Human factors engineering can help answer these questions. 

To err is human

When the heart stops beating, every minute that passes reduces the chance of resuscitating the patient by 10%. In the example above, the paramedic should have pressed “shock” instead of “on”, because in several defibrillator models, the on and off switches are the same. When the defibrillator is accidentally turned off, it takes an average of two to three minutes to restart, meaning the user has 20-30% less chance of resuscitating a patient.

Blaming the paramedic does not prevent other healthcare professionals from making the same mistake in the future. Chances are that improving the defibrillator’s design is a much more effective step.

Human factors engineering (HFE) is the discipline that tries to identify and address problems with equipment and systems design. Its goal is to optimise the interaction of workers with their technical tools and work environment to improve safety, effectiveness and ease of use. 

HFE considers how products are used in a real work environment, by fallible humans. It uses the principles of cognitive behaviour to assess how workers will respond to a particular scenario, considering variables such as physical demands, mental workload and team dynamics. It also studies how environmental conditions such as PPE, noise, inadequate lighting or distractions can impact a worker’s ability to perform a task.

The underlying principle in HFE is that we can’t redesign humans — slips and lapses are common even in the most scrupulous, best trained worker. However, we can redesign processes and tools to prevent mistakes or, if they do occur, make corrective actions more easily. 

To achieve this, HFE uses a combination of equipment usability tests, standardisation policies, checklists and environment redesign.

No more blame games

Compared with other high-risk industries, the applications of HFE principles in healthcare is quite recent. One of the first publications to tackle the question of equipment and systems design in the sector is the 2000 IOM report To err is human: building a safer healthcare system

The report was revolutionary in its approach to medical error — instead of blaming healthcare providers, it focussed on the inadequacy of tools and systems. The report also highlighted the need to create a culture of safety, where workers can report potential hazards without fearing criticism or retaliation. Yet 20 years later, the standard approach is still to try and avoid all human error, and when this inevitably occurs, the first reaction is to blame the provider. 

Compare this with the aviation sector, where the principles of HFE started to be applied as early as the 1970s. Most major airlines have non-reprisal and non-retaliation policies, meaning that pilots and air traffic controllers can safely report when they are about to make a mistake, without having to fear for their jobs and licences. This allowed the industry to create systems that prevent or mitigate human error. The result? Today there are more than 30,000 flights a day, but planes crashes are rare.  

Expectations vs reality

One of the biggest problems in medical device manufacturing is that there is often a huge gap between how design engineers think a device will be used, and how it is actually used on the field. 

2007 usability study of two common models of defibrillators found that when the scenario required two subsequent synchronised cardioversions, 50% of the users had the device in the wrong mode the second time, although all of them were familiar with the model they were using. This happened because the device changed mode during the first shock without notifying the user. 

Researchers notified both manufacturers, and one of them replied that the appropriate corrective action for that case was explained in the device labelling. But as every person who has ever pulled instead of pushed a door knows, labels are not always effective. When every second counts, paramedics don’t have time to calmly read the device label. 

The manufacturer’s response signals a huge gap between how work is imagined and how it is actually performed. This issue is not by any means exclusive to defibrillators, the same can happen with a variety of medical devices. For example, transcribing the wrong values from a glucometer into the electronic medical record is still a very common mistake in intensive care units. This could be easily fixed by implementing connectivity between the glucometer and the central electronic medical record. In this case, systems designers didn’t consider the fact that nurses need to transcribe these values manually multiple times, which can inevitably lead to mistakes. 

The good news is that systems fixes are sometimes easier to implement that you might think. For example, older devices that lack connectivity can be retrofitted with smart sensors to communicate with the central system. When technology is designed to be used by real humans, in real work environment, it becomes an ally rather than a source of potential hazards.

Though we’re still using trusty old USB devices, newer connectors, like the ones you use to charge your mobile phone, have been designed using human factors engineering to work both ways. We’re starting to see the same in medtech.

Back to topbutton