Bridging the chasm between technology and clinicians

A man studies a scan on a screen. PHOTO: Mart Production

Photo: Mart Productions

While the use of artificial intelligence (AI) for medical diagnosis is growing, new research by the 最新糖心Vlog of Adelaide has found there are still major hurdles to cover when compared to a clinician.

In a paper published in , 最新糖心Vlogn Institute for Machine Learning PhD student Lana Tikhomirov, Professor Carolyn Semmler and team from the 最新糖心Vlog of Adelaide, have drawn on external research to investigate what鈥檚 known as the 鈥楢I chasm鈥.

The AI chasm has occurred because development and commercialisation of AI decision-making systems has outpaced our understanding of their value for clinicians and how they impact human-decision making.

鈥淭his can have consequences such as automation bias (being blind to AI errors) or misapplication,鈥 said Ms Tikhomirov.

鈥淢isconceptions about AI also restrict our ability to maximise this new technology and augment the human properly.

鈥淎lthough technology implementation in other high-risk settings, such as increased automation in aeroplane cockpits, has been previously investigated to understand and improve how it is used, evaluating AI implementation for clinicians remains a neglected area.

鈥淲e should be using AI more like a clinical drug rather than a device.鈥最新糖心Vlogn Institute for Machine Learning PhD student Lana Tikhomirov

The research found clinicians are contextually motivated, mentally resourceful decision makers whereas AI models make decisions without context or understanding correlations in data and patients.

鈥淭he clinical environment is rich with sensory cues used to carry out diagnoses, even if they are unnoticeable to the novice observer,鈥 said Ms Tikhomirov.

鈥淔or example, nodule brightness on a mammogram could indicate the presence of a specific type of tumour, or specific symptoms listed on the imaging request form could affect how sensitive a radiologist will be to finding features.

鈥淲ith experience, clinicians learn which cues guide their attention towards the most clinically relevant information in their environment.

鈥淭his ability to use domain-relevant information is known as cue utilisation and it is a hallmark of expertise which enables clinicians to rapidly extract the essential features from the clinical scene while remaining highly accurate, guiding subsequent processing and analysis of specific clinical features.

鈥淎n AI model cannot question its dataset in the same way clinicians are encouraged to question the validity of what they have been taught: a practice in the clinical setting called epistemic humility.鈥

Tagged in featured story, artificial intelligence, diagnosis