Artificial Intelligence Embedded Imaging Modality

In the third blog of her series on AI and the radiographer, Shamie Kumar explores the impact on the radiographer when AI is integrated within an imaging modality.

In previous BIR blog posts, I have explored how AI is integrated into PACS with the AI outputs seen on radiology systems, and whether non-reporting radiographers could learn and benefit from AI. The question to explore in this blog is when AI is integrated within an imaging modality itself and how that may impact a radiographer.

AI embedded into a portable digital X-ray machine

Radiographic images are acquired in multiple modalities within different patient pathways. I will explore how AI embedded into a portable digital X-ray machine might change and affect how the radiographer works and learns.

Every radiographer is trained to take X-rays on portable machines and this is a core skill and it is an adapted technique compared with dedicated static X-rays rooms. It is unique in the sense patient positioning can vary depending on the environment and situation, whether this be on a ward or in A/E resus. Patient’s conscious level and mobility can vary, often supine and not all being cooperative. There can be situations where other healthcare professionals (HCP) are in proximity of the patient being imaged due to the image being acquired outside of the main radiology department.

AI output

Some hospitals have adopted digital portable X-ray machines to provide an instant image, the radiographer can see the chest X-ray immediately after exposure and decide whether the image quality is optimal. As AI becomes integrated within the modality, in this instance on a portable digital X-ray machine, the radiographer will also see the AI output and findings alongside the original X-ray. Not only does the radiographer see the AI output but other HCP that are present will also have the accessibility to view the same in the given environment. As we all know, X-rays need to be reported by radiologist or reporting radiographers, but often clinicians make clinical decision before these inpatient portable x-rays reports are finalised and available on the hospital system, especially if quick intervention is required.

When AI integration is done in such a way that radiographer need not log into PACs to view the AI output and is shown on the modality once the image is acquired, all radiographers can utilise AI to its full potential. The focus quickly shifts to: does the radiographer have the relevant education and training to understand the AI intended use, the AI outputs, what are the functions, features of the AI, how do they clinically interpret these images, how does AI work and what are the limitation of AI. All these questions become important when an AI is implemented; radiographers need to be trained how to use it, become familiar with the outputs, and educate others around them. If this is approached robustly, it will empower radiographers to learn and upskill themselves with AI being part of their daily clinical workflow, giving them the confidence to support and guide other healthcare professionals (HCPs) who also are looking at the X-ray when it acquired.

AI is an assistive tool

It’s important to recognize that AI findings are never the final diagnoses. Ultimately AI is an assistive tool, embedded within portable machines. Doctors and HCPs will also view the AI output and, with time, it will be the role of the radiographers to appropriately manage and guide other healthcare professionals.

About Shamie Kumar

Shamie Kumar is a practicing HCPC Diagnostic Radiographer; graduated from City University London with a BSc Honors in Diagnostic Radiography in 2009 and is a part of Society of Radiographers with over 12 years of clinical knowledge and skills within all aspects of radiography.

She studied further in leadership, management, and counselling with a keen interest in artificial intelligence in radiology.

Leave a comment