The future of Artificial Intelligence and radiology


How would you feel if an algorithm could inform you that you had cancer-based on your mammography exam or CT scan? It is highly likely that in the future, the creative work of radiologists will be necessary to solve challenging problems and to oversee diagnostic procedures. AI will absolutely become part of their routine in diagnosing basic cases and helping to assist with repetitive jobs. So, instead of feeling threatened by AI, radiologists need to become familiar with how it could help them in their daily lives for the better.

Radiologists who use AI will replace those who don’t

There is a lot of fear surrounding AI and its future impact on medicine. There are many indicators suggesting that AI will completely revolutionize the world of healthcare. With the advancements in deep learning algorithms and narrow AI, there is a buzz around the medical field of imaging, in particular, something that has set many radiologists into a panic. Curtis Langlots, Professor of Radiology, recently presented at the GPR Tech Conference in San Jose. He mentioned how one of his students had sent an email saying they were considering going into radiology but that they weren’t sure it was a viable career any longer. This is completely wrong, radiology isn’t a dying profession, in fact, it’s far from it.

There is a lot of hype around the radiology profession that deep learning and machine learning and AI, in general, is going to replace radiologists in the future and that perhaps all radiologists will end up doing is looking at images. However, it’s simply not true. As a comparison, we could consider that of a plane going into autopilot. This innovation certainly didn’t replace pilots, it assisted with their tasks. When a plane is flying a very long route, it’s great to be able to switch on the autopilot, however, they are not very useful when rapid judgment is required. So, the technology and human combination is definitely a winning one, and it is going to be the same case in healthcare too.

While it may be true that AI will not replace radiologists, it must be said that radiologists who use AI will certainly replace those who don’t. Here’s why:

What do X-ray lamps, cat intestines, and the history of medical images have in common? The clinical radiology field began with the discovery of the X-ray back in 1895 by a German man named Wilhelm Conrad Röntgen. X-ray mania had taken over the world within the two months following its discovery. With headlines such as “new light seeing through flesh to bones” and “soon every house will have a cathode-ray machine” it really was considered to be a revolution. Perhaps other similar hyped technologies are brought to mind?

So excited by the discovery, Thomas Edison wanted to try to create a commercial “X-ray lamp” – his efforts failed, sadly. As did the efforts to try to get an X-ray of the human brain. He allowed story-driven journalists to go nuts as they were reported to have waited outside his lab for weeks waiting for the latest innovation. Some reporters created fake images of the human brain, one of which was actually a pan of cat intestines by H. A. Falk radiographed in 1896.

Even though some of the earlier methods turned out to be impossible projects, X-ray soon found its groove in medicine. And, it is expected that the same will happen with AI and medicine soon – hopefully with no cat intestines this time!

Radiology has been used in technological developments since it was first introduced. The Knick, a TV series that depicts the first era of modern healthcare, an inventor in the series contacts the hospital manager to present him with his new innovation: the X-ray machine. It took around one hour for the machine to take the picture. Nowadays, if you go to the hospital for a check-up on your lungs, it takes only a couple of minutes for the x-ray procedure to take place, and only a few more minutes to get the results.

A lot has changed since the very first experiments with the “X-ray lamp,” however, one thing remained constant – rapid technological advancements in radiology.

Larger range of tools and better precision

Around half a century after the X-ray was discovered, another innovation joined the medical imaging field: ultrasound. These new commercially available systems allowed for wider dissemination from the mid-sixties onwards. With growing advancements in piezoelectric materials and electronics, improvements were made from bistable to grayscale images. Also from still pictures to real-time moving images. The move from room-sized huge ultrasound machines to portable ones was amazing to see, and it only changed over another half a century. With Clarius Mobile Health introducing the very first pocket-sized handheld ultrasound scanner complete with a smartphone app, physicians can carry it around with them to undertake fast exams and to guide quick procedures like targeted injections and nerve blocks.

Let’s talk about body scanners. In 1971, the very first CT scanners were developed. They had a single detector for brain studies and were created under the leadership of Godfrey Hounsfield, an electrical engineer at EMI (Electric and Musical Industries, Ltd). In the 1970s, Raymond Damadian built the very first MRI scanner by hand, with the help of students at New York’s Downstate Medical Center. The first MRI scan of the human body was completed in 1977, and a human organism with cancer in 1978. In the early 2000s, medical imaging was routine in many centers, such as fetal imagine, body MRI, cardiac MRR, and functional MR imaging.

With precision comes automation

With expanding means in the field of radiology comes an increase in precision too. While precision is still the main focus, there is also the shift towards automation which aims to make radiologists’ lives easier. As radiologists have to look through many images each day, it is inevitable that this part of their job can become automated. Algorithms can be trained to find and detect various types of abnormalities based on the images, so why wouldn’t we allow them to do this job so that radiologists can spend their time on the harder issues at hand?

With the possibilities of deep learning, algorithms are able to teach themselves while radiologists oversee its effectiveness. The longer it is used for, the more effective it will be and it’s an opportunity too good to miss. Radiology could fast become one of the most creative specialties where problem-solving and the holistic approach is key.

So, with all this said, it definitely doesn’t mean that AI will take over all of a radiologist’s tasks. There will always be common findings and diagnoses on medical images that AI can help with, however, there are also very uncommon problems that we simply cannot miss. It could be hard for deep learning to identify those issues. So at what stage is this technology at the moment?

Is it possible for AI to predict when you might die?

Experiments have been carried out by scientists at the University of Adelaide where the use of AI systems should be able to tell one when they might die. The deep learning algorithms have been analyzing the CT scans of 48 patients to predict whether they might die within the next five years, the study so far has been 69% accurate. It is a similar outcome to the results of human diagnosticians, which is an impressive achievement. The deep learning machine was trained to indicate signs of disease in the organs by using a series of 16,000 images. The aim of the research is to check and measure overall health, rather than to identify a single disease.

This is just the tip of the iceberg, however, as there is a lot of research being carried out to teach algorithms about different diseases and how to detect them. An algorithm launched by IBM called Medical Sieve has been able to assist in clinical decision making in cardiology and radiology. The system can look at radiology images and detect problems faster and more reliably. Watson, another IBN AI analytic platform, is also used in the radiology field. Following the purchase of Merge Health in 2015, Watson gained access to millions of radiology studies and a lot of existing medical data which enabled brilliant training of the AI functionality and therefore gave better readings at imaging exams.

Other giants such as Agfa, Siemens, and Philips are already working with AI integration into their medical imaging software systems. GE is working on a predictive analytics software that utilizes AI. It helps in imaging departments if fails to show at work due to sickness, or if the volume of patients increases. With similar work-in-progress in predictive analytics software for imaging is tech company – Vital. There are also many smaller and larger start-up companies that are harnessing the power of AI for radiology.

All of this research doesn’t necessarily mean that we are currently ready to have patients face their life expectancy based on their medical images, however.

What are the challenges in introducing AI to the radiology department?

For us to gain some kind of idea as to when machine learning may be introduced on a wider scale, we first need to see how machine learning currently works in radiology. The process normally looks like this: The algorithm is fed by many many images and data parts that allow it to learn and detect differences in tissue. Just like how computers can recognize images of cats and dogs. If the algorithm makes an error, it is spotted by the researcher and they make an adjustment to the code. It is therefore a lengthy process and tons of data is needed. It is believed that the end result will look similar to this: Radiologists will conduct the high-level exam, and the algorithm will likely create a minable, structured, prelim report. The algorithm will therefore do the quantification that most humans don’t enjoy doing, and it will do it very well.

Other experts in precision medicine believe that there are many challenges faced in building these analytical platforms, from first acquiring and inputting the data, ensuring annotation of the data is effective, storage strategy, the regulator/policy/governance through the process, and the types of analysis that will be enabled via the platform. The largest challenge is that of data annotation and allowing various views of it along with enabling its discovery across the many connected data sets in the platform.

Furthermore, there is some convincing to be done to show hospitals that AI algorithms actually work. Experts suggest that there will be a process that takes advantage of external and internal “crowdsourcing” of appropriately anonymized data.

For instance, a user could have established data science algorithms that are based on anonymized datasets from their hospital network. Then, a new hospital could use the algorithm to further refine the anonymized local datasets to customize if for their needs. Once hospitals see a “win” scenario, they may be encouraged to allow the systems to use further datasets so as to contribute to the users’ solution. It’s perhaps similar to how we try to go into cool water on a hot summer day. Firstly, you see other people doing it, then you see that it’s safe, and so you get involved too, perhaps dipping your toes before fully committing.

When will we get to have AI analyzing our CT scans?

Every day, we move closer to clinical use. The Data Science Bowl of 2017 aimed at detecting lung cancer by using smart algorithms on more than 1000 anonymous lung scans that were provided by the US National Cancer Institute. There were over 18,000 unique algorithms created during the challenge. The main goal was to find the path to deliver the algorithms to systems that can be used in clinical care, and therefore members like the FDA and the American College of Radiology can connect to the image system users and the radiologist who would use these algorithms.

In 2017, the FDA went on to approve the first cloud-based deep learning algorithm. It was developed by Arterys for cardiac imaging. So we can see, we are slowly getting there. Experts suggest that within the next 3 years, we should see many machine learning algorithms being used in clinical pilot schemes and also in approved use. It is also expected that within this time frame, there may also be low dose CT lung cancer deep learning algorithms within the arsenal of a radiologist’s toolkit. This would be able to assess an individual’s risk of lung cancer.

There are, however, no concrete estimations, and it is viable that it will be a step by step process where a lot of sub-fields will be developed quicker than others. An example is in mammography, where it is more likely that AI will be used sooner than in CT scanning. There is indeed potential for a quicker approach that could see preliminary reports within the next 10 years. In some fields, this is a distinct possibility.

The future of radiology is with AI

At the end of the day, experts and research trends show just how AI will revolutionize radiology in the future. So, rather than feel threatened by it or neglect it, the medical world should adopt it with open arms.

Rather than radiologists feeling pushed out by machine intelligence, they should engage it, learn it, and promote it. After all, it is something that will help patients. We expect that there will be huge changes in the radiology field in the coming years. It is a field that needs to be kept at the forefront, and what matters the most is taking care of the patients. Let us all nurture that thought and make the future of radiology with AI a good one.

HUMANITAS GROUP

Humanitas is a highly specialized Hospital, Research and Teaching Center. Built around centers for the prevention and treatment of cancer, cardiovascular, neurological and orthopedic disease – together with an Ophthalmic Center and a Fertility Center – Humanitas also operates a highly specialised Emergency Department.