Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Brief Commentary
Cardio Oncology with ACOS
Case Report
Case Series
Conference Review
Consensus Statement
Current Issue
Editorial
Erratum
Letter to Editor
Media and News
Molecular Insight Story
New Drug Update
News
Original Article
Position Paper
Response to the letter
Review Article
Short Communication
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Brief Commentary
Cardio Oncology with ACOS
Case Report
Case Series
Conference Review
Consensus Statement
Current Issue
Editorial
Erratum
Letter to Editor
Media and News
Molecular Insight Story
New Drug Update
News
Original Article
Position Paper
Response to the letter
Review Article
Short Communication
View/Download PDF

Translate this page into:

Review Article
8 (
1
); 9-14
doi:
10.25259/IJMIO_32_2022

Artificial intelligence in radiation oncology: How far have we reached?

Department of Radiation Oncology, Rajiv Gandhi Cancer Institute and Research Centre, New Delhi, India
Corresponding author: Kundan Singh Chufal, Department of Radiation Oncology, Rajiv Gandhi Cancer Institute and Research Centre, New Delhi, India. kundan25@gmail.com
Licence
This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, transform, and build upon the work non-commercially, as long as the author is credited and the new creations are licensed under the identical terms.

How to cite this article: Chufal KS, Ahmad I, Chowdhary RL. Artificial intelligence in radiation oncology: How far have we reached? Int J Mol Immuno Oncol 2023;8:9-14.

Abstract

Technological advances have revolutionized the field of radiation oncology (RO) as more and more departments are now equipped with modern linear accelerators and planning systems, resulting in the generation of a considerable amount of clinical, imaging, and dosimetric data. Artificial intelligence (AI) can utilize all these data points to create models which can expedite decision-making, treatment planning, and response assessment. However, various roadblocks impede the speed of development in this field. While data quality and security are the top priorities, legal and ethical issues are equally important. This scoping review provides an overview of the emerging possibilities resulting from an integration of modern RO workflow and AI-based technologies.

Keywords

Artificial intelligence
Radiomics
Deep learning
Machine learning
Radiotherapy planning

INTRODUCTION

Radiation oncology (RO) is a rapidly evolving branch of oncology, and up to half of all cancer patients will require radiotherapy (RT) intervention at some point in the course of their disease.[1] With immense technological development, radiation planning and delivery have become precise and accurate, and consequently, the processes involved have become complex. This complexity has added another dimension to the already existing problem of adequately trained staff, that is, time-consuming workflows. In addition, predictive analyses of RT plans and delivered doses will be required soon to improve the quality of RT plans.

Artificial intelligence (AI) has shown promising applications in healthcare. In contrast to simple automation [Figure 1], it involves learning complex rules and patterns from historical data, which are then used to predict outcomes or simplify complex tasks. In this scoping review, we will discuss the application of AI to increase the efficiency, accuracy, and quality of the RT workflow, which may improve value-based cancer care delivery in resource-constrained settings.

Figure 1:
The difference between automation and artificial intelligence.

ARTIFICIAL INTELLIGENCE IN RADIATION ONCOLOGY

The RO cancer care continuum includes treatment decisions, planning, delivery, and follow-up. In addition, treatment planning comprises several sub-processes: target and normal tissue segmentation, inverse planning, dose optimization, decision support, quality assurance (QA), and outcome prediction. The efficiency of these procedures can be enhanced by integrating AI, and progress has been made in tasks like contouring and planning.

The broad application of AI in RO can be divided into two parts:

  1. Process-driven AI

    1. Decision tools

    2. Segmentation

    3. RT planning

    4. Dose optimization

    5. QA f. Treatment delivery

  2. Outcome-driven AI (Predictive Modeling)

    1. Prognostication

    2. Response assessment

    3. Toxicity prediction.

PROCESS-DRIVEN AI – DECISION TOOLS

The magnitude and rate of data accumulation have challenged our ability to analyze, interpret and apply multiple data points simultaneously. For example, in a study[2] among specialized thoracic radiation oncologists (ROs) (experience ranging from 3 to 20 years), the predicted treatment outcome of lung cancer patients was no better than a random guess (AUC ranging from 0.52 to 0.59), while AI based model performance was significantly better (AUC of 0.61 to 0.77). Another example is the prediction of complications and emergency visits before starting radiation treatment, which improves the existing clinical workflow.[3] Synthesizing accumulating evidence concerning patients’ clinical/imaging data and objectively determining appropriate therapy is becoming more taxing. AI has the potential to speed up this process, presumably without introducing any subjective biases. Finally, while every RO applies their judgment to personalize RT plans based on their patient’s unique clinical/ imaging characteristics, AI can take this a step further by suggesting a specific treatment plan to achieve the optimal dose[4] based on predicted radiation sensitivity.[5]

PROCESS-DRIVEN AI – SEGMENTATION

Delineation of the gross disease (GTV) and associated regions of risk (CTV) are the cornerstones of modern RO, yet it is the most time-consuming step. It is well known that inter-observer variation in target delineation affects treatment outcomes,[6] and while auto-contouring solutions provided by commercial vendors of treatment planning systems have attempted to accelerate this process,[7,8] their acceptance remains low. Although based on a knowledge-based framework, the efficiency of auto-contouring algorithms remain variable and frequently generates incorrect contours due to inherent limitations[9] (especially near soft tissues[10]), which require manual corrections. This defeats the purpose of “auto-contouring” as it exists today.

Deep learning techniques such as convolutional neural networks or adversarial neural networks hold promise in this field as the performance of these algorithms reaches close to human levels both for tumors[11] and normal tissue segmentation.[12] Nevertheless, further research is required to generate high-quality prospective data with robust external validation and broader generalizability before these algorithms are widely adopted in the clinical workflow.

PROCESS-DRIVEN AI – RT PLANNING

RT planning involves patient positioning and immobilization before performing the simulation computed tomography (CT) scan. Depending on the disease site, this process can be very involved and requires coordination between ROs, physicists, and technologists. AI solutions can predict probable dose distributions based on diagnostic images;[4] Similarly, we can predict the optimal treatment position and immobilization so that the whole simulation process is streamlined.

Patients requiring specialized treatment techniques such as Deep Inspiration Breath Hold (DIBH) for left-sided breast cancer usually undergo 3 days of assessment to determine eligibility for this technique. Deep learning-based algorithms can be used on routine X-ray chest images[13] to identify their eligibility for DIBH and help in better resource utilization and patient care. In addition, generating synthetic CT scans from magnetic resonance imaging (MRI) using generative adversarial networks[14] can further smoothen the simulation workflow as the patient does not have to undergo RT planning CT scans if they have already undergone RT planning MRI.

Image co-registration (between simulation CT and MRI or PET-CT) plays a key role in determining the true extent of the tumor, as combining information from different imaging modalities overcomes the limitations of the simulation CT alone. However, commercially available registration methods lack generalizability, while deep learning-based approaches perform better, are more robust, and are generalizable across multiple imaging modalities.[15] These AI applications can enhance the modern RT simulation workflow.

PROCESS-DRIVEN AI – DOSE OPTIMIZATION

The generation of a high-quality deliverable RT plan is a multi-step process. Several studies[4] have shown that optimal dose distribution can be predicted (along with identifying machine parameters to achieve this dose distribution), and dose calculation can be accelerated using a knowledge-based approach.[16] Despite these advances, planning and dose optimization are not fully automated and frequently require human intervention, without which they can result in suboptimal dose distributions.[17]

Moreover, this approach is unsuitable for complex RT plans requiring photons and electrons. Recently, AI-based methods can generate RT plans comparable to or superior to humans.[18,19] The ideal solution would be to predict the best dose distribution followed by generating a treatment plan that matches closely to the predicted dose distribution, making the whole process fully automated.

PROCESS-DRIVEN AI – QA

QA involves patient-specific QA, aiming to detect human errors in treatment plans and anomalies in planning software. On the other hand, machine-related QA involves testing isolated parts of the treating machines. These processes involve many repetitive, time-consuming tasks. Patient-specific QA passing rates can be predicted using an AI algorithm that can flag the possible sources of errors, avoiding the need to measure physical doses.[20] In addition, the data acquired during the daily use of radiation machines can be used to predict future trends, and potential errors and improve machine-related QA efficiency.[21]

PROCESS-DRIVEN AI – TREATMENT DELIVERY

Patient scheduling for radiation treatment and on-treatment assessment can be made more efficient by utilizing AI approaches to identify the most important contributing factors to long waiting times.[22] Accurate treatment setup is one of the most crucial steps in overall radiation workflow and depends heavily on integrated cone beam CT (CBCT) devices. Although CBCT has revolutionized radiation treatment delivery by facilitating image-guided radiation therapy, poor image quality is a significant issue affecting the overall setup verification and treatment delivery time. AI has been used to improve the quality of these images by generating higher-resolution images, making it easier to match them with the simulation CT scan, thus speeding up the time for setup verification.[23] In addition, moving organs such as the lung and liver require real-time tumor tracking, and AI has shown great potential to accurately track tumor motion by predicting the anticipated trajectory of the tumor within milliseconds.[24]

OUTCOME-DRIVEN AI – PROGNOSIS, RESPONSE, AND FOLLOW-UP

Be it RO or any other field of medicine, understanding the prognosis of a particular condition and predicting response is of utmost importance. Also significant is the toxicity associated with the proposed treatment. There has been an enormous effort by researchers to model relevant clinical factors to predict treatment outcomes in terms of treatment response and toxicity. Many machine learning and deep learning AI techniques have recently been utilized to demonstrate their potential for better overall survival, response, and toxicity prediction.[25-29] These AI-based prediction models can provide precise point-of-care recommendations, thus enhancing clinical decision support.

Radiation planning dosimetric data can be integrated with orthogonal data like genomics, medical imaging, and electronic medical records to build robust Tumor Control Probability Models and Normal Tissue Complication Probability Models.[2,30] Radiomics is a field of medical imaging analytics where features are extracted from images based on complex interrelationships of pixels and voxels.[31] The amount of data extracted through the radiomics approach is enormous and needs AI-based techniques to make sense of this data. Initially, it was reported for the prognostication of lung cancer patients who underwent definitive RT,[32] following which many studies were initiated to study other outcomes like response prediction,[33] toxicity prediction,[34] determining the nature of lung nodules and exploring imaging genomics[35,36] to name just a few. Although much research is being done in these fields, we do not have any validated model for routine clinical use.

ISSUES AND CHALLENGES

The radiomics approach mentioned above has been studied extensively but often fails or lacks external validation[37] because of a bias toward tumor volume.[38,39] In addition, deep learning techniques have been criticized for their “black-box” nature (despite their excellent accuracy), as we are not fully aware of the reasons for their prediction. To tackle this, recently, there has been a focus on Explainable AI,[40] and if widely implemented, it will eventually drive adoption by AI skeptics.

Another significant challenge is the availability of high-quality datasets for AI-Model training and validation. Therefore, our primary emphasis must be on high-quality data collection and curation, as a lack of consistency in standardizing this medical data impedes progress in this field.[41-43]

Once medical data in institutional databases are standardized, the next challenge is to form multi-institutional collaborations across all possible geographic locations. At present, models are being trained on locally available limited-size datasets, thus creating better-performing models only on that localized geographic entity and lack generalizability. This is a consequence of the legal and ethical issues encompassing sharing medical data between different institutions, as the patient’s right to privacy and data protection is an essential prerequisite that must be fulfilled at any cost.[44,45]

An important step toward overcoming this hurdle is federated model training, where medical data does not leave the institution’s database. Instead, only model training is done in each institution, and the parameters learned from each model are combined, thus incorporating all geographic heterogeneities and resulting in a generalized model that may work globally.[46] In addition, this can overcome racial biases in AI algorithms and make them ethically sound.[47]

It is also anticipated that the dynamics of the patient-doctor relationship will change with the utilization of AI, and the focus will be on the patient-healthcare establishment relationship. However, we must also be forewarned that unethical AI practices pose a unique challenge,[48] and a strong technical and legal infrastructure needs to be created to protect patients and organizations.

CONCLUSION

AI and RO form a comfortable blend and can lead to a perfect example of integrated AI-guided workflow. AI-based approaches can be applied to every aspect of the RT workflow continuum. As the modern RO department adopts more and more of these approaches, the efficient utilization of resources will lead to all stakeholders (ROs, medical physicists, and radiation technologists) spending less time on technical processes and more time optimizing outcomes for our patients. The steady march of technological advancement leads to a fear of becoming redundant, yet with history as a witness, each significant step forward in our specialty has shifted our responsibilities. The approaching integration of AI is no different.

Declaration of patient consent

Patient’s consent not required as there are no patients in this study.

Conflicts of interest

There are no conflicts of interest.

Financial support and sponsorship

Nil.

References

  1. , , , . The role of radiotherapy in cancer treatment: Estimating optimal utilization from a review of evidence-based clinical guidelines. Cancer. 2005;104:1129-37.
    [CrossRef] [PubMed] [Google Scholar]
  2. , , , , , , et al. A prospective study comparing the predictions of doctors versus models for treatment outcome of lung cancer patients: A step toward individualized care and shared decision making. Radiother Oncol. 2014;112:37-43.
    [CrossRef] [PubMed] [Google Scholar]
  3. , , , . Predicting emergency visits and hospital admissions during radiation and chemoradiation: An internally validated pretreatment machine learning algorithm. JCO Clin Cancer Inform. 2018;2:1-11.
    [CrossRef] [Google Scholar]
  4. , , , , , , et al. A feasibility study for predicting optimal radiation therapy dose distributions of prostate cancer patients from patient anatomy using deep learning. Sci Rep. 2019;9:1076.
    [CrossRef] [PubMed] [Google Scholar]
  5. , , , , , , et al. An image-based deep learning framework for individualizing radiotherapy dose. Lancet Digit Health. 2019;1:e136-47.
    [CrossRef] [PubMed] [Google Scholar]
  6. , , , , , , et al. Contouring variations and the role of atlas in non-small cell lung cancer radiation therapy: Analysis of a multi-institutional preclinical trial planning study. Pract Radiat Oncol. 2015;5:e67-75.
    [CrossRef] [PubMed] [Google Scholar]
  7. , , , , , , et al. Institutional clinical trial accrual volume and survival of patients with head and neck cancer. J Clin Oncol. 2015;33:156-64.
    [CrossRef] [PubMed] [Google Scholar]
  8. , , , , , , et al. Radiation Therapy Quality Assurance (RTQA) of concurrent chemoradiation therapy for locally advanced non-small cell lung cancer in the PROCLAIM Phase 3 trial. Int J Radiat Oncol Biol Phys. 2018;101:927-34.
    [CrossRef] [PubMed] [Google Scholar]
  9. , . Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med Phys. 2017;44:547-57.
    [CrossRef] [PubMed] [Google Scholar]
  10. , , , , , , et al. Comparison of automated atlas-based segmentation software for postoperative prostate cancer radiotherapy. Front Oncol. 2016;6:178.
    [CrossRef] [PubMed] [Google Scholar]
  11. , , , , , , et al. Deep learning algorithm for auto-delineation of high-risk oropharyngeal clinical target volumes with built-in dice similarity coefficient parameter optimization function. Int J Radiat Oncol Biol Phys. 2018;101:468-78.
    [CrossRef] [PubMed] [Google Scholar]
  12. , , , , , , et al. Deep learning to achieve clinically applicable segmentation of head and neck anatomy for radiotherapy. arXiv. 2021;23:e26151.
    [CrossRef] [PubMed] [Google Scholar]
  13. , , , , , , et al. Convolutional neural network to predict deep inspiration breath hold eligibility using chest X-ray. Radiother Oncol. 2021;161:s560-1.
    [CrossRef] [Google Scholar]
  14. , , , , , , et al. Dose evaluation of fast synthetic-CT generation using a generative adversarial network for general pelvis MR-only radiotherapy. Phys Med Biol. 2018;63:185001.
    [CrossRef] [PubMed] [Google Scholar]
  15. , , , , . Scalable high-performance image registration framework by unsupervised deep feature representations learning. IEEE Trans Biomed Eng. 2016;63:1505-16.
    [CrossRef] [PubMed] [Google Scholar]
  16. , , , , . Technical note: A feasibility study on deep learning-based radiotherapy dose calculation. Med Phys. 2020;47:753-8.
    [CrossRef] [PubMed] [Google Scholar]
  17. , , , , , , et al. Comparison of planning quality and efficiency between conventional and knowledge-based algorithms in nasopharyngeal cancer patients using intensity modulated radiation therapy. Int J Radiat Oncol Biol Phys. 2016;95:981-90.
    [CrossRef] [PubMed] [Google Scholar]
  18. , , , , , , et al. Intelligent inverse treatment planning via deep reinforcement learning, a proof-of-principle study in high dose-rate brachytherapy for cervical cancer. Phys Med Biol. 2019;64:115013.
    [CrossRef] [PubMed] [Google Scholar]
  19. , , , , , . Deep reinforcement learning for automated radiation adaptation in lung cancer. Med Phys. 2017;44:6690-705.
    [CrossRef] [PubMed] [Google Scholar]
  20. , , , , , . IMRT QA using machine learning: A multi-institutional validation. J Appl Clin Med Phys. 2017;18:279-84.
    [CrossRef] [PubMed] [Google Scholar]
  21. , . Predictive time-series modeling using artificial neural networks for Linac beam symmetry: An empirical study. Ann N Y Acad Sci. 2017;1387:84-94.
    [CrossRef] [PubMed] [Google Scholar]
  22. , , . Predicting Waiting Times in Radiation Oncology Using Machine Learning. In: Montreal. Quebec: McGill University; .
    [CrossRef] [PubMed] [Google Scholar]
  23. , , , , , , et al. Cone beam computed tomography image quality improvement using a deep convolutional neural network. Cureus. 2018;10:e2548.
    [CrossRef] [PubMed] [Google Scholar]
  24. , , . On using an adaptive neural network to predict lung tumor motion during respiration for radiotherapy applications. Med Phys. 2005;32:3801-9.
    [CrossRef] [PubMed] [Google Scholar]
  25. , , , , , , et al. Deep learning for lung cancer prognostication: A retrospective multi-cohort radiomics study. PLoS Med. 2018;15:e1002711.
    [CrossRef] [PubMed] [Google Scholar]
  26. , , , , , , et al. Deep learning for predicting major pathological response to neoadjuvant chemoimmunotherapy in non-small cell lung cancer: A multicentre study. EBioMedicine. 2022;86:104364.
    [CrossRef] [PubMed] [Google Scholar]
  27. , , , , , . Pathological response prediction to neo-adjuvant chemoradiation in esophageal carcinoma using artificial intelligence and radiomics: An exploratory analysis. Int J Radiat Oncol Biol Phys. 2020;108:e612-3.
    [CrossRef] [Google Scholar]
  28. , , , , , , et al. Artificial intelligence enabled prognostic modelling for thymomas. Int J Radiat Oncol Biol Phys. 2020;108:e787.
    [CrossRef] [Google Scholar]
  29. , , , , , , et al. Radiation pneumonitis prediction after stereotactic body radiation therapy based on 3D dose distribution: Dosiomics and/or deep learning-based radiomics features. Radiat Oncol. 2022;17:188.
    [CrossRef] [PubMed] [Google Scholar]
  30. , , , , , , et al. Clinical decision support of radiotherapy treatment planning: A data-driven machine learning strategy for patient-specific dosimetric decision making. Radiother Oncol. 2017;125:392-7.
    [CrossRef] [PubMed] [Google Scholar]
  31. . Data science in radiology: A path forward. Clin Cancer Res. 2018;24:532-4.
    [CrossRef] [PubMed] [Google Scholar]
  32. , , , , , , et al. Decoding tumour phenotype by noninvasive imaging using a quantitative radiomics approach. Nat Commun. 2014;5:4006.
    [CrossRef] [PubMed] [Google Scholar]
  33. , , , , , , et al. Deep Learning and Radiomics predict complete response after neo-adjuvant chemoradiation for locally advanced rectal cancer. Sci Rep. 2018;8:12611.
    [CrossRef] [PubMed] [Google Scholar]
  34. , , , , , . Lung texture in serial thoracic computed tomography scans: Correlation of radiomics-based features with radiation therapy dose and radiation pneumonitis development. Int J Radiat Oncol Biol Phys. 2015;91:1048-56.
    [CrossRef] [PubMed] [Google Scholar]
  35. , , , , , , et al. Perinodular and intranodular radiomic features on lung CT images distinguish adenocarcinomas from granulomas. Radiology. 2019;290:783-92.
    [CrossRef] [PubMed] [Google Scholar]
  36. , , , , , , et al. Radiomics and radiogenomics in lung cancer: A review for the clinician. Lung Cancer. 2018;115:34-41.
    [CrossRef] [PubMed] [Google Scholar]
  37. , , , , . Design characteristics of studies reporting the performance of artificial intelligence algorithms for diagnostic analysis of medical images: Results from recently published papers. Korean J Radiol. 2019;20:405-10.
    [CrossRef] [PubMed] [Google Scholar]
  38. , , , , , , et al. Vulnerabilities of radiomic signature development: The need for safeguards. Radiother Oncol. 2019;130:2-9.
    [CrossRef] [PubMed] [Google Scholar]
  39. , . Vulnerabilities of radiomics: Why the most popular radiomics signature accidentally measured tumor volume. Strahlenther Onkol. 2021;197:361-4.
    [CrossRef] [PubMed] [Google Scholar]
  40. , , , . Explainable medical imaging AI needs human-centered design: Guidelines and evidence from a systematic review. NPJ Digit Med. 2022;5:156.
    [CrossRef] [PubMed] [Google Scholar]
  41. , , , , , , et al. Standardizing normal tissue contouring for radiation therapy treatment planning: An ASTRO consensus paper. Pract Radiat Oncol. 2019;9:65-72.
    [CrossRef] [PubMed] [Google Scholar]
  42. , , , , , , et al. Improving treatment plan evaluation with automation. J Appl Clin Med Phys. 2016;17:16-31.
    [CrossRef] [PubMed] [Google Scholar]
  43. , , , , , , et al. Standardizing dose prescriptions: An ASTRO white paper. Pract Radiat Oncol. 2016;6:e369-81.
    [CrossRef] [PubMed] [Google Scholar]
  44. , . European union regulations on algorithmic decision-making and a “right to explanation”. AI Mag. 2016;38:50-7.
    [CrossRef] [Google Scholar]
  45. . The right to explanation, explained. Berkeley Technol Law J. 2019;34:18-24.
    [CrossRef] [Google Scholar]
  46. , , , , . A review of medical federated learning: Applications in oncology and cancer research In: Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries. Cham: Springer International Publishing; .
    [CrossRef] [Google Scholar]
  47. , . Gender shades: Intersectional accuracy disparities in commercial gender classification In: , , eds. Proceedings of the 1st Conference on Fairness, Accountability and Transparency. Proceedings of Machine Learning Research. . p. :77-91.
    [Google Scholar]
  48. , , . Implementing machine learning in health care addressing ethical challenges. N Engl J Med. 2018;378:981-3.
    [CrossRef] [PubMed] [Google Scholar]
Show Sections