Diagnostic modalities for CAD involving ionizing radiation have rapidly increased over recent years despite direct epidemiological evidence from atomic bomb survivors of the stochastic risks of radiation exposure. Although the individual patient risk of cancer is small with a favorable risk-benefit ratio even for high-dose radiological procedures, concerns arise when imaging is used without proven clinical rationale, when alternative modalities can be employed with equal efficacy, or when imaging is repeated unnecessarily secondary to inadequate communication within the medical care community.1 Research demonstrates that physicians and patients who undergo diagnostic imaging involving significant radiation exposure have insufficient awareness of the environmental impact, biorisk, and the dose exposure from radiation.2-4 The most widely used studies include nuclear medicine myocardial perfusion imaging, computed tomography coronary, aortic and pulmonary angiography, and cardiac catheterization. The methodology used to quantify radiation exposure, the doses associated with typical protocols, and the techniques employed to reduce exposure are addressed. Greater radiation awareness among providers may prompt use of alternative imaging, more careful selection of current imaging and patients, and changes in imaging technique or equipment to decrease cancer risk from future diagnostic imaging.
Coronary artery disease (CAD) is the leading cause of mortality in the United States, accountable for 1in 5 deaths, and is associated with $142 billion in annual health care expenditures.5 Diagnostic imaging modalities have rapidly advanced such that medical exposures now represent the majority of effective radiation dose individuals receive; this has increased seven-fold over the past 25 years.16 It is estimated that >10% of overall radiation exposure to a U.S. inhabitant is attributable to nuclear and radiologic tests.2
As the diagnostic gold standard for CAD, coronary angiography procedures have increased from 2.45 to 3.85 million annually over the course of almost a decade from 1993 to 2002.7 Noninvasive modalities with ionizing radiation have also significantly expanded. In 1990, fewer than 3 million nuclear medicine studies were performed in the United States; however, by 2002, this number more than tripled to 9.9 million.8 Specifically, high-dose, dual isotope myocardial perfusion imaging (MPI) is among the highest of all medical diagnostic tests; nonetheless, its use nearly doubled from 19 to 30% between 1997 to 2002, and it currently comprises 36% of all outpatient MPI tests.9 Given that 72% of approximately 6 million patients who present for chest pain annually in U.S. emergency rooms will be hospitalized,5 computerized tomography coronary angiography (CTCA) provides a rapid diagnostic tool for triage. However, up to one third of CT scans in the U.S. are ordered without proven clinical rationale, when alternative modalities are equally efficacious. or are repeated unnecessarily.1
Despite radiation exposure and clinical indication concerns, the volume of computed tomography (CT) scans has increased over 20-fold over the past 25 years and is now equivalent to more that 62 million scans annually.1,10
Although imaging procedures are ideally performed in accordance with the As Low As Reasonably Achievable (ALARA) principle,9 there is direct epidemiologic evidence of the stochastic implications of even low-dose radiation, such as that from a few CT scans or from other high-dose radiologic procedures.1,10-12 The updated stochastic effects of low-dose ionizing radiation are addressed in the BEIR VII report.13 The risk depends on the absorbed dose, type of radiation, and the specific organ or tissue irradiated. Many radiation organizations support the linear no threshold model (LNT) such that cancer risk proceeds in linear fashion without a lower threshold. In general, stochastic risks have been shown to decrease with age, are higher in women, and are dependent on underlying predisposing factors unique to the patient.
The effective dose (E) is a common index value to compare patient radiation exposure from different diagnostic procedures, regardless of the medical facility or type of imaging performed. It is the weighted average of the mean absorbed radiation dose to various body organs that is multiplied by a tissue weighting factor published by the International Commission on Radiological Protection (ICRP). Equivalent dose is a radiation protection quantity related to the stochastic risk from absorbed doses in a population exposed to radiation.14 Cristner et al demonstrated that effective dose approximations may vary substantially depending on which tissue weighting coefficients published in various ICRP publications are used.15 Specifically, the most significant variance is demonstrated in deriving coronary artery E from CT chest examinations depending on which breast tissue weighting factor is used.15 There are various ways for determining E for nuclear medicine, computed tomography, and fluoroscopy. Current dosimetry models allow only estimates of doses that vary with the patient's weight and organ size, assuming standardized biokinetic data and uniform radiopharmaceutical activity within organs.9 Dose reference levels (DRL) have been proposed by the ICRP as protection quantities used to establish limits of exposure to both workers and the general public.
The ICRP has compiled a series of models and dosimetry tables for a variety of radiopharmaceuticals, the most recent in 2007, Publication 103.16 MPI at the San Antonio Military Medical Center is largely performed with technetium (99mTc) sestamibi, or Cardiolite. Patients receive two doses of technetium- 99m sestamibi total: 8 mCi for rest and 24 mCi for stress testing in single-day imaging process. Although manufacturers provide package inserts (Pl), the total body dose that is reported is lower than the effective dose and fails to consider nonuniform dose distribution.9 The effective dose to a typical patient during a standard cardiac study may be estimated by using dose coefficients, tissue weighting factors, radionuclide activities, and mathematical equations. Although studies have shown the effective dose for MPI may range from 2.2 - 31.5 mSv, the mean for a 99mTc-sestamibi rest-stress 1-day protocol have been cited as 9.4 and 11.1,6,9 which is approximately 512 times more radiation than a posterior-anterior chest radiography (0.02 mSv)2 and 3 times more than the annual background radiation in the United States (3 mSv).17 This corresponds to an estimated lifetime attributable risk (LAR) of 1 cancer in 1,000 exposed subjects, per the BIER VI I Committee 2005,25 with an LAR of fatal cancer of >1 in 10,000 cases.2
With CT, radiation exposure is expressed as volume CT dose index (CTDI vol ), which is the average radiation dose over a volume scanned expressed in units of mGy* cm.9, 10 The integrated sum comprises the dose length product (DLP) for the complete CT exam. In practice, the actual dose may be measured using any of the following three approaches: 1) calculations based on physical phantoms measurements; 2) CTDI vol or DLP values modified by a conversion coefficient (k); or 3) Monte Carlo simulations that model photon transport through a simulated, mathematical patient phantom taking into account tissue-weighting factors.9,15 Huda et al has introduced E/DLP conversion factors specific for cardiac CT examinations that depend on the proximity of the radiographic (X-ray) beam to the heart, increase as X ray tube voltage increases, and are modified by patients’ weight.18
Until recently, conventional chest CT scans had been used to approximate E from cardiac CT exams; however, the cardiac region has shown to be more radiosensitive than previously assumed. Organ dose for CT coronary angiography (CTCA) may range from 40-100 mGy, compared to 0.01-0.15 mGy for a chest radiograph.1 Prior studies have shown that the mean effective dose for CTCA ranges from 5-32 mSv;6 female breasts, lungs, liver, and esophagus correlate with the highest equivalent doses.9 Average mean effective doses have been cited as 12 and 16mSv5,6 though estimations are scanner-dependent. Philips 16-slice scanners average 4.9 and 8.1mSv with and without electrocardiography-controlled tube current modulation (ECTCM), respectively. Per Einstein et al, the LAR of cancer from CTCA without ECTCM using Monte Carlo simulations and the BEIR VII approach to determine cancer risk for women aged 20, 40, 60 and 80 years was 1 in 143, 1 in 284, 1 in 466, and 1 in 1338, respectively.5 The risk decreased as a function of age with the greatest proportion of cancers derived from breast and lung tissues.5 Comparatively, for men of the same age groups, the LAR of cancer was 1 in 686, 1 in 1007, 1 in 1241, and 1 in 3261.5 Although reducing the tube current by 35% resulted in an estimated 35% cancer risk reduction, the greater stochastic risk to females across all age groups was suspected secondary to their increased intrinsic radiosensitivity and degree of breast tissue exposure.5
In a retrospective review of 1119 consecutive adult patients by Smith-Bindman et al, the range of effective doses for computed tomography pulmonary angiography (CTPA) was 2-30 (median 8) mSv,19 similar to that cited by Mettler (13-40 mSv with a mean of 15 mSv).6 The effective dose range for an abdominal CT to evaluate for dissection ranged from 4-69 (median 24) mSv.19 The LAR of cancer for a CTPA and an abdominopelvic aortic dissection CT scan in a 20-year old woman was approximately 1 in 330 exposed female patients.19 This was approximately 1.3 times the risk in males of the same age, though it decreased for both genders by an approximate factor of 1.5 with every 20-year increase in age. The gender disparity in risk with increasing age was comparatively less with aortic dissection CT scans.19
In general, radiation doses for computed tomography scans may vary significantly between and within study types, as well as within and across medical institutions. Effective dose also depends on the number of scans obtained, patient size, the product of tube current and scan time (mAs), and various scanner parameters, including design, pitch, axial range, and maximum tube voltage (kVp).1,10
Radiation from coronary angiography is reported as the dose-area product (DAP) or air kerma (KAP), which is energy released per unit mass of irradiated air multiplied by radiographic beam cross-sectional area; it is measured in units of Gy*cm squared.9 Similar to CT imaging,dose measurements can be performed via one of three ways which result in similar E estimates: 1) use of physical anthropomorphic phantoms; 2) DAP values modified by a conversion factor; or 3) Monte Carlo simulations. 9 Conversion factors may range from 0.12 to 0.26 mSv [Gy*cm2] -1 although the most widely used are proposed by the National Radiological Protection Board (NRPB) and may vary depending on the radiographic view. Skin dose may be measured with thermoluminescence dosimeters; however, DAP provides a more accurate estimation of E since it considers body surface area and secondary extent of body organ irradiation.
Mean effective doses from conventional coronary angiography may vary widely, from 2.3-22.7 mSv, secondary to procedural complexity, protocol, equipment, and operator skill.6,9,20,21 Coronary percutaneous transluminal angioplasty procedures may impart from 6.9-57 mSv, with a mean of 15 mSv,6 translating to approximately 5-fold greater radiation exposure than annual background radiation (3 mSv) and a 700-fold increase in radiation compared to a posteroanterior chest radiograph (0.02 mSv).6 22 Typically, fluorography contributes to the majority and fluoroscopy to less than half of the radiation dosage for diagnostic cardiac catheterizations.9 Leung and Martin’s study of six cardiologists showed an average E of 1.1 mSv from fluoroscopy during left heart catheterizations and an average total E of 3.1mSv.21 In comparison, Mettler et al estimates 7 mSv as the average effective dose for diagnostic coronary angiography.6 For right heart catheterizations and coronary bypass grafts performed, 50% DAP originated from fluoroscopy.21 The third quartile of actual measurements has been accepted in most pubIications as the diagnostic reference level (DRL), specifically 105 Gy*cm2 per the ICRP Publication 60.8 This level provides a radiation dose reference at which acceptable image quality and diagnostic information should be achieved commensurate with the medical imaging task.
Increased effective doses have been associated with radial approach catheterizations.23 Similarly, specific projections, such as the right anterior oblique caudal and left anterior oblique cranial, account for a disproportionately high percentage of KAP.21,22 A prior study determined the total risk of developing fatal cancer over forty years following percutaneous coronary intervention was 84 and 68 per 100,000 patients for men and women, respectively.22 Overall, the effective dose received depends on multiple factors including operator experience, technique, and lab equipment.9,22
As Hall et al proposes, epidemiological data from atomic bomb survivors have provided the "gold standard" in the quantitative assessment of carcinogenic risk from low-dose radiation, such as from cardiac diagnostic imaging, for several reasons: 1) the study involves a large, randomized population of approximately 100,000 persons of both genders and spanning all ages; 2) approximately 30,000 survivors were exposed to low-dose radiation between 5- lOOmSv (mean dose 40 mSv), equivalent to exposure from single and multiple CT scans; 3) cancer incidence and mortality data has surfaced since the study's inception over sixty years ago; and 4) mortality follow up is nearly completed for those exposed as adults and >50% for those exposed as children.1,22 The current, unanimous consensus among national and international organizations for exposures less than 100 mSv is the "linear no-threshold" (LNT) model that proposes risk of stochastic effects decreases linearly without threshold with decreasing radiation dose.5,24-26 Though unproven, this hypothesis may still be applied to many cardiac imaging procedures, since they impart doses in ranges where credible, direct evidence has shown increased cancer risk.
Several analyses of atomic bomb survivors have concluded that the lowest mean dose associated with a statistically significant estimate of excess relative risk (ERR) is about 35 mSv, equivalent to maximum organ doses imparted by two or three CT scans.11,12 Furthermore, a large study piloted by the International Agency for Research on Cancer (IARC) of 400,000 nuclear industry workers exposed to a mean dose of 20 mSv showed a statistically significant ERR estimate of 0.97 per Sv, resulting in 1-2% of cancer deaths attributable to radiation; this is statistically consistent with atomic bomb survivor data.27, 28 These findings have been further supported by preliminary results of the Techa River in Russia, a large- scale, low-dose cohort study.29 The Biological Effects of Ionizing Radiation (BEIR VII, Phase 2) committee has since developed LAR of cancer risk models based on atomic bomb survivor data in response to the US Environmental Protection Agency's request to evaluate the health effects of low level ionizing radiation.5
Prior studies have shown lung and breast malignancy are primary contributors to lifetime attributable risk (LAR) of cancer, a finding reflected by the increased breast tissue weighting factor in the most recent ICRP Publication 103.5 Incidence is highest in women and in children and young adults since the radiosensitivity of many organs decreases with age. A long lag time is typical from acute radiation exposure to the development of malignancy, which may not ultimately manifest in the elderly or those with decreased life expectancy.
Despite extensive epidemiologic based research, extrapolated data only provides estimates of stochastic risk from radiation exposure. Limitations in the BEIR VII risk models include the methods used to transpose Japanese atomic bomb survivor data to U.S. populations given their disparate baseline cancer rates, sampling variability, dose and dose-rate effectiveness factor (DDREF), biological aspects of different sources of ionizing radiation, and assumptions derived from the theoretica I LNT model.8 In Report 126, the US National Council on Radiological Protection and Measurements (NCRP) quantified various estimations of uncertainty in total fatal cancer risk derived from epidemiological (±25%) and dosimetric (0-30%) data, the transfer of risk between populations (-30 to +65%), and projections to lifetime risk (-50 to +10%).30 The overall uncertainty was a factor of approximately 3 above and below the estimated value with the greatest contributor from extrapolation of risk for doses lower than those considered in the atomic bomb study.30
Studies show that patients undergoing diagnostic imaging involving significant radiation exposure have little, if any, awareness of the nuclear medicine dose and the associated risk of cancer.2-4 Furthermore, there is a general lack of awareness among physicians, regardless of specialty, gender or age, regarding the dose exposure, environmental impact, and biorisk of radiation. As awareness of stochastic implications grows within the context of unrestricted access to technology, it is incumbent on the medical community to identify and execute radiation-sparing techniques.
Myocardial perfusion imaging dose exposure can be reduced by using a stress first/stress only protocol with 99mTc sestamibi for patients with a low CAD pretest probability.9 However, diagnostic performance and prognostic value have not been extensively evaluated for this strategy compared to combined stress and rest imaging. Although a second visit may be warranted for some patients, appropriate communication between physicians and nuclear medicine personnel to ensure accurate pretest risk stratification may decrease this probability.9 Adequate hydration and early micturition are also encouraged during post-exposure care.
For computed tomography, effective dose may be decreased via minimizing the number of scans received, limiting the exposed scan region, and optimizing tube current and voltage with respect to the scanner type and patient habitus.9 Using multiple x-ray sources and longer detection rays can also increase the pitch of the scan and reduce overlap between gantry rotations.9 Additionally, prospective gating combined with "step-and-shoot" non-spiral scanning can result in significant exposure reductions.31 For CTCA specifically, electrocardiogram-controlled tube current modulation (ECTCM) can optimally decrease the dose nearly 50% by employing reduced tube current during the cardiac cycle intervals in which coronary motion precludes reliable data collection. Similarly, adjuvant use of beta-blockers can decrease coronary artery velocity to improve image quality and decrease exposure times.5,9 Although CTCA bypasses the complication risks of invasive coronary angiography, such imaging should be avoided in young women, despite radiation-reducing methods, given their particular radiosensitivity.
Strategies to minimize ERR from fluoroscopy and fluorography of cardiac catheterization include employing the slowest times frames, the least image magnification, and the fewest number of views without simultaneously compromising image quality or diagnostic accuracy. Other methods include minimizing the distance between the image detector and x-ray tube to the patient, optimizing beam collimation, and shielding all radiosensitive organs. Additionally, left ventriculography may be omitted if the information can be ascertained via other tests. Fluorography (cineography) time should also be limited as much as possible during coronary procedures.22
In this expanding field of ionizing radiation, technological advancements must be cautiously weighed in light of various unknowns. For example, further research is needed regarding the effects of age, gender, and body habitus on dosimetry as the degree of estimation uncertainty increases with patient anatomical variation from phantom models used to derive tissue weighting factors. Clarification of radiation levels leading to stochastic verse deterministic effects is also important; in other words, determining the maximum exposure thresholds that may induce cancer compared to organ toxicity and tumor cell kill.14 Furthermore, despite various radiation reduction strategies, the resulting sensitivity, specificity, and dosimetry of each requires further exploration. At this point, there are no large-scale epidemiologic studies of the cancer risk specifically associated with medical imaging.
In summary, diagnostic modalities for CAD involving ionizing radiation have rapidly increased over recent years despite direct epidemiological evidence of the stochastic risks of exposure. This poses significant ethical concerns, demanding greater radiation awareness within the medical community and potential broadening of informed consent to include the stochastic risks of radiation.3,17,32 For the majority of radiologic procedures, the clinical indication exceeds the potential risk; however, alternative imaging modalities of equal efficacy should be considered as appropriate. Particular caution should be employed in cases of inadequate clinical rationale, when inadequate communication may result in unnecessary repetition of studies, or for defensive medicine purposes.1,11 Careful selection of patients and optimization of scanning protocols may help to limit cancer risk from radiation.
The views expressed in this material are those of the author, and do not reflect the official policy or position of the U.S. Government, the Department of Defense, or the Department of the Army or Air Force.
Nayar AK, White BM, Stone KE, Slim AM. Radiation Exposure and Associated Cancer Risk With Cardiac Diagnostic Imaging. J Am Osteopath Coll Radiol. 2013;2(2):14-20.