Ok, I haven’t finished writing up my thoughts on the last paper, but that’s just because I’ve been lazy. It was interesting paper, and I learned a few things. Still skeptical though. But more on that later.
This week’s article comes from Medical Physics and is titled “The impact of increased Al filtration on x-ray tube loading and image quality in diagnostic Radiology” by RH Behrman (Med Phys 30, 69-78 (2003)).
When I was an undergrad (way back in 1991), one of the projects I did (and the first one I did in Medical Physics) for my 4th year physics lab course was to study dose reduction to pediatric patients undergoing cardiac catheterization procedures. My lab partner and I looked at reducing radiation dose by adding a copper filter at the x-ray tube. So this paper was of particular interest to me. Added filtration significantly reduces low energy x-rays that don’t contribute to image formation, but then you need to compensate by boosting the x-ray technique.
Most fluoroscopy systems now come with automatic systems that add or remove filters of various types depending on the amount of radiation received at the receptor. Good for the patient, but harder on the generator and x-ray tube. I also suspect that the added filtration also leads to increased scatter exposure to the doctors performing the procedure. Since the added filter increases the effective energy of the x-ray beam, and Compton scatter increases with energy, there should be more scattered radiation This is one of the things I’ve been wanting to study for a while (one of the many research project ideas gathering dust in the back of my brain). Maybe I should find a way to get a summer student or something to work on this with me.
Abstract:
Previous work has shown that for nine common radiographic projections (AP abdomen, AP cervical spine, LAT cervical spine, PA chest, LAT chest, AP hip, AP lumbar spine, LAT lumber spine, and AP pelvis) increasing the total x-ray tube filtration from 2.5 mm Al equivalent (the regulatory minimum for general diagnostic radiology) to 4.0 mm Al equivalent, reduces the average effective dose and average skin entrance dose by 9% and 16%, respectively, using a 400 speed screen-film system.1 In this study, the effects of this filtration increase on x-ray tube loading and image quality were assessed. For the above projections and filtration increase, mean absolute and percentage increases in tube loading were 2.9 mAs and 15%, respectively, for a constant film density and fixed kVp. Tube current (mA) increases of 25% (a worst case) resulted in no statistically significant loss in focal spot resolution due to blooming for both large (1.2 mm) and small (0.6 mm) focal spot sizes, except at high mA low kVp techniques. The latter losses were below 10%, and when the image receptor blur was incorporated, the total system spatial resolution losses were on the order of one-quarter to one-half these values for typical clinical geometries. Radiographs of a contrast phantom taken with 2.5 and 4.0 mm total Al equivalent x-ray tube filtration were compared at 60, 70, 81, 90, 102, and 121 kVp. No statistically significant changes were observed with regard to (1) test object conspicuity as reported by three observers, (2) image contrast, as measured using a densitometer with a 3 mm aperture (±0.0017 OD, 95% confidence level), and (3) pixel value image noise, image contrast-to-noise ratios, and image signal-to-noise ratios, as measured using a scanning densitometer with a 12-bit acquisition depth and 85 µ pixel size (±2.5%, ±3.1%, and ±2.5%, 95% confidence levels, respectively). These results, combined with the linear no-threshold model for radiation risk and the ALARA principle, suggest that general radiography should be carried out using a minimum of 4.0 mm total Al equivalent filtration. ©2003 American Association of Physicists in Medicine.
Discover more from Imablog
Subscribe to get the latest posts sent to your email.