The British journal of radiology vol:79 issue:948 pages:981-90
An experimental method of determining the optimal beam quality for digital mammography systems was applied to two systems (Fuji Profect and GE Senographe 2000D). The mean glandular dose (MGD) and contrast-to-noise ratio (CNR) were measured using Perspex breast phantoms simulating breasts from 20 mm to 90 mm thick. For each thickness, four combinations of tube voltage and target/filter were tested. Optimal beam quality was defined as giving a target CNR for the lowest MGD and was similar for the two systems. For breasts with a thickness of 21 mm or 32 mm, a tube voltage of either 25 kV or 28 kV and a Mo/Mo target/filter combination was optimal. For breast thicknesses of 45 mm and greater, the combination that had the highest X-ray energy (34 kV Rh/Rh) was optimal. Optimization using the higher energy beam quality required greater detector dose to compensate for the lower contrast. Thus for a 75 mm thick breast the 34 kV Rh/Rh combination required about a 90% greater detector dose than 28 kV Mo/Mo to achieve the same CNR because of the 25% reduction in contrast. Nonetheless, the MGD was reduced by 32% by choosing the higher energy spectra and achieving the same CNR. Current automatic exposure control (AEC) designs that aim for a fixed detector dose are not optimal and greater use of higher energy spectra should be accompanied by higher detector doses at all breast thicknesses which are average or above. This may result in slightly higher doses, but better image quality for these breasts.