Radiography using film has been an established method for imaging the internal organs of the body for over 100 years. Surveys carried out during the 1980s identified a wide range in patient doses showing that there was scope for dosage reduction in many hospitals. This paper discusses factors that need to be considered in optimising the performance of radiographic equipment. The most important factor is choice of the screen/film combination, and the preparation of automatic exposure control devices to suit its characteristics. Tube potential determines the photon energies in the X-ray beam, with the selection involving a compromise between image contrast and the dose to the patient. Allied to this is the choice of anti-scatter grid, as a high grid ratio effectively removes the larger component of scatter when using higher tube potentials. However, a high grid ratio attenuates the X-ray beam more heavily. Decisions about grids and use of low attenuation components are particularly important for paediatric radiography, which uses lower energy X-ray beams. Another factor which can reduce patient dose is the use of copper filtration to remove more low-energy X-rays. Regular surveys of patient dose and comparisons with diagnostic reference levels that provide a guide representing good practice enable units for which doses are higher to be identified. Causes can then be investigated and changes implemented to address any shortfalls. Application of these methods has led to a gradual reduction in doses in many countries.