Thirty years into the human immunodeficiency virus (HIV) epidemic in the United States, an estimated 50,000 persons become infected each year: highest rates are in black and Hispanic populations and in men who have sex with men. Testing for HIV has become more widespread over time, with the highest rates of HIV testing in populations most affected by HIV. However, approximately 55% of adults in the United States have never received an HIV test. Because of the individual and community benefits of treatment for HIV, in 2006 the Centers for Disease Control and Prevention recommended routine screening for HIV infection in clinical settings. The adoption of this recommendation has been gradual owing to a variety of issues: lack of awareness and misconceptions related to HIV screening by physicians and patients, barriers at the facility and legislative levels, costs associated with testing, and conflicting recommendations concerning the value of routine screening. Reducing or eliminating these barriers is needed to increase the implementation of routine screening in clinical settings so that more people with unrecognized infection can be identified, linked to care, and provided treatment to improve their health and prevent new cases of HIV infection in the United States.