Led in its earliest decades by a few pioneers and supported by a small number of professional organizations and universities, medical informatics was funded primarily by federal grants and contracts until 1980, when industry began to enter the marketplace. Despite technology advances, diffusion across health care was slow, and computers were used predominately for business functions. In the 1980s specialized subsystems were developed for the clinical laboratory, radiology, and pharmacy, but by 1989 only a few medical information systems were operational, most of them in academic health centers that had received federal funding. In the 1990s, distributed information systems allowed physicians to enter orders and retrieve test results using clinical workstations; and hospital networks integrated data from all the distributed clinical specialty databases in an electronic patient record. By the end of 1990s, systems were up and running in the Department of Defense and Veterans Administration. In the 2000s, more clinicians in the United States were using electronic health records, due in part to steps taken to adjust the computer to its professional users. Diffusion was further advanced in 2010, when direct federal funding was extended to health care providers using systems that met "Meaningful Use" requirements in caring for Medicare and Medicaid patients. Advances expected in the next decade include precision medicine and patient genotyping; telehealth care; cloud computing; support for elder care with multiple chronic diseases and polypharmacy; advanced clinical decision support; patient data security; big data analytics, improved population health, public health, and disaster management; and interoperability and integration of care across venues.