The electron microscope (EM) provides exquisitely detailed information about structural arrangements of matter through its high native resolution, contrast and wide variety of available signals. This enables broad application across numerous fields, including physics, materials science, medicine, and biology. In some fields, especially biology, there is increasing need for quantification at smaller length scales and simultaneous demand for structural data from larger volumes. While new digital automated tools make it possible in some cases to investigate most of a 3 mm transmission EM sample, they remain ineffective for significantly larger volumes, for example whole-genome patterned DNA [1], neural circuits [2], silicon wafers, and histological arrays. This is due in large part to operating costmeasured in both dollars and hours-arising from extensive sample preparation / handling and dependence on skilled operators [3]. For this reason, applicability of high-resolution EM beyond laboratory research is mostly limited to niche areas (for example, nephrology and ciliary dyskinesia in clinical medicine). Further, researchers and clinicians increasingly turn to methods that take ensemble measurements such as low-cost genetic sequencing and mass spectrometry despite the richness and spatial precision of information available from EM. Published EM results are generally cherry-picked from dozens to hundreds of images of painstakingly-prepared samples taken over weeks to months. What if EM's were optimized such that every image coming from the machine was scientifically significant and "publication-ready"? High throughput requires new ways of looking at EM imaging. The EM imaging process is a packet system, and throughput can be defined as the amount of useful data retrieved from the system during the packet divided by the time taken to do so. To be an honest throughput number, the time taken should include all aspects of the experiment, including sample preparation and loading, machine setup time (and any downtime experienced), as well as the time spent in the microscope itself. Generally the time spent actually examining the sample in the microscope is tiny compared to these other steps, and the time spent acquiring scientifically significant data is an even smaller fraction. Microscopy needs to significantly change to meet the throughput needs of modern biology: high resolution imaging may transition from a seldom-used relatively small part of a lengthy process to an always-on, always available part of a long term, continuous acquisition chain lasting weeks, months or even years. In this paradigm, collecting scientifically significant images becomes the majority of the process, rather than a small part of an arduous march dominated by sample preparation, sample handling, and experiment design. For a microscope to operate efficiently in this paradigm, it needs to offer robust and reliable performance over very long timescales. Such automation has begun with scanning EM [4] and work in EM-based gene sequencing [5,6]. T...