Abstract:The advent of modern, high-speed electron detectors has made the collection of multidimensional hyperspectral transmission electron microscopy datasets, such as 4D-STEM, a routine. However, many microscopists find such experiments daunting since analysis, collection, long-term storage, and networking of such datasets remain challenging. Some common issues are their large and unwieldy size that often are several gigabytes, non-standardized data analysis routines, and a lack of clarity about the computing and network resources needed to utilize the electron microscope. The existing computing and networking bottlenecks introduce significant penalties in each step of these experiments, and thus, real-time analysis-driven automated experimentation for multidimensional TEM is challenging. One solution is to integrate microscopy with edge computing, where moderately powerful computational hardware performs the preliminary analysis before handing off the heavier computation to high-performance computing (HPC) systems. Here we trace the roots of computation in modern electron microscopy, demonstrate deep learning experiments running on an edge system, and discuss the networking requirements for tying together microscopes, edge computers, and HPC systems.