The alignment of a 2D microscopic image stack to create a 3D image volume is an indispensable aspect of serial section electron microscopy (EM) technology, which could restore the original 3D integrity of biological tissues destroyed by chemical fixation and physical dissection. However, due to the similar texture intrasection and complex variations intersections of neural images, previous registration methods usually failed to yield reliable correspondences. And this also led to misalignment and impeded restoring the z-axis anatomical continuity of the neuron volume. In this article, inspired by human behaviors in finding correspondences, which use the topological relationship of image contents, we developed a spatial attention-based registration method for serial EM images to improve registration accuracy. Our approach combined the U-Net framework with spatial transformer networks (STN) to regress corresponding transformation maps in an unsupervised training fashion. The spatial attention (SA) module was incorporated into the U-Net architecture to increase the distinctiveness of image features by modeling its topological relationship. Experiments are conducted on both simulated and real data sets (MAS and Reg-Cremi). Quantitative and qualitative comparisons demonstrate that our approach results in state of art accuracy (using the evaluation index of NCC, SSIM, Dice, Landmark error) and providing smooth and reliable transformation with less texture blur and unclear boundary than existing techniques. Our method is able to restore image stacks for visualization and quantitative analysis of EM image sequences.
It remains a mystery as to how neurons are connected and thereby enable use to think, and volume reconstruction from series of microscopy sections of brains is a vital technique in determining this connectivity. Image registration is a key component; the aim of image registration is to estimate the deformation field between two images. Current methods choose to directly regress the deformation field; however, this task is very challenging. It is common to trade off computational complexity with precision when designing complex models for deformation field estimation. This approach is very inefficient, leading to a long inference time. In this paper, we suggest that complex models are not necessary and solve this dilemma by proposing a dual-network architecture. We divide the deformation field prediction problem into two relatively simple subproblems and solve each of them on one branch of the proposed dual network. The two subproblems have completely opposite properties, and we fully utilize these properties to simplify the design of the dual network. These simple architectures enable high-speed image registration. The two branches are able to work together and make up for each other’s drawbacks, and no loss of accuracy occurs even when simple architectures are involved. Furthermore, we introduce a series of loss functions to enable the joint training of the two networks in an unsupervised manner without introducing costly manual annotations. The experimental results reveal that our method outperforms state-of-the-art methods in fly brain electron microscopy image registration tasks, and further ablation studies enable us to obtain a comprehensive understanding of each component of our network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.