2016
DOI: 10.1103/physreve.93.053302
|View full text |Cite
|
Sign up to set email alerts
|

Expansion-maximization-compression algorithm with spherical harmonics for single particle imaging with x-ray lasers

Abstract: In 3D single particle imaging with X-ray free-electron lasers, particle orientation is not recorded during measurement but is instead recovered as a necessary step in the reconstruction of a 3D image from the diffraction data. Here we use harmonic analysis on the sphere to cleanly separate the angular and radial degrees of freedom of this problem, providing new opportunities to efficiently use data and computational resources. We develop the Expansion-Maximization-Compression algorithm into a shell-by-shell ap… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
11
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 32 publications
0
11
0
Order By: Relevance
“…Classification work includes manifold mapping (6), spectral clustering (7), principal component analysis, and support vector machines (8). Orientation methods include common curve approaches (9)(10)(11)(12), expectation maximization (13)(14)(15), and manifold embedding (16)(17)(18)(19). Once images are classified, oriented, and assembled into a 3D intensity function, iterative phasing (20) is often used to determine molecular structure.…”
mentioning
confidence: 99%
“…Classification work includes manifold mapping (6), spectral clustering (7), principal component analysis, and support vector machines (8). Orientation methods include common curve approaches (9)(10)(11)(12), expectation maximization (13)(14)(15), and manifold embedding (16)(17)(18)(19). Once images are classified, oriented, and assembled into a 3D intensity function, iterative phasing (20) is often used to determine molecular structure.…”
mentioning
confidence: 99%
“…One group of methods to orient the diffraction patterns in SPI relies on the information in the common intersection curves of the patterns (Huldt et al, 2003;Shneerson et al, 2008;Bortel & Tegze, 2011;Yefanov & Vartanyants, 2013;Zhou et al, 2014). Other methods find the possible orientations of the patterns by comparing them with a 3D intensity model updated by every iteration (Loh & Elser, 2009;Tegze & Bortel, 2012;Flamant et al, 2016;Nakano et al, 2017Nakano et al, , 2018. Another group of methods uses the manifold embedding technique to find the similarities between diffraction patterns and order them in the orientation space (Fung et al, 2009;Giannakis et al, 2012;Kassemeyer et al, 2013;Winter et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, the ability to analyze such data sets is sought after by the community in general [22]. For example, for the successful determination of 3D-structures from a CDI data set using the expansion-maximization-compression algorithm [22][23][24], it is necessary to sample the 3D Fourier space up to the Nyquist rate for the desired resolution and this for all sub-species contained in the target under study. The achievable resolution, as well as the chance for successful convergence of the algorithm, correlates directly with the number of diffraction patterns with a high signal-tonoise ratio [23].…”
Section: Introductionmentioning
confidence: 99%
“…For example, for the successful determination of 3D-structures from a CDI data set using the expansion-maximization-compression algorithm [22][23][24], it is necessary to sample the 3D Fourier space up to the Nyquist rate for the desired resolution and this for all sub-species contained in the target under study. The achievable resolution, as well as the chance for successful convergence of the algorithm, correlates directly with the number of diffraction patterns with a high signal-tonoise ratio [23]. Thus, huge data sets are taken and as a consequence of the sheer amount of data, it is getting increasingly complicated to distill the high-quality data subsets that are suitable for subsequent analysis steps.…”
Section: Introductionmentioning
confidence: 99%