In this study we report a convergence of behavioural and neuroanatomical evidence in support of an amygdala hypothesis of autism. We find that people with high-functioning autism (HFA) show neuropsychological profiles characteristic of the effects of amygdala damage, in particular selective impairment in the recognition of facial expressions of fear, perception of eye-gaze direction, and recognition memory for faces. Using quantitative magnetic resonance (MR) image analysis techniques, we find that the same individuals also show abnormalities of medial temporal lobe (MTL) brain structure, notably bilaterally enlarged amygdala volumes. These results combine to suggest that developmental malformation of the amygdala may underlie the social-cognitive impairments characteristic of HFA. This malformation may reflect incomplete neuronal pruning in early development.
Three-dimensional (3D) reconstruction and examination of tissue at microscopic resolution have significant potential to enhance the study of both normal and disease processes, particularly those involving structural changes or those in which the spatial relationship of disease features is important. Although other methods exist for studying tissue in 3D, using conventional histopathological features has significant advantages because it allows for conventional histopathological staining and interpretation techniques. Until now, its use has not been routine in research because of the technical difficulty in constructing 3D tissue models. We describe a novel system for 3D histological reconstruction, integrating whole-slide imaging (virtual slides), image serving, registration, and visualization into one user-friendly package. It produces high-resolution 3D reconstructions with minimal user interaction and can be used in a histopathological laboratory without input from computing specialists. It uses a novel method for slice-to-slice image registration using automatic registration algorithms custom designed for both virtual slides and histopathological images. This system has been applied to >300 separate 3D volumes from eight different tissue types, using a total of 5500 virtual slides comprising 1.45 TB of primary image data. Qualitative and quantitative metrics for the accuracy of 3D reconstruction are provided, with measured registration accuracy approaching 120 μm for a 1-cm piece of tissue. Both 3D tissue volumes and generated 3D models are presented for four demonstrator cases.
Various algorithms comparing 2D NMR spectra have been explored for their ability to dereplicate natural products as well as determine molecular structures. However, spectroscopic artefacts, solvent effects, and the interactive effect of functional group(s) on chemical shifts combine to hinder their effectiveness. Here, we leveraged Non-Uniform Sampling (NUS) 2D NMR techniques and deep Convolutional Neural Networks (CNNs) to create a tool, SMART, that can assist in natural products discovery efforts. First, an NUS heteronuclear single quantum coherence (HSQC) NMR pulse sequence was adapted to a state-of-the-art nuclear magnetic resonance (NMR) instrument, and data reconstruction methods were optimized, and second, a deep CNN with contrastive loss was trained on a database containing over 2,054 HSQC spectra as the training set. To demonstrate the utility of SMART, several newly isolated compounds were automatically located with their known analogues in the embedded clustering space, thereby streamlining the discovery pipeline for new natural products.
Light microscopy applied to the domain of histopathology has traditionally been a two-dimensional imaging modality. Several authors, including the authors of this work, have extended the use of digital microscopy to three dimensions by stacking digital images of serial sections using image-based registration. In this paper, we give an overview of our approach, and of extensions to the approach to register multi-modal data sets such as sets of interleaved histopathology sections with different stains, and sets of histopathology images to radiology volumes with very different appearance. Our approach involves transforming dissimilar images into a multi-channel representation derived from co-occurrence statistics between roughly aligned images.
Registration of histopathology images of consecutive tissue sections stained with different histochemical or immunohistochemical stains is an important step in a number of application areas, such as the investigation of the pathology of a disease, validation of MRI sequences against tissue images, multiscale physical modeling, etc. In each case, information from each stain needs to be spatially aligned and combined to ascertain physical or functional properties of the tissue. However, in addition to the gigabyte-size images and nonrigid distortions present in the tissue, a major challenge for registering differently stained histology image pairs is the dissimilar structural appearance due to different stains highlighting different substances in tissues. In this paper, we address this challenge by developing an unsupervised content classification method that generates multichannel probability images from a roughly aligned image pair. Each channel corresponds to one automatically identified content class. The probability images enhance the structural similarity between image pairs. By integrating the classification method into a multiresolution-block-matching-based nonrigid registration scheme (N. Roberts, D. Magee, Y. Song, K. Brabazon, M. Shires, D. Crellin, N. Orsi, P. Quirke, and D. Treanor, "Toward routine use of 3D histopathology as a research tool," Amer. J. Pathology, vol. 180, no. 5, 2012.), we improve the performance of registering multistained histology images. Evaluation was conducted on 77 histological image pairs taken from three liver specimens and one intervertebral disc specimen. In total, six types of histochemical stains were tested. We evaluated our method against the same registration method implemented without applying the classification algorithm (intensity-based registration) and the state-of-the-art mutual information based registration. Superior results are obtained with the proposed method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.