In this paper, we introduce a method to automatically reconstruct the 3D motion of a person interacting with an object from a single RGB video. Our method estimates the 3D poses of the person and the object, contact positions, and forces and torques actuated by the human limbs. The main contributions of this work are three-fold. First, we introduce an approach to jointly estimate the motion and the actuation forces of the person on the manipulated object by modeling contacts and the dynamics of their interactions. This is cast as a large-scale trajectory optimization problem. Second, we develop a method to automatically recognize from the input video the position and timing of contacts between the person and the object or the ground, thereby significantly simplifying the complexity of the optimization. Third, we validate our approach on a recent MoCap dataset with ground truth contact forces and demonstrate its performance on a new dataset of Internet videos showing people manipulating a variety of tools in unconstrained environments.* Please see our project webpage [2] for trained models, data and code.
Visual localization in large and complex indoor scenes, dominated by weakly textured rooms and repeating geometric patterns, is a challenging problem with high practical relevance for applications such as Augmented Reality and robotics. To handle the ambiguities arising in this scenario, a common strategy is, first, to generate multiple estimates for the camera pose from which a given query image was taken. The pose with the largest geometric consistency with the query image, e.g., in the form of an inlier count, is then selected in a second stage. While a significant amount of research has concentrated on the first stage, there is considerably less work on the second stage. In this paper, we thus focus on pose verification. We show that combining different modalities, namely appearance, geometry, and semantics, considerably boosts pose verification and consequently pose accuracy. We develop multiple hand-crafted as well as a trainable approach to join into the geometric-semantic verification and show significant improvements over stateof-the-art on a very challenging indoor dataset.
Alzheimer's disease (AD) is characterized by the deposition of misfolded tau and amyloid-beta (Aβ). Tramiprosate (TMP) and its metabolite 3-sulfopropanoic acid (SPA) are phase 3 therapeutics believed to target Aβ oligomers. It is of paramount importance to understand how TMP/SPA modulate the conformations of Aβ. Here, we studied the Aβ42 alone and in the presence of TMP or SPA by adaptive sampling molecular dynamics. Next, to quantify the effects of drug candidates on Aβ42, we developed a novel Comparative Markov State Analysis (CoVAMPnet) approach: ensembles of learned Markov state models were aligned across different systems based on a solution to an optimal transport problem, and the directional importance of inter-residue distances for assignment to the Markov states was assessed by a discriminative analysis of aggregated neural network gradients. TMP/SPA shifted Aβ42 towards more structured conformations by interacting non-specifically with charged residues and destabilizing salt bridges involved in oligomerization. SPA impacted Aβ42 the most, preserving α-helices and suppressing aggregation-prone β-strands. Experimental biophysical analyses showed mild effects of TMP/SPA on Aβ42, and activity enhancement by the endogenous metabolization of TMP into SPA. The CoVAMPnet method is broadly applicable to study the effects of drug candidates on conformational behavior of intrinsically disordered biomolecules.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.