We describe an approach to2D-to-3D video conversion for the stereoscopic display. Targeting the problem of synthesizing the frames of a virtual "right view" from the original monocular 2D video, we generate the stereoscopic video in steps as following. (1) A 2.5D depth map is first estimated in a multi-cue fusion manner by leveraging motion cues and photometric cues in video frames with a depth prior of spatial and temporal smoothness. (2)The depth map is converted to a disparity map with considering both the displaying device size and human"s stereoscopic visual perception constraints. (3)We fix the original 2D frames as the "left view" ones, and warp them to "virtually viewed" right ones according to the predicted disparity value. The main contribution of this method is to combine motion and photometric cues together to estimate depth map.In the experiments, we apply our method to converting several movie clips of well-known films into stereoscopic 3D video and get good results 1 .
With the recent booming of 3DTV industry, more and more stereoscopic videos are demanded by the market. This paper presents a system of converting conventional monocular videos to stereoscopic ones. In this system, an input video is firstly segmented into shots to reduce operations on similar frames. Then, automatic depth estimation and interactive image segmentation are integrated to obtain depth maps and foreground/background segments on selected key frames. Within each video shot, such results are propagated from key frames to non-key frames. Combined with a depthto-disparity conversion method, the system synthesizes the counterpart (either left or right) view for stereoscopic display by warping the original frame according to disparity maps. For evaluation, we use human labeled depth map as the reference and compute both the mean opinion score (MOS) and Peak signal-to-noise ratio (PSNR) to evaluate the converted video quality. Experiment results demonstrate that the proposed conversion system and methods achieves encouraging performance.
Although inhibiting EGFR-mediated signaling proved to be effective in treating certain types of cancers, a quickly evolved mechanism that either restores the EGFR signaling or activates an alternative pathway for driving the proliferation and survival of malignant cells limits the efficacy and utility of the approach via suppressing the EGFR functionality. Given the fact that overexpression of EGFR is commonly seen in many cancers, an EGFR-targeting antibody-drug conjugate (ADC) can selectively kill cancer cells independently of blocking EGFR-mediated signaling. Herein, we describe SHR-A1307, a novel anti-EGFR ADC, generated from an anti-EGFR antibody with prolonged halflife, and conjugated with a proprietary toxin payload that has increased index of EGFR targeting-dependent versus EGFR targeting-independent cytotoxicity. SHR-A1307 demonstrated strong and sustained antitumor activities in EGFR-positive tumors harboring different oncogenic mutations on EGFR, KRAS, or PIK3CA. Antitumor efficacy of SHR-A1307 correlated with EGFR expression levels in vitro and in vivo, regardless of the mutation status of EGFR signaling mediators and a resultant resistance to EGFR signaling inhibitors. Cynomolgus monkey toxicology study showed that SHR-A1307 is well tolerated with a wide therapeutic index. SHR-A1307 is a promising therapeutic option for EGFR-expressing cancers, including those resistant or refractory to the EGFR pathway inhibitors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.