Recently, there has been a lot of interest in building compact models for video classification which have a small memory footprint (< 1 GB) [15]. While these models are compact, they typically operate by repeated application of a small weight matrix to all the frames in a video. For example, recurrent neural network based methods compute a hidden state for every frame of the video using a recurrent weight matrix. Similarly, cluster-and-aggregate based methods such as NetVLAD have a learnable clustering matrix which is used to assign soft-clusters to every frame in the video. Since these models look at every frame in the video, the number of floating point operations (FLOPs) is still large even though the memory footprint is small. In this work, we focus on building compute-efficient video classification models which process fewer frames and hence have less number of FLOPs. Similar to memory efficient models, we use the idea of distillation albeit in a different setting. Specifically, in our case, a compute-heavy teacher which looks at all the frames in the video is used to train a compute-efficient student which looks at only a small fraction of frames in the video. This is in contrast to a typical memory efficient Teacher-Student setting, wherein both the teacher and the student look at all the frames in the video but the student has fewer parameters. Our work thus complements the research on memory efficient video classification. We do an extensive evaluation with three types of models for video classification, viz., (i) recurrent models (ii) cluster-and-aggregate models and (iii) memory-efficient cluster-and-aggregate models and show that in each of these cases, a see-it-all teacher can be used to train a compute efficient see-very-little student. Overall, we show that the proposed student network can reduce the inference time by 30% and the number of FLOPs by approximately 90% with a negligible drop in the performance.
A large library of diversified compounds (pyrano [2,3-d]pyrimidines, pyrido[2,3-d] pyrimidines and a variety of spirooxindoles) were synthesized through a very efficient, economical and environmentally benign process utilizing magnetic nanoparticles. Ease of recovery using an external magnetic field is an additional ecofriendly attribute of this catalytic system. Most of the compounds are new; therefore, could be further explored for their pharmaceutical application. Moreover, column chromatography and recrystallisation of the products is not required as the crude products are already highly pure and hence can be used for target oriented synthesis on a wide scale.
A new protocol for the N‐arylation of aryl halides with anilines using Cu nanoparticles in polyethylene glycol (PEG) as an efficient and reusable catalytic system has been developed. The reaction did not require any cocatalyst. Various solvents were screened, and PEG400 provided the best results. The studies showed that the mechanism of catalytic action is dependent on the size of the nanoparticles. The Cu nanoparticles and PEG were recyclable and retained their activity. This newly developed protocol was also found to be suitable for the cross coupling of NH heterocycles with iodobenzene. The present methodology offers several advantages, such as excellent yields, short reaction times, and milder reaction conditions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.