Big Data technologies and their analytical methods can help improve the quality of education. They can be used to process and analyze classroom video streams to predict student attention, this would greatly improve the learning-teaching experience. With the increasing number of students and the expansion of educational institutions, processing and analyzing video streams in real-time become a complicated issue. In this paper, we have reviewed the existing systems of student attention detection, open-source real-time data stream processing technologies, and the two major data stream processing architectures. We also proposed a new Big Data architecture for real-time student attention detection.
In the area of Object Detection, the most important step is the extraction of object features. One of the most used approaches is HaarLike features and the Integral Image technique to integrate them. The Integral Image technique, used by Viola and Jones, is generally used to calculate the integral of a rectangular filter in an input picture. This filter is a rectilinear rectangle. We propose a method to integrate a rotated one by any angle of rotation inside an image based on the Bresenham algorithm of drawing a segment. We use some pixels – called key points - that forms the four segments of a rotated rectangle, to calculate its Integral Image. Our method focuses on three essential tasks; the first is to determine the rule for drawing a segment (SDR), the second is to identify all the key points of the rectangle r, and the third is to calculate the integral image. The speed of this method depends on the size and angle of rotation of the rectangle. To demonstrate the efficiency of our idea, we applied it to the rotated Haar-like features that we proposed in a later work [12], which had as objectives the improvement of the Viola and Jones algorithm to detect the rotated faces in a given image. We performed tests on more widespread databases of images, which showed that the application of this technique to rotated Haar-Like features improves the performance of object detectors, in general, and faces in particular.
Avoiding detection is vital for the survival of many animals. Factors extrinsic to animals, such as the visual complexity of the background, have been shown to impede the detection of animals. Studies using artificial and natural backgrounds have attributed background complexity to various visual features of the background. One feature that has received less attention is the diversity of color (hue) in the background. We used chickens and artificial backgrounds containing perceptually distinct elements in experiments to test whether color and luminance diversity affect detection time. We found that color diversity in the background impeded detection, while color diversity in prey and luminance diversity in the background did not impede detection. We also did not find an effect of luminance contrast on detection time. Our study suggests a prey animal can benefit in terms of increased detection times by predators when resting on backgrounds with enhanced color diversity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.