Retailers have long sought ways to better understand their consumers' behavior in order to deliver a smooth and enjoyable shopping experience that draws more customers every day and, as a result, optimizes income. By combining various visual clues such as activities, gestures, and facial expressions, humans may fully grasp the behavior of others. However, due to inherent problems as well as extrinsic forced issues such as a shortage of publicly available information and unique environmental variables, empowering computer vision systems to provide it remains an ongoing problem (wild). In this paper, the authors focus on identifying human activity recognition in computer vision, which is the first and by far the most important cue in behavior analysis. To accomplish this, the authors present an approach by integrating human position and object motion in order to detect and classify tasks in both temporal and spatial analysis. On the MERL shopping dataset, the authors get state-of-the-art results and demonstrate the capabilities of the proposed technique.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.