Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2014
DOI: 10.1145/2556288.2557116
|View full text |Cite
|
Sign up to set email alerts
|

RecoFit

Abstract: Although numerous devices exist to track and share exercise routines based on running and walking, these devices offer limited functionality for strength-training exercises. We introduce RecoFit, a system for automatically tracking repetitive exercises -such as weight training and calisthenics -via an arm-worn inertial sensor. Our goal is to provide real-time and post-workout feedback, with no user-specific training and no intervention during a workout. Toward this end, we address three challenges: (1) segment… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 160 publications
(28 citation statements)
references
References 35 publications
0
28
0
Order By: Relevance
“…As the reader might have perceived, the system bears some resemblance to human activity recognition (HAR) systems (Kumari et al, 2021;Morris et al, 2014;Muaaz et al, 2021) (mainly in the realtime signal processing phase) since the information extracted from a microphone could also be extracted from an HAR device such as an accelerometer. The main difference between this kind of system and the one described in this section is that HAR systems, as their name implies, gravitate around a person doing an activity, while in our case, the central reference point is a robot.…”
Section: The Acoustic Touch Recognition Systemmentioning
confidence: 99%
“…As the reader might have perceived, the system bears some resemblance to human activity recognition (HAR) systems (Kumari et al, 2021;Morris et al, 2014;Muaaz et al, 2021) (mainly in the realtime signal processing phase) since the information extracted from a microphone could also be extracted from an HAR device such as an accelerometer. The main difference between this kind of system and the one described in this section is that HAR systems, as their name implies, gravitate around a person doing an activity, while in our case, the central reference point is a robot.…”
Section: The Acoustic Touch Recognition Systemmentioning
confidence: 99%
“…New technologies such as 3D printing and other fabrication options enable embedding of sensors in a variety of materials and objects [63], which could have implications for accessibility (e.g., smart prosthetics [33]). Wearable technologies embed sensors directly into clothing or accessories, often with the goal of augmenting cognition (e.g., Google Glass's heads-up display) [58,62] or health tracking (e.g., many smart watch systems [17,42], some of which researchers have explored making more accessible [13,16]); Carrington et al have also explored expanding the concept of wearables to augment mobility aids [12]. Improved voice-based sensors are increasingly available as smart-speakers or phone-based virtual assistants (a category of sensor that may be particularly of interest to people with disabilities [10,46,59]).…”
Section: Ubiquitous Computingmentioning
confidence: 99%
“…Human activity recognition (HAR) is an important sub-field of human-computer interaction (HCI), with a rich body of work. HAR covers a broad range of tasks, including distinguishing activity from non-activity [5,6], activity classification [7] and repetition counting [5,6,8,9,10]. These tasks are interesting by themselves from a research perspective, but also have a wide range of potential real world applications, especially in the fields of healthcare and personal fitness.…”
Section: Related Workmentioning
confidence: 99%
“…Similar to [9], they also used auto-correlation for repetition counting. Morris et al [6] built and improved on the aforementioned works. They used an arm-mounted inertial measurement unit and achieved up to 99% recognition accuracy across 14 gym exercises (plus walking and running).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation