2018
DOI: 10.1007/978-3-319-93846-2_14
|View full text |Cite
|
Sign up to set email alerts
|

Where Is the Nurse? Towards Automatically Visualising Meaningful Team Movement in Healthcare Education

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
38
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 34 publications
(38 citation statements)
references
References 8 publications
0
38
0
Order By: Relevance
“…In the context of project-based learning, Cukurova, Luckin, Millán, and Mavrikis (2018), and Spikol, Ruffaldi, Dabisias, and Cukurova (2018) collected data from learners' hand movements and head direction to predict their success in open-ended design tasks. In the area of professional development, Echeverria, Martinez-Maldonado, Power, Hayes, and Shum (2018) used sensor data to capture trainee nurses' interactions during healthcare training and created visualisations of their interaction and movements for effective reflections. From the educators' point of view, Prieto, Sharma, Kidzinski, Rodríguez-Triana, and Dillenbourg (2018) recently collected eye-tracking, audiovisual and accelerometer data of educators to help them support the management of classroom activities.…”
mentioning
confidence: 99%
“…In the context of project-based learning, Cukurova, Luckin, Millán, and Mavrikis (2018), and Spikol, Ruffaldi, Dabisias, and Cukurova (2018) collected data from learners' hand movements and head direction to predict their success in open-ended design tasks. In the area of professional development, Echeverria, Martinez-Maldonado, Power, Hayes, and Shum (2018) used sensor data to capture trainee nurses' interactions during healthcare training and created visualisations of their interaction and movements for effective reflections. From the educators' point of view, Prieto, Sharma, Kidzinski, Rodríguez-Triana, and Dillenbourg (2018) recently collected eye-tracking, audiovisual and accelerometer data of educators to help them support the management of classroom activities.…”
mentioning
confidence: 99%
“…For example, authors have used automated video analysis to model students' posture [45] and gestures [1], teacher's walking [10], interactions between teachers and students [1,53] during a lecture, and characterising the types of social interactions of students in makerspaces [15]. Wearable sensors have also been used to track teachers' orchestration tasks by using a combination of sensors (eye tracker, accelerometer, and a camera) [44] and students' mobility strategies while working in teams in the contexts of primary education [48], healthcare simulation [18] and firefighting training [51]. Some work has attempted to close the feedback loop by displaying some positioning traces back to teachers.…”
Section: Spatial Analysis and Positioning Technology In The Classroommentioning
confidence: 99%
“…Tracking systems have emerged recently, enabling the automated capture of positioning and proximity traces from authentic classrooms using wearables attached to students' shoes [48], computer-vision [1] and positioning trackers [18]. Some systems even summarise the time a teacher has spent in close proximity to a student or group of students, to raise an alarm if a threshold is reached (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…These approaches have utilised various forms of technologies and data sources, including WiFi data [35], computer vision algorithms [1,6], thermal sensors [7], and wearable technologies [30,42]. They have also focused on a range of learning spaces, such as small fabrication rooms [9], healthcare simulation rooms [12], the library [40], laboratories [30], regular classrooms [1,42], and lecture rooms [6]. Out of these approaches, physical positioning tacking is one of the promising methods in providing fine-grained data for capturing in-class social interactions because of its high spatial-temporal precision [30].…”
Section: Introductionmentioning
confidence: 99%