One of the main shortcomings of event data in football, which has been extensively used for analytics in the recent years, is that it still requires manual collection, thus limiting its availability to a reduced number of tournaments. In this work, we propose a deterministic decision tree-based algorithm to automatically extract football events using tracking data, which consists of two steps: (1) a possession step that evaluates which player was in possession of the ball at each frame in the tracking data, as well as the distinct player configurations during the time intervals where the ball is not in play to inform set piece detection; (2) an event detection step that combines the changes in ball possession computed in the first step with the laws of football to determine in-game events and set pieces. The automatically generated events are benchmarked against manually annotated events and we show that in most event categories the proposed methodology achieves $$+90\%$$ + 90 % detection rate across different tournaments and tracking data providers. Finally, we demonstrate how the contextual information offered by tracking data can be leveraged to increase the granularity of auto-detected events, and exhibit how the proposed framework may be used to conduct a myriad of data analyses in football.
Three-dimensional motion capture systems such as Vicon have been used to validate commercial electronic performance and tracking systems. However, three-dimensional motion capture cannot be used for large capture areas such as a full football pitch due to the need for many fragile cameras to be placed around the capture volume and a lack of suitable depth of field of those cameras. There is a need, therefore, for a hybrid testing solution for commercial electronic performance and tracking systems using highly precise three-dimensional motion capture in a small test area and a computer vision system in other areas to test for full-pitch coverage by the commercial systems. This study aimed to establish the validity of VisionKit computer vision system against three-dimensional motion capture in a stadium environment. Ten participants undertook a series of football-specific movement tasks, including a circuit, small-sided games and a 20 m sprint. There was strong agreement between VisionKit and three-dimensional motion capture across each activity undertaken. The root mean square difference for speed was 0.04 m·s−1 and for position was 0.18 m. VisionKit had strong agreement with the criterion three-dimensional motion capture system three-dimensional motion capture for football-related movements tested in stadium environments. VisionKit can thus be used to establish the concurrent validity of other electronic performance and tracking systems in circumstances where three-dimensional motion capture cannot be used.
Perceptions of synthetic surfaces used in football can vary considerably between players, and obtaining reliable feedback is challenging. The aim of this study was to develop a suitable process and evaluate the merits of establishing a sensory panel to assess the subjective attributes of third generation synthetic turf surfaces (3G turf) used in football. Focus groups with 12 male and 13 female footballers were conducted on an outdoor 3G turf pitch to develop a common language to describe sensory feedback related to player–surface interactions. Post-session analysis revealed two main themes related to player–surface interactions: hardness and grip. These themes were broken down further into five sensory attributes (Movement Speed, Slip, Movement Confidence, Leg Shock and Give) which were investigated further in an indoor test area containing ten 3G turf surfaces with controlled surface properties. A panel consisting of 18 University footballers (11 male and 7 female) undertook a screening and training session to refine the language associated with the sensory attributes and become familiar with the testing protocol. During a final evaluation session, players were asked to discriminate between surfaces using the paired comparison method for each of the sensory attributes. Player consistency remained similar between the screening and evaluation sessions whilst the panel’s ability to discriminate between surfaces improved during the evaluation session. Sensory training can therefore be a useful approach to aid players in differentiating between surfaces and lead to a greater understanding of athlete perceptions of surface attributes.
One of the main shortcomings of event data in football, which has been extensively used for analytics in the recent years, is that it still requires manual collection, thus limiting its availability to a reduced number of tournaments. In this work, we propose a computational framework to automatically extract football events using tracking data, namely the coordinates of all players and the ball. Our approach consists of two models: (1) the possession model evaluates which player was in possession of the ball at each time, as well as the distinct player configurations in the time intervals where the ball is not in play; (2) the event detection model relies on the changes in ball possession to determine in-game events, namely passes, shots, crosses, saves, receptions and interceptions, as well as set pieces. First, analyze the accuracy of tracking data for determining ball possession, as well as the accuracy of the time annotations for the manually collected events. Then, we benchmark the auto-detected events with a dataset of manually annotated events to show that in most categories the proposed method achieves +90% detection rate. Lastly, we demonstrate how the contextual information offered by tracking data can be leveraged to increase the granularity of auto-detected events, and exhibit how the proposed framework may be used to conduct a myriad of data analyses in football.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.