ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8682917
|View full text |Cite
|
Sign up to set email alerts
|

Sound-based Transportation Mode Recognition with Smartphones

Abstract: Smartphone-based identification of the mode of transportation of the user is important for context-aware services. We investigate the feasibility of recognizing the 8 most common modes of locomotion and transportation from the sound recorded by a smartphone carried by the user. We propose a convolutional neural network based recognition pipeline, which operates on the shorttime Fourier transform (STFT) spectrogram of the sound in the log domain. Experiment with the Sussex-Huawei locomotiontransportation (SHL) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
2

Relationship

3
5

Authors

Journals

citations
Cited by 20 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…The mono-modal classifiers employed in this paper are adapted directly from our previous work [18], [23], [24], which are comparable to the state of the art. The sound and vision classifiers are among the first works that are applied to transportation mode recognition.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The mono-modal classifiers employed in this paper are adapted directly from our previous work [18], [23], [24], which are comparable to the state of the art. The sound and vision classifiers are among the first works that are applied to transportation mode recognition.…”
Section: Discussionmentioning
confidence: 99%
“…Fig. 5(b) illustrates the deep architecture of the sound-based classifier (T sound ), which we initially developed in [24]. The convolutional neural network consists of an input layer, two CNN and two FCNN blocks, and an output decision block.…”
Section: B Sound Classifiermentioning
confidence: 99%
See 1 more Smart Citation
“…A user often carries a wearable device (e.g smartphone, smartwatch) during travel, which is embedded with multimodal sensors including motion sensors, GPS (global positioning system), microphone and camera. While most work on locomotion and transportation recognition has used motion sensors [1,2,[8][9][10][11][12]14] or sound [4,5,13], our recent work has shown that vision is also an important modality to recognise modes of locomotion and transportation [6]. We showed that 8 activities (Still, Walk, Run, Bike, Car, Bus, Train, Subway) can be recognised with an F1 score of 82.1% for the best classifier.…”
Section: Introductionmentioning
confidence: 85%
“…The mode of transportation or locomotion is an important contextual that enables applications such as activity and health monitoring, individual environmental impact monitoring, and intelligent service adaptation [16][17][18][19][20][21][22][23][24]. Several prior works looked at recognizing modes of transportation from smartphone sensors, such as motion [25,26], GPS [27][28][29][30][31][32], sound [33], image [34] and the fusion of multiple sensors [35]. The potential of exploiting GSM and WiFi signals for transportation mode recognition has also been reported [36][37][38].…”
Section: Introductionmentioning
confidence: 99%