2020
DOI: 10.1007/978-981-15-5554-1_13
|View full text |Cite
|
Sign up to set email alerts
|

Experiments in Lifelog Organisation and Retrieval at NTCIR

Abstract: Lifelogging can be described as the process by which individuals use various software and hardware devices to gather large archives of multimodal personal data from multiple sources and store them in a personal data archive, called a lifelog. The Lifelog task at NTCIR was a comparative benchmarking exercise with the aim of encouraging research into the organisation and retrieval of data from multimodal lifelogs. The Lifelog task ran for over 4 years from NTCIR-12 until NTCIR-14 (2015.02-2019.06); it supported … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…This has also resulted in the creation of many large multimodal personal datasets [11] comprising of different data types (e.g, passive visual capture, mobile device context, physiological data) [8] that enables research community to develop intelligent systems to track individual's health and gain more insights of an individual's personal data such as daily-life event segmentation [9], activities of daily-living identification as an indicator in health tracking systems [10], etc. Although multiple data sources are recorded in multimodal personal datasets [12], only the combination of visual and related metadata including semantic locations, daily-life activities, date and time are employed extensively in research [10,20] while others has not yet been exploited. Typically, physiological signals are usually ignored due to the limited amount of research conducted using this type of data as well as the limitations of recording devices in terms of the granularity signal measurement.…”
Section: Introductionmentioning
confidence: 99%
“…This has also resulted in the creation of many large multimodal personal datasets [11] comprising of different data types (e.g, passive visual capture, mobile device context, physiological data) [8] that enables research community to develop intelligent systems to track individual's health and gain more insights of an individual's personal data such as daily-life event segmentation [9], activities of daily-living identification as an indicator in health tracking systems [10], etc. Although multiple data sources are recorded in multimodal personal datasets [12], only the combination of visual and related metadata including semantic locations, daily-life activities, date and time are employed extensively in research [10,20] while others has not yet been exploited. Typically, physiological signals are usually ignored due to the limited amount of research conducted using this type of data as well as the limitations of recording devices in terms of the granularity signal measurement.…”
Section: Introductionmentioning
confidence: 99%
“…This has also resulted in the creation of many large multimodal personal datasets [11] comprising of different data types (e.g, passive visual capture, mobile device context, physiological data) [8] that enables research community to develop intelligent systems to track individual's health and gain more insights of an individual's personal data such as daily-life event segmentation [9], activities of daily-living identification as an indicator in health tracking systems [10], etc. Although multiple data sources are recorded in multimodal personal datasets [12], only the combination of visual and related metadata including semantic locations, daily-life activities, date and time are employed extensively in research [10,19] while others has not yet been exploited. Typically, physiological signals are usually ignored due to the limited amount of research conducted using this type of data as well as the limitations of recording devices in terms of the granularity signal measurement.…”
Section: Introductionmentioning
confidence: 99%