Proceedings of the Third Annual Workshop on Lifelog Search Challenge 2020
DOI: 10.1145/3379172.3391719
|View full text |Cite
|
Sign up to set email alerts
|

Myscéal: An Experimental Interactive Lifelog Retrieval System for LSC'20

Abstract: The Lifelog Search Challenge (LSC), is an annual comparative benchmarking activity for comparing approaches to interactive retrieval from multi-modal lifelogs. Being an interactive search challenge, issues such as retrieval accuracy, search speed and usability of interfaces are key challenges that must be addressed by every participant. In this paper, we introduce Myscéal, an interactive lifelog retrieval engine designed to support novice users to retrieve items of interest from a large multimodal lifelog. Add… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

4
4

Authors

Journals

citations
Cited by 20 publications
(18 citation statements)
references
References 25 publications
(23 reference statements)
0
18
0
Order By: Relevance
“…Moreover, the increased participation of fourteen teams [1-3, 8-11, 14, 15, 18, 19, 21, 24, 25] marks an unprecedented rise in interested research parties. This is also reflected in the overall performance of the proposed systems: while the winning teams Myscéal [24] and SOMHunter [19] solved all 21 given tasks, on average the other systems managed to solve more than 14 tasks. After performing very poorly at LSC2019, we decided to refactor our continuously developed participating system, lifeXplore [13,14,20], which previously was based on div-eXplore [12,20,23] -an interactive video browsing and retrieval tool designed for the VBS challenge.…”
Section: Introductionmentioning
confidence: 89%
“…Moreover, the increased participation of fourteen teams [1-3, 8-11, 14, 15, 18, 19, 21, 24, 25] marks an unprecedented rise in interested research parties. This is also reflected in the overall performance of the proposed systems: while the winning teams Myscéal [24] and SOMHunter [19] solved all 21 given tasks, on average the other systems managed to solve more than 14 tasks. After performing very poorly at LSC2019, we decided to refactor our continuously developed participating system, lifeXplore [13,14,20], which previously was based on div-eXplore [12,20,23] -an interactive video browsing and retrieval tool designed for the VBS challenge.…”
Section: Introductionmentioning
confidence: 89%
“…The Exquisitor system [16] used relevance feedback to build a model of the user's information needs without using any explicit query mechanism, while SOMHunter [20] and THUIR [19] employed user feedback to iteratively refine the retrieved results. MySceal [24] proposed a temporal query mechanism that allowed to search for up to 3 consecutive events simultaneously and also introduced a concept weighing methodology to determine the importance of visual annotations in the data.…”
Section: Related Workmentioning
confidence: 99%
“…The number of matching concepts between a query and the annotation of images can be used as a ranking criterion in retrieval systems [8]. With different considerations, Myscéal [28] viewed this as a document retrieval problem by indexing textual annotations and matching with textual queries. Embedding techniques are also commonly based on the idea of encoding concepts from both queries and images tags into the same vector space to calculate the similarity between them [19,20].…”
Section: Related Workmentioning
confidence: 99%
“…We applied the rule-based approach proposed in [26] to generate a semantic graph representing query items and their interactions, as described in the query text. Before this, any location and time query information can be extracted and archived for the later filtering mechanism by analyzing partof-speech tagging from the topic, as utilized in Myscéal [28]. All words from the query are pre-processed to exclude stopwords and then lemmatized.…”
Section: Query To Graphmentioning
confidence: 99%