Proceedings of the 6th ACM Conference on Embedded Network Sensor Systems 2008
DOI: 10.1145/1460412.1460428
|View full text |Cite
|
Sign up to set email alerts
|

Distributed image search in camera sensor networks

Abstract: Recent advances in sensor networks permit the use of a large number of relatively inexpensive distributed computational nodes with camera sensors linked in a network and possibly linked to one or more central servers. We argue that the full potential of such a distributed system can be realized if it is designed as a distributed search engine where images from different sensors can be captured, stored, searched and queried. However, unlike traditional image search engines that are focused on resource-rich situ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
32
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(34 citation statements)
references
References 37 publications
0
32
0
Order By: Relevance
“…A query can be any form of keyword search using a ranking function (e.g., tfidf) identifying the top-k most relevant files. The proposed search engine can be used in sensors to search for relevant objects in their surroundings [17,23], in cameras to search pictures by using tags [22], in personal smart dongles to secure the querying of documents and files hosted in an untrusted Cloud [4,12] or in smart meters to perform analytic tasks (i.e., top-k queries) over sets of events (i.e., terms) captured during time windows (i.e., files) [3]. Hence, this engine can be thought of as a generalized Google desktop or Spotlight embedded in smart objects.…”
Section: Figure 1 Smart Objects Endowed With Mcus and Nand Flashmentioning
confidence: 99%
See 2 more Smart Citations
“…A query can be any form of keyword search using a ranking function (e.g., tfidf) identifying the top-k most relevant files. The proposed search engine can be used in sensors to search for relevant objects in their surroundings [17,23], in cameras to search pictures by using tags [22], in personal smart dongles to secure the querying of documents and files hosted in an untrusted Cloud [4,12] or in smart meters to perform analytic tasks (i.e., top-k queries) over sets of events (i.e., terms) captured during time windows (i.e., files) [3]. Hence, this engine can be thought of as a generalized Google desktop or Spotlight embedded in smart objects.…”
Section: Figure 1 Smart Objects Endowed With Mcus and Nand Flashmentioning
confidence: 99%
“…Unfortunately, state-of-the-art indexing techniques either consume a lot of RAM or produce a large quantity of random fine-grain updates. Few pioneer works already considered the problem of embedding a search engine in sensors equipped with Flash storage [17,20,22,23], but they target small data collections (i.e., hundreds to thousands of files) with a query execution time which remains proportional to the number of indexed documents. By construction these search engines cannot meet insertion performance and query scalability at the same time.…”
Section: Figure 1 Smart Objects Endowed With Mcus and Nand Flashmentioning
confidence: 99%
See 1 more Smart Citation
“…The work in [10] presents a distributed image search system on a network of iMote2 sensor nodes equipped with extended flash storage. The system is based on SIFT [11] local features, which are extremely slow to compute on such hardware.…”
Section: Introductionmentioning
confidence: 99%
“…As in [10], the hardware building blocks which we decide to adopt are, to the best of our knowledge, much less powerful than the reference hardware in the literature. The reference testbed is built on commercial hardware and entails the complete pipeline of object recognition visual tasks including fast feature detection and feature description implemented on sensor nodes, features delivery to a sink through a multi-hop communication protocol, and features matching, implemented at a central controller.…”
Section: Introductionmentioning
confidence: 99%