Please use this identifier to cite or link to this item:
|dc.title||Answering similarity queries in peer-to-peer networks|
|dc.identifier.citation||Kalnis, P., Ng, W.S., Ooi, B.C., Tan, K.-L. (2006). Answering similarity queries in peer-to-peer networks. Information Systems 31 (1) : 57-72. ScholarBank@NUS Repository. https://doi.org/10.1016/j.is.2004.09.003|
|dc.description.abstract||A variety of peer-to-peer (P2P) systems for sharing digital information are currently available and most of them perform searching by exact key matching. In this paper we focus on similarity searching and describe FuzzyPeer, a generic broadcast-based P2P system which supports a wide range of fuzzy queries. As a case study we present an image retrieval application implemented on top of FuzzyPeer. Users provide sample images whose sets of features are propagated through the peers. The answer consists of the top-k most similar images within the query horizon. In our system the participation of peers is ad hoc and dynamic, their functionality is symmetric and there is no centralized index. In order to avoid flooding the network with messages, we develop a technique that takes advantage of the fuzzy nature of the queries. Specifically, some queries are "frozen" inside the network, and are satisfied by the streaming results of similar queries that are already running. We describe several optimization techniques for single and multiple-attribute queries, and study their tradeoffs. We evaluate the performance of our algorithms by a prototype implementation on our P2P platform and a simulated large-scale network. Our results suggest that by reusing the existing streams, the scalability of the system improves both in terms of number of nodes and query throughput. © 2004 Elsevier Ltd. All rights reserved.|
|Appears in Collections:||Staff Publications|
Show simple item record
Files in This Item:
There are no files associated with this item.
checked on May 20, 2020
WEB OF SCIENCETM
checked on May 13, 2020
checked on May 12, 2020
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.