Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/31626
DC FieldValue
dc.titleEfficient location-based spatial keyword query processing
dc.contributor.authorZHANG DONGXIANG
dc.date.accessioned2012-03-31T18:01:45Z
dc.date.available2012-03-31T18:01:45Z
dc.date.issued2011-08-19
dc.identifier.citationZHANG DONGXIANG (2011-08-19). Efficient location-based spatial keyword query processing. ScholarBank@NUS Repository.
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/31626
dc.description.abstractThe emergence of Web $2.0$ applications, including social networking sites, wikipedia and multimedia sharing sites, has changed the way of how information is generated and shared. Among these applications, map mashup is a popular and convenient means for data integration and visualization. In recent years, users have contributed a huge amount of spatial objects in various media formats and displayed them on a map. They have also annotated these objects with tags to provide semantic meaning. In order to leverage such a large scale spatial-textual database, we propose efficient location-based spatial keyword query processing strategies in this thesis. First, we address a novel query, named $m$CK ($m$ Closest Keywords). The query accepts a set of query keywords and aims at finding a set of spatial tuples matching the keywords and closest to each other. A useful application is to find $m$ closest local service providers using keywords such as ``cinema'', ``seafood restaurant'' and ``shopping mall'', to save the transportation time. To efficiently answer an $m$CK query, we introduce a new index named bR$^*$-tree which is an extension of R$^*$-tree. Based on bR$^*$-tree, we exploit a priori-based top-down search strategy and propose efficient pruning rules which significantly reduce the search space. Second, we adopt $m$CK query to detect the geographical context of web resources. More specifically, we build a uniform model to represent online resources by a set of tags and propose a detection method by tag matching. Since there could be hundreds of thousands of tags, we improve bR$^*$-tree and design an efficient and scalable search algorithm. Furthermore, we propose a new \emph{geo-tf-idf} ranking method to improve the matching precision. Third, we solve the problem of efficient web image locating when tags are not available. We treat high dimensional image feature as ``keyword''. Thus, a geo-image can be considered as a set of spatial keywords at the same location. Given a query image, our goal is to find a geo-image in the spatial image database that is most similar to the query image and use its location as the detecting result. To solve the nearest neighbor (NN) query, we propose a new index named HashFile. The index can support approximate NN search in the Euclidean space and exact NN search in $L_1$ norm. Our experiment results show that it provides better efficiency in processing both types of NN queries. Finally, we design and develop a new travel mashup system, named \textbf{LANGG}, to utilize the above efficient spatial keyword query processing technique and provide location-based services. The main objective of our system is to recommend users a travel destination based on their personal interest. Users can submit a set of travel services they would like to enjoy, an interesting travel blog or even a travel photo with beautiful scene. User feedback shows that our system provides satisfactory search results.
dc.language.isoen
dc.subjectkeyword search, spatial database, spatial index, image retrieval, mashup, multimedia
dc.typeThesis
dc.contributor.departmentCOMPUTER SCIENCE
dc.contributor.supervisorTUNG KUM HOE, ANTHONY
dc.description.degreePh.D
dc.description.degreeconferredDOCTOR OF PHILOSOPHY
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Ph.D Theses (Open)

Show simple item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
ZhangDX.pdf2.5 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.