Publication Type

Conference Proceeding Article

Version

submittedVersion

Publication Date

9-2010

Abstract

Users of text search engines are increasingly wary that their activities may disclose confidential information about their business or personal profiles. It would be desirable for a search engine to perform document retrieval for users while protecting their intent. In this paper, we identify the privacy risks arising from semantically related search terms within a query, and from recurring highspecificity query terms in a search session. To counter the risks, we propose a solution for a similarity text retrieval system to offer anonymity and plausible deniability for the query terms, and hence the user intent, without degrading the system’s precision-recall performance. The solution comprises a mechanism that embellishes each user query with decoy terms that exhibit similar specificity spread as the genuine terms, but point to plausible alternative topics. We also provide an accompanying retrieval scheme that enables the search engine to compute the encrypted document relevance scores from only the genuine search terms, yet remain oblivious to their distinction from the decoys. Empirical evaluation results are presented to substantiate the effectiveness of our solution.

Keywords

Search engines, privacy, confidential information, similarity text retrieval system

Discipline

Databases and Information Systems | Information Security

Publication

Proceedings of the VLDB Endowment: 36th International Conference on Very Large Data Bases: Singapore, 13-17 September 2010

Volume

3

Issue

1/2

First Page

598

Last Page

607

ISSN

2150-8097

Identifier

10.14778/1920841.1920918

Publisher

ACM

City or Country

New York

Copyright Owner and License

Authors

Additional URL

http://dx.doi.org/10.14778/1920841.1920918

Share

COinS