Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

7-2005

Abstract

For agents deployed in real-world settings, such as businesses, universities and research laboratories, it is critical that agents protect their individual users’ privacy when interacting with others entities. Indeed, privacy is recognized as a key motivating factor in design of several multiagent algorithms, such as distributed constraint optimization (DCOP) algorithms. Unfortunately, rigorous and general quantitative metrics for analysis and comparison of such multiagent algorithms with respect to privacy loss are lacking. This paper takes a key step towards developing a general quantitative model from which one can analyze and generate metrics of privacy loss by introducing the VPS (Valuations of Possible States) framework. VPS is shown to capture various existing measures of privacy created for specific domains of distributed constraint satisfactions problems (DCSPs). The utility of VPS is further illustrated via analysis of DCOP algorithms, when such algorithms are used by personal assistant agents to schedule meetings among users. In addition, VPS allows us to quantitatively evaluate the properties of several privacy metrics generated through qualitative notions. We obtain the unexpected result that decentralization does not automatically guarantee superior protection of privacy.

Discipline

Artificial Intelligence and Robotics | Information Security

Research Areas

Intelligent Systems and Optimization

Publication

Proceedings of the 4th International Conference on Autonomous Agents and Multi Agent Systems, AAMAS 2005, July 25-29, Utrecht, Netherlands

First Page

1030

Last Page

1037

ISBN

9781595930934

Identifier

10.1145/1082473.1082629

Publisher

AAAI Press

City or Country

Palo Alto, CA

Copyright Owner and License

Publisher

Additional URL

https://doi.org/10.1145/1082473.1082629

Share

COinS