Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
7-2020
Abstract
Network embedding effectively transforms complex network data into a low-dimensional vector space and has shown great performance in many real-world scenarios, such as link prediction, node classification, and similarity search. A plethora of methods have been proposed to learn node representations and achieve encouraging results. Nevertheless, little attention has been paid on the embedding technique for bipartite attributed networks, which is a typical data structure for modeling nodes from two distinct partitions. In this paper, we propose a novel model called BiANE, short for Bipartite Attributed Network Embedding. In particular, BiANE not only models the inter-partition proximity but also models the intra-partition proximity. To effectively preserve the intra-partition proximity, we jointly model the attribute proximity and the structure proximity through a novel latent correlation training approach. Furthermore, we propose a dynamic positive sampling technique to overcome the efficiency drawbacks of the existing dynamic negative sampling techniques. Extensive experiments have been conducted on several real-world networks, and the results demonstrate that our proposed approach can significantly outperform state-of-theart methods.
Keywords
Network embedding, Bipartite attributed network, Link prediction
Discipline
Databases and Information Systems | OS and Networks
Research Areas
Data Science and Engineering
Publication
Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual event, China, 2020 July 25-30
First Page
149
Last Page
158
ISBN
978145038016420
Identifier
10.1145/3397271.3401068
Publisher
ACM
City or Country
Virtual Event, China
Citation
1
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1145/3397271.3401068