Dynamic Bayesian Contrastive Predictive Coding Model for Personalized Product Search

Document Type

Article

Publication Title

ACM Transactions on the Web

Abstract

In this article, we study the problem of dynamic personalized product search. Due to the data-sparsity problem in the real world, existing methods suffer from the challenge of data inefficiency. We address the challenge by proposing a Dynamic Bayesian Contrastive Predictive Coding model (DBCPC), which aims to capture the rich structured information behind search records to improve data efficiency. Our proposed DBCPC utilizes contrastive predictive learning to jointly learn dynamic embeddings with structure information of entities (i.e., users, products, and words). Specifically, our DBCPC employs structured prediction to tackle the intractability caused by non-linear output space and utilizes the time embedding technique to avoid designing different encoders each time in the Dynamic Bayesian models. In this way, our model jointly learns the underlying embeddings of entities (i.e., users, products, and words) via prediction tasks, which enables the embeddings to focus more on their general attributes and capture the general information during the preference evolution with time. For inferring the dynamic embeddings, we propose an inference algorithm combining the variational objective and the contrastive objectives. Experiments were conducted on an Amazon dataset and the experimental results show that our proposed DBCPC can learn the higher-quality embeddings and outperforms the state-of-the-art non-dynamic and dynamic models for product search.

DOI

10.1145/3609225

Publication Date

10-10-2023

Keywords

Bayesian networks, Data mining, Embeddings, Inference engines, Information retrieval

Comments

IR conditions: non-described

Share

COinS