Skip to yearly menu bar Skip to main content


Poster

Probabilistic Modeling for Sequences of Sets in Continuous-Time

Yuxin Chang · Alex Boyd · Padhraic Smyth

MR1 & MR2 - Number 53
[ ]
[ Slides [ Poster
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT
 
Oral presentation: Oral: General Machine Learning
Fri 3 May 7 a.m. PDT — 8 a.m. PDT

Abstract: Neural marked temporal point processes have been a valuable addition to the existing toolbox of statistical parametric models for continuous-time event data. These models are useful for sequences where each event is associated with a single item (a single type of event or a ``mark'')---but such models are not suited to the practical situation where each event is associated with a set of items. In this work, we develop a general framework for modeling set-valued data in continuous-time, building on recurrent neural point process models. In addition we develop inference methods that can use such models to answer probabilistic queries such as ``the probability of item $A$ being observed before item $B$,'' conditioned on sequence history. Computing exact answers for such queries is generally intractable for neural models due to both the continuous-time nature of the problem setting and the combinatorially-large space of event types for each event. To address this, we propose a class of importance sampling methods and demonstrate orders-of-magnitude improvements in efficiency over direct sampling via systematic experiments with four real-world datasets. We also illustrate how to use this framework to perform model-selection using likelihoods that do not involve one-step-ahead prediction.

Chat is not available.