Differentiable Change-point Detection With Temporal Point Processes
Abstract
In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and change-points are assumed to be unknown. For this problem, we propose DCPD, a Log-likelihood Ratio based Global Change-point Detector, which detects change-points after observing the entire sequence. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose DCPD-W, a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifice in performance. Experiments on synthetic datasets explore the effects of runtime, relative complexity, and other aspects of distributions on various properties of our change-point detector, namely robustness, detection accuracy, scalability, etc under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc. We show that our methods either outperform or perform comparably with the baselines.