Skip to yearly menu bar Skip to main content


Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior

Yohan Jung · Jinkyoo Park

Auditorium 1 Foyer 138


Convolutional deep sets are architecture of deep neural network (DNN) that can model the stationary stochastic process. This architecture uses the kernel smoother and the DNN for constructing the translation equivariant functional representations, and thus reflecting the inductive bias of the stationarity into the DNN. However, since this architecture employs the kernel smoother known as the non-parametric model, it might produce ambiguous representations when the amount of data points is not given sufficiently. To remedy this issue, in this work, we introduce a Bayesian convolutional deep sets that construct the random translation equivariant functional representations with stationary prior. Furthermore, we present how to impose task-dependent prior for each dataset because the wrongly imposed prior forms the even worse representation than that of the kernel smoother. We validate that the proposed architecture and its training on various experiments with time-series and image datasets.

Live content is unavailable. Log in and register to view live content