Marzieh AjirakImmanuel ElbauNili SolomonovLogan Grosenick
This paper focuses on discrete representation learning for multivariate time series with Gaussian processes. To overcome the challenges inherent in incorporating discrete latent variables into deep learning models, our approach uses a Gumbel-softmax reparameterization trick to address non-differentiability, enabling joint clustering and embedding through learnable discretization of the latent space. The proposed architecture thus enhances interpretability both by estimating a low-dimensional embedding for high dimensional time series and by simultaneously discovering discrete latent states. Empirical assessments on synthetic and real-world fMRI data validate the model's efficacy, showing improved classification results using our representation.
Ching ChangChiao-Tung ChanWei‐Yao WangWen-Chih PengTien-Fu Chen
Sangho LeeWonjoon KimYoungdoo Son
Xiongjun ZhaoLing ZouLong YeZhengyu LiuShaoliang Peng
Mustafa Gökçe BaydoğanGeorge C. Runger
Michael PotterILKAY YILDIZ POTTEROCTAVIA CAMPSMARIO SZNAIER