Dynamical Gaussian Process Latent Variable Model for Representation Learning from Longitudinal Data

Thanh Le, Vasant Honavar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Many real-world applications involve longitudinal data, consisting of observations of several variables, where different subsets of variables are sampled at irregularly spaced time points. We introduce the Longitudinal Gaussian Process Latent Variable Model (L-GPLVM), a variant of the Gaussian Process Latent Variable Model, for learning compact representations of such data. L-GPLVM overcomes a key limitation of the Dynamic Gaussian Process Latent Variable Model and its variants, which rely on the assumption that the data are fully observed over all of the sampled time points. We describe an effective approach to learning the parameters of L-GPLVM from sparse observations, by coupling the dynamical model with a Multitask Gaussian Process model for sampling of the missing observations at each step of the gradient-based optimization of the variational lower bound. We further show the advantage of the Sparse Process Convolution framework to learn the latent representation of sparsely and irregularly sampled longitudinal data with minimal computational overhead relative to a standard Latent Variable Model. We demonstrated experiments with synthetic data as well as variants of MOCAP data with varying degrees of sparsity of observations that show that L-GPLVM substantially and consistently outperforms the state-of-the-art alternatives in recovering the missing observations even when the available data exhibits a high degree of sparsity. The compact representations of irregularly sampled and sparse longitudinal data can be used to perform a variety of machine learning tasks, including clustering, classification, and regression.

Original languageEnglish (US)
Title of host publicationFODS 2020 - Proceedings of the 2020 ACM-IMS Foundations of Data Science Conference
PublisherAssociation for Computing Machinery, Inc
Pages183-188
Number of pages6
ISBN (Electronic)9781450381031
DOIs
StatePublished - Oct 19 2020
Event2020 ACM-IMS Foundations of Data Science Conference, FODS 2020 - Virtual, Online, United States
Duration: Oct 19 2020Oct 20 2020

Publication series

NameFODS 2020 - Proceedings of the 2020 ACM-IMS Foundations of Data Science Conference

Conference

Conference2020 ACM-IMS Foundations of Data Science Conference, FODS 2020
Country/TerritoryUnited States
CityVirtual, Online
Period10/19/2010/20/20

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Dynamical Gaussian Process Latent Variable Model for Representation Learning from Longitudinal Data'. Together they form a unique fingerprint.

Cite this