A time-varying group sparse additive model for genome-wide association studies of dynamic complex traits

Micol Marchetti-Bowick, Junming Yin, Judie A. Howrylak, Eric P. Xing

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Motivation: Despite the widespread popularity of genome-wide association studies (GWAS) for genetic mapping of complex traits, most existing GWAS methodologies are still limited to the use of static phenotypes measured at a single time point. In this work, we propose a new method for association mapping that considers dynamic phenotypes measured at a sequence of time points. Our approach relies on the use of Time-Varying Group Sparse Additive Models (TV-GroupSpAM) for high-dimensional, functional regression. Results: This new model detects a sparse set of genomic loci that are associated with trait dynamics, and demonstrates increased statistical power over existing methods. We evaluate our method via experiments on synthetic data and perform a proof-of-concept analysis for detecting single nucleotide polymorphisms associated with two phenotypes used to assess asthma severity: forced vital capacity, a sensitive measure of airway obstruction and bronchodilator response, which measures lung response to bronchodilator drugs. Availability and Implementation: Source code for TV-GroupSpAM freely available for download at http://www.cs.cmu.edu/~mmarchet/projects/tv-group-spam, implemented in MATLAB.

Original languageEnglish (US)
Pages (from-to)2903-2910
Number of pages8
JournalBioinformatics
Volume32
Issue number19
DOIs
StatePublished - Oct 1 2016

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Biochemistry
  • Molecular Biology
  • Computer Science Applications
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'A time-varying group sparse additive model for genome-wide association studies of dynamic complex traits'. Together they form a unique fingerprint.

Cite this