Improving posture recognition among construction workers through data augmentation with generative adversarial network

J. Zhao, E. Obonyo, Q. Yin

Research output: Contribution to journalConference articlepeer-review

Abstract

Deep Neural Networks (DNN) models have shown high potential in recognizing workers' risky postures using data from wearable Inertial Measurement Units (IMUs). However, there is a data paucity challenge - DNN models require a large dataset with annotation for desirable performance. The research discussed in this paper proposes to address this problem through a data generation framework that leverages Generative Adversarial Network (GAN) to i) synthesize motion data, ii) augment training data, then iii) improve the recognition performance. Its potential was validated using naturalistic posture data of workers. Three GAN models were developed for data generation. A Train on Real and Test on Hybrid approach was used to quantitatively assess synthesized data and select sufficiently-trained GAN models. The performance of three commonly-used DNN models was compared after data augmentation. Results showed that the augmentation with GAN-synthesized data improved recognition accuracy by 1.2%-3% for varying postures. These findings suggest the feasibility of applying motion data augmentation with GAN models to advance automated construction safety monitoring.

Original languageEnglish (US)
Article number092005
JournalIOP Conference Series: Earth and Environmental Science
Volume1101
Issue number9
DOIs
StatePublished - 2022
EventInternational Council for Research and Innovation in Building and Construction World Building Congress 2022, WBC 2022 - Melbourne, Australia
Duration: Jun 27 2022Jun 30 2022

All Science Journal Classification (ASJC) codes

  • General Environmental Science
  • General Earth and Planetary Sciences

Fingerprint

Dive into the research topics of 'Improving posture recognition among construction workers through data augmentation with generative adversarial network'. Together they form a unique fingerprint.

Cite this