TY - JOUR
T1 - A computer algorithm to impute interrupted heart rate data for the spectral analysis of heart rate variability - The ARIC study
AU - Liao, Duanping
AU - Barnes, Ralph W.
AU - Chambless, Lloyd E.
AU - Heiss, Gerardo
N1 - Funding Information:
The shorter term beat-to-beat heart rate data collected from the general population are often interrupted by artifacts, and an arbitrary exclusion of such individuals from analysis may significantly reduce the sample size and/or introduce selection bias. A computer algorithm was developed to label as artifacts any data points outside the upper and lower limits generated by a 5-beat moving average ±25% (or set manually by an operator using a mouse) and to impute beat-to-beat heart rate throughout an artifact period to preserve the timing relationships of the adjacent, uncorrupted heart rate data. The algorithm applies Fast Fourier Transformation to the smoothed data to estimate low-frequency (LF; 0.025–0.15 Hz) and high-frequency (HF; 0.16–0.35 Hz) spectral powers and the HF/LF ratio as conventional indices of sympathetic, vagal, and vagal–sympathetic balance components, respectively. We applied this algorithm to resting, supine, 2-min beat-to-beat heart rate data collected in the population-based Atherosclerosis Risk in Communities study to assess the performance (success rate) of the algorithm (N = 526) and the inter-and intra-data-operator repeatability of using this computer algorithm (N = 108). Eighty-eight percent (88%) of the records could be smoothed by the computer-generated limits, an additional 4.8% by manually set limits, and 7.4% of the data could not be processed due to a large number of artifacts in the beginning or the end of the records. For the repeatability study, 108 records were selected at random, and two trained data operators applied this algorithm to the same records twice within a 6-month interval of each process (blinded to each other’s results and their own prior results). The inter-data-operator 1Support provided by National Heart, Lung, and Blood Institute Contracts N01-HC-55015, N01-HC-55016, N01-HC-55018, N01-HC-55019, N01-HC-55020, N01-HC-55021, and N01-HC-55022 The work was completed while the lead author (D.L.) was a postdoctoral fellow in the Cardiovascular Disease Epidemiology Training Program supported by NIH, NHLBI NRSA Grant 5T32HL07055.
Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 1996/4
Y1 - 1996/4
N2 - The shorter term beat-to-beat heart rate data collected from the general population are often interrupted by artifacts, and an arbitrary exclusion of such individuals from analysis may significantly reduce the sample size and/or introduce selection bias. A computer algorithm was developed to label as artifacts any data points outside the upper and lower limits generated by a 5-beat moving average ±25% (or set manually by an operator using a mouse) and to impute beat-to-beat heart rate throughout an artifact period to preserve the timing relationships of the adjacent, uncorrupted heart rate data. The algorithm applies Fast Fourier Transformation to the smoothed data to estimate low-frequency (LF; 0.025-0.15 Hz) and high-frequency (HF; 0.16-0.35 Hz) spectral powers and the HF/LF ratio as conventional indices of sympathetic, vagal, and vagal-sympathetic balance components, respectively. We applied this algorithm to resting, supine, 2-min beat-to-beat heart rate data collected in the population-based Atherosclerosis Risk in Communities study to assess the performance (success rate) of the algorithm (N = 526) and the inter- and intra-data-operator repeatability of using this computer algorithm (N = 108). Eighty-eight percent (88%) of the records could be smoothed by the computer-generated limits, an additional 4.8% by manually set limits, and 7.4% of the data could not be processed due to a large number of artifacts in the beginning or the end of the records. For the repeatability study, 108 records were selected at random, and two trained data operators applied this algorithm to the same records twice within a 6-month interval of each process (blinded to each other's results and their own prior results). The inter-data-operator reliability coefficients were 0.86, 0.92, and 0.90 for the HF, LF, and HF/LF components, respectively. The average intra-data-operator reliability coefficients were 0.99, 0.99, and 0.98 for the HF, LF, and HF/LF components, respectively. These results indicate that this computer algorithm is efficient and highly repeatable in processing short-term beat-to-beat heart rate data collected from the general population, given that the data operators are trained according to standardized protocol.
AB - The shorter term beat-to-beat heart rate data collected from the general population are often interrupted by artifacts, and an arbitrary exclusion of such individuals from analysis may significantly reduce the sample size and/or introduce selection bias. A computer algorithm was developed to label as artifacts any data points outside the upper and lower limits generated by a 5-beat moving average ±25% (or set manually by an operator using a mouse) and to impute beat-to-beat heart rate throughout an artifact period to preserve the timing relationships of the adjacent, uncorrupted heart rate data. The algorithm applies Fast Fourier Transformation to the smoothed data to estimate low-frequency (LF; 0.025-0.15 Hz) and high-frequency (HF; 0.16-0.35 Hz) spectral powers and the HF/LF ratio as conventional indices of sympathetic, vagal, and vagal-sympathetic balance components, respectively. We applied this algorithm to resting, supine, 2-min beat-to-beat heart rate data collected in the population-based Atherosclerosis Risk in Communities study to assess the performance (success rate) of the algorithm (N = 526) and the inter- and intra-data-operator repeatability of using this computer algorithm (N = 108). Eighty-eight percent (88%) of the records could be smoothed by the computer-generated limits, an additional 4.8% by manually set limits, and 7.4% of the data could not be processed due to a large number of artifacts in the beginning or the end of the records. For the repeatability study, 108 records were selected at random, and two trained data operators applied this algorithm to the same records twice within a 6-month interval of each process (blinded to each other's results and their own prior results). The inter-data-operator reliability coefficients were 0.86, 0.92, and 0.90 for the HF, LF, and HF/LF components, respectively. The average intra-data-operator reliability coefficients were 0.99, 0.99, and 0.98 for the HF, LF, and HF/LF components, respectively. These results indicate that this computer algorithm is efficient and highly repeatable in processing short-term beat-to-beat heart rate data collected from the general population, given that the data operators are trained according to standardized protocol.
UR - http://www.scopus.com/inward/record.url?scp=0029863785&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0029863785&partnerID=8YFLogxK
U2 - 10.1006/cbmr.1996.0012
DO - 10.1006/cbmr.1996.0012
M3 - Article
C2 - 8785911
AN - SCOPUS:0029863785
SN - 0010-4809
VL - 29
SP - 140
EP - 151
JO - Computers and Biomedical Research
JF - Computers and Biomedical Research
IS - 2
ER -