Abstract
Many papers in the medical literature analyze the cost-effectiveness of screening for diseases by comparing a limited number of a priori testing policies under estimated problem parameters. However, this may be insufficient to determine the best timing of the tests or incorporate changes over time. In this paper, we develop and solve a Markov Decision Process (MDP) model for a simple class of asymptomatic diseases in order to provide the building blocks for analysis of a more general class of diseases. We provide a computationally efficient method for determining a cost-effective dynamic intervention strategy that takes into account (i) the results of the previous test for each individual and (ii) the change in the individual's behavior based on awareness of the disease. We demonstrate the usefulness of the approach by applying the results to screening decisions for Hepatitis C (HCV) using medical data, and compare our findings to current HCV screening recommendations.
Original language | English (US) |
---|---|
Pages (from-to) | 28-37 |
Number of pages | 10 |
Journal | Mathematical Biosciences |
Volume | 226 |
Issue number | 1 |
DOIs | |
State | Published - Jul 2010 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Modeling and Simulation
- General Biochemistry, Genetics and Molecular Biology
- General Immunology and Microbiology
- General Agricultural and Biological Sciences
- Applied Mathematics