TY - JOUR
T1 - Energy-efficient heating control for smart buildings with deep reinforcement learning
AU - Gupta, Anchal
AU - Badr, Youakim
AU - Negahban, Ashkan
AU - Qiu, Robin G.
N1 - Publisher Copyright:
© 2020 Elsevier Ltd
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2021/2
Y1 - 2021/2
N2 - Buildings account for roughly 40% of the total energy consumption in the world, out of which heating, ventilation, and air conditioning are the major contributors. Traditional heating controllers are inefficient due to lack of adaptability to dynamic conditions such as changing user preferences and outside temperature patterns. Therefore, it is necessary to design energy-efficient controllers that can improvise occupant thermal comfort (deviation from setpoint temperature) while reducing energy consumption. This research presents a Deep Reinforcement Learning (DRL)-based heating controller to improve thermal comfort and minimize energy costs in smart buildings. We perform extensive simulation experiments using real-world outside temperature data. The results show that the DRL-based smart controller outperforms a traditional thermostat controller by improving thermal comfort between 15% and 30% and reducing energy costs between 5% and 12% in the simulated environment. A second set of experiments is then performed for the case of multiple buildings, each having its own heating equipment. The performance is compared when the buildings are controlled centrally (using a single DRL-based controller) versus decentralized control, where each heater is controlled independently and has its own DRL-based controller. We observe that as the number of buildings and differences in their setpoint temperatures increase, decentralized control performs better than a centralized controller. The results have practical implications for heating control, especially in areas with multiple buildings such as residential complexes with multiple houses.
AB - Buildings account for roughly 40% of the total energy consumption in the world, out of which heating, ventilation, and air conditioning are the major contributors. Traditional heating controllers are inefficient due to lack of adaptability to dynamic conditions such as changing user preferences and outside temperature patterns. Therefore, it is necessary to design energy-efficient controllers that can improvise occupant thermal comfort (deviation from setpoint temperature) while reducing energy consumption. This research presents a Deep Reinforcement Learning (DRL)-based heating controller to improve thermal comfort and minimize energy costs in smart buildings. We perform extensive simulation experiments using real-world outside temperature data. The results show that the DRL-based smart controller outperforms a traditional thermostat controller by improving thermal comfort between 15% and 30% and reducing energy costs between 5% and 12% in the simulated environment. A second set of experiments is then performed for the case of multiple buildings, each having its own heating equipment. The performance is compared when the buildings are controlled centrally (using a single DRL-based controller) versus decentralized control, where each heater is controlled independently and has its own DRL-based controller. We observe that as the number of buildings and differences in their setpoint temperatures increase, decentralized control performs better than a centralized controller. The results have practical implications for heating control, especially in areas with multiple buildings such as residential complexes with multiple houses.
UR - http://www.scopus.com/inward/record.url?scp=85096546025&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85096546025&partnerID=8YFLogxK
U2 - 10.1016/j.jobe.2020.101739
DO - 10.1016/j.jobe.2020.101739
M3 - Article
AN - SCOPUS:85096546025
SN - 2352-7102
VL - 34
JO - Journal of Building Engineering
JF - Journal of Building Engineering
M1 - 101739
ER -