Markov Decision Processes for Control of a Sensor Network-Based Health Monitoring System

Abstract

Optimal use of energy is a primary concern in fielddeployable sensor networks. Artificial intelligence algorithms offer the capability to improve the performance or sensor networks in dynamic environments by minimizing energy utilization while not compromising overall performance. However, they have been used only to a limited extent in sensor networks primarily due to their expensive computing requirements. We describe the use of Markov decision processes for the adaptive control of sensor sampling rates in a sensor network used for human health monitoring. The MDP controller is designed to gather optimal information about the patient's health while guaranteeing a minimum lifetime of the system. At every control step, the MDP controller varies the frequency at which the data is collected according to the criticality of the patient's health at that time. We present a stochastic model that is used to generate the optimal policy offline. In cases where a model of the observed process is not available a-priori. we descrihe a Q-learning technique to learn the control policy, by using a pre-existing master controller. Simulation results that illustrate the performance of the controller are presented.

Cite

Text

Panangadan et al. "Markov Decision Processes for Control of a Sensor Network-Based Health Monitoring System." AAAI Conference on Artificial Intelligence, 2005.

Markdown

[Panangadan et al. "Markov Decision Processes for Control of a Sensor Network-Based Health Monitoring System." AAAI Conference on Artificial Intelligence, 2005.](https://mlanthology.org/aaai/2005/panangadan2005aaai-markov/)

BibTeX

@inproceedings{panangadan2005aaai-markov,
  title     = {{Markov Decision Processes for Control of a Sensor Network-Based Health Monitoring System}},
  author    = {Panangadan, Anand V. and Ali, Syed Muhammad and Talukder, Ashit},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2005},
  pages     = {1529-1534},
  url       = {https://mlanthology.org/aaai/2005/panangadan2005aaai-markov/}
}