Special Seminar in CDS/CMS
There is a surge of interest in efficient resource control for future wireless communication and control systems. Different applications, such as video streaming, energy harvesting, plant stabilization and optimizations, pose different levels of challenges to the stochastic resource control problems. The existing literatures on improving the physical layer performance cannot be extended to solve the above-mentioned problems that are related to the application level performance metrics in different application scenarios. Even though Markov decision process (MDP) is a useful tool to formulate the stochastic optimization problems, the conventional solution such as value iteration algorithm (VIA) is a numerical solution, which suffers from slow convergence and a lack of design insights. In this talk, we focus on obtaining low complexity stochastic control solutions to the following specific problems: i) multi-antenna beamforming for supporting multimedia streaming, ii) power management for networked control systems over correlated wireless fading channels. We formulate the associated stochastic optimization problems as infinite horizon average cost MDP problems. Unlike the VIA for handling the MDP, we propose a continue-time perturbation approach and a diffusion approximation approach. For each of the above problems, we apply our proposed approaches to obtain low complexity and insightful solutions. Note that even though our proposed approaches offer a framework for obtaining low complexity approximate solutions to the original discrete-time MDP, it still is a case-by-case problem for actually solving the associated optimality equations (e.g., a multi-dimensional partial differential equation). The proposed solution for each problem is compared with some state-of-the-art baselines and it is shown through simulations that significant performance gain can be achieved.