Deep Reinforcement Learning Framework for Energy Management of Energy Hubs
Abstract:
Model-free and data-driven deep reinforcement learning (DRL) framework can develop an intelligent controller that can exploit information to optimally schedule the energy hub with the aim of minimizing energy costs and emissions. By posing the energy hub scheduling problem as a multi-dimensional continuous state and action space, this method can lead to a more efficient operation by considering nonlinear physical characteristics of the energy hub components like nonconvex feasible operating regions of combined heat and power (CHP) units, valve-point effects of power-only units, and fuel cell dynamic efficiency. Moreover, to provide great potential for the DDPG agent to learn an optimal policy in an efficient way, a hybrid forecasting model based on convolutional neural networks (CNNs) and bidirectional long short-term memories (BLSTMs) is developed to overcome the risk associated with PV power generation that can be highly intermittent, particularly on cloudy days. The effectiveness and applicability of the proposed scheduling framework in reducing energy costs and emissions while coping with uncertainties are demonstrated by comparing it against conventional robust optimization and stochastic programming approaches as well as state-of-the-art DRL methods in different case studies.
Date and Time
Location
Hosts
Registration
- Date: 15 Jul 2024
- Time: 03:00 PM to 04:00 PM
- All times are (UTC-05:00) Central Time (US & Canada)
- Add Event to Calendar
- Starts 15 June 2024 12:00 AM
- Ends 15 July 2024 03:00 PM
- All times are (UTC-05:00) Central Time (US & Canada)
- No Admission Charge
Speakers
Dr. Hussein Abdeltawab
Deep Reinforcement Learning Framework for Energy Management of Energy Hubs
Biography:
Dr. Abdeltawab received his Ph.D. in electrical and computer engineering from the University of Alberta, Canada, in 2017. He worked as a design engineer with different Canadian energy consulting companies, where he participated in more than 50 projects. He is a licensed Professional Engineer in Canada with four years of industrial experience. He joined Penn State as an Assistant Professor in 2020, and Now he is an Assistant Professor of Electrical Engineering at Wake Forest University (WFU).
Dr. Abdeltawab is the PI of the Intelligent Power and Energy Management Lab (IPEM) at WFU. He is also the Chair of the IEEE Winston-Salem Chapter. Further, he is a senior Member of the IEEE and the Associate Editor of the IET Generation, Transmission & Distribution (GTD) Journal. His research interests include artificial intelligence, Deep learning, and optimization with applications related to Transportation Electrification, Multicarrier Systems, energy storage, renewable energy integration, and power system planning.
Research Experience:
1) Energy management of Energy storage resources, EVs and other distribution generation resources.
2) Optimal planning and allocation of energy resources in the distribution power system.
3) Power system studies including load flow, stability and optimal dispatch.
4) Renewable energy resources control and operation in the power system.
5) Utilizing artificial intelligence in future prediction and optimal decision making.
6) Deep learning applications in wind and PV power forecasting.
7) Deep reinforcement learning application in multi-carrier energy systems.