2019 IEEE PES Innovative Smart Grid Technologies Europe (ISGT-Europe) 2019
DOI: 10.1109/isgteurope.2019.8905731
|View full text |Cite
|
Sign up to set email alerts
|

Realistic Peer-to-Peer Energy Trading Model for Microgrids using Deep Reinforcement Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(27 citation statements)
references
References 12 publications
1
26
0
Order By: Relevance
“…Our intelligent agent strategies managed to achieve annual average saving of 7,56ce/kWh (25,3%) with only 10% DR. As for prosumers, they made a profit with average annual MCP of 25,4ce/kWh with trading instead of putting it in the grid and achieving a feed-in tariff 16,83ce/kWh i.e. 51% annual increase in the revenue which is comparable to that of Chen and Bu (2019). However, it must be noted that the results from Chen and Bu (2019) corresponds to the regulations of United States of America but in our case, it corresponds to Germany.…”
Section: Evaluation Of the Results Of The Modelmentioning
confidence: 62%
See 2 more Smart Citations
“…Our intelligent agent strategies managed to achieve annual average saving of 7,56ce/kWh (25,3%) with only 10% DR. As for prosumers, they made a profit with average annual MCP of 25,4ce/kWh with trading instead of putting it in the grid and achieving a feed-in tariff 16,83ce/kWh i.e. 51% annual increase in the revenue which is comparable to that of Chen and Bu (2019). However, it must be noted that the results from Chen and Bu (2019) corresponds to the regulations of United States of America but in our case, it corresponds to Germany.…”
Section: Evaluation Of the Results Of The Modelmentioning
confidence: 62%
“…An annual decrease of 30% cost of electricity was reported for the community through peer-to-peer trading. Chen and Bu (2019) has explored the self-learning prosumer behaviour of developing intelligent agent strategies through deep reinforcement learning method in a LEM of 200 households. The average annual revenue saved through this method for the LEM with only trading was reported as 33% saving with trading in LEM and 54% with trading and storage.…”
Section: Evaluation Of the Results Of The Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…In [52], Dominguez-Barbero et al proposed a DQN-based energy management algorithm for an isolated residential microgrid to minimize the operating cost, which is the sum of DG generation cost and the penalty of non-served power demand. In [53], Chen et al proposed a DQN-based energy trading strategy for a microgrid to maximize the utility function, which is related to trading profit, retail profit, battery wear cost, demand penalty and virtual penalty. In [54], Ji et al proposed a DQN-based energy management algorithm for a microgrid to minimize daily operating cost.…”
Section: Applications Of Drl In Building Microgridsmentioning
confidence: 99%
“…Deep reinforcement learning (DRL) [6] is applied in the MG energy management field in several prior works. The DRL is used for Peer-to-Peer energy trading in [7], and an improvement is shown when compared with rulebased strategy. Furthermore, energy management problem is formulated to be a markov decision process (MDP) in [8], and actor-critic based DRL is proposed to minimize the energy cost.…”
Section: Introductionmentioning
confidence: 99%