2017
DOI: 10.1109/tnet.2016.2591018
|View full text |Cite
|
Sign up to set email alerts
|

Packet-Scale Congestion Control Paradigm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 50 publications
0
2
0
Order By: Relevance
“…Taking the widely used random early detection (RED) algorithm for example, exponentially weighted moving average (EWMA) of queue length is adopted as the congestion signal, where the forgetting factor plays an important role. 6 This algorithm, however, faces the same challenge that has been raised as the second disadvantage, ie, how to determine the optimal forgetting factor. First and foremost, if the inflow rate itself is quite high and the queue length is small, the average queue-length-based AQM algorithms will drop a much smaller fraction of packets than it should and, thus, makes the algorithm inept at detecting incipient congestion.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Taking the widely used random early detection (RED) algorithm for example, exponentially weighted moving average (EWMA) of queue length is adopted as the congestion signal, where the forgetting factor plays an important role. 6 This algorithm, however, faces the same challenge that has been raised as the second disadvantage, ie, how to determine the optimal forgetting factor. First and foremost, if the inflow rate itself is quite high and the queue length is small, the average queue-length-based AQM algorithms will drop a much smaller fraction of packets than it should and, thus, makes the algorithm inept at detecting incipient congestion.…”
Section: Introductionmentioning
confidence: 99%
“…12 In order to filter out transient fluctuations and smoothen the congestion signal, an EWMA of sending rate is then chosen in the work of Lovewell. 6 This algorithm, however, faces the same challenge that has been raised as the second disadvantage, ie, how to determine the optimal forgetting factor. If the forgetting factor is set large, the algorithm will fail to smoothen abnormal data caused by bursty traffic or network noise, whereas on the contrary, it will result in overreliance on historical data.…”
Section: Introductionmentioning
confidence: 99%