2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI) 2014
DOI: 10.1109/icacci.2014.6968240
|View full text |Cite
|
Sign up to set email alerts
|

An optimized RFC algorithm with incremental update

Abstract: RFC (Recursive Flow Classification) is one of the best packet classification algorithms. However, RFC has moderate to prohibitive high preprocessing time for rule-sets having more than 10K rules. RFC does not provide incremental update. Due to these essential missing features, RFC is used in limited scenarios. This paper attempts to add these essential features in RFC. Our algorithm uses various memory and processing optimizations to speed up RFC preprocessing phase.We provide an algorithm to compute only thos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…The algorithm based on dimension decomposition decomposes each rule into multiple dimensions in a certain number of bytes or bits. Each dimension is searched separately, and then combined to obtain the final search result, representative classical algorithms include BV(Bit vector), ABV(Aggregated bit vector), RFC(Recursive flow classification) [15][16][17], etc. These methods are fast, but with the increase of the size of the rule set, the space consumption will increase exponentially in the worst case, and the space consumption is high.…”
Section: B Dimension-decomposition-based Methodsmentioning
confidence: 99%
“…The algorithm based on dimension decomposition decomposes each rule into multiple dimensions in a certain number of bytes or bits. Each dimension is searched separately, and then combined to obtain the final search result, representative classical algorithms include BV(Bit vector), ABV(Aggregated bit vector), RFC(Recursive flow classification) [15][16][17], etc. These methods are fast, but with the increase of the size of the rule set, the space consumption will increase exponentially in the worst case, and the space consumption is high.…”
Section: B Dimension-decomposition-based Methodsmentioning
confidence: 99%
“…Finally, all bitmaps are intersected to get the matching result. As the number of the rules increases, the memory consumption increases drastically since the bitmap length depends on the number of the rules; at the same time, since it uses 8-bit lookup, the total memory access required for BV intersection is very high, which in turn leads to the decrease in search performance [20]. Aggregated bit vector (ABV) [21] is proposed to reduce the memory access by bit aggregation, which improves the classification speed; however, the memory consumption becomes more serious since it needs to store extra information such as the aggregated bit vector.…”
Section: Related Workmentioning
confidence: 99%
“…Meanwhile, memory consumption is also very high due to the reason that extra ECTs need to be stored. Many optimizations have done in the last few years to reduce not only the memory consumption but also the preprocessing time, such as those in [4,20]. Due to the inherent complexity of the RFC, it is still difficult to satisfy various requirements.…”
Section: Related Workmentioning
confidence: 99%