Proceedings of the 10th Aerospace Technology Congress, October 8-9, 2019, Stockholm, Sweden 2019
DOI: 10.3384/ecp19162017
|View full text |Cite
|
Sign up to set email alerts
|

Lempel-Ziv-Markov Chain Algorithm Modeling using Models of Computation and ForSyDe

Abstract: The data link is considered a critical function of modern aircraft, responsible for exchanging information to the ground and communicating to other aircraft. Nowadays, the increasing amount of exchanged data and information brings the need for network usage optimization. In this sense, data compression is considered a key approach to make data packages size smaller. Regarding the fact that avionics systems are safety-critical, it is fundamental not losing data nor performance during the compression procedures.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…A comprehensive case study was based on Reference [42] published by the authors of this work. That paper modeled the LZMA compression using the ForSyDe SDF MoC library, but not representing part of the system dynamic behavior.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…A comprehensive case study was based on Reference [42] published by the authors of this work. That paper modeled the LZMA compression using the ForSyDe SDF MoC library, but not representing part of the system dynamic behavior.…”
Section: Resultsmentioning
confidence: 99%
“…A first simplified LZMA model was introduced by the author of the present research in Reference [42] using the SDF MoC. That simplified LZMA modeling considered a set of assumptions to be valid.…”
Section: Moc Definition Stepmentioning
confidence: 99%
“…A compression algorithm is defined as either lossless (i.e, when a compressed file is decompressed, the output matches the original file) or lossy (i.e., when a compressed file is decompressed, the output is epsilonclose to the original data, but not identical). Standard lossless compression algorithms that are being used in many domains [5,4,1,37,15] are based on deriving a token-based mapping to reduce the compressed file size [16,18,7,35,10,3,13]. Yet, there are no published studies attempting to combine these concepts from information theory, explicitly leveraging the various lossless data compression algorithms as feature extractors.…”
Section: Introductionmentioning
confidence: 99%