2018
DOI: 10.3390/e20060401
|View full text |Cite
|
Sign up to set email alerts
|

Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data

Abstract: Information transfer that characterizes the information feature variation can have a crucial impact on big data analytics and processing. Actually, the measure for information transfer can reflect the system change from the statistics by using the variable distributions, similar to Kullback-Leibler (KL) divergence and Renyi divergence. Furthermore, to some degree, small probability events may carry the most important part of the total message in an information transfer of big data. Therefore, it is significant… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
1

Relationship

4
2

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 28 publications
0
11
0
Order By: Relevance
“…are MIM and CMIM defined in Equations (1) and (4), as well as 0 < ≤ Actually, the bitrate transmission with a message importance loss constraint has a special solution for a certain scenario. In order to give a specific example, we investigate the optimization problem in the Bernoulli(p) source with a symmetric or erasure transfer matrix as follows.…”
Section: Bitrate Transmission Constrained By Message Importancementioning
confidence: 99%
See 2 more Smart Citations
“…are MIM and CMIM defined in Equations (1) and (4), as well as 0 < ≤ Actually, the bitrate transmission with a message importance loss constraint has a special solution for a certain scenario. In order to give a specific example, we investigate the optimization problem in the Bernoulli(p) source with a symmetric or erasure transfer matrix as follows.…”
Section: Bitrate Transmission Constrained By Message Importancementioning
confidence: 99%
“…In recent years, massive data has attracted much attention in various realistic scenarios. Actually, there exist many challenges for data processing such as distributed data acquisition, huge-scale data storage and transmission, as well as correlation or causality representation [1][2][3][4][5]. Facing these obstacles, it is a promising way to make good use of information theory and statistics to deal with mass information.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It is the same with recommendation value since lim ̟→+∞ u min e ̟(1−u min ) N i=1 p i e ̟(1−u i ) = 1 (cf. (36)). Furthermore, if ̟ is not a very large positive number, the form of f (̟, x, P ) (1 < x < N) is similar with Shannon entropy, which is discussed in [39].…”
Section: B Optimal Recommendation and Mimmentioning
confidence: 99%
“…In order to depict the message importance in small-probability event scenarios, message importance measure (MIM) was proposed in Reference [ 30 ]. Furthermore, MIM is fairly effective in many applications of big data, such as IoT [ 31 ], mobile edge computing [ 32 ]. In addition, Reference [ 33 ] expanded MIM to the general case, and it presented that MIM can be adopted as a special weight in designing the recommendation system.…”
Section: Introductionmentioning
confidence: 99%