2021
DOI: 10.3390/e23020212
|View full text |Cite
|
Sign up to set email alerts
|

The Role of Entropy in Construct Specification Equations (CSE) to Improve the Validity of Memory Tests

Abstract: Commonly used rating scales and tests have been found lacking reliability and validity, for example in neurodegenerative diseases studies, owing to not making recourse to the inherent ordinality of human responses, nor acknowledging the separability of person ability and item difficulty parameters according to the well-known Rasch model. Here, we adopt an information theory approach, particularly extending deployment of the classic Brillouin entropy expression when explaining the difficulty of recalling non-ve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
78
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(80 citation statements)
references
References 36 publications
2
78
0
Order By: Relevance
“…Apart from this local communication, it is, of course, in many cases, also required to communicate measurement information as globally as needed, across distances and amongst different persons, according to what is meaningful for the quality assurance in each field of application [ 37 ]. The present work shares a background common with our previous paper [ 11 ], as can be briefly summarised as follows: The amount of “useful information” in a measurement system, analogous to a certain extent with the original entropy concept as a measure of “useful energy” in steam engines [ 40 ], can be described with the well-known conditional entropy expression: Expression (A1) states how the amount of information changes during transmission in a measurement system in terms of the entropy in the response ( Y ) of the system when observing a quantity ( Z ) attributed to the measurement object. At the start of the measurement process, there is an initial “deficit” in the entropy (i.e., “surplus” information) coming from prior knowledge (prior distribution, P ) of the measurand (attribute, Z , of the object entity, (A)).…”
Section: Appendix A1 Analysing Categorical Responsesmentioning
confidence: 85%
See 3 more Smart Citations
“…Apart from this local communication, it is, of course, in many cases, also required to communicate measurement information as globally as needed, across distances and amongst different persons, according to what is meaningful for the quality assurance in each field of application [ 37 ]. The present work shares a background common with our previous paper [ 11 ], as can be briefly summarised as follows: The amount of “useful information” in a measurement system, analogous to a certain extent with the original entropy concept as a measure of “useful energy” in steam engines [ 40 ], can be described with the well-known conditional entropy expression: Expression (A1) states how the amount of information changes during transmission in a measurement system in terms of the entropy in the response ( Y ) of the system when observing a quantity ( Z ) attributed to the measurement object. At the start of the measurement process, there is an initial “deficit” in the entropy (i.e., “surplus” information) coming from prior knowledge (prior distribution, P ) of the measurand (attribute, Z , of the object entity, (A)).…”
Section: Appendix A1 Analysing Categorical Responsesmentioning
confidence: 85%
“…Here, we complement these analyses with ab initio theoretical and quasi-theoretical causal estimates of the memory task difficulty, thus providing evidence for construct validity, item equivalence, and enabling metrological references based on causality and best understanding. Our previous work on recalling the simplest nonverbal items—taps and numbers [ 11 , 12 ]—gave first principle, ab initio explanations of the task difficulty in terms of entropy: less entropy means a more ordered task, which is thus easier to perform. This entropy-based explanatory theory, summarized in Appendix B , is extended here to include causal explanations of recall in the word learning list test RAVLT IR, including SPE such as primacy and recency.…”
Section: Case Study Ii: Explaining Serial Position Effectsmentioning
confidence: 99%
See 2 more Smart Citations
“…Specifically, activation and interaction enthalpy parameters were associated with characteristic configuration variable values of the resulting pattern from free energy minimization of 2D CVM. The versatility of measures from information theory is also evident in the study carried out by Melin et al [ 6 ], in which entropy-based construct specification equations are used to improve validity of memory tests and to design novel combined tests. This new methodology can indeed contribute to obtain reliable diagnoses of neurological diseases, such as dementia, and to properly characterize cognitive processes generated by cerebral networks by providing accurate scoring.…”
mentioning
confidence: 99%