Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.
Nowadays, there has been an enormous increase in audio recordings, some of these of the interest of audio forensics. Audio forensics has been used in many different ways for recognising firearm's calibres. In this work, we use Partial Directed Coherence (PDC) on recordings of capsules hitting the ground to recognise the firearm's calibre. Six different calibres were recorded in a controlled environment, and these audios were cut to have the same size. These audios were analysed using PDC. The results show that the audios of the calibres are orthogonal; thus, for this experiment being impossible to get a false positive while using the proposed methodology.
Entropy is a concept that remote to the 19th century and it was associated with the work realized by a thermal machine in the context of the Industrial Revolution. The 20th century saw an unprecedented scientific revolution and one of the most important innovations from this time was Information Theory, which also has a concept of Entropy. It can be argued that this is one of the most misused scientific therms and researchers of different areas have been using it wrong. In this paper, a historical background for the evolution of the concept of "entropy" is presented, as well as mathematical proofs and logical arguments for the interconnection of the concept in different areas of science, and how it is related with complexity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.