Information is probably one of the most difficult physical quantities to comprehend. This applies not only to the very definition of information, but also to the physical entity of information, meaning how can it be quantified and measured. In recent years, information theory and its function in systems has been an intense field of study, due to the large increase of available information technology, where the notion of bit dominated the information discipline. Information theory also expanded from the “simple” “bit” to the quantal “qubit”, which added more variables for consideration. One of the main applications of information theory could be considered the field of “autonomy”, which is the main characteristic of living organisms in nature since they all have self-sustainability, motion and self-protection. These traits, along with the ability to be aware of existence, make it difficult and complex to simulate in artificial constructs. There are many approaches to the concept of simulating autonomous behavior, yet there is no conclusive approach to a definite solution to this problem. Recent experimental results have shown that the interaction between machines and neural cells is possible and it consists of a significant tool for the study of complex systems. The present work tries to review the question on the interactions between information and life. It attempts to build a connection between information and thermodynamics in terms of energy consumption and work production, as well as present some possible applications of these physical quantities.