2006
DOI: 10.1002/cplx.20125
|View full text |Cite
|
Sign up to set email alerts
|

Complexity, parallel computation and statistical physics

Abstract: The intuition that a long history is required for the emergence of complexity in natural systems is formalized using the notion of depth. The depth of a system is defined in terms of the number of parallel computational steps needed to simulate it. Depth provides an objective, irreducible measure of history applicable to systems of the kind studied in statistical physics. It is argued that physical complexity cannot occur in the absence of substantial depth and that depth is a useful proxy for physical complex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
10
0

Year Published

2008
2008
2017
2017

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 61 publications
0
10
0
Order By: Relevance
“…This proposal, though consistent with the concept of depth introduced by Machta [4], has its own limitations, as we generally simulate approximate models of a system, rather than exact ones.…”
mentioning
confidence: 89%
“…This proposal, though consistent with the concept of depth introduced by Machta [4], has its own limitations, as we generally simulate approximate models of a system, rather than exact ones.…”
mentioning
confidence: 89%
“…Systems with long histories (i.e., mechanisms of long-term memory) allow for the "emergence", or accumulation of physical properties in a growing space of otherwise highly unpredictable states. This idea has been captured intuitively through a complexity measure -algorithmic depth -which seeks to equate complexity with historical depth (Bennett, 1988;Machta, 2006). Hence complex systems are systems for which a full understanding requires a specification of a historical sequence.…”
Section: The Challenges and Character Of Biological Theorymentioning
confidence: 99%
“…On the other hand, note that there are a variety of other measures for what is known as algorithmic entropy, as, e.g., the description size of a minimal algorithm (or computer, or circuit) which is able to generate an instance of the problem under scrutiny [13][14][15]. However, such measures are often impractical when it comes to the analysis of large systems.…”
Section: Introductionmentioning
confidence: 99%