One criticism to algorithmic complexity is that it does not conform to our intuition of complexity in common language. For example, when we say that something is complex we rarely mean something random but actually something neither trivial nor random but rather quite sophisticated. A measure of the complexity of a string can be arrived at by combining the notions of algorithmic information content and time. According to the concept of Logical Depth introduced by Charles Bennett, the complexity of a string is best defined by the time that an unfolding process takes to reproduce the string from its shortest description. The longer time it takes, the more complex. Hence complex objects are those which can be seen as "containing internal evidence of a nontrivial causal history." Unlike algorithmic complexity, which assigns randomness its highest complexity values, logical depth assigns a low complexity or low depth to both random and trivial objects. It is thus more in keeping with our intuition of complex physical objects, because trivial and random objects are intuitively easy to produce, have no long history, and unfold very quickly. Logical Depth was originally identified with what is usually believed to be the right measure for evaluating the complexity of real-world objects such as living beings. Hence its alternative designation: physical complexity (used by Bennett himself). This is because the concept of logical depth takes into account the plausible history of an object. It combines the shortest possible description of the object with the time that it takes for this description to evolve to its current state. The addition of time in logical depth results in a reasonable characterization of the organizational physical complexity of an object, which is not to be had by the application of the concept of algorithmic complexity alone. A persuasive case for the convenience of the concept of logical depth as a measure of organized complexity, over and against algorithmic complexity used by itself, is made by Bennett himself. Bennett's main motivation was actually to provide a reasonable means of measuring the physical complexity of real-world objects. Bennett provides a careful development of the notion of logical depth, taking into account near-shortest programs as well as the shortest one\[Dash]hence the significance value\[Dash]to arrive at a reasonably robust measure. To understand the intuition behind logical depth think of the following. Imagine you are given a random file in binary of size a thousand bits: If you try to compress the file you won't be able to do so by much: In comparison to, for example, compressing a file consisting of simply one thousand 1s. Now, if you measure the decompression time of the compressed random file: you will find that it is not very different than the decompression time of the compressed simple file: This is because the decompression instructions in both cases are very simple, one because the file is trivial and the other because there are almost no decompression instructions for a random file because we were unable to compress it by much. But in fact you can see there is a small difference between one and other time perhaps because the random file does have some slightly more sophisticated decompression steps compared to the totally trivial case. But let's now compress a file that is neither trivial nor random and see what happens: We can see that just as we would have expected, the Thue-Morse sequence that is algorithmically generated and produces some non-trivial statistical patterns can be compressed by more bits than the pseudo-random sequence but by less than the trivial file, it is right in the middle as expected, and the decompression time is also longer, because now there are more decompression instructions than for the random file and these instructions are less trivial than for the trivial file and thus require more computing time to reproduce the original Thue-Morse sequence. This is exactly the core idea of this beautiful measure of Logical Depth. Formally, for a finite string, Bennett's first logical depth measure is defined as follows. Let s be a string and d a significance level. A string's depth is given by the following formula: that reads as upper case D at significance level lower d of an object s is the minimum time T taken by a computer program p running on a Universal Turing machine U that can reproduce s and whose program length compared to the shortest computer program p'\.1e is not greater than d and U(p) reproduces the object s. Algorithmic complexity and logical depth are therefore intimately related. The latter depends on the former because one has to first find the shortest programs producing the string and then look for the shortest times, looking for the shortest programs is equivalent to approximating the algorithmic complexity of a string. While the compressed versions of a string are approximations of its algorithmic complexity, decompression times are therefore approximations of the logical depth for that string. For algorithmic complexity the choice of universal Turing machine is bounded by an additive constant by the invariance theorem that we covered in a lecture before while for logical depth the choices involved are bounded by a multiplicative factor so choices are more important yet they are still somehow under control, at least theoretically.