Let us talk about the different type of sources that seem able to generate randomness. On the screen you have some examples in each case a square with 40 thousand bits coming from different sources, one from atmospheric noise from static generated by a radio receiver with no noise filter with a fraction of it coming from a chaotic feedback loop and another possibly from the cosmic noise as a result of the Big Bang. Indeed, the Big Bang left some background noise that can get into some channels. On another square we have bits coming from a quantum source from timing radioactive decay detected by a Geiger counter. Both atmospheric and quantum bits were obtained from a website and service called QUANTUM.ORG. Finally, another window with the digits of the mathematical constant pi in binary. The set of atmospheric bits come from chaotic but a deterministic system according to the laws of classical mechanics and thus is of causal nature, it comes not from accident but a long sequence of cause and effect probably going all the way back to the Big Bang. The bits in this window are deterministic in the sense that there is a state previous to the one on the screen that can fully explain the future state of the system no matter how complicated. Indeed, the only way that this window of random bits looks random is because it is extremely difficult to reproduce the initial condition for those atmospheric values to be set in that precise way again and reversing the process is extremely difficult thus the causal origin, even if well defined, is not reachable. In contrast, on the third window, we have binary digits from the mathematical constant pi which are not only deterministic bits as we can reproduce them any number of times from the same source without any loss of information, in this case from a formula for pi reproducing digit by digit. In fact, we can reproduce the same bits from any formula for pi and there is not only nothing unpredictable or even chaotic about the digits of pi but we have access to the source and cause for them yet the arrangement of bits appear random. the causal origins of these digits can be explained in various ways, including the relationship between the ratio of the length of a circle and its diameter. Finally, on the second window, we have bits from a quantum source whose digits may be explained by completely very different mechanics than classical described by quantum mechanics and may have been produced not only in an unpredictable fashion but also, allegedly according to some theories, from a non-deterministic source. However, one can see that the images look very similar despite their very different nature. Statistical tests are sometimes useful but clearly fail to capture the nature of the sources. In the application of Shannon entropy to these windows and without knowledge of the sources, entropy values would not tell apart these cases and we will see what algorithmic randomness can tell about them. One can emulate some types of randomness using computer programs such as cellular automata: Traditionally, one may think that simply feeding a computer program with a trivial input will always produce a trivial output such as with Wolfram's Elementary Cellular Automaton with rule 18 starting from a single black cell, which is the simplest possible initial condition: And it may be thought that only by feeding random inputs one may get some random-looking output back, such as from rule 22 starting from a random initial condition: However, this is not always the case, not all computer programs, no matter how simple, reproduce only random-looking output from random-looking input. This small computer program, an Elementary Cellular Automaton with rule 30, was found by Stephen Wolfram to be able to intrinsically generate random-looking behaviour from even the simplest input. Here depicted is, for example, the one side of the evolution of the cellular automaton that shows no apparent regularities. This random-looking behaviour is also called pseudo-randomness because it is deterministic randomness and so it is simulated. We will see how algorithmic complexity provides a framework to distinguish between these sources and cases that traditional approaches cannot easily tell apart. For algorithmic complexity it won't be easy either and we will face some challenges but we will see how it offers directions for improvement whereas Shannon entropy is mostly a death end. Let's see some of the statistical properties in one of the objects used in our previous example, the mathematical constant pi. But this time in decimal, as it is most typically shown. The constant pi is actually believed to be Borel normal, that is a concept introduced by the mathematician Emile Borel and it will be quite important to understand because we will later use the concept to explain some examples in which the application of entropy can be proven to be deceiving. Borel normality means that each digit appears exactly the same number of times, so the number 0 appears as many times as the number 1 and the number 2 and 3 and so on up to 9 with equal chance, and not only each digit but each pair of digits and then each triplet and so on, so if pi is Borel normal it means that it has no statistical regularity in the limit, no subsequence is over-represented but only temporary as the sequence gets larger local regularities vanish. pi is actually believed to be Borel normal in all bases, including in binary, and that property is called absolute Borel normality. While it is not know for sure if pi is, all the statistical evidence suggests it is. In this plot, for example we can go through a long segment of the digits of pi and see how all the pieces of the pie chart, no pun intended, representing each of the digits of pi, gets about the same size the greater the number of digits are produced and only at the very beginning there are small noticeable fluctuations, and this happens in all bases. However, normality does not capture the concept of randomness. For example, the Champernowne constant is produced by putting all positive integer numbers together in a sequence as if it were the expansion of a real number, the result is something like 0.123456789101112\[Ellipsis] and so on. When the number of digits grows, the slices cut the pie chart into parts that become more nearly equal, indicating the constant's statistical normality in base 10 for single digits. No proof of normality is known for this number in other bases but it is normal in base 10 by design, here we can see the statistical evidence and it is not too difficult to see why by design is normal, because we will asymptotically use about the same number of digits at any given time when going through all positive integers. Another constant, called the Copeland-Erdos constant is obtained by concatenating the digits of the primes and looks like 0.23571113171923... . Copeland and Erdos proved that this constant is Borel normal in base 10, meaning that in the limit the frequency of each digit is 1/10, but clearly it is not random. So how to characterise randomness if sound measures such as Borel normality, Shannon entropy and even the whole body of traditional statistics cannot properly characterise it in the way in which intuitively it should? Stay tuned and later in this unit you will see.