Algorithmic Information Theory - LeFreq/Singularity GitHub Wiki

Algorithmic Information Theory is a neat concept whereby you quantify the knowledge in a stream of data, by how small you can make the algorithm to reproduce that stream. The bigger the algorithm to create the stream, the higher the "algorithmic information" or complexity contained in the data.

The bitstream for pi is infinite, but the program to generate it is small and finite, suggesting that there's not that much real information in the number.

The highest-conceived complexity of data (conceived so far) is called the Omega number, imagined by Gregory Chaitin. Take all possible Turing machines (iterating on the most simple turing machine and going upwards), and print a 1 if it halts and a 0 if it doesn't it. This becomes the Omega number.

This number is considered to have the highest theoretical algorithmic complexity possible. That means it is more unpredictable than the best pseudo-random number stream.


See also:
⚠️ **GitHub.com Fallback** ⚠️