James Gleick wrote Chaos (see also here), that inspired me to write my own fractals (the function of functions z -> z + c where z and c are complex numbers and c is a complex constant), in a Java applet. Slow as hell but it worked.

Chaos hooked me up to the last word. No different with The Information.

Gleick guides the reader through the development of information and communication systems over the past centuries.

The book sets off with the messaging system of tribes Africa using drums. Then continues to writing and how that forms and changes the process of thinking.

“The written word – the persistent word- was a prerequisite for conscious thought as we understand it.”

Next step is logic and mathematics. The development of language and dictionaries, formalization of spelling and the story of the development of the Oxford English Dictionary (OED). The first creators of the OED used Milton and Shakespeare as the foundational layer for this english dictionary.

Shakespeare stands out as a massive contributor to English. As an inventor or inventor or first recorder of thousands of words as we have seen, he is the most quoted author in the OED as well, with no less than over thirty thousand references.

As a sidenote (and not the last for this article), where Shakespeare in English is a central foundational reference for English Language, the Statenvertaling of the Bible holds a similar position for Dutch. I could phantasize about the cultural elaborations of that difference.

Gleick continuous with the development of computation, from the creation of logarithmic tables to Charles Babbage, who we could view a the prophet of the modern computer. He was thinking in terms of programming language and memory, avant la lettre.

Interesting introduction to his relation with Ada Lovelace, Lord Byron’s daughter. Where Babbage seemed the inventor of the computing machine before its existence, Ada was the programmer of this non-existent machine, hitting programming problems that could only 100 years later be exercised on real computers.

“How multifarious and how mutually complicated are the considerations which the working of such an engine involve. There are frequently several distinct sets of effects going on simultaneously; all in a manner independent of each other, and yet to a greater or lesser degree exercising a mutual influence.”

(As a sidestep: two recent books have been published on Lovelace and Babbage that I have not yet have time to read. The Thrilling Adventures of Lovelace and Babbage by Sydney Padua – a graphic novel I am really looking forward to. And Ada Byron Lovelace and the Thinking Machine by Laurie Wallmark.)

A real development was the telegraph, as a first electric apparatus speeding communication over distances. Communications were coded, and morse code becoming a standard at some point.

The need for secrecy was needed leading to the development of cryptography.

Entree Claude Shannon who introduced the science of Information theory. Shannon worked on predictability and redundancy in streams of data representing encrypted texts.

Claude Shannon invented how logical operations could process information and how to build these operations in systems with relays. Shannon wanted to build these systems to prove theorems.

At about the same time, Kurt Gödel came around and proved that the ideal mathematical world of Russell and Whitehead’s Principia Mathematica, where all mathematical theorems could be proved by logic, was false. Gödel proved that any logical system is inconsistent or incomplete. Hofstadter has explained this counter-intuitive conclusion in Gödel, Escher, Bach extensively and illustratively and Gleick makes no attempt to improve on that.

Turing at the same time proved a similar notion, the Entscheidungsproblem – by Hilbert – can every proposition be proven to be true or false – and Turings answer was no. He did this through the invention of a theoretical computer, the Turing Machine.

Interestingly, the main protagonist of the book, Claude Shannon, is a secluded mathematician working for Bell Labs. At the same time as Alan Turing, and incidently or not they both worked on cryptanalysis during the war without knowing this from eachother (classified, Turing from England, Shannon from the US), and they even have worked some time at Bell Labs and met up with lunch now and then. (The same Bell labs that is the subject of Douglas Coupland’s Kitten Clone, and the company that still today provides the backbone of our information highway, The Internet.)

All this work by eventually culminated in the creation of the information processing machine, nowadays knows as the computer.

Shannon continued to develop his Information Theory, looking at quantification of predictablility and redundancy to measure information content.

He conducted some tests with his wife, using a Raymond Chandler detective Pickup on Noon Street,

“… put his finger on a short passage at random, and asked Betty to start guessing the letter, then the next letter, then the next. The more text she saw, of course, the better her chances of guessing right.”

Shannon and Schrödinger bring physics and information theory were together in the notion of entropy. Information processing, thinking and artificial intelligence notions develop.

Information theory is found to apply to nature itself : DNA is discovered. The development of thinking of biology in term of computability, algorithms, procedures gives more insight into the building blocks of life itself.

(And as an aside, if we are able to think of the biological mechanisms in terms of algorithms, can we do so for societal mechanisms to which a human belongs. And to the intellectual developments, meaning can we also build a recipe for the development of information to knowledge to intelligence? Which would be logical in the context of life’s driver of negative entropy.)

Richard Dawkins develops his ideas about the Selfish Gene. Which has much in common with the Antifragility thinking of Taleb.

Chaitin and Kolmogorov develop a theory to measure how much information is contained in a given ‘object’. Complexity is described in computability terms. And complexity has computability problems, like Gödel’s theory and this was the Chaitin version of incompleteness.

Then moves to quantum computing, making computations on an atomic scale.

The book closes with a view on the proliferation of information, describing the development of Wikipedia. The amount of information we have access to nowadays is becoming a challenge in itself. There’s information in abundance, but to find useful information in the overwhelming pile is the trick. Dissemination, filtering, ordering and search becoming essential tools. This is still something we do not have under control yet.

Gleick leaves the reader with a challenge to self. Learn to deal with the amount of information available. Then I mean not to manage the information, but to psychologically being able to handle information abundance. The FOMO and threat of total information procrastination is real. We will need to learn to ignore. We will also need to our own ways to store, record, share the information we find useful or interesting.

How to manage Borges’ library of Babel.

The book is an achievement on itself. Admirable how much information (no pun intended) Gleick has been able to pack in a book.

## 2 gedachten over “From African Drums to Borges’ Library of Babel: a history of information by James Gleick”