emergent complexity

Simplicity is the foundation upon which you want to build complexity. Building complexity on top of existing complexity is an unsustainable exercise in accumulating technical debt.

What you want to do is find a minimalistic set of primitives with enough expressiveness to fashion ideas without excessive repetition.

Looking at language as an analogy; Binary, the language of classical computers, consists of exactly two words. Given the enormous complexity of the world at large, you're going to need a fuckton of such words to differentiate concepts. 32, which used to be the prominent number, is barely enough for 4 million things. So computers couldn't even assign a unique name to every person on Earth without a new abstraction layer (bits → bytes) and incurring an order of magnitude cost in computational speed.

On the other extreme you have something like the CJK alphabet, which includes all the pictographs used in Chinese, Japanese, Korean, and so on. There are tens of thousands of symbols in this set. Encoding information using these primitives is extremely compact; a single character has content worth dozens of bits. But the downside to this complexity is then the enormous memory required to identify the symbols, and the difficulty of constructing new meaning, since each icon is rather loaded.

Where lies the sweet spot of simplicity? When you reduce too much, the matter becomes practically unusable. When you reduce too little, progress is blocked by a shaky foundation.

One good example of our search and discovery of close to optimal simplicity is the LEGO System bricks.

This quest is still ongoing and under rapid debate and development in computer science. What's the most optimal structural density to express ideas to computers? Too little structure, and you have to do everything yourself, and that's tedious and error-prone. Too much structure, and suddenly you are unable to form complicated concepts without working around that structure; at worst it actively fights against you.

comment