Understanding Complexity by Scott Page
Overall a dense but short listen. Potpourri of new ideas, and far better than Why Information Grows by Cesar Hidalgo. This lecture is well organized, engagingly narrated, with memorable examples. A great, rather intense introduction to Complexity.
A criticism I had was that Page attempts to weave too many disparate threads together. The final chapter gives practical takeaways but they feel trite and don't really integrate the most interesting parts from earlier in the book.
Here are some ideas I from the book I found interesting.
Wolfram’s four system types - https://www.wolframscience.com/nks/p231--four-classes-of-behavior/
Complexity happens in a sweet spot - There are four necessary conditions for complexity. But it's necessary that all of these conditions be present in moderate quantities. Too much or too little of any of them will lead to equilibrium, not complexity.
Measuring diversity generally speaking - https://en.wikipedia.org/wiki/Diversity_index
Dancing landscape - rugged landscapes that change over time.
Creative vs. evolutionary processes - Evolution is relentless and infinitely persistent, like the tortoise. Creativity requires drive from a specific group of people, who are finite in energy and time, but can move quickly, like the hare.
Power law vs normal distributions - Many things are normally distributed. But some are power law distributed. Why? Independent events leads to normal distribution. Interdependent events leads to power law distributions.
Explore vs exploit trade-offs - A well formulated treatment on this subject.
Self-organized criticality - the idea that many systems approach a critical state automatically.
Simulated annealing - an approach to finding global maximums under certain constraints.
Complexity is in the eyes of the beholder - if you're really smart, checkers is just tic-tac-toe, and no longer complex. Is that right?