A Tentative Roadmap
What topics and books live along the Road to Reality?
Soon, we begin. To set the tone, I wrote a short piece over at my weekly newsletter called “Computing the Universe“ about why studying physics sets the functional-programming module in my brain tingling. Give it a read if you’re interested in why I’m starting this project.
I also wanted to offer a few examples of the sort of textbooks and topics that I’ll attempt to cover. This series is ostensibly aimed at Penrose’s Road to Reality, but I have a big appetite and I’m sure I’ll cover more ground than necessary.
There are roughly four areas that fascinate me as I think about plunging into a phase of study. These are:
The spell books, the knowledge itself. What do we know about how the universe is ticking? We live in the universe, so we can poke and prod it whenever we like. Humans have developed an outrageous stable of mathematical tools to investigate reality, and without understanding these tools, modern theories of how it all works won’t make sense.
The history of the knowledge. How have we come to learn these things? The tools, of course, aren’t handed down from on high. Humans have discovered and molded the tools, and some are very ancient indeed. I’ve found that tools invented before and after modern computing feel very, very different to learn and use, and knowing this makes it easier to learn how to use them.
Strategies for learning hard things and doing deep work. The tools of modern math and physics are primarily brain-upgrades. We have these incredible, big brains that we can inject with new software; but it doesn’t feel very nice, and it’s easy to stall out mid-download. I think strategies for learning and absorbing hard things are as important as the tools themselves.
How to share and communicate research. We have this wealth of human knowledge available, and most of it is locked up in PDFs and the brains of stressed-out professors and grad students. I know in my bones that the situation can be so much better. This course is an experiment for me in how to squeeze knowledge through my brain and eject it in a format that’s a little easier for others to absorb. If we want to get to the next level, it can’t be a lifelong slog to even understand the questions.
The Spell Books
I’ve just finished “Teach Yourself Physics: A Travel Companion” by Jakob Schwichtenberg, and he does a fantastic job of laying out the map of modern physics. Go get that book for specific recommendations on the best textbooks for self-learning.
Physics is the skeleton that gives structure to a lot of things I’m curious about. The rough path through modern physics seems to be:
Dynamics and Classical Mechanics: This is how the world works at scales far from the very big and very small. How do we tell the future, given the present?
Electromagnetism: our first field theory. Incredibly important in its own right, and a warm-up for the bigger, badder field theories coming soon.
Quantum Mechanics: what breaks when you get very small?
Quantum Field Theory: Particles seem to cancel each other out or spring into existence. How can we make sense of this?
General Relativity: The gravitational field doesn’t seem to fit well into QFT, but it has its own, apparently quite difficult set of mathematical tools.
Quantum Gravity: Open research, with many folks trying to make the previous two bullets work with each other. (I don’t have a handle yet on why they don’t work together. I do know to gesture vaguely at black holes.)
It’s very tempting to study phenomena that you can understand with pen and paper. This rules out almost everything, so we have to move to computers.
There are three textbooks I’ve found that have implemented large areas of modern physics in Lisp, one of the first programming languages. The books are:
Structure and Interpretation of Classical Mechanics, or “SICM”: a tour of Lagrangian and Hamiltonian mechanics.
Functional Differential Geometry: a Lisp-y tour of the math required for general relativity
Turtle Geometry, from the same authors. Mathematical explorations in code, written back in the 80s and inspired by Seymour Papert’s Mindstorms.
These are really alien-level technology for teachers, but the code is clunky and hard to use. I’ve already put a good amount of work into creating a set-up for sharing interactive, beautiful solutions to SICM’s exercises over at https://github.com/sritchie/sicm, so expect to see invitations to play with ideas from this book yourself.
Textbooks as Literature
Some textbooks are famous almost as literature in their own right. Here are two examples:
Visual Complex Analysis, by Tristan Needham
The Variational Principles of Mechanics, by Cornelius Lanczos
Textbook authors really are the most outrageously generous humans in civilization. The amount of work and love it must take to elevate a technical work to the level of inspiring, soaring literature is insane. Here’s an example from Lanczos:
“We have done considerable mountain climbing. Now we are in the rarefied atmosphere of theories of excessive beauty and we are nearing a high plateau on which geometry, optics, mechanics, and wave mechanics meet on common ground. Only concentrated thinking, and a considerable amount of re–creation, will reveal the beauty of our subject in which the last word has not been spoken.” (The Variational Principles of Mechanics, p229)
The goal of the project is to absorb the core of each of the main areas of modern physics; that’s not going to be possible without being comfortable with the math.
I read a quote somewhere that math is like a dull whetstone on which you can sharpen your mind. It’s more fun than that, and full of dark corners where no one’s poked around. Some examples of textbooks:
Naive Set Theory, by Paul Halmos. Set Theory is the grammar of math. I spent last summer reading this book, and I’m excited to share some of the strangeness that bubbles out of the foundations of mathematics.
Information Theory: A Tutorial Introduction, by James Stone. “Information” is a technical idea that Claude Shannon invented at Bell Labs in 1948 to study how to send and recover messages over communication channels. Information has seeped into many areas of modern physics; it’s a simple, abstract idea, and this book does a great job of covering the basics.
My ability to study math and physics revved up when I started learning more about the humans that have been guiding the search, and the order in which all of this was discovered. I’m not sure it’s helpful to learn the specific tools in historical order; but the pioneers of these fields are real, lovely, hassled people with families, doing the best they can, and it all just gets a little more relatable when you pause and remember that.
Some examples of books that cover the history:
A History of Vector Analysis, by Michael Crowe. I promise that this book is fascinating, and I’ll do my best to distill it down for you.
The Curve of Binding Energy, by John McPhee. McPhee is one of my favorite writers. This book is a profile of Theodore Taylor, a bomb builder from Los Alamos, and his growing alarm at how easy it would be to build a nuclear bomb outside of a lab.
It’s painful to upgrade your brain. The periods where I’ve committed to doing it, pushed through and earned a new Mental Lens have been some of the most fulfilling periods of my life. (The Little Schemer with pen and paper, on a two week backpacking trip to Nicaragua launched my career as a functional programmer.)
Here are a few examples of books that I’ve found to be inspiring and helpful, on the topic of how to study (and retain!) hard things:
Deep Work, by Cal Newport. This book lays out the case for why deep, focused work is a critical skill, and then describes many wonderful tools for how to make it a part of your life. I loved So Good they Can’t Ignore You too. If you study hard things, the brain upgrades stick with you. Do this over and over and doors into new areas of work and play will seem to magically open for your new, enhanced mind.
Ultralearning, by Scott Young. I haven’t read this yet, but I’ve had a few people write me recently recommending it, and the first 20 pages or so have me excited.
Fluent Forever, by Gabriel Wyner. This is a brilliant little book about how to learn new languages. I spent 6 focused months studying Japanese and managed to absorb over 1000 Kanji and a ton of vocabulary using these tools. I’ve been porting the memorization tactics over to math and physics, and plan on writing about the differences between language learning and deep technical studying. (Surprise… memorization helps.)
Sharing and Communicating Knowledge
Finally, buried at the bottom, we come to my real motivation. I want to make it through Penrose’s book, yes. But I also want to use this series to explore much better, more interactive ways of communicating knowledge. So much modern research is shared through static PDFs. Scientists go to great lengths to hide the actual thought process that led to some new idea, and end up sharing soulless, static documents that put a huge burden on the learner to decode.
I’d like to test out different ways of exploring and learn with you all what methods are most interesting. Some examples of tools and approaches that I’ve been reading about:
My attempts over at the SICM repository to generate interactive simulations and math from code.
Roam Research, a piece of alien-level technology that I’ve been using for months now to sort and organize my writing and research.
How to Take Smart Notes, by Sönke Ahrens. This is an extraordinary book that describes how to actually store and collect the insights, thoughts and graph-like connections that pop into your head when you become curious. Roam is a perfect place to implement these ideas. My secret goal is to make research products appear almost automatically as a byproduct of exploring and feeding knowledge into a system like Roam. More on this later.
Michael Nielsen is way ahead of me on all of this. Check out the treasures over at http://cognitivemedium.com/. Start with Magic Paper if you need a boost.
You see what’s happening here. The Road to Reality is the excuse to go into the dark forest. The forest itself is fascinating, and there’s no rush.
Thanks for coming along with me. I’ll be back in a week or two with some notes on Lisp, which Michael Nielsen’s described as the Maxwell’s equations of software. If we’re going to talk about computing the universe, we need to start with Lisp.