David Epstein mentioned Hillel Einhorn's epiphany about happiness - something triggered by a fortune cookie.
David Epstein mentioned Hillel Einhorn's epiphany about happiness - something triggered by a fortune cookie.
Posted at 10:40 AM in building insight, miniposts | Permalink | Comments (0)
About ten years ago I devoted a good deal of time to learning a bit of category theory - an area of abstract mathematics that's become important to physics. It deals with relationships rather than objects, abstractions and analogies, context, sameness, equivalence rather than equality, how you see patterns, and much more. It can impact thinking in many areas beyond mathematics. It provides a rigorous framework that gives choice in how problems are approached and take you beyond putting things in boxes that don't always make sense.
It's very technical, but Eugenia Cheng has written an excellent non-major freshman level book with the goal of getting art students to think about the world differently. I suspect it will work for many other people - particularly those who have to deal with complexity and seeing curious patterns that are otherwise missed. The Joy of Abstraction is one of those rare books that can change how you see the world. I'd recommend getting a physical version and doing the exercises. (I don't buy books from Amazon and recommend you support local bookstores)
Steve Strogatz chatted with her on this episode of his Joy of Wh(y) podcast. Listen even if the book doesn't strike you as interesting!
Posted at 10:31 AM in book recommendation, critical thinking, math, podcasts | Permalink | Comments (0)
It turns out we make it up as we go along and that's a foundational piece of why it's been so successful.
Apparently there's been some discussion about the connection of physics and some quest for an ultimate truth on a social media site I gave up. A few people reached out and asked my opinion. Starting down the path as a teenager, I thought science was after some deep and ultimate truth. Some grand and beautiful underpinning of the Universe. I was wrong. It took time, but finally as a grad student I had come to know better.
What physics does is build mathematical models that describe observations. (why math works so well is another discussion - but it just does) The hypothesis models that are robust enough to survive different experimental techniques and are predictive become accepted theory. It's very much an iterative process and the models get progressively better with more predictive models coming into use as appropriate. The Standard Model is the best we have for particle physics. It has some warts, but it's far and away the most accurate theory devised. General Relativity is the benchmark for gravity. Both are under constant study and that study has led to radically different views of the Universe and how it has and will evolve.
But to the question from the birdsite. What is real? Is a proton real and what is it made of? What about the spin of a particle? Is dark matter or the dark force real? For the electron and the quarks and gluons that make up protons and neutrons, the most accurate and way to think about them is as excitations in fields that spread throughout the universe. You, the computer screen you're looking at, your dog and your lunch are vast assemblages of field excitations. It's far from intuition and these models are rarely taught until the junior year and that's usually at a high level as the math is a bit intense. It may seem weird, but it's much more predictive than anything else. And gravity being spacetime that changes it's curvature around mass and mass that responds to the curvature of spacetime isn't exactly Newton's falling apples.
Current accepted theory addresses serious flaws in earlier work. The earlier work was good enough given the sophistication of measurements, but tools improve with time. It turns out most (but not all!) of applied physics and engineering doesn't have to worry about these deeper descriptions as these are fields that usually apply to things roughly our size moving a low speeds. There's an interesting notion called complementarity that tells you it's okay to use less precise, but easier to understand, models when you don't need extreme accuracy. We live in a world where models with a Newtonian underpinning are usually valid.
In a way it's how we perceive things. We only sense a tiny amount of the electromagnetic spectrum, our hearing misses quite a bit, our sense of time misses large and small intervals and our size is roughly between that of the Sun and a hydrogen atom. We don't perceive much of the reality around us - we have to use our imaginations to figure out how to ask the right questions and see more deeply.
So we build models. We don't work out the equations of motion, but we walk, throw balls, drive cars and so one. We have a fair amount of information for these simple actions. Physical models quickly get more difficult. How do you predict an earthquake and its severity? When and where tornadoes will do damage? We build approximations, but that's all they are. Better models will come using the scientific process. Even if it's a messy undertaking, it's proven to be the best approach we have. And there's a foundational principle you have to deal with - one that Feynman famously articulated:
The first principle is that you must not fool yourself and you are the easiest person to fool.
More complex and social sciences come into play. Predicting markets and economic models don't have the twelve digit accuracy seen in the Standard Model .. they generally don't make it past the decimal point. They're also a stab at reality. Social models are often poorly constructed with bad data and analysis - a major "AI" issue these days. And how we understand each other as people. We all build models. Hopefully they're good enough.
For a long time I believed you could change minds about the causes and threat of global warming through education.Silly me - it turns out to be a fool's errand. We all have different views of reality. We can witness an event and see differing realities. These lenses can be very different and culturally driven. Quite a bit of work has been done trying to understand how this works. Recently a podcast chat with psychologist Jer Clifton was recommended by Corinne. I give it two thumbs up and recommend it to anyone who has to deal with other people in almost any capacity. It may even make you think twice about how you interact with others.
Posted at 03:52 PM in history of science, podcasts, science | Permalink | Comments (0)
a minipost
With all the talk about large language models and applications like ChatGPT, I recalled a post from about two and a half years ago: eliza the next generation. I'd say much the same now - some of the predictions have come about. Of course it's only been a short while and I played with prototypes, so there's that:-)
It was high pandemic and I was spending quite a bit of time talking with friends who were working on interesting things. Among other things I started to become aware of progress in large language models. While many who work in AI (a poor label, but you get the idea without me having to talk about a variety of fields and subfields as well as what's unrelated), believed them a diversion, there was progress in LLMs and some interesting applications were emerging. We chatted about where it could go and what techno-social issues might arise. (disclaimer: I'm not a techno-solutionist) About the time I posted, Emily Bender at the University of Washington wrote a terrific paper on issues with natural language understanding and if we can believe what comes out of LLMs. Recently an article on her work appeared in The Intelligencer. It's a necessary read if only to understand her octopus analogy.
This morning Gregg Vesonder and I exchanged some text messages. He's a serious AI guy with decades of experience and is someone who thinks deeply about social technical issues. One of his comments:
We’ve seen this scene before, blinded by the hype. My concern this time is that explicit or inadvertent bias in the data can mask some nasty behavior that can subtly or not so subtly mask really tragic manipulations.
Aka speech acts.
I hadn't thought about speech acts much before. The definition and examples given in the link are excellent tools for thinking about these things.
I'm not against LLMs and there are some useful applications beyond writing horoscopes in white bro-speak, but the dimensionality of the space for abuse strikes me as large. As Suw Charman-Anderson notes, we're already seeing it in publishing.
'AI' is a very large and diffuse collection. Some areas are useful and potentially revolutionary. Others are potentially dangerous. We really need to think about these things rather than just trying them out and seeing what breaks. The things that break may be far too valuable.
Posted at 11:28 AM in miniposts, society and technology | Permalink | Comments (0)
[A minipost from my reply to Pip Coburn on his recent piece on student-mindedness]
The lesson came as I prepared for my thesis defense. The Stony Brook physics department has three examiners: two physicists - a theorist and an experimentalist (one is your advisor) - and someone external to the department who is outside your field. How I found mine is a long story, but the choice made a huge difference. He was a surgery professor and an accomplished violinist, which seemed reasonably remote from the semileptonic decay mode of charmed particles. The committee members get a draft of the thesis a few months before the defense. Ideally that's when major issues generally uncovered. I thought the surgeon wouldn’t have much to say, but he *really* wanted to understand what it was about. He was a serious and intensely curious student, with a range far beyond medicine. His particle physics was about on par with my surgical skills, so I had to describe my work to extremely bright student who had a very different background. Spread over two months we spent at least twenty hours together. His "dumb" (the label he chose) questions gifted me with a deeper insight into what I'd done. There was so much I hadn’t thought deeply enough about.
Over a good meal one night he said he used to be intimidated by experts outside his field until he discovered they were often intimidated by him. If you feel intimidated it’s a sign there's something to learn and perhaps you can learn something. Over those two months I learned a bit about surgery and Bach violin sonatas, but the real learning was to to welcome and appreciate the imposter syndrome (I didn’t know to term until much later). Also I learned something about communication. What I didn’t learn was how to effectively communicate the work to high school students - but that’s another story.
Since then I’ve been on a number of defense committees. Only one was physics. The rest have been all over the place and have been fantastic opportunities to learn and get a bit of insight from a developing expert. Hopefully some of my dumb questions help them as much as my surgeon friend's dumb questions helped me.
Posted at 11:00 AM in building insight, education, miniposts | Permalink | Comments (0)
This morning I remembered a quote that would be perfect for a friend as well as many of you. I'll get to it later, but it made me think about something that I've spent time thinking about for at least a decade. As an introduction there's a question we like to ask others.
What's the most important human invention?
Agriculture, towns, fire, cooking, clothing, etc.. tend to come up. All are important, but I like turning the question a bit. Consider a central invention we all do - something that makes us very human.
We invent tomorrow.
Our minds create mental models - visions of the future - and give us a kind of time travel. We run through potential scenarios we might encounter and can plan for. They're usually in an episodic form and we imagine ourselves and others, often in some other place, experiencing a variety of options. This time travel can flip and reach into the past. We call it episodic memory. These memories aren't perfectly accurate. They're more like screenplays we use to recreate the past with whatever bits of memory and shards of content we can access. We can also create "what if" scenarios and use them to plan for the future. It's a fluid time machine that can operate almost simultaneously in the past and future creating past and future episodic memories. And critically, we separate past and future from each other and the present.
People are really good at copying. Compared to other primates we're supercopiers. But we also experiment doing things like sticking our hands into fire, eating a plant we shouldn't have .. and we, or those who watched us tempt the Darwin Award, remember and let others know. We innovate and teach moving this information into our culture - a cultural evolution of sorts. There's a lot to unpack here. It's a beautiful subject to think and ask others about, but I want to keep this short and eventually get to the quote.
Keeping track of things quickly gets out of hand for our very limited memories. It appears the first mechanisms were appeared over 20,000 years ago to deal with the information overload of keeping track of herd animals, seasons.. important things like that. It was the dawn of humanity creating an external memory system. Moving forward the Sumerians are usually credited with creating a formal writing system. At first accounting, but eventually our stories became readable. And that takes us to a beautiful description by Carl Sagan.
What an astonishing thing a book is. It’s a flat object made from a tree with flexible parts on which are imprinted lots of funny dark squiggles. But one glance at it and you’re inside the mind of another person, maybe somebody dead for thousands of years. Across the millennia, an author is speaking clearly and silently inside your head, directly to you. Writing is perhaps the greatest of human inventions, binding together people who never knew each other, citizens of distant epochs. Books break the shackles of time.
Posted at 02:06 PM in building insight, change, critical thinking | Permalink | Comments (0)
Every now and again I try to post something on the history of science, technology, or math - at least in the areas where I've spent time with primary sources or history of science and technology classes. My first director at Bell Labs was big on the history of science and engineering, encouraging us to take a series of courses offered at Rutgers. Such courses are important, but far too rare. One counts as one of my favorite undergrad classes: 'Engineering disasters of the first part of the 20th century'.
My problem is keeping such pieces short and to the point. Nikola Tesla is someone I've started writing about at least three times before bailing as there was too much for an hour or so of writing. Another issue is much of the folklore is wrong. As an electrical engineer he was incredibly intuitive, but his understanding of the underlying physics was often somewhere between wrong and batsh*t crazy. It seemed like to much to deal with.
This short video on Tesla touches the high points and is consistent with a course I had along with a lot of reading from the period. The course was great. Looking at the period from from from Michael Faraday through Philo Farnsworth, it touched on the physics, engineering and the business interests that drove the middle and later parts of the period. Describing the period as colorful would be understatement.
Reality doesn't diminish my respect for Tesla - it's just he wasn't what popular myth suggests. And a side note. I spent about three years at Brookhaven National Laboratory on Long Island. The remains of the old Tesla tower is nearby, protected by a too easily scaled fence:-)
To the video - it's a short must-watch piece.
Posted at 09:12 AM in history of technology | Permalink | Comments (0)
Ab his via sterniture ad maiora.
In about 1820 the local monarch ordered Carl Friedrich Gauss to carry out a survey of the Kingdom of Hanover. The great mathematician toiled away on the survey for about four years. While physically difficult and stressful, it's nature gave Gauss something interesting to think about.
Halfway through he realized you could chop a flat map up into small triangles and get an approximation for the topography, but it could never be perfect. The Euclidian geometry we tend to think of where parallel lines never meet and the sum of the angles of a triangle is always 180° just doesn't work.1 He set his mind to consider the geometry of curved spaces. It undoubtedly seemed nutty to anyone but a mathematician at the time, but he pushed on and developed the Theorema Egregium - a masterpiece of thought. At a high level it shows you can determine the curvature of a surface with just local measurements. You don't need to consider the space the surface is embedded in. Among other things it tells you why we hold pizza the way we do and that the sum of the angles on a sphere is greater than 180° and changes on an egg and the border of Wyoming isn't a rectangle.. Here I link to a great explanation by the wonderful Cliff Stoll.
Gauss realized this was fundamental, but only a beginning. At the beginning of his paper describing the discovery was this phrase:
Ab his via sterniture ad maiora - 'From here the path to something more important is prepared'
Indeed it was. Thirty or so years later Bernhard Riemann generalized what Gauss had started. Its beauty was appreciated and enhanced by a few mathematicians until Albert Einstein needed a tool to describe the curvature of spacetime. And there it was - a beautiful, but fairly obscure, mathematical result from the mid 19th century revolutionized physics 70 years later. Our Universe is not Euclidian .. that's only an illusion, albeit a very good approximation most of the time, dictated by where we live and our senses. Among mathematicians Gauss is a great GOAT candidate.2
We didn't have Latin in high school and I probably wouldn't have taken it anyway. Since then I've learned a little on my own, but am very rusty. Still, there are phrases that seem more useful than their English equivalents. Encountering one you pause and think a bit. This is one I use encouraging people working on long term goals where much of the day to day grind to major accomplishments seem like small steps on the way to the goal. It is a reminder to keep pushing ahead.
__________
1 Think of lines of lines of latitude on the Earth. They intersect the equator at right angles added to 180°. The angle they subtend at the pole is greater than 0°, so the sum of all three is greater than 180°.
2 The story of a school master telling the seven year old Gauss to add the integers from 1 to 100 to occupy him for an hour appears in a summation of his life shortly after his death. Given the problem, he thought for awhile and answered 5050. He wrote the integers 1 through 100, folded the series and added .. so 1+100, 2+99 and so on. Each pair sums to 101 and there are 50 such pairs so 50*101 = 5050. He had shown the sum of integers from 1 to n = n(n + 1)/2.
I don't know if it was apocryphal, but he was doing serious math a few years later.
Posted at 07:50 PM in math, miniposts | Permalink | Comments (0)
Once a year I read BP's Engery Outlook. They have a record of serious greenwashing, so anything from them (or the other fossil fuel companies) requires a bit of skepticism, but this year's report took a turn from the normal path. They suggest global oil demand peaked in 2019. There have been any number of 'peak oil' predictions, but most refer to extractable supply - a bit more on that later. While the BP predictions are more scenarios than rigorous predictions, it's significant that a large oil company believes oil demand is, and will continue to, drop.
BP outlines three scenarios: Accelerated, Net Zero and New Momentum. I'm guessing they'd prefer the New Momentum scenario, and will probably lobby and use other tactics to move in that direction, but the bottom line is they see less oil coming from the ground.
I think they're transportation modeling is too conservative. Electrification will probably take place much more rapidly than predicted. Granted there's a lot of inertia in fleet replacement given how long cars last, but the assumptions strike me as too slow. They don't seem to be taking into account the likely unbundling of transportation. Horace Dediu has been a big proponent of unbundling and points out most trips are local and short. [side note: you really need to follow Horace as well as well as David Levinson if you have any interest in the future of ground transportation] It makes no economic or environment sense to use a two ton car, gas or electric, to make a three mile trip. While much of North America has saddled itself with legacy transportation infrastructure and little imagination of what is possible, change is coming to Europe and it seems likely the developing world will favor very inexpensive electric micromobility solutions.
Back to peak oil. M. King Hubbert was a geophysicist with the Shell Oil Company in the 50s when he started looking at long term oil production predictions. All tended to make use of variables that weren't well pinned down. Hubbert's model was simple assuming once a discovery was made, production would increase exponentially as more resources and efficiencies are brought in. At some point a peak is reached and an exponential decline begins. The model can apply to any region, or groups of regions, assuming all of the discoveries are taken into account. It also assumes the impact of new technologies occurs at a constant rate.
On the extraction side the so-called Hubbert curve has accurately modeled many oil fields and regions - notably the US through about 2005.1 When dramatic new technologies appear - like practical fracking, shale and tar sands production, and computer guided drilling - you effectively add new sources. It's an interesting resource mining curve that's been used in many areas outside of oil production. Hopefully we'll move to a point where most extractable oil is left in the ground and oil exploration becomes a thing of the past.
__________
1 It may look like a Gaussian or Bell curve, but it doesn't die off as quickly. It takes the form
Posted at 11:24 AM in energy, society and technology, technology | Permalink | Comments (0)
a celebration of homophones
For those who use the American mm/dd/yyyy, it suggests 3.14 With an error of about 0.05%, it’s a good enough approximation for many uses. And those who use the more common dd/mm/yyyy notion can wait for the 22nd of July for 22/7. At 0.04% from the mark, this one's a slightly better approximation.
But just how much Pi do you really need? Eight digits is overkill for most engineering tasks. NASA uses 16 digits .. 15 to the right of the decimal .. to navigate around the solar system going beyond Pluto. The most I can imagine would be to construct a high quality circle the radius of the known universe, about 46 billion light years, to the diameter of a hydrogen atom,. You’d need 38 significant digits for the task.
xkcd has a piece on approximations that’s painfully true. For some back of the envelope calculations in astrophysics and cosmology you can get by with crude approximations.
It’s also Einstein’s birthday. Pie, being a homophone of Pi, seems to the a good bet for the day. I have it on good authority he loved cherry pie and ice cream, so there you go.
He’s one of a dozen or so people who have quotes improperly ascribed to them. He was a joker and much of what he did say is often taken out of context. But here's a real one I love. Shortly before his death he was asked to offer advice to young people. His reply was published in Life Magazine.
"The important thing is not to stop questioning. Curiosity has its own reason for existence. One cannot help but be in awe when he contemplates the mysteries of eternity, of life, of the marvelous structure of reality. It is enough if one tries merely to comprehend a little of this mystery each day." A. Einstein 1955
Posted at 10:12 AM in general comments, miniposts | Permalink | Comments (0)
| Reblog (0)