Deluge, Drought, Fire and Earthquake
California's Four Seasons
- a plaque at Caltech
In the early 70s two researchers began to sift through long term weather data-sets hunting for patterns other than seasonal variation. They had access to a late 60s super computer and a powerful processing technique called the Fast Fourier Transform.1
The data set was from an isolated weather station near the equator and international date line. Every day a weather balloon would be tracked created a time series. An interesting pattern emerged - about every month and a half barometric pressure dropped and wind speeds picked up. It looked like the signature of a storm and was interesting enough that they analyzed data from weather stations throughout the region. A weather oscillation in the tropics of the Indian Ocean emerged and was named for them. Now it appears the Madden-Julian oscillation (MJO) is one of the most important weather patterns on the planet.
(click for animation)
Computers and computation have improved enormously in the forty years since the MJO's discovery, but only recently have they been powerful enough to enable researchers to ask questions deep enough to build predictive models. The mechanism is far from well understood, but it appears it impacts other systems. It is connected to Indian and Australian monsoons, the Pineapple Express and perhaps most dramatically for Californians the El Niño.
This year several huge warm wet disturbances have amplified the developing El Niño to produce some intense hurricanes in the Pacific as well as possibly suppressing activity in the Atlantic. California may well be in the crosshairs of an exceptionally strong El Niño event. It is unlikely to bring much drought relief as the snow pack is unlikely to increase much, but it could bring floods and mudslides. Ah the seasons.
Outside of the research community the MJO is not well-known, but understanding it may well be central to understanding weather on the planet and its behavior in a warming world could have dramatic impacts. But if it is only beginning to be understood now why mention it? It turns out enough is known that the uncertainty for certain classes of events has been reduced to the point where predictions can be made. One organization has managed to model it well enough to be able to make accurate enough cold snap predictions over a month out - far enough that traders are using it to make bets on heating fuels.
Science is sedimentary. By the beginning of the 20th century pencil and paper calculations and a few simple measurements were enough to suggest that human caused release of carbon dioxide into the atmosphere could raise the average temperature of the Earth. By the 1960s a clear signal was emerging and a few people began to make predications. By 1980 alarm bells had gone off in the research division of Exxon and at a few University and government labs. The simple models had small enough error bars that it appeared we might be heading for some rather bad sailing. The 90s saw a shrinkage of the error bars along with enough insight that deeper questions arose about mechanisms. Error bars shrunk and new insight and questions came. By around 2000 climate scientists had a sound enough case that most scientists in similar fields were agreeing with them. The signs were strong enough that it was clear action was necessary, but money and politics got in the way.
We won't understand nature perfectly - but there are points along the way where our understanding is sufficient to bring invention and change. There is an art and a science in knowing how to determine when the error bars are small enough to be useful. But progress will continue and maybe we'll get a handle on each of California's seasons.
___________
1 The CDC 6600 had a 10 MHz clock and 128 K of memory in the $10 million maxed out configuration. Comparisons with current technology are difficult, but the S1 module in the Apple Watch has a clock, slowed for battery life, that is 52 times as fast and 256 K of on-chip memory cache as well as 8 GB of flash that it can access faster than the CDC.
The Fast Fourier Transform reduces the computation required for a Fourier transform from O(n2) to O(n log n) where n is the data size. For realistic data sets this was worth decades of Moore's Law. All of us are aware of the fantastic progress that has been made in silicon over the past six decades, but improvements in algorithms have also made an enormous contribution.
__________
Recipe corner
No recipe this time other than noting beans are a favorite. I like some of the dried varieties from Rancho Gordo and use their basic bean cooking technique. It will work for any mostly fresh dried beans.
RANCHO GORDO Cooking: Basic Beans from Steve Sando on Vimeo.
Comments
You can follow this conversation by subscribing to the comment feed for this post.