From a paper by Francis Glaton that appeared in the April 5, 1977 issue of Nature. He published a series on heredity which is seen as a big step in the development of correlation and regression analysis.
Some years ago the Bulletin of Atomic Scientists (the doomsday clock group) did a science & history piece about nuclear secrets hit men
one of the stories
From 24 January to 4 February 1944, Heisenberg traveled to occupied Copenhagen, after the German Army confiscated Bohr's Institute of Theoretical Physics. He made a short return trip in April. In December, Heisenberg lectured in neutral Switzerland. The United States Office of Strategic Services sent former major league baseball catcher and OSS agent Moe Berg to attend the lecture carrying a pistol, with orders to shoot Heisenberg if his lecture indicated that Germany was close to completing an atomic bomb. Heisenberg did not give such an indication, so Berg decided not to shoot him, a decision Berg later described as his own "uncertainty principle”
Space is a really big place. Even the solar system is too large for exploration using conventional rockets. But we manage to get there - the trick is gravity assisted flight - a technique developed by Michael Minovitch in the early 60s.. A spacecraft passing close to a planet can gain momentum from the planet giving a nice acceleration. The planet, in return, loses a tiny amount. As a graduate student Minovitch worked out about 200 planetary fly-bys.
Up until this time it looked like a brute force form of space flight was necessary and NASA was working on nuclear rockets and nuclear powered ion propulsion systems. The technology didn't work out at the time and gravity assist has seen great use.
A 2012 Economist article on the future of manufacturing encapsulated this common conception. “Governments have always been lousy at picking winners, and they are likely to become more so, as legions of entrepreneurs and tinkerers swap designs online, turn them into products at home and market them globally from a garage,” the article stated. “As the revolution rages, governments should stick to the basics: better schools for a skilled workforce, clear rules and a level playing field for enterprises of all kinds. Leave the rest to the revolutionaries.”
That view is as wrong as it is widespread. In fact, in countries that owe their growth to innovation, the state has historically served not as a meddler in the private sector but as a key partner of it—and often a more daring one, willing to take the risks that businesses won’t. Across the entire innovation chain, from basic research to commercialization, governments have stepped up with needed investment that the private sector has been too scared to provide. This spending has proved transformative, creating entirely new markets and sectors, including the Internet, nanotechnology, biotechnology, and clean energy.
Today, however, it has become harder and harder for governments to think big. Increasingly, their role has been limited to simply facilitating the private sector and, perhaps, nudging it in the right direction. When governments step beyond that role, they immediately get accused of crowding out private investment and ineptly trying to pick winners. The notion of the state as a mere facilitator, administrator, and regulator started gaining wide currency in the 1970s, but it has taken on newfound popularity in the wake of the global financial crisis. Across the globe, policymakers have targeted public debt (never mind that it was private debt that led to the meltdown), arguing that cutting government spending will spur private investment. As a result, the very state agencies that have been responsible for the technological revolutions of the past have seen their budgets shrink. In the United States, the budget “sequestration” process has resulted in $95 billion worth of cuts to federal R & D spending from 2013 to 2021. In Europe, the eu’s “fiscal compact,” which requires states to drop their fiscal deficits down to three percent of gdp, is squeezing educational and R & D spending.
State spending on innovation tends to be assessed in exactly the wrong way. Under the prevailing economic framework, market failures are identified and particular government investments are proposed. Their value is then appraised through a narrow calculation that involves heavy guesswork: Will the benefits of a particular intervention exceed the costs associated with both the offending market failure and the implementation of the fix? Such a method is far too static to evaluate something as dynamic as innovation. By failing to account for the possibility that the state can create economic landscapes that never existed before, it gives short shrift to governments’ efforts in this area. No wonder economists often characterize the public sector as nothing more than an inefficient version of the private sector.
This incomplete way of measuring public investment leads to accusations that by entering certain sectors, governments are crowding out private investment. That charge is often false, because government investment often has the effect of “crowding in,” meaning that it stimulates private investment and expands the overall pie of national output, which benefits both private and public investors. But more important, public investments should aim not only to kick-start the economy but also, as Keynes wrote, “to do those things which at present are not done at all.” No private companies were trying to put a man on the moon when NASA undertook the Apollo project.
Without the right tools for evaluating investments, governments have a hard time knowing when they are merely operating in existing spaces and when they are making things happen that would not have happened otherwise. The result: investments that are too narrow, constrained by the prevailing techno-economic paradigm. A better way of evaluating a given investment would be to consider whether it taught workers new skills and whether it led to the creation of new technologies, sectors, or markets. When it comes to government spending on pharmaceutical research, for example, it might make sense to move past the private sector’s fixation on drugs and fund more work on diagnostics, surgical treatments, and lifestyle changes.
It could be argued that these equations got their start 150 years ago this month, when Maxwell presented his theory uniting electricity and magnetism before the Royal Society of London, publishing a full report the next year, in 1865. It was this work that set the stage for all the great accomplishments in physics, telecommunications, and electrical engineering that were to follow.
But there was a long gap between the presentation and the utilization. The mathematical and conceptual underpinnings of Maxwell’s theory were so complicated and counterintuitive that his theory was largely neglected after it was first introduced.
It took nearly 25 years for a small group of physicists, themselves obsessed with the mysteries of electricity and magnetism, to put Maxwell’s theory on solid footing. They were the ones who gathered the experimental evidence needed to confirm that light is made up of electromagnetic waves. And they were the ones who gave his equations their present form. Without the Herculean efforts of this group of “Maxwellians,” so named by historian Bruce J. Hunt, of the University of Texas at Austin, it might have taken decades more before our modern conception of electricity and magnetism was widely adopted. And that would have delayed all the incredible science and technology that was to follow.