Why?
In the last post we looked at the Higgs candidate announcement at CERN with a bit of the historical context. Searches like this are expensive in time and money. Why do we do them?
The motivation to the scientist is a fundamental curiosity and a need to expand the amount of communal ignorance in the field - to ask new and deeper questions. Formulating and asking the right sort of questions is the trick and is how science progresses. Fortunately there are side benefits that can drive technology and benefit humanity along the way.
One of the beauties of studying Nature is that it is comprehensible - at least so far. The tactic is to formulate really good questions and the best tend to be clean and simple. There is usually some technique involved and that can be extremely difficult, but the basic questions are simple.
People often give me strange looks when I tell them I love physics because it is the simplest of all of the sciences. By that I mean that it tends to study the basics of matter, space and time. Particle physics is particularly attractive as it studies the smallest bits of matter. The problem is the Earth lacks the conditions to allow exotic forms of matter that existed when the Universe was very young, so the trick has been to slam them bits together hard - really hard. The energies involved are so high that temperatures of the very early Universe are created in tiny packets for very short amounts of time and, thanks to Einstein, the energy can convert into exotic short lived particles. These particles decay into more stable particles that fly away from the collision site at very high speeds and by detecting these and their masses, charge, and their kinetic energy we can infer what was created in the flash.
Early particle accelerators could fit in a lab room, but higher energies required larger and larger accelerators. The ISR at CERN has a design energy of 7 TeV. This is about 7,500 times as much as the energy associated with the mass of a proton - its rest mass. It effectively has 7,500 times as much energy as a resting proton. This is about the energy of a flying mosquito. There are a lot of protons in the storage ring. If we add up all of the energy in the protons circulating the ring - all of those mosquito level energies - we get a big number as there are a lot of protons. The total amount of kinetic energy of the protons in the ring is about the same as a battleship moving at several kph (left as an exercise for the reader).
The regions where the collisions take place are surrounded by onion like layers of detectors to sort out out all of the debris. The two big ones at CERN are Atlas and CMS - for a sense of the scale of the machine, check out these images taken a few years ago during the construction phase.
But why do this? Is it worthwhile?
I'm in the camp that the measured pursuit of fundamental science is a good thing. In the process of getting there new ground in supporting fields is often broken and there are sometimes serendipitous discoveries that can change society. I'll mention a couple of examples.
Recently Om asked about the data rate of the experiments at CERN. That is a tricky number to get at as an enormous amount of information is rejected. Each collision produces a large number of "tracks" - the paths of the product particles. The data rate at the intersection points of the CMS detector are on the order of 40 terabytes per second in its current configuration - far too high for any current computation system to handle1 The first set of triggers are fashioned from hardware logic and demand key pieces of important information be in place to be interesting. These make their decision in about a millionth of a second. The number of candidates that survive this step is reduced by a factor of about a thousand.
The next stage of the trigger logic is software - mostly C++ and some assembler. This is powerful to reduce the number of interesting events by another factor of 1,000. Now the data rate is manageable at about a millionth the original rate - about 40 billion bytes a second. The rate is being increased as the capabilities of each detector is improved. Atlas, the other big detector, has a three layer trigger that reduces events by a factor of about 200,000. It produces about 320 Megabytes per second. The total data coming from current detectors on the ISR is about 1 Gigabytes per second and rapidly increasing.2
Most of these events turn out to be uninteresting and it is necessary to quickly figure out what is boring and get rid of it. This is done by using layers of hardware and software "triggers".
All of this data - this candidate information - needs to be shipped around the world to several thousand physicists and carefully processed. The ISR supports something North of about a hundred investigations at any time - dozens within each major collaboration. Some people work on a few investigations simultaneously and hundreds of Ph.D.s are one byproduct.3 High performance databases and networks are a byproduct. A friend who happens to be one of the "gods" of computer science like to hang out with CERN and JPL types as their needs are often beyond what is available and they drive interesting areas of computer science.4
Another bit of serendipity begun at CERN impacted the Internet deeply. Tim Breners-Lee created the world wide web to aid particle physics collaboration. When I was a grad student collaborations usually involved at most perhaps a half dozen institutions and maybe two dozen people. (Check out a few photos of my Ph.D. thesis experiment. ) A major apparatus might support a handful of investigations. As energies increased detectors became much more complex and a great amount of sharing was necessary. Now we have hundreds of institutions and thousands of people working at single shared detectors with hundreds of investigations underway. Many of these are interconnected. Effective mechanisms of communication were necessary and it isn't a big surprise that particle physics began to use precursors to the Internet in the late 70s with most institutions being users before the Internet became the real Internet.
Physicists were too uninformed and dumb to know that hierarchical networked information schemes wouldn't work, so they ended up inventing a practical and nearly good enough framework out of necessity - the human part of networked collaboration badly needed common repositories that were easy to navigate. Another physicist named Larry Smarr managed a group that, among other things, made Tim's WWW practical - Mosaic was the first effective browser. The Internet was already rapidly growing at the time based on some policy moves by the US government in the late 1980s. The WWW, Mosaic and subsequent development created an explosion of use. It is important to note the plodding phone companies had their own notion of what a massive data network should be and were in the process of working through an International standards body. Up until about 1995 the plan of record at AT&T and probably many other firms, was very different and would have created something that would have been under much greater centralized control with a cost structure that would have been cost prohibitive to users like you, me and your elderly relatives. A revolution would not have taken place. The architecture and nature of a worldwide data network was was up for grabs early on. The timing of the growth combined with the slow motion of the telcos gave the Internet a shot and it, quite simply and fortunately, won.5
I believe that serendipity like this as well as the creation of a deep technical talent pool is sufficient to justify the cost of the sport. Unfortunately the level of commitment in the US has fallen and continues to fall and Europe is also faltering. China, on the other hand, is rapidly becoming a major powerhouse and the next generation of accelerator may well be Chinese. It is great there is still a strong curiosity as we are still at the beginning of the process and hopefully will go in much deeper. And in the process new technologies and a stream of curious people who create them will continue to flow.
This give me great optimism for the future.
_________
1 A generous estimate of all Internet traffic is about 30,000 PB a month - about 12 TB a second .. so this single detector at CERN is currently seeing "prompt" data at about three times the rate the entire Internet generates.
2 I'm comfortable with using the term "data" here as it means something to me in the context of experimental particle physics. It is some carefully massaged information, but how it is massaged is carefully thought out and procedures exist to detect problems.
3 Most of these Ph.D.s will not work in particle physics past a postdoc or two, but often find work in industry. Many of them do high performance and/or high efficiency software. Friends who manage software projects tell me these guys are often much better than regular computer science graduates in these specialized areas. I find there is an enormous ability to be interested in multiple fields and connect the dots. Physics may be terribly specialized at the discipline level, but turns out to be cover many fields in practice and experimental physicists are often powerful technical generalists.
An interesting artifact is C++ has some very efficient compilers and libraries for those who need the best performance they can get without resorting to using assemblers and having a deep understanding of their computational platforms.
4 The JPL connection and the notion of super robust zero effective fault computing is also an interesting driver and worth writing about in a future post. It turns out there is a connection with a boat on a goose pond.
5 A big disconnect between Bell Labs and AT&T proper was knowing about and betting on the Internet. Few in Research had their doubts about the Internet, but the company was on a very different internal course through the ITU. The company had to change all of a sudden and did that with a few clueful folks and a boatload of - well - flashy ballast. It couldn't keep up and ultimately failed, only to be acquired by one of the "Baby Bells" that had been created at divestiture. Its name had branding value, so at least that piece continues.
I'm convinced that without the rapid expansion of the Internet fueled by the emergency of the Web it is likely the carriers and the ITU would have won and it is likely you would be using something closer to a commercial Minitel rather than what you have today - if you even had access.
_________
Recipe
A regular reader and good friend happens to be of Belarusian ancestry. One of her aunts makes a great cake that uses egg whites. I've been making many rich gelatos and ice creams lately and they tend to use a lot of yolks ... I was looking for something to use the egg whites and this is just the ticket.
I'm not sure if it really is Belarusian or if someone from Belarus just uses it, but it is wonderful. The recipe is somewhat modified with a bit less butter and adjusted to use cornflour rather than potato starch as Sukie is allergic to potatoes. It is also terrific with about a tablespoon of lemon zest mixed in.
° 160 g plain/all-purpose flour
° 1 Tbsp cornflour
° 1 Tsp baking powder
° 100 g melted butter (just under a stick), slightly cooled
° Preheat oven to 350° F with a rack in the center
° Finally fold in cool melted butter.
° Pour the batter into a buttered or pam'ed bunt pan and bake for 30-40 minutes. It is done when a wooden toothpick comes clean
° Cool slightly before turning out of the pan.
Comments
You can follow this conversation by subscribing to the comment feed for this post.