Expensing snow tires on a company trip to California threw up a few flags. It seems Friedolf was trying to make it up 7,500 foot Table Mountain in California in the Winter and snows were the only option. He was a key player on the Bell Labs Telstar team responsible for its solar cells and radiation hardening. The team was small by today's standards and little was known about how to build the world's first successful active telecommunication satellite.
Table Mountain happened to be the location of a Smithsonian/JPL solar observatory. There was a question about how much power would be available for the satellite so Friedolf decided to make some calibrated measurements near sea level and on a mountain that happened to have accurate instruments. He then used a simple model - almost back-of-the-envelope and came up with a projection that turned out to be correct within a few percent.
He had a fair for clever instrumentation and the late 50s he was involved with Sandia Labs testing some nuclear devices.1 The trick was to put the instruments close enough to the device that they could record what was happening and get the signal a safe distance away before the instruments and cables were vaporized by the blast. All of this was worked out to the nanosecond and he built a reputation for getting very close to the explosions.
In science observation is centrally important. Most of your time as an experimentalist or observationalist is spent designing the observation and making sure it is doing what you think it will do. You have to trust it to the point where anything that strikes you as being odd really is odd - that is where the real pay dirt usually lies. Even if you don't see anything novel (which happens most of the time), you need to be certain you didn't miss anything. Then you run the experiment or your observation session and take your data. Finally you sort through what you saw. The shortest amount of time spent is usually the experiment/observation phase. In Friedolf's nuclear bomb measurements the event might last a few milliseconds, but designing and building the apparatus - thinking through exactly what it should do and verifying it works - often took several person years. The analysis phase could also run for person years.
Sometimes an observation can run a very long time, but you need to be certain your observation is up to catching the important moments accurately. All of this takes a lot of careful planning and worrying about the edge cases. You need to fail several times to realize where the pitfalls are - people who are good at the game can tell you more than a few disaster stories from earlier in their careers.
I was thinking about this recently as I spent some time with a company that is probably the best mix of technology and art I've seen. They build most of their own tools largely because they are at the leading edge in a changing world and partly because they consider understanding their tools a core competency. At the same time their knowledge of their end user is qualitative rather than quantitate. Their understanding of how their customers relate to their product is based on emotions and any modeling they could do wouldn't be as effective as their current technique. They currently see their process as good enough, but they continually test their assumptions.
I've complained about the use of the word "data" as it is often misleading. It is important to think about how the measurements to be made, how they will be made, how trustworthy they are, how they will be analyzed and how they can be used. If you can't do this entire chain robustly, you are probably in over your head. So rather than terms like data-centric, data-driven, or data-whatever, I prefer something like carefully observational.
In the early 90s I found myself in Mesa, AZ at an AT&T data center doing strike duty. A few of us got curious about the process leading to some discoveries about a rather messy system. Important corporate decisions were being made on what seemed to be straightforward call information - hundreds of millions of call detail records.2 Over the years some inaccurate assumptions were made about how CDRs were generated and recorded. Sorting out what was really happening was a difficult slog - a few person years of work by a few physicists and a mathematician, but we were able to show that about two percent error in the billing process. This is by no means an isolated case. I've seen it in numerous industries. "Data" taking and assumptions about the data require a lot of thought and any changes to the flow need to be carefully thought through if these streams are important to your business.
Recently Pip put together a list of questions to ask companies to determine how data driven they are. I'd be curious to see how dedicated they are to careful observation - are they observationally driven? Do they understand where they must be qualitative and where they can be quantitate? Can they build models from their observations and use these for business decisions? Can they use the information they gather, along with models they create and perform experiments on potential scenarios? The possibilities are quite rich, but certainly not for all businesses.
Is the work sufficiently important that the organization has an internal competence or does it or do they outsource some or all of it? What is the background of the folks doing the work?3 Are they in a position to deeply understand what they are measuring, analyzing, modeling and interpreting? I worry many organizations don't understand the process deeply and there are a lot of people who can make pretty charts that are - well - pretty, but not much else.
I'm even more impressed by those who have thought the problem through deeply enough to realize there are parsimonious choices that can be made. The company I visited comes to mind as does Trader Joe's - a company that makes effective use of human intelligence of its sales people. And Friedolf... the question came up as to how many solar cells should be on the Telstar. He noted the craft had to be spherical given the type of stabilization it had to employ and that some surface area would be used for antennas and some instrumentation. His answer was "cover everything else with solar cells - that will dictate everything else"... of course he was right and no numerical measurements were required to come to the conclusion. I'm impressed by organizations that sometimes see the purchase of snow tires as an important contribution towards understanding something deeply enough.
__________
1 Bell Laboratories administered Sandia for the government.
2 CDRs have start and stop time on calls, toll rate information, some switch information and not much else.
3 People from the natural scientists tend to excel - science is based on getting this right.
__________
Recipe Corner
A friend sent this recipe recently - I modified it a bit and made it with Cora Cora and Blood oranges. Excellent seasonal dessert that is super easy to make. It is really good with a bit of crème fraîche.
Oranges with Rosemary and Honey
Ingredients
° 2 oranges at room temp
° 2 blood oranges at room temp
° 1/4 cup of honey
° 1 tbl olive oil plus some for a pan
° 1 sprig fresh rosemary, separate the leaves and mine
° a good finishing salt (Malden)
Technique
° peel the oranges removing as much pith as possible and slice about 3/8" thick. Keep the blood oranges separate as they bleed color
° set the oven rack about 6" below the broiler and preheat. Cover a pan with aluminum foil and brush with oil. Put the blood oranges on one side to keep juice bleeding to a minimum. Drizzle the oil and honey over the slices. Sprinkle on the rosemary and salt
° broil until the oranges start to caramelize. About 4 minutes in my oven
° serve on a platter alternating orange slices. Pour the pan juices over the oranges.