In the past week at least a dozen people have asked if I will buy an WATCH. I tell them probably not - at least not this iteration. Watching reactions is interesting so I try to remain passive and let them take the direction of any discussion that follows. The categories so far:
° evaporation - conversation quickly moves elsewhere
° the death-knell for Swiss watch makers and the meaning of luxury
° speculation that this is the inflection point where Apple is finally doomed
° new forms of computation (a single case)
I have no idea if the watch will succeed or not, but am drawn to the last conversation type.
During my life the notion of computation has taken a variety of forms across range of devices. In high school my personal computer was a beat-up mechanical adding machine and slide rule. Some organizations were using electronic analog computers to solve differential equations and digital computing had made its move from universities to the military and a few industries. The public perception of a computer had shifted from a person, usually female, employed to do calculations by hand to a machine operated by men in white coats carrying boxes of punched cards. The digital computer was magical - unobtainable magic unless you were an organization.
Two millennia earlier there was the Antikythera mechanism - the earliest known mechanical analog computer. Earlier still were computers of place to mark celestial events - the alignments of celestial bodies with fixed monuments. Computation for religion, calendars and possibly a toy for the rich.
Navigation sparked the development of any number of analog mechanical computers. Astrolabes, sectors and so on. By the 19th century saw tide predicting machines were built to solve complex differential equations and the earliest naval fire control systems were fitted to long range guns to revolutionize warfare. Computation was driven by the needs of emerging technologies. Analog electro mechanical systems were built to understand and run the power network in the 1930s and WWII saw crash programs to build computational bomb sights and early code cracking electronic machines.
People very rarely transitioned from one of these computation systems to something newer. These were largely purpose built and used by experts. By the 70s a change was underway - the emergence of general purpose digital computing.
My perception of computing went from my slide rule to an account on a UNIVAC something or other in the engineering building and a growing understanding of WATFIV - a dialect of FORTRAN. I would drop off my jobs when I got up at 4am to get the lowest rates. The huge change came in grad school when my department let students use its minicomputers. There was direct access through a terminal - for the first time you were touching the heart of the machine. I started to play with networks - it hadn't occurred to me until then that a terminal on a network and some storage was beyond useful.
Personal computers were for the very dedicated hobbyist. By the early 80s I had logins on several UNIX machines connected to the ARPANET (my bang path ended up on samwise) and root on a few. Computing was democratizing and I helped start rec.pets.ferrets on USENET to connect with a larger community and learn how to support the fury family members.
At Bell Labs the day before Christmas was the best holiday of the year. Families would come in and people would mill about seeing what others were doing. Kids went wild with games they couldn't access at home. They had to be kicked off the machines to make way for others as the killed grues in Zork and fought off space invaders. We took this as a sign that our work must be cool. We missed the fact that the kids could care less about our work. Those from families of means were steering their families towards Radio Shack TRS80s and Apple ][s... the future of education you know:-)
1984 arrived. I had seen Xerox Stars and Smalltalk and later the Apple Lisa. Internally we had Blit terminals giving us a mouse driven windowing graphical user interface on our Unix Vaxen. It was clear this was the future so I spent a pile of money on a Mac (128k) on day one. It wasn't terribly useful and programming it was difficult, but something seriously important was in the air. The practical personal computer had nearly arrived. People were dreaming about mostly the wrong uses as they lacked experience with emerging use cases. IBM blessed PCs for business and the vast majority of action for a decade went in that direction. The Mac's user interface was dismissed. Real computer users (we had moved beyond programmers) used a text interface. A mouse based windowing system for most meant Windows 95 - which arrived around the time home Internet access exploded. A multi-pronged fork in the road had arrived and computing took them.
Everyone is familiar with what happened next. The rapid acceptance of the Internet as a primary reason for owning a personal computer followed by the an evolution to laptops and smartphones. Each of these came with its own blue sky. Initially the transitions appeared as failures to those who had been used to earlier forms of interaction. Tasks from the last generation of devices were often more difficult on these smaller form factors. Advanced users were too involved in their own world to understand that a much larger group of new users would come in and define new paths.
In the late 1990s our Human Computing Interface department at AT&T Research spent a lot of time worrying about the intersection of handheld computers with enough memory to run real operating systems with wireless communication, geolocation and photography. It was clear these would come together into a single device. We built a prototype based on a Compaq PDA we had convinced to run Linux. It was connected to other modules making it about the size of a brick, but it let the imagination run and we were given a view of what the world might be like in ten or fifteen years. We thought computing would go into the woodwork and clothing. Fashion as a matrix for intimate computation (the iCane, iBelt, iWatch concepts and smart shirts) and smartdust.
The iPhone isn't a phone or a Macintosh. The WATCH isn't a watch or an iPhone. I don't have much experience with the paths wearable computation might take. I think in terms of a path of types computation driven by specialized use and later by Moore's law and an exponentially increasing connected user base. Looking back I can see a clear path - a geodesic of computation - that runs from the Antikythera mechanism through the slide rule followed by a progression of mainframes, minicomputers, a series of Macs and the iPhone. It's smoothness is an illusion that only appears in retrospect. At each junction an increasing number of possible paths emerged. I found my thread of low resistance that met my use case and pocketbook.
Unlike the smartphone which we had crudely prototyped in 1999, I don't have enough experience with wearable interfaces to have a sense of my use case. I have my doubts about Apple's vision, but admit it is based on naivety. I don't know if they have found the right starting point, but suggest their new device represents a huge potential change that will roll out over the next five years. The wrist seems like a natural place for this amount of computation. There are so many other issues involved, but at least we have the first serious run at intimate computation.
A few final thoughts.
People who wear watches that cost more than a thousand dollars these days use them primarily as a social signal - as an element of fashion. That is an enormously large and complex subject. Smartphones flirted with the edges, but now it is front and center. The only things that seem certain are is devices will come back to our wrists for the first time since the mobile phone largely vanquished them and STEM education and thinking is for an earlier time.
I still use my slide rule, but increasingly I'm interested in the science of the aesthetics of materials and the culture we call fashion...
__________
Recipe Corner
I haven't made anything interesting other than a bit of experimentation with cold brewed chocolate. Chocoholics may want to give this a try.
Comments
You can follow this conversation by subscribing to the comment feed for this post.