During a Christmas holiday in the late 70s I lugged our group's Texas Instruments Silent 700 back to Montana to for a bit of programming. The computers for our experiment were at Brookhaven National Laboratories and Stony Brook on Long Island. Huge multimillion dollar machines that ran programs in batch mode, I would type in my Fortran code, submit the job and check to see if it ran or not. The output had far too much information to display on my terminal - that would have to wait for my return.
It looked like am oversized typewriter printing semi-legibly to hard to find thermal ink paper. The link was a telephone. I'd call long distance after 11pm when the rates were low and push the phone handset into the acoustic coupler cups on the terminal. Three hundred baud - about 300 bits per second. Almost glacial, but spiffy for a remote terminal. When the phone line was noisy (often), you could change the baud rate to a leisurely 110.1
There was another machine I'd call - a Unix machine at Berkeley that could network with other machines on something called Theorynet which connected to CSNET which gatewayed to a ARPANET. There was this thing called email that was proving to be useful as well as being able to compile and run programs directly and the engineering types were in full experimentation mode. Mailing lists were particularly useful and we had a few for particle physics. Shared and distributed services - imagine that. It seemed like it might eventually be useful.
I was using a form of what the folks at Bell Labs were calling cloud computing.
Computers beyond hobbyist level were extremely expensive metered resources, but all of the excitement seemed to be at the lower level. Microcomputers were popping up all over - Byte, the tech porn of the day, was printing ads from dozens of companies that blinked in and out of existence. Much of it still demanded skill on the part of the user. A reasonable machine - something with a full gallon of random access memory - started at about $5,000 in money of the day and quickly soared out of sight.2 Almost everything I was doing required much more computer.
But prices were coming down. Memory and processors were dropping in price with improved silicon production and growing in capability as the the size of transistors shrunk. Apple had a plug and play model and Radio Shack had an 'inexpensive' model for home use. Both were dramatically better than much more expensive machines only a few years before. Then the IBM PC came and business finally started buying microcomputers - the trend in price and power continued.
It is fun to pick something like a cpu, memory chip or disk drive and plot its price per unit of processing power or storage over time. Here I've done it for random access memory in 4 per megabyte of storage. The data points are single unit prices advertised in computer magazines and are chosen for the best value at the time.3 It is not adjusted for inflation - doing so would steepen the slope by a factor of about eight if you thought CPI inflation numbers applied.
The bottom line is widely evident. Computing has become ridiculously inexpensive and is appearing everywhere. Some supporting bits of aren't improving as rapidly - things like batteries, radio links and security. It is important to dwell on these and how they might impact new devices and applications as we move beyond PCs and smartphones into wearables and the internet of things.
If you are interested in links longer than a ten meters or so, radios use a lot of power. This is ok if you have a wired power supply or can easily change batteries on a regular schedule, but that just isn't practical or inexpensive in many cases. Very low power radios and computers that compute and communicate only when needed are desirable. Much of the stuff that will surround us will fall into this category. It would be useful that the communications be secure and private in some cases - only between our us and our own devices. There are cases where direct communication with the cloud doesn't make much sense.
We do carry smartphones much of the time and our proximity with our cloud of connected devices is often sufficient. The smartphone can communicate and process - some of it might travel somewhere else - perhaps to some cloud service. We may go one better and have enough communication and computation in something like a watch. It would then talk to our phone directly or over an ad hoc mesh network we control.
Reading to Tim Cook's speeches hint at a very end-user concentric model. The customer of a service is the end user. Cloud based services are only used when it can add to the platform or experience. There is a recognition other services should be supported, but don't make sense for Apple to monetize. Some are necessary. Apple doesn't have to be the best in class - just good enough. They appear to have focused on five services that directly touch the user throughout the day.
° Siri - cloud based information and possibly a control interface
° HomeKit - home automation/IoT/home entertainment
° ApplePay - secure payments
° CarPlay - in car infotainment
° HealthKit - body/health monitoring (security and privacy are enormous issues)
Arguably four of these are very primitive at this point and all are seeing development. Some are mostly cloud (Siri) others may be very local or, in the case of HealthKit, allow you to pick who and what you share information with. All involve working with other companies - ApplePay is a particularly interesting example. It isn't clear how Apple will execute or what their schedule is, but it does focus on the end user as customer and the services tie together through the smartphone and possibly even a wearable rather than a cluster of points in the cloud and third parties.
There are many subtle architecture points that Apple seems to be making when you go through their developer documentation and listen to some of their folks. The thumbprint security model they have is powerful and difficult to do right. They are not a fully integrated company as some thing. Much of what they do is modular, but where user experience is involved they focus on integration. Part is the linkage of OS to hardware, but also the custom silicon that gives lower power consumption than their peers and allows for fingerprint security. I suspect this type of integration will become even more important to them.
Most interesting to me is how do you get reliable, sensible and secure communications across billions of devices. There are some issues with programing tools - and early hints that Swift is more than just a modern general purpose language. Swift really only makes sense in the context of Apple's framework and minimizing errors across devices may be a goal.
Much to consider - cloud based services are still incredibly important and will increase in power, but silicon is just to cheap and other bits too expensive to insure its ubiquity. Fundamental issues with the security model of the Internet will probably come to bite the cloud - getting around those issues is be non-trivial.4 I'm betting on explosive growth in the ground fogs five years out and Apple seems to be lining itself up for the opportunity. Apple has a record of rolling things out which are promptly declared dead and useless, but more than a few pan out down the road. This is a five year plan at least.
__________
1 No, I'm not going to get into baud vs bits per second other than noting if you have an interest in ancient history you can look it up.
2 Many of the early hobbyists were also amateur radio enthusiasts. When you tested a transmitter you would dump the power into a small load that absorbed up to a kilowatt of radio frequency power. A kilowatt is about 60% of a small stove burner, so these loads were placed in one gallon paint cans filled with motor oil. A full gallon meant you were running a kilowatt- the legal limit for some of the shortwave bands.
Some of the microprocessors of the day could address up to 64 kilobytes (216 bytes) of random access memory. Being able to fill match the address space with physical memory was running a full gallon. In the late 70s a quality fully populated S100 board was about $3,000..
3 Byte magazine for much of it, later NewEgg ads. Early points are a hodgepodge from old magazines. There was no attempt to find the lowest price or even an average price - just representative prices using the same vendor over multiple years where possible.
The graph is not a perfect line on the log plot - economics and bumps in manufacturing are mixed in.
4 The model of a trusted computing base (TCB in the literature) hasn't been relevant since we went to a server in the cloud computing. Finding answers is not easy and the best researchers in the field find themselves asking very basic questions to find direction.
__________
Recipe Corner
A chocolate soup based on something I had in a restaurant. Stomping unhealthy, but excellent:-) Many variations should be possible - raspberry would be nice
You're Kidding, Right? Chocolate Soup
Ingredients
° 1/4 cup heavy cream
° 3/4 cup whole milk
° 1 navel orange, zested and segmented
° 6 ounces 70% dark chocolate, broken into pieces
° 1 tablespoon honey
° 1/4 cup orange juice
° 1 tablespoon orange liqueur
Technique
° In a small saucepan over medium heat, bring cream, milk and orange zest to a boil, then immediately remove pan from heat. Pull off the heat, allow zest to steep in milk mixture for 15 minutes.
° Meanwhile, in a medium, nonreactive bowl, combine chocolate and honey.
° Place infused milk back over medium heat and bring to a boil. Pour milk through a strainer into chocolate-honey mixture, whisking together until chocolate is fully melted
and texture is smooth. Gently stir in orange juice and orange liqueur until combined.
°. To serve at room temperature, let sit 30 minutes. When ready to serve, stir until combined, pour into bowls and top with orange segments.
° Or you can refrigerate it a few hrs to let it thicken
Comments