Goodbye, Mr. Bond…

 

When I teach the history of Silicon Valley, I open my lecture with a clip from the 1973 James Bond film Live and Let Die.  My point it not to talk about Bond, but rather his wristwatch — a Pulsar LED, one of the first of our mobile digital devices.  Understanding the unique moment in history when digital watches still seemed like a pretty neat idea is key to understanding the strange and powerful economics of the semiconductor industry.

In recognition of the passing today of Roger Moore, here is an excerpt from Computer: A History of the Information Machine that I wrote about the history of the digital watch:

In the 1973 film Live and Let Die, the stylish secret agent James Bond traded his signature wristwatch, a Rolex Submariner, for the latest in high-tech gadgetry, a Hamilton Pulsar digital watch. Unlike a traditional timepiece, the Pulsar did not represent time using the sweep of an hour-and-minute hand; instead, it displayed time digitally, using a recently developed innovation in microelectronics called the light-emitting diode, or LED. In the early 1970s, the glowing red lights of an LED display represented the cutting edge of integrated circuit technology, and digital watches were such a luxury item that, at $2100 for the 18-karat gold edition, they cost more even than an equivalent Rolex. The original Pulsar had actually been developed a few years early for the Stanley Kubrick film 2001: A Space Odyssey, and, for a time, access to this technology was limited to the domain of science fiction and international espionage.

Within just a few years, however, the cost (and sex appeal) of a digital watch had diminished to almost nothing. By 1976 Texas Instruments was offering a digital watch for just $20 and, within a year, had reduced the price again by half. By 1979 Pulsar had lost $6 million dollars,  had been sold twice, and had reverted back to producing more profitable analogue timepieces. By the end of the 1970s, the cost of the components required to construct a digital watch had fallen so low that it was almost impossible to sell the finished product for any significant profit. The formerly space-age technology had become a cheap commodity good—as well as something of a cliché.

The meteoric rise and fall of the digital watch illustrates a larger pattern in the unusual economics of microelectronics manufacturing. The so-called planar process for manufacturing integrated circuits, developed at Fairchild Semiconductor and perfected by companies like Intel and Advanced Micro Devices (AMD), required a substantial initial investment in expertise and equipment, but after that the cost of production dropped rapidly. In short, the cost of building the very first of these new integrated circuit technologies was enormous, but every unit manufactured after that became increasingly inexpensive.

The massive economies of scale inherent in semiconductor manufacture—combined with rapid improvements in the complexity and capabilities of integrated circuits, intense competition within the industry, and the widespread availability of new forms of venture capital—created the conditions in which rapid technological innovation was not only possible but essential. In order to continue to profit from their investment in chip design and fabrication, semiconductor firms had to create new and ever-increasing demand for their products. The personal computer, video game console, digital camera, and cellphone are all direct products of the revolution in miniature that occurred in the late 1960s and early 1970s. But while this revolution in miniature would ultimately also revolutionize the computer industry, it is important to recognize that it did not begin with the computer industry. The two key developments in computing associated with this revolution—the minicomputer and the microprocessor—were parallel strands unconnected with the established centers of electronic digital computing.

Leave a Reply

Your email address will not be published. Required fields are marked *