On March 11, 1811, the first Luddite attack in which knitting frames were actually smashed occurred in the Nottinghamshire village of Arnold. Kevin Binfield in Writings of the Luddites: “The grievances consisted, first, of the use of wide stocking frames to produce large amounts of cheap, shoddy stocking material that was cut and sewn rather than completely fashioned and, second, of the employment of ‘colts,’ workers who had not completed the seven-year apprenticeship required by law.”
In 1589, William Lee, an English clergyman, invented the first stocking frame knitting machine, which, after many improvements by other inventors, drove the spread of automated lace-making at the end of the 18th century. Legend has it that Lee had invented his machine to get revenge on a lover who had preferred to concentrate on her knitting rather than attend to him (as depicted by Alfred Elmore in the 1846 painting The Invention of the Stocking Loom).
Lee demonstrated his machine to Queen Elizabeth I, hoping to obtain a patent, but she refused to grant one, fearing the impact on the work of English artisans: "Thou aimest high, Master Lee. Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars" (quoted in Why Nations Fail by Daron Acemoglu and James Robinson).
Another accidental inventor was Alexander Graham Bell. His father, grandfather, and brother had all been associated with work on elocution and speech, and his mother and wife were deaf, which influenced Bell's research interests and inventions throughout his life. Bell’s research on hearing and speech led him to experiment with the transmission of sound using electricity, culminating on March 7, 1876, when he received a U.S. patent for “improvement in telegraphy,” his invention of what later would be called the telephone. Three days later, on March 10, 1876, Bell said into his device: “Mr. Watson, come here, I want you.” Thomas Watson, his assistant, sitting in an adjacent room at 5 Exeter Place, Boston, answered: “Mr. Bell, do you understand what I say?”
Later that day, Bell wrote to his father (as Edwin S. Grosvenor and Morgan Wesson recount in Alexander Graham Bell):
Articulate speech was transmitted intelligibly this afternoon. I have constructed a new apparatus operated by the human voice. It is not, of course, complete yet—but some sentences were understood this afternoon… I feel I have at last struck the solution of a great problem—and the day is coming when telegraph wires will be laid to houses just like water or gas—and friends converse with each other without leaving home.
The telephone was adopted enthusiastically in the U.S., but there were doubters elsewhere, questioning its potential to re-engineer how businesses communicated. In 1879, William Henry Preece, inventor and consulting engineer for the British Post Office, could not see the phone succeeding in Britain because he thought the new technology could not compete with cheap labor: “…there are conditions in America which necessitate the use of instruments of this kind more than here. Here we have a superabundance of messengers, errand boys, and things of that kind.”
The telephone not only ended the careers of numerous messenger boys worldwide but also led to the total demise of the telegraph operator. On January 25, 1915, Bell inaugurated the first transcontinental telephone service in the United States with a phone call from New York City to Thomas Watson in San Francisco. Bell repeated the words of his first-ever telephone call on March 10, 1876. In 1915, Watson replied, “It would take me a week to get to you this time.”
While the telephone destroyed some jobs, it created other new occupations, such as the telephone operator. But this very popular job among young girls also eventually became the victim of yet another accidental inventor.
On March 10, 1891, Almon Brown Strowger, an American undertaker, was issued a patent for his electromechanical switch to automate telephone exchanges. Steven Lubar in InfoCulture: “…a Kansas City undertaker, Strowger had a good practical reason for inventing the automatic switchboard. Legend has it that his telephone operator was the wife of a business rival, and he was sure that she was diverting business from him to her husband. And so he devised what he called a ‘girl-less, cuss-less’ telephone exchange.”
The first automatic switchboard was installed in La Porte, Indiana, in 1892, but automated switchboards did not become widespread until the 1930s. Anticipating future reactions to some of the inventions of the computer age, shifting work to the users was not received enthusiastically by them. But AT&T’s top-notch propaganda machine got over that inconvenience by predicting that more operators would be needed before long than there were young girls suitable for the job.
AT&T and its users were ambivalent about switching to automatic switching. While users were not happy about working for no pay for the phone company, they also valued the privacy accorded to them by the automatic switchboard. AT&T, for its part, was interested in preserving its vast investment in operator-assisted switching equipment. Richard John in Network Nation:
To rebut the presumption that Bell operating companies were wedded to obsolete technology, Bell publicists lauded the female telephone operator as a faithful servant… The telephone operator was the “most economical servant”—the only flesh-and-blood servant many telephone users could afford…. The idealization of the female telephone operator had a special allure for union organizers intent on protecting telephone operators from technological obsolescence. Electromechanical switching… testified a labor organizer in 1940… was “inanimate,” “unresponsive,” and “stupid,” and did “none of the things which machinery is supposed to do in industry”—making it a “perfect example of a wasteful, expensive, inefficient, clumsy, anti-social device.”
The transistor, invented in 1946 at AT&T to improve switching, led to the rise and spread of computerization and made the switching system essentially a computer. By 1982, almost half of all calls were switched electronically. The transistor also took computerization from the confines of deep-pocketed corporations and put it in the hands of hobbyists.
On March 5, 1975, The Homebrew Computer Club met for the first time, with 32 “enthusiastic people” attending. Apple’s co-founder Steve Wozniak:
Without computer clubs there would probably be no Apple computers. Our club in the Silicon Valley, the Homebrew Computer Club, was among the first of its kind. It was in early 1975, and a lot of tech-type people would gather and trade integrated circuits back and forth. You could have called it Chips and Dips…
The Apple I and II were designed strictly on a hobby, for-fun basis, not to be a product for a company. They were meant to bring down to the club and put on the table during the random access period and demonstrate: Look at this, it uses very few chips. It’s got a video screen. You can type stuff on it. Personal computer keyboards and video screens were not well established then. There was a lot of showing off to other members of the club. Schematics of the Apple I were passed around freely, and I’d even go over to people’s houses and help them build their own.
The Apple I and Apple II computers were shown off every two weeks at the club meeting. “Here’s the latest little feature,” we’d say. We’d get some positive feedback going and turn people on. It’s very motivating for a creator to be able to show what’s being created as it goes on. It’s unusual for one of the most successful products of all time, like the Apple II, to be demonstrated throughout its development.
The first Apple computers were not designed as a “product for a company.” In the 1970s, business executives were not entirely clear about the potential impact of computers on work and workers. On March 1, 1971, Time magazine reviewed the state of the computer industry in “A Growth Industry Grows Up”:
Computer technology has raced ahead of the ability of many customers to make good use of it. Not long ago, the Research Institute of America found that only half of 2,500 companies questioned felt that their present machines were paying for themselves in increased efficiency… For all the change that it has already wrought, the computer has barely begun to transform the methods of business and very probably the character of civilization.
Apple and other PC makers greatly impacted workers and how work gets done, and later, how consumers live their lives (although, I would argue, computers certainly did transform “the character of civilization”). However, it was difficult for “experts” to predict the exact nature of which workers will be impacted and how. See January in the History of Information Technologies for more on failed predictions, specifically SRI’s 1976 report “Office of the Future,” authored by an expert working not far from the Homebrew Computer Club.
Recent advances in artificial intelligence have led many commentators to declare that we have entered “the second machine age,” in which automation will impact knowledge workers rather than blue-collar workers, who were the type of workers replaced by computers—and knitting frames—before. This is not an entirely accurate description of the past and possibly the future.
In 1953, John Diebold wrote in “AUTOMATION-The new Technology”:
The effect of automation on the information handling functions of business will probably be more spectacular and far-reaching [than its impact on factory workers]. Repetitive office work, when in sufficient bulk as in insurance companies, will be put on at least a partially automatic basis. Certain functions, such as filing and statistical analysis on the lower management levels, will be performed by machines. Yet very few offices will be entirely automatic. Much day-to-day work—answering correspondence and the like—will have to be done by human beings.
Two years after publishing the book that popularized the term “automation,” Diebold established one of the first consulting companies advising businesses on how to adopt this new technology. An entirely new industry and an entirely new breed of knowledge workers followed in his path. “Automation” has created many new jobs, and there is no reason why AI and robots will not give rise to new knowledge worker jobs—in consulting, servicing, help-desking, observing, counting, talking, analyzing, researching, marketing, selling, etc.
More importantly, as Diebold has predicted, but in a much larger sense, computers have altered and augmented knowledge work since the 1950s. To begin with, early computers replaced the original “computers,” human calculators (mostly women) that had already been replaced to some extent by mechanical calculators. Their work was certainly “knowledge work,” and their skills and education served well those who transitioned into the new jobs of “computer operators” and “programmers.”
How about that quintessential knowledge work, that of a lawyer? Sure, we now have AI programs leading to “expensive lawyers replaced by cheaper software.” But how many expensive lawyers were “replaced” by the advent of LexisNexis in the 1970s?
Diebold predicted that the “day-to-day work” would not be automated, and what he probably had in mind was that secretaries would keep their jobs. The fact that they didn’t show how difficult it is to predict what impact computers will have (or not have) on knowledge work. Secretaries were not replaced by computers but by managers who accepted the new social norm that it was not beneath them to “answer correspondence and the like,” as long as they did it with the new status symbol—the personal computer.
View the current projections of how many jobs will be destroyed by artificial intelligence with healthy skepticism. As in the past, many occupations will undoubtedly be affected by increased computerization and automation. However, many current occupations will thrive, and new ones will be created as the way work is done continues to change.