The first electronic digital computers were made in the early 1940s, and dozens more were built in the late 1940s and the early 1950s. How was computing done before that? The French Revolution introduced not only a new calendar and a new system of weights and measures, but also division of the right angle into 100 grades instead of 90 degrees. New trigonometric tables needed to be prepared for surveyors, who might lose their revolutionary fervor if they had to convert grades to degrees and use the ancien régime tables. The official in charge of this chanced upon the chapter in Adam Smith's Wealth of Nations about a pin factory, and decided to organize a computing factory, hiring former wigmakers and servants, whose occupations were superfluous in the revolutionary society, but who could do arithmetic. Decades later Charles Babbage heard of this, and tried to build an automatic calculator of polynomials, but failed, having spent an enormous amount of money; a Swedish engineer and his son did build a simpler version. The nineteenth century saw many mechanical calculators and slide rules, which continued to be used well into the twentieth century; in the Soviet Union they were used into the 1970s. Large astronomical, ballistic, nautical calculations demanded large workshops full of calculators; the New Deal's stimulus program hired hundreds of unemployed men and women to calculate mathematical tables. One meteorologist imagined a weather forecasting service staffed by 64000 calculators; before Edward Lorenz's butterfly paper it was not realized, how hard it is. Electromechanical tabulators were used since the 1890 United States census; during the Manhattan Project, young Richard Feynman gave a plutonium bomb computation both to women with mechanical calculators and to tabulators; at first the women were ahead, but then they got tired, and the tabulators didn't.