Jump to ratings and reviews
Rate this book

A History Of Computing Technology:2nd (Second) edition

Rate this book
This revised edition of the popular reference and textbook outlines the historical developments in computing technology. It explains and describes historical aspects of calculation with an emphasis on the physical devices used in different times to aid people in their attempts at automating the process of arithmetic.

Paperback

First published November 1, 1985

45 people want to read

About the author

Michael R. Williams (Ph.D., Oklahoma State University) is professor of marketing and director of the Academy of Customer Excellence and Sales at Oklahoma City University. Prior to his academic career, Williams established a successful 30-plus year career in industrial sales, market research, and sales management and continues to consult and work with a wide range of business organizations. He has coauthored The Professional Selling Skills Workbook, Sales Management: Analysis and Decision Making and a variety of executive monographs and white-papers on sales performance topics. Williams's research has been published in many different national and international journals, including the Journal of Personal Selling & Sales Management, International Journal of Purchasing and Materials Management, Journal of Business and Industrial Marketing, Quality Management Journal and Journal of Business-to-Business Marketing. His work has also received numerous honors, including Outstanding Article for the Year in Journal of Business and Industrial Marketing, the AACSB's Leadership in Innovative Business Education Award, and the Marketing Science Institute's Alden G. Clayton Competition. In 2004, Williams was honored with the Mu Kappa Tau Marketing Society recognition award for Outstanding Scholarly Contributor to the Sales Discipline. He has also been honored with numerous university, college, and corporate teaching and research awards, including Old Republic Research Scholar, the presentation of a seminar at Oxford's Braesnose College, Who's Who in American Education, and Who's Who in America. Williams has and continues to serve in leadership roles as an advisor and board member for sales and sales management associations and organizations, including the University Sales Center Alliance, National Conference in Sales and Sales Management, and Vector Marketing.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
1 (11%)
4 stars
4 (44%)
3 stars
4 (44%)
2 stars
0 (0%)
1 star
0 (0%)
Displaying 1 of 1 review
Profile Image for Michael.
26 reviews7 followers
February 2, 2022
Recapitulations and contemplations after reading A History of Computing Technology by Michael R. Williams.

Computer science is inextricably bound up with mathematics. Indeed, the older departments of computer science in the world’s modern universities have most often been incubated first within mathematical departments, from which they then separate. Sometimes they emerge from departments of engineering, the connection to mathematics being indirect. The connection is balefully downplayed in university recruitment as being unnecessary, with a view to attracting skittish students—but it’s there, whatever the admission criteria be.

Understanding the mathematics is rewarding—even if you don’t study mathematics or engineering or computer science—because so much of our world is founded upon it. So much of what we consider to be inherent, to be self-evident, is the product of hundreds, even thousands of years of trial and error, of blind experimentation, with no guide for what is good or correct. It is striking how great the progress has been even in the last twenty years, let alone the last twenty thousand.

Our very numerals 0–9 are a relatively recent invention of the Indians, inherited by the Arabs, then by the Latin-speaking world. For the innovators did not only provide us with a wieldier system of numeration: the very idea nothingness as a firm representable quantity is owed to them. In Arabic that quantity to this day is called sifr, a centred dot ٠, which was Latinised as zephirum, and then syncopated to zero.

Multifarious experimentations with other systems preceded the Hindu-Arabic numerals’ eventual domination, as described by Michael R. Williams in this book. By the time that humans began settling as farmers, a means of accounting was certainly needed, especially where taxation was exacted on settled folk. The Mycenaeans, for example, of the third and second centuries B.C.E. used a tally system attested in Linear A and B tables.

Aside from secular concerns, however, were the movement of the heavens. Early humans would have easily observed that the seasons work in a cycle, and that familiar constellations appear and reappear in the night-sky in accordance with the seasons. Cloud patterns or the smell of the air augur bad weather, which in extremis kills. No doubt a link was made between kinds of weather and the seasons—blizzards and exposure in winter, draughts and starvation in summer. When seen as a facet of divine retribution, inferring the motion of the heavens takes on a religious character. Where nature and gods and weather and the heavens is intermingled, it is a study in Divine Law and Justice. Hence the Babylonians’ abiding interest in astronomy and astrology produced their division of the heavens (and hence the circle) into three hundred and sixty degrees, proceeding from their base-sixty system of counting.

In contemplating the foundations, one is transfixed by a sudden apprehension: These two earliest mathematical occupations have endured since the dawn of civilisation. Government and stargazing are jointly two very human concerns. One is inevitably drawn to the conclusion that advances in governance, philosophy and science could not have proceeded without the development of mathematics. It is perhaps the oldest institution after prostitution.

But how to record numbers? Abstraction is not difficult: fingers to count the flock. But it is not permanent. How about solid objects of roughly equal size, like stones? But stones can be scattered, records tampered. Stones are heavy, too, and impractical where large numbers are concerned, and can’t be subdivided. The problem becomes to develop a system that is manipulable, and a means of recording that is persistent, becoming more pressing when currency was invented.

When clay came to be used for pottery, its malleability when wet made it conducive to recording counts as lines. Records in clay could be fired and stored permanently. But clay is impractical, requiring ‘some skill and needle-sharp stylus’ if the script is complex, either entailing a simplified script, per Cuneiform, or another medium, as paper came to be used in Greece and Crete2. The early experimentations with numerical systems were thus concerned with means of recording quantities and preserving records. Two types of system emerged: positional (Arabic) and additional (Roman).

The additional systems seemed to have posed greater burden on the mathematician. One product was duplation and mediation, devised by the Egyptians, who considered them arithmetical operations alongside addition, subtraction, multiplication and division. Duplation is multiplication by doubling a multiplicand until the index (or sum of products of multiple indices) reaches a desired quantity. Mediation works similarly (but it also necessitated a system for representing fractions). In fact, mediation is still used today in optimisations of division on computers as bit-shifting!

The demands of navigation and government brought on great advances in tools, which was not greeted everywhere with enthusiasm. To the reader’s amusement, Williams relates accounts of mathematicians’ scorning tools. As great a personage as Plato was scornful of ‘the vulgar and childish art’ of calculation. In his time, mathematicians were either logistikoi, doers of calculations, or arithmetikoi, the pure mathematicians, who reasoned in the theory of numbers3. William Oughtred disdained the newfangled sector, because ‘the way of Art is not by Instruments’, teaching with which makes the teacher ‘vulgar’ and ‘their scholars only doers of tricks’. It is with some gall that he said this, considering that in the seventeenth century he invented the slide-rule, which was in wide use till very recently.

Tools eased computation. Speed was necessary in firing a cannon in pitched battle, accuracy to measure one’s position at sea. But however faster calculation was, it was still tediously slow, particularly where large multiplications were concerned. The task was eased considerably with John Napier’s invention of logarithms. Many tables of logarithms were published to that end, but were all riddled with errors, despite the repeated efforts of states to produce better tables. Revolutionary France made a prodigious effort in this regard, hiring hundreds of unemployed hairdressers to perform independent calculations and avoid unintentional conspiring. These tables were never published, though no doubt it would still have had significant errors: Despite other efforts at error elimination, the errors would always crop up away from the mathematicians in the typesetting!

Tables continued to be riddled with errors until the development of printing mechanical calculators in the mid-nineteenth century. Many attempts had been made at a workable mechanical calculator, but each had been hampered by physical limitations in one way or another. Carrying ones strained the gears and carries over multiple digits could result in slippage or breaks of the mechanism, a force supplied by the user by a crank. Until well into the twentieth century people would continue to rely on manual tools to perform calculations with the help of printed tables.

Babbage’s Difference Engine, a very large mechanical calculator used to estimate the coefficients of polynomials, was the first attempt at an automated solver, using the same process children are taught today in discovering the nth term of a series. It was to undertake the whole process from problem input to printing the result. It failed, but the concept did not, as a Swedish engineer, George Scheutz, produced a working difference engine years later, informed by Babbage’s designs.

On the back of the many iterations of the design of the Difference Engine, Babbage developed an early concept for a general computer: Babbage’s Analytical Engine, a monstrously large machine implementing memory, computation and output units. Modern machines would follow a similar schema. Memory apparently proved the most troublesome. Mechanical solutions were the simplest ways of storage conceptually, but being mechanical, they were slow; the whole machine was throttled by their slow speed. Stored numbers had to be passed along columns of gears to the arithmetic units on demand.

The computers of the interbellum and war periods were made of combinations of mechanical and electrical components. A shortage of components during and after the war constrained the development of purely electrical computers, although theoretically possible with the technology available. The first successful electronic computers appeared in Britain after the war, using vacuum tubes, followed by appearances in the U.S. (The U.S. military’s interest in the computing projects helped the engineers to circumvent hardware restrictions.)

The boggling variety of configurations—different word sizes, for example, because 8-bit words etc. had not yet become the standard—made these machines non-interoperable. IBM’s own computers not even interoperable until clients of the 700 series of computers demanded more easily programmable systems—giving rise to an industry of assemblers—and greater facility in sharing programmes produced for the computers between computing sites. Standards thence developed for word-sizes and gradually the assembly-level instructions available began to have a common set of instructions.

Computers are of course fully electrical now. Perhaps the only mechanical parts on most computers are the fans or the CD drive. The standards that fix the fundamental architecture of computers remains impressively consistent is owed to the computer market that developed in the decades after the War. There is still a chasm between Windows NT and UNIX systems in important ways. A programme compiled on one does not work on the other without a virtual machine. But by and large, with niggling exceptions, today code written on one machine can be compiled on any machine.

The book does not go further than 1985, when it was published. Williams could not have known about Turing’s machines in Bletchley Park, which are only speculated on. It would be interesting to compare his predictions with what we know now of the decipherment unit there and their machine. It is charming, too, to see the expressions of anticipation for the future of computing technology. Williams is retired now. I wonder what he thinks of these advances, of how central the computer has become to our existence.



1 The Decipherment of Linear B, John Chadwick, Cambridge University Press, p. 21.
2 Ibid.
3 A History of Computing Technology, Michael R. Williams, p. 57.
Displaying 1 of 1 review

Can't find what you're looking for?

Get help and learn more about the design.