History of computing
History of computing |
---|
Hardware |
Software |
Computer science |
Modern concepts |
By country |
Timeline of computing |
Glossary of computer science |
The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The timeline of computing presents a summary list of major developments in computing by date.
Concrete devices
Computing is intimately tied to the representation of numbers. But long before abstractions like number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as :
- one-to-one correspondence, a rule to count how many items, say on a tally stick, which was eventually abstracted into number;
- comparison to a standard, a method for assuming reproducibility in a measurement, the number of coins, for example;
- the 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly spaced knots, for example.
Numbers
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach sequences to others. All the known languages have words for at least "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items.
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example Euclid's algorithm for finding the greatest common divisor of two numbers.
By medieval times, the positional Hindu-Arabic numeral system had reached Europe, which allowed for systematic computation of numbers. During this period, the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm (for use in multiplication and division) and the trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their innate curiosity about an equation. Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to learn the answer.
Navigation and astronomy
Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed by looking up numbers in a mathematical table, and interpolating between known cases. For small enough differences, this linear operation was accurate enough for use in navigation and astronomy in the Age of Exploration. The uses of interpolation have thrived in the past 500 years: by the twentieth century Leslie Comrie and W.J. Eckert systematized the use of interpolation in tables of numbers for punch card calculation.
In our time, even a student can simulate the motion of the planets, an N-body differential equation, using the concepts of numerical approximation, a feat which even Isaac Newton could admire, given his struggles with the motion of the Moon.
Weather prediction
The numerical solution of differential equations, notably the Navier-Stokes equations was an important stimulus to computing, with Lewis Fry Richardson's numerical approach to solving differential equations. To this day, the most powerful computer systems of the Earth are used for weather forecasts.
Symbolic computations
By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to pass college-level calculus courses. Using programs like Maple, Macsyma (now Maxima) and Mathematica, including some open source programs like yacas, it is now possible to visualize concepts such as modular forms which were only accessible to the mathematical imagination before this.
Books for further reading
See List of books on the history of computing.
Journals
- IEEE Annals of the History of Computing
Writing computer history
The professionalization of computer historiography is a fairly recent phenomenon. "Until the 1990s, this literature consisted almost entirely of memoirs by computer professionals, often tightly focused on the invention and design of particular machines or on the business history of early computer companies. A few important scholarly books, such as Williams’ A History of Computing Technology, attempted to cover the whole sweep of computer history. Even these, however, generally focused on the computer as a technological object, rather than on applications or (especially) on the evolving social role of computers and networks." [1]
Edward urges his colleagues to think "outside the box" to understand the social dynamics of the development and use of computers.
Recent studies have begun to tell the story of the role of computers in the history of administration, business, communication and warfare.
Notes
- ^ Paul N. Edwards, "Making History: New Directions in Computer Historiography", IEEE Annals of the History of Computing, January-March, 2001
See also
- History of computing hardware, Timeline of quantum computing
- History of computer science
- Algorithm
- List of mathematicians
- Computing timelines category
- Virtual Museum of Computing
External links
- IEEE Annals of the History of Computing
- Richmond (UK) History of Computing Group
- The History of Computing by J.A.N. Lee
- The History of Computing Project
- SIG on Computers, Information and Society of the Society for the History of Technology
- A History of Computers
- Cringely's "Triumph of the Nerds"
- Computer History Museum
- Stanford Encyclopedia of Philosophy entry
- The history of computer
- Charles Babbage Institute: Center for the History of Information Technology
- Key Resources in the History of Computing
- History of the Computer
- A Chronology of Digital Computing Machines (to 1952) by Mark Brader
- Bitsavers, an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 50s, 60s, 70s, and 80s