answersLogoWhite

0


Best Answer

= A Brief History of Computing

- Mechanical Computing Devices = © Copyright 1996-2005, Stephen White 500 B.C. The abacus was first used by the Babylonians as an aid to simple arithmetic at sometime around this date. The abacus in the form we are most familiar with was first used in China in around 1300 A.D. 1623 Wilhelm Schickard (1592-1635), of Tuebingen, Wuerttemberg (now in Germany), made a "Calculating Clock". This mechanical machine was capable of adding and subtracting up to 6 digit numbers, and warned of an overflow by ringing a bell. Operations were carried out by wheels, and a complete revolution of the units wheel incremented the tens wheel in much the same way counters on old cassette deck worked. The machine and plans were lost and forgotten in the war that was going on, then rediscovered in 1935, only to be lost in war again, and then finally rediscovered in 1956 by the same man (Franz Hammer)! The machine was reconstructed in 1960, and found to be workable. Schickard was a friend of the astronomer Johannes Kepler since they met in the winter of 1617. 1625 William Oughtred (1575-1660) invented the slide rule. 1642 French mathematician, Blaise Pascal built a mechanical adding machine (the "Pascaline"). Despite being more limited than Schickard's 'Calculating Clock' (see 1623), Pascal's machine became far more well known. He was able to sell around a dozen of his machines in various forms, coping with up to 8 digits. 1668 Sir Samuel Morland (1625-1695), of England, produces a non decimal adding machine, suitable for use with English money. Instead of a carry mechanism, it registers carries on auxiliary dials, from which the user must re-enter them as addends. 1671 German mathematician, Gottfried Leibniz designed a machine to carry out multiplication, the 'Stepped Reckoner'. It can multiple number of up to 5 and 12 digits to give a 16 digit operand. The machine was later lost in an attic until 1879. Leibniz was also the co-inventor of calculus. 1775 Charles, the third Earl Stanhope, of England, makes a successful multiplying calculator similar to Leibniz's. 1776 Mathieus Hahn, somewhere in what will be Germany, also makes a successful multiplying calculator that he started in 1770. 1786 J. H. Mueller, of the Hessian army, conceives the idea of what came to be called a "difference engine". That's a special purpose calculator for tabulating values of a polynomial, given the differences between certain values so that the polynomial is uniquely specified; it's useful for any function that can be approximated by a polynomial over suitable intervals. Mueller's attempt to raise funds fails and the project is forgotten. 1801 Joseph-Maire Jacuard developed an automatic loom controlled by punched cards. 1820 Charles Xavier Thomas de Colmar (1785-1870), of France, makes his "Arithmometer", the first mass-produced calculator. It does multiplication using the same general approach as Leibniz's calculator; with assistance from the user it can also do division. It is also the most reliable calculator yet. Machines of this general design, large enough to occupy most of a desktop, continue to be sold for about 90 years. 1822 Charles Babbage (1792-1871) designed his first mechanical computer, the first prototype for the difference engine. Babbage invented 2 machines the Analytical Engine (a general purpose mathematical device, see 1834) and the Difference Engine (a re-invention of Mueller's 1786 machine for solving polynomials), both machines were too complicated to be built (although attempt was made in 1832) - but the theories worked. The analytical engine (outlined in 1833) involved many processes similar to the early electronic computers - notably the use of punched cards for input. 1832 Babbage and Joseph Clement produce a prototype segment of his difference engine, which operates on 6-digit numbers and 2nd-order differences (i.e. can tabulate quadratic polynomials). The complete engine, which would be room-sized, is planned to be able to operate both on 6th-order differences with numbers of about 20 digits, and on 3rd-order differences with numbers of 30 digits. Each addition would be done in two phases, the second one taking care of any carries generated in the first. The output digits would be punched into a soft metal plate, from which a plate for a printing press could be made. But there are various difficulties, and no more than this prototype piece is ever assembled. 1834 George Scheutz, of Stockholm, produces a small difference engine in wood, after reading a brief description of Babbage's project. 1834 Babbage conceives, and begins to design, his "Analytical Engine". The program was stored on read-only memory, specifically in the form of punch cards. Babbage continues to work on the design for years, though after about 1840 the changes are minor. The machine would operate on 40-digit numbers; the "mill" (CPU) would have 2 main accumulators and some auxiliary ones for specific purposes, while the "store" (memory) would hold perhaps 100 more numbers. There would be several punch card readers, for both programs and data; the cards would be chained and the motion of each chain could be reversed. The machine would be able to perform conditional jumps. There would also be a form of microcoding: the meaning of instructions would depend on the positioning of metal studs in a slotted barrel, called the "control barrel". The machine would do an addition in 3 seconds and a multiplication or division in 2-4 minutes. 1842 Babbage's difference engine project is officially cancelled. (The cost overruns have been considerable, and Babbage is spending too much time on redesigning the Analytical Engine.) 1843 Scheutz and his son Edvard Scheutz produce a 3rd-order difference engine with printer, and the Swedish government agrees to fund their next development. 1847 Babbage designs an improved, simpler difference engine, a project which took 2 years. The machine could operate on 7th-order differences and 31-digit numbers, but nobody is interested in paying to have it built. (In 1989-91, however, a team at London's Science Museum will do just that. They will use components of modern construction, but with tolerances no better than Clement could have provided... and, after a bit of tinkering and detail-debugging, they will find that the machine does indeed work.) 1853 To Babbage's delight, the Scheutzes complete the first full-scale difference engine, which they call a Tabulating Machine. It operates on 15-digit numbers and 4th-order differences, and produces printed output as Babbage's would have. A second machine is later built to the same design by the firm of Brian Donkin of London. 1858 The first Tabulating Machine (see 1853) is bought by the Dudley Observatory in Albany, New York, and the second one by the British government. The Albany machine is used to produce a set of astronomical tables; but the observatory's director is then fired for this extravagant purchase, and the machine is never seriously used again, eventually ending up in a museum. The second machine, however, has a long and useful life. 1871 Babbage produces a prototype section of the Analytical Engine's mill and printer. 1878 Ramon Verea, living in New York City, invents a calculator with an internal multiplication table; this is much faster than the shifting carriage or other digital methods. He isn't interested in putting it into production; he just wants to show that a Spaniard can invent as well as an American. 1879 A committee investigates the feasibility of completing the Analytical Engine and concludes that it is impossible now that Babbage is dead. The project is then largely forgotten, though Howard Aiken is a notable exception. 1885 A multiplying calculator more compact than the Arithmometer enters mass production. The design is the independent, and more or less simultaneous, invention of Frank S. Baldwin, of the United States, and T. Odhner, a Swede living in Russia. The fluted drums are replaced by a "variable-toothed gear" design: a disk with radial pegs that can be made to protrude or retract from it. 1886 Dorr E. Felt (1862-1930), of Chicago, makes his "Comptometer". This is the first calculator where the operands are entered merely by pressing keys rather than having to be, for example, dialled in. It is feasible because of Felt's invention of a carry mechanism fast enough to act while the keys return from being pressed. 1889 Felt invents the first printing desk calculator. 1890 1890 U.S. census. The 1880 census took 7 years to complete since all processing was done by hand off of journal sheets. The increasing population suggested that by the 1890 census the data processing would take longer than the 10 years before the next census - so a competition was held to try to find a better method. This was won by a Census Department employee, Herman Hollerith - who went on to found the Tabulating Machine Company (see 1911), later to become IBM. Herman borrowed Babbage's idea of using the punched cards (see 1801) from the textile industry for the data storage. This method was used in the 1890 census, the result (62,622,250 people) was released in just 6 weeks! This storage allowed much more in-depth analysis of the data and so, despite being more efficient, the 1890 census cost about double (actually 198%) that of the 1880 census. 1892 William S. Burroughs (1857-1898), of St. Louis, invents a machine similar to Felt's (see 1886) but more robust, and this is the one that really starts the mechanical office calculator industry. 1906 Henry Babbage, Charles's son, with the help of the firm of R. W. Munro, completes the mill of his father's Analytical Engine, just to show that it would have worked. It does. The complete machine is never produced. 1938 Konrad Zuse (1910-1995) of Berlin, with some assistance from Helmut Schreyer, completes a prototype mechanical binary programmable calculator, the first binary calculator it is based on Boolean Algebra (see 1848). Originally called the "V1" but retroactively renamed "Z1" after the war. It works with floating point numbers having a 7-bit exponent, 16-bit mantissa, and a sign bit. The memory uses sliding metal parts to store 16 such numbers, and works well; but the arithmetic unit is less successful. The program is read from punched tape -- not paper tape, but discarded 35 mm movie film. Data values can be entered from a numeric keyboard, and outputs are displayed on electric lamps. 1939 Zuse and Schreyer begin work on the "V2" (later "Z2"), which will marry the Z1's existing mechanical memory unit to a new arithmetic unit using relay logic. The project is interrupted for a year when Zuse is drafted, but then released. (Zuse is a friend of Wernher von Braun, who will later develop the *other* "V2", and after that, play a key role in the US space program.)

---- © Copyright 1996-2004, Stephen White My homepage - email:swhite@ox.compsoc.net

User Avatar

Wiki User

βˆ™ 15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

βˆ™ 16y ago

It all depends on how early you want to go... 1930s: IBM punch card tabulators. This system automated the Social Security Administration when it first appeared, and it was used by the Nazis to help exterminate the Jews. 1940s: Eniac. Required 1800 square feet of floor space, but it helped win World War II by creating ballistic trajectory tables for artillery cannons. 1950s: IBM 704. This was the first mass-produced computer. You couldn't buy one--they leased them to end users. Also Univac, the first computer you could buy outright. 1960s: IBM 360. This was the first "system" computer. Let me explain. You know you can buy a Celeron PC with 512MB RAM and 120GB of disk, and when you use it up you can buy a Core2Duo machine with 8GB RAM and 4TB of disk, and just move all your programs, data and peripherals to the new box. In the early days of computers you couldn't do that--if you had a Univac and you needed a larger machine you'd have to trade in your printer...no mean feat when it was the size of a Volkswagen and the room was built around it...and recompile, plus partially rewrite, all your software. This made early computers a hard sell especially since their competition at the time was the ledger pad, which you used with a sharp pencil. The S/360 was different--if you outgrew the machine you bought, you could get a larger one and keep all your old stuff. 1970s: Two things happened. The first was the emergence of the minicomputer. Digital Equipment Corporation had the PDP-11, and later the VAX. IBM came out with a lot of machines--System/34, S/36, S/38, Series 1. Everyone had minis. Then there were the original microcomputers, led by the Altair 8800. The plans for that machine were published in a magazine and you built it. And when I say "built it," I'm not talking about this "screw premade cards together and have a computer in an hour" stuff you guys like...when you built an Altair, they sent you a board that didn't have any parts on it, and you had to buy every chip and resistor separately then solder the whole thing together. It's the difference between building a plastic storage shed from Lowe's and building a house from scratch. There were also the Apple I and II models. 1980s: The IBM PC, PC/XT, PC/AT, 386 and 486 were big on the DOS side--no Windows yet, you had to remember commands and type them in. Apple had its Macintosh. There were still lots of big-iron machines such as big VAXes, IBM System/370 and Wang VS.

This answer is:
User Avatar

User Avatar

Wiki User

βˆ™ 16y ago

The earliest computer device was called Abacus. An ancient calculating device made of beads and wires mounted on a frame. This is used in China and other parts of the world to do math problems. I hope that you find this information useful.

This answer is:
User Avatar

User Avatar

Wiki User

βˆ™ 12y ago

Abacus, slide rule, various forms of mechanical adding machines- some still in very much use on Cash registers, etc. That"s a few. stretching the term an analog photometer for Photography can be considered a form of both circular slide rule and an analog computer. llight input is a selenium or silicon photo-electric cell.

This answer is:
User Avatar

User Avatar

Wiki User

βˆ™ 15y ago

Fingers, Quipu, Pebble Computer, Abacus, Napier's Bones

This answer is:
User Avatar

User Avatar

Wiki User

βˆ™ 16y ago

Lost to human history there may have been numerous devices that could be classified as computers. One that remains in use close to its original motherboard is the Abacus.

This answer is:
User Avatar

User Avatar

Wiki User

βˆ™ 13y ago

napiers bone

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What are the seven earliest computer device?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What was the erliest computing devices?

probably an abacus. Invented by the Chinese.


Earliest computer dates back to when?

The earliest known computing device, the Antikythera Mechanism, dates back to about 100 BC. It is mechanical and analog, it is believed to have done astronomical and navigational calculations.


What is the earliest computer device?

It mostly depends upon what you mean by "computing". Computing device can simply mean that the device is programmable and capable of automated calculations. In this case, the earliest known computing device is the "castle clock",invented by Al-Jazari. It was considered the earliest programmable analog computer, dating to 1206. The castle clock was able to automatically open doors to represent that an hour in a day has passed, and it was also able to be programmed to change the amount of time it keeps track of to show that a day has passed, allowing the user to compensate for the changing lengths of day and night throughout the year.


What kind of a device is a computer?

computer is a electronic device


The earliest computing device?

probably the counting board


Is a computer an output device or a modem?

A computer is an output device.


When was the first mouse invented?

The earliest mouse was invented in 1946 by the British but was a military device kept secret. The first personnel computer (PC) mouse was invented in 1982 by Microsoft.


Were computers around in 1750-1900?

No. The earliest known device using computer technology was the Hewlett-Packard 200A Audio Oscillatorio Oscillator, developed in 1939. There were calculator machines...


What is the name of the earliest computer languages?

The earliest were assembly, Autocode, and finally, Plankalkul


Main processing device in a computer?

main processing device on a computer


Define the output device in computer?

Define the output device in computer?


What are the seven earliest devices?

napiers bone