History of Computers

ENG 121

The volume and use of computers in the world are so great, they have
become difficult to ignore anymore. Computers appear to us in so many ways that
many times, we fail to see them as they actually are. People associated with a
computer when they purchased their morning coffee at the vending machine. As
they drove themselves to work, the traffic lights that so often hampered us are
controlled by computers in an attempt to speed the journey. Accept it or not,
the computer has invaded our life.

The origins and roots of computers started out as many other inventions
and technologies have in the past. They evolved from a relatively simple idea or
plan designed to help perform functions easier and quicker. The first basic type
of computers were designed to do just that; compute!. They performed basic math
functions such as multiplication and division and displayed the results in a
variety of methods. Some computers displayed results in a binary representation
of electronic lamps. Binary denotes using only ones and zeros thus, lit lamps
represented ones and unlit lamps represented zeros. The irony of this is that
people needed to perform another mathematical function to translate binary to
decimal to make it readable to the user.

One of the first computers was called ENIAC. It was a huge, monstrous
size nearly that of a standard railroad car. It contained electronic tubes,
heavy gauge wiring, angle-iron, and knife switches just to name a few of the
components. It has become difficult to believe that computers have evolved into
suitcase sized micro-computers of the 1990\'s.

Computers eventually evolved into less archaic looking devices near the
end of the 1960\'s. Their size had been reduced to that of a small automobile and
they were processing segments of information at faster rates than older models.
Most computers at this time were termed "mainframes" due to the fact that many
computers were linked together to perform a given function. The primary user of
these types of computers were military agencies and large corporations such as
Bell, AT&T, General Electric, and Boeing. Organizations such as these had the
funds to afford such technologies. However, operation of these computers
required extensive intelligence and manpower resources. The average person could
not have fathomed trying to operate and use these million dollar processors.

The United States was attributed the title of pioneering the computer.
It was not until the early 1970\'s that nations such as Japan and the United
Kingdom started utilizing technology of their own for the development of the
computer. This resulted in newer components and smaller sized computers. The use
and operation of computers had developed into a form that people of average
intelligence could handle and manipulate without to much ado. When the economies
of other nations started to compete with the United States, the computer
industry expanded at a great rate. Prices dropped dramatically and computers
became more affordable to the average household. Like the invention of the
wheel, the computer is here to stay.

The operation and use of computers in our present era of the 1990\'s has
become so easy and simple that perhaps we may have taken too much for granted.
Almost everything of use in society requires some form of training or education.
Many people say that the predecessor to the computer was the typewriter. The
typewriter definitely required training and experience in order to operate it at
a usable and efficient level. Children are being taught basic computer skills in
the classroom in order to prepare them for the future evolution of the computer

The history of computers started out about 2000 years ago, at the birth
of the abacus, a wooden rack holding two horizontal wires with beads strung on
them. When these beads are moved around, according to programming rules
memorized by the user, all regular arithmetic problems can be done. Another
important invention around the same time was the Astrolabe, used for navigation.

Blaise Pascal is usually credited for building the first digital computer
in 1642. It added numbers entered with dials and was made to help his father, a
tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that
was built in 1694. It could add, and, after changing some things around,
multiply. Leibnitz invented a special stopped gear mechanism for introducing
the addend digits, and this is still being used.

The prototypes made by Pascal and Leibnitz were not used in many places,
and considered weird until a little more than a century later, when Thomas of
Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical
calculator that