Computer are electronic device that can receive a set of instructions,
or program, and then carry out a program by performing calculations on numbered
data or by compiling and correlating other forms of information. The old world
of technology could not believe about the making of computers. Different types
and sizes of computers find uses throughout our world in the handling of data
including secret governmental files and making banking transactions to private
household accounts. Computers have opened up a new world in manufacturing
through the developments of automation, and they have made modern communication
systems. They are great tools in almost everything you want to do research and
applied technology, including constructing models of the universe to producing
tomorrow's weather reports, and their use has in itself opened up new areas of
development. Database services and computer networks make available a great
variety of information sources. The same new designs also make possible ideas of
privacy and of restricted information sources, but computer crime has become a
very important risk that society must face if it would enjoy the benefits of
modern technology. Two main types of computers are in use today, analog and
digital, although the term computer is often used to mean only the digital type.
Everything that a digital computer does is based on one operation the ability to
determine if a switch, or gate is open or closed. That is, the computer can
recognize only two states in any of its microscopic circuits on or off, high
voltage or low voltage, or—in the case of numbers—0 or 1. The speed at which the
computer performs this simple act, however, is what makes it a marvel of modern
technology. Computer speeds are measured in megaHertz, or millions of cycles per
second. A computer with a "clock speed" of 10 mHz—a fairly representative speed
for a microcomputer—is capable of executing 10 million discrete operations each
second. Business microcomputers can perform 15 to 40 million operations per
second, and supercomputers used in research and defense applications attain
speeds of billions of cycles per second. Digital computer speed and calculating
power are further enhanced by the amount of data handled during each cycle. If a
computer checks only one switch at a time, that switch can represent only two
commands or numbers; thus ON would symbolize one operation or number, and OFF
would symbolize another. By checking groups of switches linked as a unit,
however, the computer increases the number of operations it can recognize at
each cycle. The first adding machine, a precursor of the digital computer, was
devised in 1642 by the French philosopher Blaise Pascal. This device employed a
series of ten-toothed wheels, each tooth representing a digit from 0 to 9. The
wheels were connected so that numbers could be added to each other by advancing
the wheels by a correct number of teeth. In the 1670s the German philosopher and
mathematician Gottfried Wilhelm von Leibniz improved on this machine by devising
one that could also multiply. The French inventor Joseph Marie Jacquard , in
designing an automatic loom, used thin, perforated wooden boards to control the
weaving of complicated designs. Analog computers began to be built at the start
of the 20th century. Early models calculated by means of rotating shafts and
gears. Numerical approximations of equations too difficult to solve in any other
way were evaluated with such machines. During both world wars, mechanical and,
later, electrical analog computing systems were used as torpedo course
predictors in submarines and as bombsight controllers in aircraft. Another
system was designed to predict spring floods in the Mississippi River Basin. In
the 1940s, Howard Aiken, a Harvard University mathematician, created what is
usually considered the first digital computer. This machine was constructed from
mechanical adding machine parts. The instruction sequence to be used to solve a
problem was fed into the machine on a roll of punched paper tape, rather than
being stored in the computer. In 1945, however, a computer with program storage
was built, based on the concepts of the Hungarian-American mathematician John
von Neumann. The instructions were stored within a so-called memory, freeing the
computer from the speed limitations of the paper tape reader during execution
and permitting problems to be solved without rewiring the computer. The rapidly
advancing field of electronics led to construction of the first general-purpose
all-electronic computer in 1946 at the University of Pennsylvania by the
American engineer John Presper Eckert, Jr. and the American physicist John
William Mauchly. (Another American physicist, John Vincent Atanasoff, later
successfully claimed that certain basic techniques he had developed were used in
this computer.) Called ENIAC, for