Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image
Article image

Computers deciphered military codes

More than a century after Charles Babbage designed the first computer, the British built one. Or was it the Americans? Or did the Germans do it first?

Until recently, the U.S.A.’s Mark I was though to be the first true computer. It was completed in 1944.

In fact, Colossus I, a computer which deciphered German military codes, was completed in Britain the previous year. The Colossus machines were such wellkept war-secrets that their existence was not revealed until the mid-seventies.

It now seems that the German Z 3 and Z 4 computers were in use even earlier. They were destroyed by Allied bombing and very little is known about them.

Babbage’s dream had been of a general-purpose machine which could automatically carry out a series of operations on a set of information, and could then be re-programmed to carry out different operations on another set of information. The American, British and German computers were all reprogrammable. They were electro-mechani-cal machines which used relays as on-off switches. These opened and closed like the old Morse code keys. IBM’s Mark I sounded “like a roomful of ladies knitting.” The first completely electronic computer was ENIAC, the Electronic Numerical Integrator and Calculator. One of its designers, Dr John Mauchly, was a physics professor who wanted to find a better way of forecasting the weather. He knew that valves (vacuum tubes) could be used for counting very quickly. He and an electronics whiz, Presper Eckert, developed ENIAC during World War II at the University of Pennsylvania. It was funded not for weather prediction but for producing ballistics tables. ENIAC was the size of a house, and contained more than 18,000 valves. All the

This is the second of our series of articles to help readers become more aware of computers. The series is based loosely on the fourth-form computer awareness syllabus. The first article appeared last Wednesday. The articles have been written by an Auckland teacher, MARY MATTHEW, with cartoons by another Auckland teacher, RITA PARKINSON.

i Lacrnpuiar i j Hujaranass j

lights in West Philadelphia used to dim when it was turned on.

While ENIAC was being built, many experts were sure it would never work. According to the manufacturers’ guarantees, valves would last for 2500 hours. You could expect a valve to fail every 7.5 minutes. If the designers had taken this at face value, there might not be any computers today. By careful design and quality control, Mauchly and Eckert managed to have ENIAC running one or two days between failures. It was used as a productive computer for 10 years although it was not completed until two months after the war was over.

THE STORED-PROGRAM CONCEPT

To change ENIAC’s programme it was necessary to rewire thousands of circuits. This could take several days. Computer design made its biggest step forward since the work of Charles Babbage when John von Neumann outlined a way of getting round this.

Computers already stored numbers in their memory. Von Neumann suggested storing the instructions there too.

Storing programs in memory meant that instruc-

tions could be loaded quickly from outside the computer. The program could also be changed during use, as the computer could be programmed to change its own programs. Von Neumann also suggested the use of binary numbers rather than our usual decimal ones. Binary numbers are built up from only two digits, 1 and 0, corresponding to “on” and “off” (or “true” and “false”). This is ideal for a system consisting of electrical circuits. Since 1946, computer designers have followed these recommendations. It is sobering to consider that the first modern computer was developed by von Neumann’s group at Los Alamos to design the atomic bomb. Appropriately enough, it was called a Mathematical Analyser, Numerical Integrator and Computer — MANIAC I for short. FIRST GENERATION COMPUTERS: VALVES The first generation computers used valves for information processing. Valves are large, shortlived, give off a lot of heat, and use a great deal of electricity. The IBM 650 which Hawkins used to investigate Stonehenge was a first-generStion computer. IBM originally intended to

make only fifty of the 650 s but ended up marketing more than 1000

SECOND GENERATION COMPUTERS: TRANSISTORS

Second generation computers used transistors instead of valves. These are smaller, more reliable, and use far less electricity than valves. Second-generation computers were smaller, faster, cheaper and more reliable than the first-gen-eration machines. They needed little or no airconditioning.

Software improved. Higher-level languages (meaning above the machine-language level) like FORTRAN AND COBOL were developed to speed programming. Computers spread beyond university research laboratories into government departments and big business. THIRD-GENERATION COMPUTERS: INTEGRATED CIRCUITS The integrated circuit contains a number of transistors etched directly on a single piece of silicon, along with the connections between them. They are smaller, cheaper and more reliable than separate transistors. The first computers to use ICs were marketed in 1964; the third generation had arrived. Minicomputers became popular, as small firms were able to afford them. FOURTH-GENERATION COMPUTERS: THE MICROPROCESSOR By 1972, it became possible to fit several thousand transistors on to one silicon chip — the microprocessor. A piece of silicon one centimetre square contained the equivalent of a computer’s central processor. To quote “Time” magazine: “It was Babbage’s mighty mill in microcosm.” A single chip could be pro-

grammed to do an enormous variety of things, from running a washing machine to guiding a missile.

It formed the heart of a new kind of machine, the microcomputer. The small “personal computer” of school, home and business is a computer of the fourth generation. Fifth-generation machines are under development. New technology includes the use of many processors working “in parallel.” New computer languages such as PROLOG (PROgramming in LOGic) allow human-like reasoning and “artifical intelligence.” Computer memory grows

cheaper every year. More and more people refuse to be intimidated by thought-lessly-designed software. Human needs can now influence computer designers more than machine limitations.

The Japanese in particular plan to make fifth-gen-eration machines of the 1990 s “handy.” They hope to produce computers which can communicate with us in our own natural language — English, Japanese, Maori — either orally or through devices such as keyboards.

Many hopes are pinned on these “thinking” computers of the future.

This article text was automatically generated and may include errors. View the full page to see article in its original form.
Permanent link to this item

https://paperspast.natlib.govt.nz/newspapers/CHP19840110.2.102.1

Bibliographic details

Press, 10 January 1984, Page 19

Word Count
1,039

Computers deciphered military codes Press, 10 January 1984, Page 19

Computers deciphered military codes Press, 10 January 1984, Page 19