Ah the mid-ages of computing/computer development!.
[before going into further details, you must see the first part of the trilogy this, then continue reading this article ].
Vacuum tubes
If you’ve heard of computers, you’ve heard of vacuum tubes.And if you’ve heard of vacuum tubes, you should have heard of Lee de Forest, the one who invented the vacuum tubes.His invention was the key to unlocking the door to electronic era of computers.And a rather peculiar news about the Audion(vacuum tube that De Forest invented): Earlier Thomas Edison’s electric lamp had been modified by the Englishman, Ambrose Fleming, who added a second element, called a plate, and called it the Fleming Valve. By 1906 de Forest had modified Fleming’s Valve by adding a grid to control and amplify signals, and called his device the Audion(survival of the fittest I assume!).
The ability to control and amplify signals does have its importance in the creation and manipulation of ones and zeroes.And so it is needless to say that it would've been close to impossible for creating zeros and ones for performing the digital logics.(actually it would be impossible,but I don't want to admit that it will be entirely impossible)
Flip-Flops
Then came the key component of all digital electronics-The flip flops.Technically, it is a bi-stable multivibrator, and that means it has two stable states, it can hold/remember a single bit.So this formed the basis of the building blocks that would eventually lead to enhanced computing.British physicist W H Eccles and F W Jordan published the first flip-flop circuit design in 1919.It was called the Eccles-Jordan circuit before the more fancy Flip-Flop name was given to it.
Multivibrators basically have two useful states of operation-monostable and bistable.A monostable state implies that once suitable current is provided to the circuit, the flip-flop goes into on and Off states automatically.A bi-stable multivibrator, is important because it has two stable states.You give it a current and tell it to become a 1, and it obeys; change it and tell it to become a zero, and it does so.
The ‘Eccles-Jordan binary counter,’ was implemented using two vacuum tubes as the active (amplifying) elements for each bit of information storage.Later implementations using bipolar transistors could operate at up to 20 million state transitions per second as early as 1963!.And there is an astonishing fact that“Frank Wilfred Jordan invented together with William Henry Eccles the so called “flip-flop” circuit in 1919.”This is the only fact known about Frank Wilfred Jordan and little else is known about him.(Don't believe me?, try Googling him!)
IBM
Several companies merged in view of increasing Comuting,Recording and tabulating data.This was the formation of IBM- International Business Machines Corporation in 1924 under Thomas J Watson Sr. as president.
Since it's founding, the company had grown vastly in terms of geographic coverage and economic sustenance.The company introduced the IBM 601.It was a punch card machine with an arithmetic unit based on “relays.” It could do a multiplication in one second.It read two numbers up to eight decimal places from a card and punched the result on a blank space on the same card. (For you information,A relay is an electrical switch that opens and closes under the control of another electrical circuit. Note that it is still 'electrical', not 'electronic'yet.).
The machine became important in scientific as well as commercial computation, with several hundred units manufactured.And IBM had a major breakthrough,that helped propel the company into one of the major players in the market.
The concept of computing
But until now, one of the key concepts involved in computing was not invented or as some say-discovered.It's the idea of how computing must actually be made possible by machines(most people know it as algorithm of computing).In 1937 Alan Mathison Turing published a paper on "computable numbers", which was, essentially, the mathematical theory of computation.That paper solved a mathematical problem, but with a difference: the solution was derived not by mathematical calculations, but by reasoning about the Turing machine.
The Turing Machine (TM) is a simply a theoretical, simplified computer.(It should have correctly been called 'computing science'). Turing, was interested in the question of what it means to be computable.Let me explain clearly,it means simply, can something be computed?if so how? For example, the problem of factorising a 1000 - digit number may be computable; the problem of determining x given x + y + z = 40 and z = 3 is not!.
Intuitively a task is computable if one can specify a sequence of instructions which when followed will result in the completion of the task. Such a set of instructions is called an effective procedure, or algorithm, for the task.This 'algorithm' must be made precise by considering the capabilities of the device that is to carry out the instructions.Devices with different capabilities may be able to complete different instruction sets, and therefore may result in different classes of computable tasks.This was the breakthrough that was waiting to happen.This idea of computing is what modern computers follow-a complex set of instructions/algorithms.
Complex numbers calculator
In the same year when Turing published his paper,a man named George Stibitz of Bell Labs constructed a 1-bit binary adder using relays. This was, therefore, one of the first computers based on binary computations, although it was only a demonstration. Later in 1940s the improvement to the device led to a more complex numbers calculator.
Symbolic Logic
Most of the key innovations for stepping into electronic era of computing were underway.But there was some basic element missing-the language of the digital electronics.It was in 1938 ,a graduate student at MIT, Claude Shannon, combined the binary system (first created by Leibniz) with Boolean algebra and published a paper on the implementation of symbolic logic using relays.
Shannon showed how Boole’s concepts of TRUE and FALSE could be used to represent the functions of switches in electronic circuits and binary concepts helped achieve this function of tRUE or FALSE by representing them as 0's and 1's.
Binary logic and boolean algebra, seperately had very limited uses but when combined, they are of tremendous help.
Let's see an example to clear things out.Suppose you were to assign symbols to facts, such as,you assign “A” to “It is a dog”, and “B” to “It is a cat”. Then, both A and B being true would be “A ^ B”; either A or B being true would be “A v B” this is boolean algebra. In this particular case, “A ^ B” is False (since it can't be both a dog and a cat at the same time), and the latter is True now this true or false thing is Binary lopgic.(So if we can combine the binary logics with the boolean algebra,we have a “killer application” as we call it these days!).Which means that ,machines can now act on information.
Thus Shannon had provided electronics engineers with the mathematical tool they needed to design digital electronic circuits, and these techniques remain the cornerstone of digital electronic design even today.
The ABC
Wait,we are missing something important here,didn't we talk about vacuum tubes? What happened to thos?.Well, in 1939 Dr John Vincent Atanasoff and a graduate student -Clifford Berry ,designed a prototype 16-bit adder. This was the first machine to calculate using vacuum tubes. By the summer Of 1941 Atanasoff and Berry then completed a special-purpose calculator for solving simultaneous linear equations; this was later called the ABC(Atanasoff-Berry Computer). The clock speed was 60 Hz(whoa! compare it to the latest PC), and an addition took 1 second. For secondary memory it used punch cards.
Then a brief history following the world war, saw the creation of Dr Zuse’s Z3 Computer, (designed and built from 1938 to 1941).It's it said was the first automatic, program-controlled, fully-functional, general purpose digital computer.In simple words the first computer.
Harvard Mark I and The Colossus
In 1943,The Harvard Mark I,the Harvard-IBM Automatic Sequence Controlled Calculator, was built at Harvard University by Howard Aiken and his team. The project was partly financed by IBM.(it weighed a whopping 5 tonnes!).Codes are used in war; in this world war, it was about electronic ciphers.In the same year,Max Newman, C E Wynn Williams, and their team—which included Alan Turing—completed the 'Heath Robinson'.It read data optically at 2,000 characters per second from two closed loops of paper tape,each about a thousand characters long.As an advancement,The Colossus, another, more powerful cipher-breaking computer,was built by Dr Thomas Flowers.It translated 5,000 characters per second (compare with 2,000 above), and used punched tape for input.
The ENIAC
The all famous ENIAC (Electronic Numerical Integrator and Computer) claimed to be one of the first totally-electronic, valve-driven, digital computers. Development finished in 1946 at the Ballistic Research Laboratory.It was designed by J Presper Eckert and John Mauchly. Yet another first, this one was widely recognised as the first Universal Electronic Computer. It weighed 30 tonnes, and could handle 1,00,000 calculations per second. It was not used for entirely favorable purposes- These numerous numbers it handled were actually calculations for bomb trajectories and to test hydrogen bomb theories.
These were some of the important contributions(though there are many others) that were the base ideas and developments for the modern computer.This mid-age of computer/computing development saw mostly concepts being developed, that would pave way to the era of electronic computers or as we call them today, the modern PCs.