Why we should thank the Victorians for our PCs

16th Jan 2010 | 10:00

Why we should thank the Victorians for our PCs

Astonishing inventions that made the IT revolution possible

The Analytical Engine and Ada Lovelace

Think 'IT revolution' and you're thinking of the second half of the 20th century – right?

From the first stored-program computer in 1948 to the blossoming internet of the late '90s, it seems obvious that it's the most recent half-century that transformed computing from an expensive curiosity for the few to a life-changing experience for the many.

So you might be surprised to hear that much of the pioneering work that made all this possible was carried out during the reign of Queen Victoria.

Boolean logic, programming languages, data transmission, radio communication, universal computation, data compression – these building blocks of the IT revolution have their roots in Victorian society.

The Analytical Engine

Given that he has a track record of never completing any of his inventions, the British mathematician Charles Babbage provides an unlikely starting point for our investigation into Victorian computer pioneers.

Babbage's first foray into computation involved the design of the so-called Difference Engine, which was intended to calculate polynomial functions for navigation and artillery applications. It was never finished in Babbage's lifetime, but the successful creation of a machine built to his original plans by the London Science Museum in 1991 vindicated the design.

Impressive as it may be for a machine weighing almost five tons and comprising 8,000 parts to work without a glitch, it's Babbage's second contrivance, the Analytical Engine, that really makes things interesting.

While the Difference Engine was dedicated to one type of calculation, the Analytical Engine was designed to be universal, just like today's computers. Except for the fact that it relied on mechanics rather than electronics, the similarities are striking for something conceived of in 1837, 121 years before the first electronic stored-program computer.

Like today's PCs, the Analytical Engine used a sequence of instructions to process data. Both the program and the data were input using punch cards similar to those used at the time to control looms in wool mills (and used in mainframe computers until the 1970s). Results could be output to a printer, a graph plotter or more punch cards so that they could be fed back into the Engine.

The analytical engine

In a direct parallel with modern computers, it had a memory that Babbage called the 'store', which had a capacity of 1,000 50-digit decimal numbers. It also had an arithmetic unit that he called the 'mill', which was capable of addition, subtraction, multiplication, division and comparison.

It was also capable of looping and conditional branching, although it seems probable that the importance of this hadn't been fully appreciated by Babbage until he made the acquaintance of Ada Lovelace, as we'll see shortly. Fascinating as the similarities with today's technology are, the differences also make interesting reading.

The Analytical Engine was to have had a steam engine as its power source. It would have carried out additions and subtractions in about a second, but could have taken up to a minute to perform division and multiplication. Speedy it certainly wasn't.

Ada Lovelace's computer program

Augusta Ada, Countess of Lovelace and daughter of the poet Lord Byron, didn't fit into the mould of Victorian society. Instead of excelling in needlework, embroidery and entertaining on the pianoforte, Ada's skills were in the realm of science and mathematics.

She was introduced to Charles Babbage at a dinner party in 1833 and they corresponded for several years, discussing first the Difference and later the Analytical Engine.

In 1942, the Italian mathematician Luigi Menabrea, whom Babbage had met a year earlier, wrote a paper entitled A Sketch of the Analytical Engine Invented by Charles Babbage. Ada Lovelace translated the article into English and, at Babbage's request, expanded it with very extensive notes of her own. Much impressed with her understanding of his creation, Babbage referred to her as the 'Enchantress of Numbers'.

But Countess Lovelace's greatest contribution to the science of computation was an example that she provided in her notes to Menabrea's article of how the Analytical Engine could be used to calculate Bernoulli numbers.

Unless you're a mathematician you probably won't be too interested in exactly what they are, so let's just say that this sequence of numbers, discovered by the Swiss mathematician Jakob Bernoulli, is of significant interest in number theory. What was of particular interest to Ada Lovelace is that they're notoriously difficult to calculate.

Each successive number requires significantly more calculations than its predecessor, and in fact Bernoulli himself only managed to work out the first 10 of the numbers that bear his name.

Ada's instructions for the Analytical Engine, while not looking like a modern computer program, are considered to be just that – the world's very first example. They contain many of the elements of today's programs, including conditional branches and nested loops, or 'cycle of cycles' as she called them.

Boolean logic, data transmission and radio communication

Boolean logic

Born in Lincoln in 1815, George Boole devised a form of logic that operates on just two values which can alternatively be thought of as true or false, or – more pertinently to our discussion of computing – 1 or 0.

Boolean logic defines various ways in which these values can be manipulated and combined. Examples include the AND function and the OR function, both of which take two inputs and then produce a single output. With the AND function, the output is a 1 only if both the inputs are 1s; whereas the OR function produces an output of 1 if either or both of the inputs are 1s.

The apparent simplicity of these functions doesn't do justice to their power. By combining the simple electronic building blocks that implement these functions (which are referred to as AND gates and OR gates), it's possible to create flip-flops, adders, shift registers and many more of the constituents of a computer that can work on binary numbers. As such, Boole had laid the theoretical foundations for today's computers.

Of course, it's a perfectly valid question to ask what would have been wrong with computing with decimal numbers, as Babbage's Analytical Engine was designed to do. A look at one of the earliest electronic computers, the University of Pennsylvania's ENIAC, provides just a glimpse of the advantages offered by moving to binary.

ENIAC handled decimal numbers, storing each digit in an electronic circuit called a ring counter that contained 36 valves. As 10 of these ring counters constituted a register, it took 360 valves to store a number in the range -9,999,999,999 to +9,999,999,999. By way of contrast, a 32-bit binary register can store numbers in the range –2,147,483,648 to + 2,147,483,647.

Using similar electronic circuits to those used in ENIAC, a 32-bit register required just 64 valves (70 for 35 bits) or, in today's terms, 64 transistors.

Data transmission

Today, data processing goes hand-in-hand with data communication, but to see the first developments in this technology we need to cross the Atlantic. Predating the telephone by more than 30 years, the telegraph is often considered the poor relation.

This undervalues the pioneering work of Samuel Morse, who first demonstrated the code that bears his name back in 1844. Morse Code uses short and long signals (known colloquially as dots and dashes) interspaced with gaps of varying lengths to represent letters, numbers and a range of symbols.

Morse code

It's really not too different from ASCII (American Standard Code for Information Interchange), which is the code used in current-day data transmission. It was designed to be sent by hand using a finely balanced switch called a Morse key and received by ear or as marks on a paper strip, but it's also possible to use Morse Code for automatic data transmission by computer.

In providing a means of automatic data transmission it achieved what has only become possible in recent times (and then not perfectly) through voice communication. It's commonly assumed that Morse Code was designed arbitrarily and that it was just by chance that the code for E, for example, is 'dot' whereas that for J is 'dot dash dash dash'. This does Samuel Morse a great disservice, as we'll see if we fast-forward over a hundred years to 1952.

One of the first methods of data compression – and now a widespread and essential technology in areas as diverse as data communications, photography and music reproduction – was Huffman Encoding. This analyses the data stream to determine how frequently each character occurs and then assigns codes of variable length, with shorter codes going to the most commonly encountered characters.

Although infrequently encountered letters end up being represented by longer codes than in non-compressed text, the short codes used for the common letters more than compensate, so the compressed text can be as little as half the size of the original.

Going back to Morse Code, the two most common letters in the English language, E and T, are represented by 'dot' and 'dash' whereas the two least common ones, Q and Z, are represented by 'dash dash dot dash' and 'dash dash dot dot'.

So not only is Morse Code the world's first system for data transmission, it also has a built-in method of data compression.

Radio communication

Moving on from Morse's telegraph lines to the wireless data transmission that's so familiar to us today, we need to return to this side of the Atlantic.

Born in Bologna, Italy, Guglielmo Marconi moved to England when the Italian government failed to invest in his work, which involved experimenting with the electromagnetic waves that were first discovered by Heinrich Hertz.

But while the German physicist had a theoretical interest in what would eventually be called radio, Marconi's interest was much more practical in nature. The history of Marconi's development of the wireless telegraph was one of increasing the transmission range step by step.

In 1897, in a demonstration to the British government, he transmitted a signal over a distance of 6km on Salisbury Plain. Later in the same year he demonstrated that radio waves could travel over the sea, first at a range of 6km and then over 19km.

During 1899 Marconi first achieved communication between Britain and France, and later equipped three ships of the Royal Navy with radio equipment allowing them to communicate over a distance of 137km.

Marconi's equipment might have transmitted and received radio signals, but the hardware bore no resemblance to the equipment that does the same job today. With no electronic devices such as valves or transistors, Marconi resorted to the brute-force approach.

To generate the signal he used a spark transmitter that applied a high voltage across an air gap to produce a lightning-like spark. The result was broadband electromagnetic radiation from ultraviolet through visible and into the radio spectrum.

His receiver used a coherer, a glass envelope containing iron filings that coalesced under the influence of radio waves thereby allowing an electrical current to flow. The result of using such crude equipment was that the transmitter had to use very high power levels and the antennae were huge.

Corwall antannae

The station in Cornwall that Marconi used to achieve the first ever trans-Atlantic transmission demonstrated this. The transmitter consumed 25kW – several thousand times more than a mobile phone – and the antenna was supported by four 66m masts. Crude it might have been, but it worked – and the world suddenly became a lot smaller.

The missing link

Many of the building blocks of the modern world might have come into place during the reign of Queen Victoria, but that doesn't necessarily mean that the information age could have been born back in the 19th century.

Babbage was a forward thinker but, hampered by the technology of the time, his Analytical Engine could only ever have calculated anything at a snail's pace. His creation couldn't have possibly made a dent in what we expect of today's computers.

The missing link – the one that did indeed act as the catalyst for the digital age half a century later – only saw the light of day after Queen Victoria's death. In 1904, the English physicist John Ambrose Fleming discovered that a glass envelope from which the air had been extracted would permit the flow of electricity between a heated cathode and an anode in one direction only. Today we'd call his creation a diode valve.

Three years later, American inventor Lee de Forest discovered that by placing a coil of wire between the anode and the cathode, the current flowing between them could be influenced by the application of a small voltage. This contributed significantly to radio communication. In allowing a voltage on one circuit to control what was happening on another circuit, it was also the first electronic switch.

The triode – as de Forest's invention was called – could form the basis of Boolean logic gates and hence of an electronic computer. So much of what was needed for a computing revolution was therefore in place, but for the lack of that final piece of the jigsaw.

As it happened, we had to wait until the late 1940s for progress to continue.


First published in PC Plus Issue 289

Liked this? Then check out 5 technologies to thank the 1950s for

Sign up for TechRadar's free Weird Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register

Follow TechRadar on Twitter

history technology
Share this Article

Most Popular

Edition: UK
TopView classic version