1.15 Oh hai, computer revolution

Alan Turing codebreaker

‘Necessity’s the mother of invention’, it has been said, and the Second World War was the catalyst for something equally seismic: The age of data, together with our growing human dependency on machines and artificial intelligence.

It was the high stakes of war, and the volatility of shifting power balances, that lay behind the invention of mechanical analogue computers, originally developed for specialised military applications by Britain.

Alan Turing, who we know of now as one of the founders of computing science, had set down the basis of a universal computing language through his mathematical theories in the pursuit of cracking encrypted enemy intelligence.

It’s karmic poetry, perhaps, that the biggest conflict in world history also contained the seeds of the computing revolution. Code propelled a new sophistication in information technology, alongside public information management and propaganda.

Code-breaking and public information management kick-started the computing era and modern marketing communications. The Second World War itself might have ended in 1945. Seventy years on, however, the effects of war still ricochet around us and inform a great deal of what we do today.

Perhaps it’s worth reflecting on how conflict from another century continues to have such an immense impact on human destiny for the times to come.

Babbage's Difference Engine No 2, 1847-1849
Babbage’s Difference Engine No 2, 1847-1849

Charles Babbage had originally developed the concept of a programmable computer that he set out in his book, ‘On the Economy of Machinery and Manufactures’, in 1832. This, in turn, built upon the economic ideas in Adam Smith’s ‘The Wealth of Nations’ published just thirty or so years before it, that were about creating wealth, efficiency and division of labour.

In ‘On the Economy of Machinery and Manufactures’, Babbage described what is now called the ‘Babbage principle’. It concerned the commercial advantages made available through the division of labour using machinery.

Augusta Ada King, Countess of Lovelace, now known as Ada Lovelace, was an English mathematician and writer who worked on the ‘Difference Engine’ and her notes include what’s recognised today as the first algorithm.

Ada Lovelace pic 2

Megan Smith, U.S. Chief Technology Officer, talking about Ada Lovelace has said, ‘She wrote some code, just on the paper itself, and she’s really somebody we point to in our heritage as our founder of computer science’.

The DNA of the web, that came from Charles Babbage and Ada Lovelace’s earliest ideas of computing, as well as the work of many others, in turn led to what is now a key characteristic of the web, it’s granular nature. The idea of the division of labour as done by machines is how we’ve now got the unique identifiers and bits and bytes that today define all digital content.

Getting the intel, inside


As consumer marketing and economic intelligence was taking off during the post-war years, technology was moving into ARPANET and email.

Arpanet_logical_map,_march_1977While the digital era might have been embryonic at this stage, nevertheless, with the development of server technology and networked data transfer, it was unequivocally under way.

Before ‘www.’ became the norm, networked technology was all about IP addresses, number series such as 345.56.78.91 through which individual terminals could be identified. When networked communication, and data transfer grew, plus the numbers of computers and people using them increased, a new approach for managing digital networks was needed that everyone could understand. It led to the introduction of the Domain Name System, or DNS in 1983, and that is what remains in place still.

The original companies of the personal computing revolution, Apple and Microsoft, had been founded in the mid 1970’s. The breakthrough though, and Apple’s first ‘hero’ product, came with the Macintosh personal computer, in early 1984. It introduced the concept of ‘user-friendliness’ into everyday language, and unparalleled new opportunities for human expression on screen.

Apple Imagery

Now, anyone who happened to have a ‘MacPaint’ spray can tool on their Apple Macintosh desktop could draw images, at the flick of a wrist, digitally.

From cave paintings to screen paintings, humankind was now beginning its digital existence, and it hinted how much previously unimaginable human creativity might be on offer.

Cave painting

Technology was dangling an enormous promise in front of humankind, a promise of personal freedom and individual agency that could come through interactivity with a screen.

 

 

 

Leave a Reply

Your e-mail address will not be published. Required fields are marked *