The History of Computers
From Behemoth Calculators to Embedded Systems
April 9, 2017
These days embedded systems and devices are an increasing part of the life around us, everything is getting “smarter”, autonomous and of course – connected. It’s amazing to imagine that just 50 years ago computers were built in designated halls and spanned hundreds of square feet, while they could only process basic input to calculate simple equations.
Let’s dive into one of the most breathtaking technological leaps in human history, a true revolution that has defined the modern age more than anything else – defining how we think and evolving the society around us.
An Analog World
Long before a computer could essentially fit into the palm of your hand, the first computers needed their own massive rooms to house all sections, their power needs were unimaginable, and yet they could only calculate basic equations using simple data.
But let’s start from the beginning.
A computer is any device that can be programmed to carry out a set of calculations, and in that sense the first computers were basic devices that helped in mathematical calculations and navigation, such as the Antikythera navigational device from about 200 BC – considered the first analog computer in history.
Tides and Torpedoes
The history of the modern computer starts with the development of the first analog computers during the 19th century. One such advanced device simply named Tide Predicting Machine was developed by the British Sir William Thompson in 1872, designed to accurately predict tide heights and irregularities months in advance. So successful was the concept, that future devices based on Thompson’s work served to assist navies in the first world war, second world war, and on well into the 60’s and 70’s.
One of the last important advances in analog computer development came with the design in 1928 of the Differential Analyzer. This computer used integration to solve differential equations, by utilizing a ball and disc mechanism. By 1938 mechanical computers had evolved into electromechanical analog computers with the development of the Torpedo Data Computer, which calculated torpedo trajectory based on the submarine’s speed and course information, the enemy’s ship estimated course and speed, and the type of torpedo and its ballistic characteristics.
The TDC was extremely advanced for its time, but was still essentially an analog computer making calculations based on simple data. It would take a world war to rocket computer development into the digital age, where computers could truly be programmable and offer much more than simple calculation solutions.
The Computer Wars
If you’re imagining large computers fighting it out on a muddy battlefield, you’re not that far-off. Let’s dive into the role of computers in the world wars, and how they progressed to keep up with the arms race that dominated the world during the 40’s with the Second World War, up until the early 70’s with the initial phase of the Cold War.
Cracking the Code
The outset of global conflict in 1939 put computer technology development into overdrive, as each nation tried to utilize computers for increasingly complex calculations – just to stay one step ahead of the enemy. Possibly the best and most famous example for wartime computer development was the race to crack the German “Enigma” code.
Through the groundbreaking work of Dr. Alan Turing the first computer was developed in 1939 that could crack the German code, this electro-mechanical computer was named “Bombe” and could basically run all possible cipher options that could be set into the Enigma.
In May 1941 the German engineer Konrad Zuse had developed the Z3 computer, which was also electro-mechanical by design, but was the first fully automatic and digital computer – naming Zuse as the father of the modern computer. Z3 was used in aircraft research and development, whereas at the same period the Germans had made the Enigma even harder to crack, including 1.8×1020 possible ways to set up and changed daily.
The German advances in the Enigma called for the next step in the British decipher program. This came with the development of “Colossus” in December 1943 by T. H. Flowers and his team, Colossus was the world’s first true digital, fully electronic, programmable computer.
Fueled by wartime needs, computers had evolved in the space of under 20 years from mechanical analog machines to digital electric devices, fully programmable and much more flexible in their abilities. By the end of the war, the need to break the enemy’s code or calculate weapon ballistics was replaced by a new age of computers – built to better mankind.
Up until the early 1970’s, computers were still far from something we would recognize, they were enormous, highly complex, expensive, and power consuming monsters. Because of this, computers were confined to universities, research institutes and military installations – anywhere with the resources, space and staff to operate.
Making the Computer Personal
The dawn of the personal computers was much more than a technological leap forward, it was the beginning of a massive revolution – changing how we think, work & communicate. All of this didn’t happen overnight though, so here’s how the desk-dominating plastic box came into our life.
But how did we even get to the PC?
The Dream to Fit on Every Desk
The personal computer is a concept device that is relatively compact (fitting in an office or home), affordable to consumers, designed for simple use and powerful enough to meet every day needs. The first PCs were introduced as far back as the 1950’s but it would take a further 30 years for the PC to become a household name.
Arguably the first personal computer is “Simon”, produced in 1950 it was priced for a wide audience at just $600, and was compact enough to fit on a desk, but had very basic processing power of 2-bit. Created for educational purposes, Simon failed to reach the general population due to its low processing abilities and lack of any real usability.
Introduced a few years later in 1957 was the IBM 610, which boasted more powerful abilities but at a much higher price tag of $55,000 which limited its success (only around 180 units were produced). The 610 gave us the first true keyboard controller, which would soon become a staple of most future computers.
A further development towards the personal computer was the Kenab-1, released in 1971 for $750. The Kenab-1 offered 8-bit processing with 256 bites of memory, and was operated using a series of switches and buttons, which somewhat limited accessibility to the machine. Ultimately even with its extremely compact size and low price tag, only some 40 devices were ever sold.
1977 was the year of the personal computer. Known as the year of the “trinity”, 1977 saw the introduction of three popular and widespread personal desk computers: the Apple II, the TRS-80 and the Commodore PET. All three included a screen and keyboard configuration, based on a single board, and designed to be user friendly and simple to operate. By 1982 both the PET and TRS-80 had discontinued production, shipping some 1.5 million and 1 million units respectively. The Apple II continued production right until 1993 reaching some 4 million computers sold.
As the trinity models were becoming yesterday’s news, IBM launched the IBM Personal Computer in 1981. IBM’s successful launch of the model 5150 would become the PC industry’s standard from that point on, defining all future technology in terms of compatibility with IBM – with Apple being the only true competitor which remained incompatible with IBM’s standard.
The model 5150 and subsequent models became an instant success, with demand quickly exceeding supply. The IBM PC had finally punched through to the widespread audience, offering a powerful, friendly computer for an accessible price – and thus defining the PC revolution. IBM’s models were initially based on the 8088 CPU with a 4.77 MHz processor, replaced with the 6 MHz Intel 286 CPU in 1984.
Alongside IBMs market lead, the Commodore 64 was introduced around the same time marketed for a more affordable price, subsequently achieving 17 million computers sold in its 9 years of production (a world record for a single model). In that time period it had dominated the low-end computer market with a 1 MHz processor and 64 bit memory.
Personal computers were a far cry from previous mainframe computers in terms of processing ability and especially in size, transforming computers that were set in a room and measured in square feet to compact computers that fit on a single desk measuring around 4”x15”x17” – depending on the model and not including screen and input devices.
Personal Computer steadily grew in popularity with ongoing advances in hardware and software, making the computer more reliable, powerful and comfortable to use. The PC had dominated the western world from the mid-80s up until the 2000s reaching 1 billion installed computers in 2008, twice as much as in 2002.
Embedded Systems – Connecting the World
The revolution created by the PC brought computers into our homes and offices. Just as we thought we reached the zenith of computer technology with our access to the internet from wireless laptops, the concept of embedded systems is striving to connect everything to everything!
So what is an embedded system?
Well, basically an embedded computer system is a computer that operates within a larger system or function. These computers are very powerful with low resource consumption, and most importantly they are designed to be as small as possible. Earlier examples of embedded systems include missile guidance systems, various space program devices, pocket calculators, and so on. As microcontrollers have become smaller and more powerful, embedded systems have become increasingly widespread, and can be found today in almost every place from telecommunication network components, portable electronic devices, vehicles, medical devices, and even washing machines.
It’s All About the Size
In recent years embedded systems have become a significant part of our lives with the increasing popularity of smart consumer devices such as smart phones, media players, digital cameras, and so on. These devices are extremely easy to use and include all functions that personal computer could offer just a few years ago – but in the palm of your hand, anywhere and anytime.
Embedded systems based on microcontrollers are fast surpassing “regular” computers in usability and accessibility, making PCs and even laptop computers redundant for most users. In terms of size it’s amazing to imagine how far we’ve come – the CuBox-i is a mini-computer measuring 2”x2”x2”, you could fit about 75 CuBoxs into the popular IBM PC developed just 16 years earlier, with around 200 times more processing power.
The second area where embedded systems are fast leaping forward has been mostly in development behind the scenes. The smart phone is a revolution in usability, but designers have their eyes set on bigger things – the “smart home”, “smart city” and ultimately a smart connected world. To achieve a connection between devices and operations, each device must include an embedded microcontroller with a connection to a network, thus devices can communicate with each other and run operations in an autonomous way. This overrides human error, cuts operating costs and highly optimizes system functionality.
The idea of connected “things” is known as the Internet of Things (IoT), and in recent years it has become the main focus of the embedded systems industry. The advances in high speed networks, edge computing, cloud technologies and hardware embedded components have allowed developers to widen the concept to include almost every part of our life.
A Connected Tomorrow
A public transport network can be fully connected and managed from a single point, road and weather conditions can be transmitted in real-time across the whole network, vehicle speed and directions can be observed and commuters in each stop can access this information as it unfolds. Home appliances can be connected to sensors, with information on their condition and operation being processed at a central management point. A user on his way home could connect with the refrigerator to check for any missing groceries, he could operate the washing machine, and the central heating would calculate the temperature and people in the house to keep the house perfectly warm in advance.
All these networks and thousands more, have been in the realm of science fiction only a couple of centuries ago, but if there’s anything we’ve learned from the amazing history of computers, it’s that science fiction becomes reality with just a few years. Today embedded computers keep progressing and pushing the boundaries of fiction and reality, and turning our world to a connected smart world.
SolidRun is proud to be a part of the embedded systems revolution, designing embedded solutions including single board computers (SBCs), systems on module (SOMs), and chip on module (COM) solutions. These products based on a variety of different families, are the building blocks of any network or embedded system, allowing developers a great deal of flexibility and complete modularity – so really any dream can become a reality.
Embedded systems are so far the final step in this fantastic journey. They not only represent the pinnacle of computer evolution in terms of size and abilities, but they engulf the notion of the computer as an integral device that’s here to make our lives better, and progress humanity one step at a time.
Computers have not replaced the need for people to think as some had feared; they have in fact made our experience of life much richer, each of us connected to the wealth of knowledge accumulated by humanity – so ultimately computers have made us smarter.
Enjoyed the story of the computer? Why not give us your thoughts in the comments below.
Also, take a look at what SolidRun has to offer in embedded systems solutions.