We are so used to computers being present in all areas of life that we tend to take them for granted. This is especially true of the generation that has grown up with desktop and laptop technology. However, the history of computers goes back many decades and the computers that we see today are much more evolved and advanced than the computers of even a decade ago. There are five differentiable generations of computers and each of them can be defined by major technological development that changed the way the computers operated. This in turn has led to less expensive, more compact, efficient, more powerful and robust computers.
1940-1956 – First generation – vacuum tubes
The circuitry in these first generation computers consisted of vacuum tubes and magnetic drums were used for memory. They were huge in size and literally took up the space of a large room. Running them also cost a fortune. The materials used in these computers were inefficient and they generated too much heat, used up large amounts of electricity which consistently caused breakdowns.
This generation of computers made use of machine language which is the most basic programming language. They could solve only one problem at a time.
1956-1963 – second generation – Transistors
Transistors took the place of vacuum tubes in the computers of the second generation. Transistors were invented in the year 1947, but they were not used much in computers till the end of the 1950s. They were a major improvement over vacuum tubes, but they still generated heat enough to damage the computers. However, due to the transistors, computers became smaller, faster, cheaper and consumed lesser electricity. The language in these computers evolved from cryptic binary to symbolic and this allowed the programmers to make the instructions in words.
1964-1971 – Third generation – Integrated circuits
Transistors started being miniaturized and were placed on silicon chips known as semiconductors. This caused a major increase in speed and efficiency of the computers. In these computers for the first time keyboards and monitors were used which interfaced with an operating system. As a result these computers could run many applications at the same time with the help of a central program which functioned to monitor memory. As a result of this evolution in technology, the computers became cheaper and for the first time they were placed in the mass market in the 1960s.
1972-2010 – Fourth generation – microprocessors
This is the revolution led by Intel who developed the Intel 4004 chip in the year 1971. Due to this advancement in technology it became possible to place all the components of the computer on a single chip. As a result the computers which once occupied an entire room could be placed in the palm of the hand. In the year 1981 the first computer designed for use in the home and in 1984 the Macintosh was introduced by Apple.
2010 – Fifth generation – artificial intelligence
These devices are still being developed but some technologies like voice recognition. Artificial Intelligence is actually made possible with the help of parallel processing and superconductors
The computer is still evolving continuously and there is a lot more to be discovered