Global IT Outlook

Information technology is a field of knowledge related to the processing and use of data. The definition is quite broad and includes both technical tools, such as computer science, and a purely scientific approach, such as information theory. IT is inextricably linked to computers: in English-speaking countries, this phrase is actually synonymous with Computer Science, the “science of computers”.

This may sound strange in 2019, but does IT have any prospects today? The question should be considered from several angles – from the development and promotion of technologies in the recent past to labor market congestion and breakthrough projects in the future.

Emergence of IT

Like almost all of the interesting things in the twentieth century, computers appeared to solve purely practical, military problems. To create supersonic planes, missiles and nuclear weapons required too much complex mathematics and calculations. We can say that the first computer became a symbol of human genius and laziness at the same time – it was created because it was difficult for scientists to count a lot.

Despite the fact that the prototype of computers is considered to be the difference machine Babbage, assembled by him in 1822, the first prototype of modern machines was created at Harvard by mathematician and engineer Howard Aixon. The model was electromechanical, consisted of 765 thousand small parts, weighed 4.5 tons and occupied a considerable space.

Then the race between large countries began, for which computers were only another, though very important, tool. In 1947, the race was spurred on by the first transistor, after which computers began to do on semiconductors, eventually reducing complex mechanics to integrated circuits. Another breakthrough came in 1976, when Apple released the first ever personal computer, Apple I.

In the early years of its existence, the PC was a fun toy that had no really important user tasks. But the competition between Apple and IBM, as well as the work of enthusiastic scientists created the opportunity to use the PC not only as an analogue of the typewriter, but also to play games, listen to music, write programs for general use.

There are two more important stages in the history of computer technology. In 1969, a prototype of the modern Internet, the local network APRANET. By 1984, the technology was out of the direct control of the military – the U.S. Science Foundation created the NSFNET network with the ability to freely connect. Over time, the network was transferred to private hands and gradually turned into the network we know today.

The latest breakthrough was the popularization of smartphones and other wearable electronics from the same Apple. Today, people are much more likely to access the Internet from mobile devices than from desktop computers, and this trend is only growing.

The future of information technology

Development of IT goes from complex to simple:

  • first, huge monsters to calculate the rockets,
  • then personal computers for scientists and engineers,
  • at the crossroads of the millennium, a common global network for use by ordinary people emerges,
  • Finally, mobile devices with touch interfaces that can be used even by the child.

The trend towards simplification is also visible in programming – previously programs were written in low-level languages or even literally in machine code to save memory and processor resources. Now writing programs is much easier thanks to abstractions and simplifications, and absolutely any student who has a PC at home can try it.

The next step in the development of IT will be the so-called “Internet of Things”. This direction leads to really smart devices and their combinations: for example, a coffee maker and a fitness bracelet can agree on what kind of coffee to make today, depending on the medical indicators, the audiocenter itself will select the appropriate music, and the smartphone will automatically warn (or not warn) the superiors that you overslept.

The heart of the system will be virtual or artificial intelligence. The rudiments of VI and self-study already exist now – voice assistants Siri and Alice are available to everyone. Will it grow into a real AI? Time will tell.

Computer hardware also continues to develop, although the miniaturization of transistors is almost exhausted. There are many interesting options in this area, such as quantum computers and ternary calculus.

Today, ASICs, special-purpose integrated circuits tailored to a specific task, are receiving special attention. This makes it possible to speed up the work by orders of magnitude – for example, specialized boards perform tasks for neural networks much faster than conventional processors.

IT became a kind of “horn of plenty” of the 21st century.

Technology has allowed to combine in one small machine almost everything that mankind is ready to offer:

  • Entertainment (games, video content, distribution of music and books)
  • Social interactions (social networks, messengers, specialized websites);
  • Training (from simple videos on Youtube to hours of free lectures by major universities);
  • Data processing (changing sound and video, creating websites, performing complex calculations);
  • Search and transfer of information.

Working with information technologies will allow a child not only to find his or her career path, but also to orientate himself or herself much more quickly in the modern world. Thus, knowledge of search technologies will help him to cope with lessons much faster, processing photos, videos or music – to become more popular.

The simplest skills in IT will help you to find a part-time job on the Internet at one of the labour exchanges or to create your own website to work for yourself. Nowadays, computer skills are just as important as driving a car.

Back to top