When was the first computer invented? How did she look? Electronic computer.

The prototype of the calculator - the adding machine - existed more than 300 years ago. Nowadays, complex mathematical calculations can be done with ease by silently pressing the keys of the same calculator or computer, mobile phone, smartphone (on which the corresponding applications are installed). Previously, this procedure took a lot of time and created a lot of inconvenience. But still, the appearance of the first calculating device made it possible to save on the costs of mental labor, and also pushed for further progress. Therefore, it is interesting to know who invented the adding machine and when it happened.

The appearance of the adding machine

Who invented the adding machine first? This person was the German scientist Gottfried Leibniz. The great philosopher and mathematician designed a device consisting of a movable carriage and a stepped roller. G. Leibniz introduced it to the world in 1673.

His ideas were adopted by the French engineer Thomas Xavier. He invented a calculating machine to perform the four operations of arithmetic. The numbers were set by moving the gear along the axis until the required numbers appeared in the slot, with each stepped roller corresponding to one digit of numbers. The device was driven by the rotation of a hand lever, which, in turn, moved gears and toothed rollers, producing the desired result. This was the first adding machine put into mass production.

Device Modifications

The Englishman J. Edmondzon was the one who invented the adding machine with a circular mechanism (the carriage performs an action in a circle). This device was created in 1889 based on the apparatus of Thomas Xavier. However, there were no significant changes in the design of the device, and this device turned out to be just as bulky and inconvenient as its predecessors. Subsequent analogues of the device also committed the same sin.

It is well known who invented the adding machine with a numeric keypad. It was the American F. Baldwin. In 1911, he introduced a counting device in which numbers were set in vertical digits containing 9 digits.

The production of such counting devices in Europe was established by engineer Carl Lindström, creating a device that was more compact in size and original in design. Here the stepped rollers were already positioned vertically rather than horizontally, and, in addition, these elements were arranged in a checkerboard pattern.

In the territory Soviet Union The first adding machine was created at the Schetmash plant named after. Dzerzhinsky in Moscow in 1935. It was called a keyboard (KSM). Their production continued until and then was resumed in the form of new models of semi-automatic machines only in 1961.

During these same years, automatic devices were also created, such as “VMM-2” and “Zoemtron-214”, which were used in various fields, while the work was characterized by great noise and inconvenience, but this was the only device at that time that helped cope with a large volume of calculations.

Now these devices are considered a rarity, they can only be found as museum exhibit or in the collection of lovers of antique technology. We examined the question of who invented the adding machine, and also provided information about the history of the technical development of this device and hope that this information will be useful to readers.

History of development computer technology

The development of computing technology can be broken down into the following periods:

Ø Manual(VI century BC - XVII century AD)

Ø Mechanical(XVII century - mid-XX century)

Ø Electronic(mid XX century - present time)

Although Prometheus in Aeschylus’s tragedy states: “Think what I did to mortals: I invented the number for them and taught them how to connect letters,” the concept of number arose long before the advent of writing. People have been learning to count for many centuries, passing on and enriching their experience from generation to generation.

Counting, or more broadly, calculations, can be carried out in various forms: exists oral, written and instrumental counting . Instrumental accounting tools at different times had different capabilities and were called differently.

Manual stage (VI century BC - XVII century AD)

The emergence of counting in ancient times - “This was the beginning of beginnings...”

The estimated age of the last generation of humanity is 3-4 million years. It was so many years ago that a man stood up and picked up an instrument he had made himself. However, the ability to count (that is, the ability to break down the concepts of “more” and “less” into a specific number of units) developed in humans much later, namely 40-50 thousand years ago (Late Paleolithic). This stage corresponds to the appearance modern man(Cro-Magnon). Thus, one of the main (if not the main) characteristic that distinguishes the Cro-Magnon man from the more ancient stage of man is the presence of counting abilities.

It is not difficult to guess that the first Man's counting device was his fingers.

The fingers turned out greatcomputer. With their help it was possible to count up to 5, and if you take two hands, then up to 10. And in countries where people walked barefoot, on their fingers it was easy to count to 20. Then this was practically enough for most people's needs.

The fingers turned out to be so closely connected with counting, that in ancient Greek the concept of “counting” was expressed by the word"fivefold" And in Russian the word “five” resembles “pastcarpus” - part hands (the word “metacarpus” is rarely mentioned now, but its derivative is "wrist" - often used even now). The hand, metacarpus, is a synonym and in fact the basis of the numeral “FIVE” among many peoples. For example, the Malay "LIMA" means both "hand" and "five".

However, there are known peoples whose units of counting are It was not the fingers, but their joints.

Learning to count on fingers toten, people took the next step forward and began to count in tens. And if some Papuan tribes could only count to six, others could count up to several tens. Just for this it was necessary invite many counters at once.

In many languages, the words “two” and “ten” are consonant. Perhaps this is explained by the fact that once the word "ten" meant "two hands." And now there are tribes that say"two hands" instead of "ten" and "arms and legs" instead of "twenty". And in England The first ten numbers are called by a common name - “fingers”. This means that the British once counted on their fingers.

Finger counting has been preserved in some places to this day, for example, the historian of mathematics L. Karpinsky in his book “The History of Arithmetic” reports that at the world's largest grain exchange in Chicago, offers and requests, as well as prices, are announced by brokers on their fingers without a single word.

Then counting with moving stones appeared, counting with the help of rosaries... This was a significant breakthrough in human counting abilities - the beginning of abstracting numbers.

The computer they created worked a thousand times faster than the Mark 1. But it turned out that most of the time this computer was idle, because to set the calculation method (program) in this computer it was necessary to connect the wires in the required way for several hours or even several days. And the calculation itself could then take only a few minutes or even seconds.

To simplify and speed up the process of setting programs, Mauchly and Eckert began to design a new computer that could store the program in its memory. In 1945, the famous mathematician John von Neumann was brought in to work and prepared a report on this computer. The report was sent to many scientists and became widely known because in it von Neumann clearly and simply formulated the general principles of the functioning of computers, that is, universal computing devices. And to this day, the vast majority of computers are made in accordance with the principles that John von Neumann outlined in his report in 1945. The first computer to embody von Neumann's principles was built in 1949 by the English researcher Maurice Wilkes.

The development of the first electronic serial machine UNIVAC (Universal Automatic Computer) began around 1947 by Eckert and Mauchli, who founded the ECKERT-MAUCHLI company in December of the same year. The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5000 vacuum tubes. Internal storage with a capacity of 1000 12-bit decimal numbers was performed on 100 mercury delay lines.

Soon after the UNIVAC-1 machine was put into operation, its developers came up with the idea of ​​automatic programming. It boiled down to ensuring that the machine itself could prepare the sequence of commands needed to solve a given problem.

A strong limiting factor in the work of computer designers in the early 1950s was the lack of high-speed memory. According to one of the pioneers of computing, D. Eckert, “the architecture of a machine is determined by memory.” The researchers focused their efforts on the memory properties of ferrite rings strung on wire matrices.

In 1951, J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes 32 x 32 x 17 with cores that provided storage of 2048 words for 16-bit binary numbers with one parity bit.

Soon, IBM became involved in the development of electronic computers. In 1952, it released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 germanium diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

IBM 704
After the IBM 704 computer, the IBM 709 was released, which, in architectural terms, was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and I/O channels appeared for the first time.

In 1956, IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create new type memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computing technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 magnetically coated metal disks that rotated at a speed of 12,000 rpm. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

Following the first production computer UNIVAC-1, Remington-Rand in 1952 released the UNIVAC-1103 computer, which worked 50 times faster. Later, software interrupts were used for the first time in the UNIVAC-1103 computer.

Rernington-Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly). In addition, it is necessary to note the US Navy officer and head of the programming team, then captain (later the only female admiral in the Navy) Grace Hopper, who developed the first compiler program. By the way, the term “compiler” was first introduced by G. Hopper in 1951. This compiling program translated into machine language the entire program, written in an algebraic form convenient for processing. G. Hopper is also the author of the term “bug” as applied to computers. Somehow through open window A beetle (in English - bug) flew into the laboratory, which, sitting on the contacts, shorted them, which caused a serious malfunction in the operation of the machine. The burnt beetle was glued to the administrative log, where various malfunctions were recorded. This is how the first bug in computers was documented.

IBM took the first steps in the field of programming automation by creating the “Fast Coding System” for the IBM 701 machine in 1953. In the USSR, A. A. Lyapunov proposed one of the first programming languages. In 1957, a group led by D. Backus completed work on the first programming language that later became popular high level, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

Alexey Andreevich Lyapunov
In Great Britain in July 1951, at a conference at the University of Manchester, M. Wilkes presented a report “ Best Method Designing an Automatic Machine", which became a pioneering work on the fundamentals of microprogramming. The method he proposed for designing control devices found wide application.

M. Wilkes realized his idea of ​​microprogramming in 1957 when creating the EDSAC-2 machine. In 1951, M. Wilkes, together with D. Wheeler and S. Gill, wrote the first programming textbook, “Composing Programs for Electronic Computing Machines.”

In 1956, Ferranti released the Pegasus computer, which for the first time implemented the concept of general purpose registers (GPR). With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had not one, but several accumulator registers at his disposal.

The advent of personal computers

Microprocessors were first used in a variety of specialized devices, such as calculators. But in 1974, several companies announced the creation of a personal computer based on the Intel-8008 microprocessor, that is, a device that performs the same functions as a large computer, but is designed for one user. At the beginning of 1975, the first commercially distributed personal computer, Altair-8800, based on the Intel-8080 microprocessor, appeared. This computer sold for about $500. And although its capabilities were very limited (RAM was only 256 bytes, there was no keyboard and screen), its appearance was greeted with great enthusiasm: several thousand sets of the machine were sold in the first months. Buyers supplied this computer with additional devices: a monitor for displaying information, a keyboard, memory expansion units, etc. Soon these devices began to be produced by other companies. At the end of 1975, Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to easily communicate with the computer and easily write programs for it. This also contributed to the rise in popularity of personal computers.

The success of Altair-8800 forced many companies to also start producing personal computers. Personal computers began to be sold fully equipped, with a keyboard and monitor; the demand for them amounted to tens and then hundreds of thousands of units per year. Several magazines dedicated to personal computers appeared. The growth in sales was greatly facilitated by numerous useful programs practical significance. Commercially distributed programs also appeared, for example the text editing program WordStar and the spreadsheet processor VisiCalc (1978 and 1979, respectively). These and many other programs made the purchase of personal computers very profitable for business: with their help it became possible to perform accounting calculations, compose documents, etc. Using large computers for these purposes was too expensive.

In the late 1970s, the spread of personal computers even led to a slight decline in demand for large computers and minicomputers (minicomputers). This became a matter of serious concern for IBM, the leading company in the production of large computers, and in 1979 IBM decided to try its hand at the personal computer market. However, the company's management underestimated the future importance of this market and viewed the creation of a personal computer as just a minor experiment - something like one of dozens of works carried out at the company to create new equipment. In order not to spend too much money on this experiment, the company's management provided the department responsible for this project, freedom unprecedented in the company. In particular, he was allowed not to design a personal computer from scratch, but to use blocks made by other companies. And this unit took full advantage of the given chance.

The then latest 16-bit microprocessor Intel-8088 was chosen as the main microprocessor of the computer. Its use made it possible to significantly increase the potential capabilities of the computer, since the new microprocessor allowed working with 1 megabyte of memory, and all computers available at that time were limited to 64 kilobytes.

In August 1981, a new computer called the IBM PC was officially introduced to the public, and soon after it gained great popularity among users. A couple of years later, the IBM PC took a leading position in the market, displacing 8-bit computer models.

IBM PC
The secret of the popularity of the IBM PC is that IBM did not make its computer a single one-piece device and did not protect its design with patents. Instead, she assembled the computer from independently manufactured parts and did not keep the specifications of those parts and how they were connected a secret. In contrast, the design principles of the IBM PC were available to everyone. This approach, called the open architecture principle, made the IBM PC a stunning success, although it prevented IBM from sharing the benefits of its success. Here's how the openness of the IBM PC architecture influenced the development of personal computers.

The promise and popularity of the IBM PC made the production of various components and additional devices for the IBM PC very attractive. Competition between manufacturers has led to cheaper components and devices. Very soon, many companies ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble their own computers compatible with the IBM PC. Since these companies did not need to bear IBM's huge costs for research and maintaining the structure of a huge company, they were able to sell their computers much cheaper (sometimes 2-3 times) than similar IBM computers.

Computers compatible with the IBM PC were initially contemptuously called “clones,” but this nickname did not catch on, as many manufacturers of IBM PC-compatible computers began to implement technical advances faster than IBM itself. Users were able to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

Personal computers of the future

The basis of computers of the future will not be silicon transistors, where information is transmitted by electrons, but optical systems. The information carrier will be photons, since they are lighter and faster than electrons. As a result, the computer will become cheaper and more compact. But the most important thing is that optoelectronic computing is much faster than what is used today, so the computer will be much more powerful.

The PC will be small in size and have the power of modern supercomputers. The PC will become a repository of information covering all aspects of our Everyday life, it will not be tied to electrical networks. This PC will be protected from thieves thanks to a biometric scanner that will recognize its owner by fingerprint.

The main way to communicate with the computer will be voice. The desktop computer will turn into a “candy bar”, or rather, into a giant computer screen - an interactive photonic display. There is no need for a keyboard, since all actions can be performed with the touch of a finger. But for those who prefer a keyboard, a virtual keyboard can be created on the screen at any time and removed when it is no longer needed.

The computer will become the operating system of the house, and the house will begin to respond to the owner’s needs, will know his preferences (make coffee at 7 o’clock, play his favorite music, record the desired TV show, adjust temperature and humidity, etc.)

Screen size will not play any role in the computers of the future. It can be as big as your desktop or small. Larger versions of computer screens will be based on photonically excited liquid crystals, which will have much lower power consumption than today's LCD monitors. Colors will be vibrant and images will be accurate (plasma displays possible). In fact, today's concept of "resolution" will be greatly atrophied.

As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today's super-powerful computers are based on the principles mathematical calculations, process, store and transmit information - the most important resource and engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

The main stages of the development of computer technology

The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

  • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
  • Mechanical stage. Began in the middle of the 17th century and lasted almost until late XIX centuries. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remember the highest digits.
  • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
  • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

The very first counting devices

The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

With the advent of writing, various ways recording numbers, positional number systems were invented (decimal - in India, sexagesimal - in Babylon).

Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). Wires or ropes - nine or more - were stretched onto a rectangular wooden frame. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunther invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

The first mechanical calculating devices

In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with big amount interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed the four basic mathematical operations and could extract the square root. The principle of its operation was based on the binary number system, specially invented by the scientist.

In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

Development of the first computer analogues

In 1887, the history of the development of computer technology entered a new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

Six years later, the English scientist Alan Turing developed the concept of a machine, which became the theoretical basis for modern computers. She had all the main properties modern means computer technology: could step-by-step perform operations that were programmed in the internal memory.

A year after this, George Stibitz, a scientist from the United States, invented the country's first electromechanical device capable of performing binary addition. His operations were based on Boolean algebra - mathematical logic created in the mid-19th century by George Boole: the use of the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer that uses electrical circuits to solve Boolean algebra problems.

The beginning of the computer era

The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device is still for a long time was kept in the strictest confidence. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

Architecture development

In 1945, the American mathematician of Hungarian-German origin John (Janos Lajos) von Neumann created the prototype of architecture modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located on 170 square meters of area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

During the same period, a series of large computers were produced in the USSR under common name"Strela" (developed by Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

Transistors. Release of the first serial computers

However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, make them even more reliable and faster.

In 1954, the American company Texas Instruments began mass-producing transistors, and two years later the first second-generation computer built on transistors, the TX-O, appeared in Massachusetts.

In the middle of the last century, a significant part government organizations And large companies used computers for scientific, financial, engineering calculations, and working with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

The emergence of integrated circuits

In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From now on you can count down mass production COMPUTER. In total, more than 20 thousand copies of this computer were produced.

In 1972, the USSR developed the ES (unified series) computer. These were standardized complexes for the operation of computer centers that had common system commands Was taken as a basis American system IBM 360.

IN next year DEC released the PDP-8 minicomputer, which became the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

During the same period, the software was constantly improved. Operating systems have been developed to support maximum amount external devices, new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

Personal computers

After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

Apple was one of the first manufacturers of personal computers. Those who created it Steve Jobs and Steve Wozniak designed the first PC model in 1976, calling it the Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer - IBM company, and in 1979 it launched its first PC.

Two years later, the company's first microcomputer with an open architecture appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, and 64 kilobytes of RAM. On behalf of the creator company, Microsoft specially developed an operating system for this machine. Numerous IBM PC clones appeared on the market, which stimulated the growth of industrial production of personal computers.

In 1984, Apple developed and released a new computer - the Macintosh. Its operating system was extremely user-friendly: it presented commands in the form of graphic images and allowed them to be entered using a mouse. This made the computer even more accessible, since now no special skills were required from the user.

Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and simulating the architecture of neural biological systems, which allows them to successfully recognize complex images.

Having consistently examined all stages of the development of computer technology, it should be noted interesting fact: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

Computer Science Classes

There are various options for classifying computers.

So, according to their purpose, computers are divided:

  • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
  • problem-oriented - problem solving a narrower direction, usually associated with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
  • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

Based on size and productive capacity, modern electronic computing equipment is divided into:

  • to ultra-large (supercomputers);
  • large computers;
  • small computers;
  • ultra-small (microcomputers).

Thus, we saw that devices, first invented by man to record resources and values, and then to quickly and accurate implementation complex calculations and computational operations, constantly developed and improved.

Appendix 4

Test on the topic:

"History of the development of computer technology"

Choose the correct answer

1. An electronic computer is:

a) a complex of hardware and software information processing tools;

b) complex technical means for automatic information processing;

c) a model that establishes the composition, order and principles of interaction of its components.

2. Personal Computer- This:

a) A computer for an individual buyer;

b) a computer that provides dialogue with the user;

c) desktop or personal computer that meets the requirements of general accessibility and universality

3. Inventor of a mechanical device that allows you to add numbers:

a) P. Norton;

b) B. Pascal;

c) G. Leibniz;

d) D. Napier.

4. The scientist who put the idea together mechanical machine with the idea of ​​program control:

a) C. Babbage (mid-19th century);

b) J. Atanosov (30s of the XX century);

c) K. Beri (XX century);

d) B. Pascal (mid-17th century)

5. The world's first programmer is:

a) G. Leibniz;
b) C. Babbage;

c) J. von Neumann;

d) A. Lovelace.

6. Country where the first computer was created that implements the principles of program management:

b) England;

c) in Germany

7. Founder of domestic computer technology:

8. The city in which the first domestic computer was created:

b) Moscow;

in Saint-Petersburg;

Yekaterinburg city.

9. Means of communication between the user and the second generation computer:

a) punched cards;

b) magnetic tokens;

c) magnetic tapes;

d) magnetic tokens.

10. First tool for counting

a) sticks;

b) pebbles;

c) human hand;

d) shells.

11. Number system in Russian abacus:

a) binary;

b) fivefold;

c) octal;

d) decimal.

12. Scope of application of first generation computers:

a) design;

b) engineering and scientific calculations;

c) banking;

d) architecture and construction.

13. Computer generation, during which high-level programming languages ​​began to appear:

a) first;

b) second;

c) third;

d) fourth.

14. Generation of computers, the elemental base of which were transistors:

a) first;

b) second;

c) third;

d) fourth.

15. Programming language in first generation machines:

a) machine code;

b) Assembler;

c) BASIC

d) Fortran

Select all correct answers:

16. Elements of third generation computers:

a) integrated circuits;

b) microprocessors

c) CRT-based display

d) magnetic disks

e) mouse manipulator

17. Elements of Babbage's Analytical Engine

a) input block;

b) microprocessor;

c) output block;

d) office;

e) mill;

f) result printing block;

g) arithmetic device;

h) memory;

18. Elements of a fourth generation computer:

a) integrated circuits;

b) microprocessors;

c) color display;

d) transistors;

e) joystick manipulator;

e) plotters.

19. The very first counting devices

a) https://pandia.ru/text/78/312/images/image003_40.jpg" width="206" height="69">.jpg" width="151" height="58">.jpg" width="146" height="71">0 " style="margin-left:-22.95pt;border-collapse:collapse;border:none">

a, d, d, f, i

Performance evaluation scale

Points

Grade

Satisfactorily

Views