Автор Анна Евкова
Преподаватель который помогает студентам и школьникам в учёбе.

The history of the development of Informatics as a science (Лингвистика)

There is no noun with the ability to represent modern life other than computer. Whether the effect is negative or positive, computers control nearly every aspect of our everyday life. Computers have evolved from bearing the role of strictly computing to having the ability of completing unthinkable tasks. Supermarket scanners calculate our grocery bill while keeping store inventory; computerized telephone switching centers play traffic cop to millions of calls and keep lines of communication untangled; and automatic teller machines (ATM) let us conduct banking transactions form virtually anywhere in the world. All of this amazing technology started over five thousand years ago and continues to grow with an unknown culmination.

The history of computing began with an analog machine. In 1623 German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocket wheels that could add and, with the aid of logarithm tables, multiply and divide. A French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column. Pascal was an eighteen year-old son of a French tax collector in the early seventeenth century. A German mathematician and philosopher named Gottfried Wil hem von Leibniz improved the Pascaline in 1694 by inventing a machine with the ability to not only add, but multiply as well. Leibniz mechanical multiplier preserved Pascals idea of using dials and gears refined form Pascals original Pascaline from the study of his notes and drawings. The refined model used a stepped-drum gear design rather than Pascals flat gear design, however; the widespread use of the mechanical calculator did not take effect until 1820. Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage. A British mathematics professor by the name of Charles Babbage began the understanding of our present day computer. It is said that the automation of computers began with one simple quote, wish to God these calculations had been performed by steam" Babbage believed there existed a strong affinity between machines and mathematics. He reckoned, if machines were best at performing flawless tasks in repetition; while mathematics, often required the unpretentious repetition of steps, he could apply the capabilities of machines to the demands of mathematics to refine the arithometer into a more evolved and elaborate machine. Babbage named is first attempt a Difference Engine which could be used to perform differential equations in 1822. The Difference Engine was powered by steam, the size of a car, and was capable of printing out the results automatically. Babbage finally called it quits after ten years of hard work. He was inspired to put is time into the invention of the first general-purpose computer called the Analytical Engine. To ameliorate his father duties, Pascal assembled a brass rectangular box, also called a Pascaline, using eight movable dials capable of adding sums up to eight figures long. Pascal system is all based upon the number ten. For example, as one dial passed nine, the next dial turned to represent one in the tens column as the original dial returned back to zero. The Pascaline only drawback was its limitation to addition. In the early 19th century French inventor Joseph-Marie Jacquard devised a specialized type of computer called loom. Jacquard s loom used punched cards to program patterns that were output as woven fabrics by the loom. Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage. Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems. Babbage also made plans for another machine, the Analytical Engine, considered to be the mechanical precursor of the modern computer. The Analytical Engine was designed to perform all arithmetic operations efficiently; however, Babbage s lack of political skills kept him from obtaining the approval and funds to build it. Augusta Ada Byron was a personal friend and student of Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time. She prepared extensive notes concerning Babbage s ideas and the Analytical Engine. Ada s conceptual programs for the Engine led to the naming of a programming language in her honor. Although the Analytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers.

Charles Xavier Thomas de Colmar was another great inventor whose efforts assisted in the evolution of the simple computer. Colmar was a Frenchman whom invented a machine with the ability to perform the four basic arithmetic functions, he invented the Arithometer, which offered a more practical approach to computing offering the ability to add, subtract, divide, and multiply. The enhanced versatility of the arithometer influenced its popularity up until World War I. With the help of his predecessors, Pascal and Leibniz, Colmar helped define the age of mechanical computation.

Computers were looked at as a way to simplify large workloads into discreet tasks. The United States census of 1880 took seven years to tally. The fear of later censuses taking an even more absurd amount of time to count, the bureau turned to technology. An American inventor named Herman Hollerith also applied the Jacquard loom concept to computing. Rather than use Babbage idea of perforated cards, Jacquard decided to use cards storing data information, which he fed into a machine compiling the results instinctively. Punched holes in the cards would represent letters and number, a single hole depicted a number, while a combination of two holes portrayed a letter. This contrivance allowed the bureau to enumerate the census results in six weeks. Not only did Hollerith machine remarkably decrease the amount of time the census took, but also the cards used represented stored memory of the census and reduced computational errors. Hollerith machine found its way into the business world founding Tabulating Machine Company in 1896, which later became International Business Machines (IBM) in 1924. After this point in history the evolution of the computer is began to become an increased desired area of interest.

The first major interest began with the onset of World War II. The lust of the government having the ability to use computers to assist them in warfare inspired them to increase the funding of computing projects resulting in the motivation for technical progress in computing. German engineer Konrad Zuse had developed a computer to design airplanes and missiles by 1941 called the Z3. The British were also in the pursuance of enhancing computer technology. They completed a secret code-breaking computer called Colossus used to decode German messages. The existence of the machine was not uncovered until decades after the war.

In the 1930s American mathematician Howard Aiken developed the Mark I calculating machine, which was built by IBM. This electronic calculating machine used relays and electromagnetic components to replace mechanical components. In later machines, Aiken used vacuum tubes and solid-state transistors to manipulate the binary numbers. Aiken also introduced computers to universities by establishing the first computer science program at Harvard University. Aiken never trusted the concept of storing a program within the computer. Instead his computer had to read instructions from punched cards. At the Institute for Advanced Study in Princeton, Hungarian-American mathematician John von Neumann developed one of the first computers used to solve problems in mathematics, meteorology, economics, and hydrodynamics. Von Neumann s 1945 Electronic Discrete Variable Computer was the first electronic computer to use a program stored entirely within its memory. John Mauchly, an American physicist, proposed an electronic digital computer, called the Electronic Numerical Integrator And Computer, which was built at the Moore School of Engineering at the University of Pennsylvania in Philadelphia by Mauchly and J. Presper Eckert, an American engineer. ENIAC was completed in 1945 and is regarded as the first successful, general digital computer. It weighed more than 27,000 kg and contained more than 18,000 vacuum tubes. Roughly 2000 of the computer s vacuum tubes were replaced each month by a team of six technicians. Many of ENIAC s first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task. Eckert and Mauchly eventually formed their own company, which was then bought by the Rand Corporation. They produced the Universal Automatic Computer, which was used for a broader variety of commercial applications. By 1957, 46 UNIVACs were in use. In 1948, at Bell Telephone Laboratories, American physicists Walter Houser Brattain, John Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an electric switch. The transistor had a tremendous impact on computer design, replacing costly, energy-inefficient, and unreliable vacuum tubes. In the late 1960s integrated circuits, tiny transistors and other electrical components arranged on a single chip of silicon, replaced individual transistors in computers. Integrated circuits became miniaturized, enabling more components to be designed into a single computer circuit. In the 1970s refinements in integrated circuit technology led to the development of the modern microprocessor, integrated circuits that contained thousands of transistors .Modern microprocessors contain as many as 10 million transistors.

The Electronic Numerical Integrator and Computer (ENIAC) was produced in a partnership between the United States government the University of Pennsylvania. The ENIAC contained 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the massive machine consumed 160 kilowatts of electrical power. It was developed by John Presper Eckert and John W. Mauchly. The ENIAC was different form the Colossus and the Mark I in that it was a general-purpose computer, which was able to compute at speed up to 1,000 times faster than the Mark I.

Manufacturers used integrated circuit technology to build smaller and cheaper computers. The first of these so-called personal computers (PCs) was sold by Instrumentation Telemetry Systems. The Altair 8800 appeared in 1975. It used an 8-bit Intel 8080 microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and displayed output on rows of light-emitting diodes. Refinements in the PC continued with the inclusion of video displays, better storage devices, and CPUs with more computational abilities. Graphical user interfaces were first designed by the Xerox Corporation, and then later used successfully by the Apple Computer Corporation with its Macintosh computer. Today the development of sophisticated operating systems such as Windows 95 and Unix enables computer users to run programs and manipulate data in ways that were unimaginable 50 years ago.

Possibly the largest single calculation was accomplished by physicists at IBM in 1995 solving one million trillion mathematical problems by continuously running 448 computers for two years to demonstrate the existence of a previously hypothetical subatomic particle called a glue ball. Japan, Italy, and the United States are collaborating to develop new supercomputers that will run these calculations one hundred times faster.

In 1996 IBM challenged Gary Kasparov, the reigning world chess champion, to a chess match with a supercomputer called Deep Blue. The computer had the ability to compute more than 100 million chess positions per second. Kasparov won the match with three wins, two draws, and one loss. Deep Blue was the first computer to win a game against a reigning world chess champion with regulation time controls. Many experts predict these types of parallel processing machines will soon surpass human chess playing ability, and some speculate that massive calculating power will one day replace intelligence. Deep Blue serves as a prototype for future computers that will be required to solve complex problems.

Computers can be either digital or analog. Digital refers to the processes in computers that manipulate binary numbers (0s or 1s), which represent switches that are turned on or off by electrical current. Analog refers to numerical values that have a continuous range. Both 0 and 1 are analog numbers, but so is 1.5 or a number like pie (p). As an example, consider a desk lamp. If it has a simple on/off switch, then it is digital, because the lamp either produces light at a given moment or it does not. If a dimmer replaces the on/off switch, then the lamp is analog, because the amount of light can vary continuously from on to off and all intensities in between.

Analog computer systems were the first type to be produced. A popular analog computer used in the 20th century was the slide rule. It performs calculations by sliding a narrow, gauged wooden strip inside a ruler like holder. Because the sliding is continuous and there is no mechanism to stop at one exact value, the slide rule is analog. New interest has been shown recently in analog computers, particularly in areas such as neural networks that respond to continuous electrical signals. Most modern computers, however, are digital machines whose components have a finite number of states for example, the 0 or 1, or on or off of bits. These bits can be combined to denote information such as numbers, letters, graphics, and program instructions.

People use computers in a wide variety of ways. In business, computers track inventories with bar codes and scanners, check the credit status of customers, and transfer funds electronically. In homes, tiny computers embedded in the electronic circuitry of most appliances control the indoor temperature, operate home security systems, tell the time, and turn videocassette recorders on and off. Computers in automobiles regulate the flow of fuel, thereby increasing gas mileage. Computers also entertain, creating digitized sound on stereo systems or computer-animated features from a digitally encoded laser disc. Computer programs, or applications, exist to aid every level of education, from programs that teach simple addition or sentence construction to advanced calculus. Educators use computers to track grades and prepare notes; with computer-controlled projection units, they can add graphics, sound, and animation to their lectures. Computers are used extensively in scientific research to solve mathematical problems, display complicated data, or model systems that are too costly or impractical to build, such as testing the air flow around the next generation of space shuttles. The military employs computers in sophisticated communications to encode and unscramble messages, and to keep track of personnel and supplies.

Enormous changes have come about in the past 30 years as a result of the development of computers in general, and personal computers in particular. This creation ranks as one of the most important inventions of the twentieth century. The computer is used in government, law enforcement, banking, business, education, and commerce. It has become essential in fields of scientific, political, and social research as well as aspects of medicine and law. Everyone is affected by the manipulation and storage of data. There are negative consequences of these developments. There are those who engage in fraudulent acts, malicious mischief, and deception. These activities have spawned the need for computer security and a new category of technical crime fighters.

At first, the personal computer was defined as a machine usable and programmable by one person at a time and able to fit on a desk. It was inexpensive, accessible, simple enough for most people to use, and small enough to be transportable. The claims for the identity of the first personal computer are numerous and depend on definition. One of the first small computers was a desktop model built by Hewlett Packard in 1972. It had all the basics: a language, memory storage device, a keyboard, and a display terminal. However, because it was built for scientists and engineers, it was not available on the general market. The first personal computer available for purchase was the Altair 8800. It was introduced, described, and pictured in the January 1975 issue of Popular Electronics magazine. It came in kit form ready to be assembled and was aimed at hobbyists who liked to build their own radios and other electronic devices.

The first personal computer that was fully assembled and offered for sale on the general market was Apple I. It was built by 25-year-old college dropout Steven Wozniak (1950- ) in his garage in Sunnyvale, California. With his friend, Steven Jobs (1955- ), Wozniak showed the new machine at the first Computer Show in Atlantic City in 1976. It astonished viewers with its small, compact size and speed, but did not sell. Wozniak redesigned it. When Apple II was unveiled, encased in a plastic cover, with color graphics, BASIC, and an accounting program called VisiCalc, orders soared. No established company was willing to invest in a machine built in a garage, so Jobs and Wozniak created the Apple Computer Company in 1977. They moved out of the garage and hired people to manufacture the machine.

Soon many individuals and companies leapt into the personal computer market. Some computers were designed for the knowledgeable hobbyist, while others followed the lead of Apple. Those computers were made for those who wanted the computer to do something and didn't care how it worked. Tandy (called Radio Shack today); Texas Instruments, which had built the first electronic calculator; Commodore; and other companies began to build personal computers for sale. Some prospered, some failed.

When IBM finally got into the personal computer market in 1981, it had an immediate impact, even though it had serious limitations. Its computer had no hard disk drive and no software or graphics. But it did have the magic letters on the front—IBM. Many customers felt that if IBM, already called "Big Blue," built a computer, it had to be good. It even convinced many people that since IBM was building personal computers, then they were here to stay. IBM sold 20,000 machines in the first few months and could have sold 50,000, but they were not geared up to manufacture that many. Its design and refinements have been followed by many other manufacturers. IBM PCs, or clones, now dominate the computer market.

Just as few computer owners program their machines, few transport them. For that, a new type of personal computer has appeared: the laptop computer. It is popular among students, researchers, and business travelers, as is the new palm or hand-held computer.

Large mainframe computers changed the way businesses ran and kept records. Personal computers changed the way individuals did business, kept family records, did their taxes, entertained, and wrote letters. Even those who fear or shun computers use them or come into contact with them every day. When they use an ATM to deposit or draw out money, they are using a dedicated computer. When paying for groceries or gasoline with a credit card, a computer is involved. The internal systems of their automobiles are run by computers. Computer literacy has become a necessary skill for technical or scientific jobs and is becoming a requirement for many jobs, such as bank tellers, salesmen, librarians, and even waiters in restaurants who use computers as part of their daily work.

Large mainframe computers changed the way businesses ran and kept records. Personal computers changed the way individuals did business, kept family records, did their taxes, entertained, and wrote letters. Even those who fear or shun computers use them or come into contact with them every day. When they use an ATM to deposit or draw out money, they are using a dedicated computer. When paying for groceries or gasoline with a credit card, a computer is involved. The internal systems of their automobiles are run by computers. Computer literacy has become a necessary skill for technical or scientific jobs and is becoming a requirement for many jobs, such as bank tellers, salesmen, librarians, and even waiters in restaurants who use computers as part of their daily work.

The importance and impact of the personal computer by the beginning of the twenty-first century rests in one part on the development of the computer and in another on the creation of a new system of communications—the Internet—that depends on personal computers and could not have become so widespread without them. Together, computers and the Internet—with its attendant World Wide Web and e-mail—have made a huge impact on society, and every day radical changes are made in the way educated people all over the world communicate, shop, do business, and play.

The Internet, the World Wide Web, and e-mail are actually three distinct entities, allied and interdependent. The Internet is a network of computers that stretches around the world and consists of phone lines, servers, browsers, and clients. It began during the Cold War in a communications network linking researchers at the United States Department of Defense (DOD) and military contractors. In 1969 it was vital to be able to maintain contact in the event of a nuclear attack. When those tensions eased, the network continued as a convenient way to communicate with research groups and companies all over the world. This network was developed at the Advanced Research Projects Agency and was initially called ARPAnet.

At first ARPAnet's primary use was for electronic mail, or e-mail, beginning between 1965 and 1971. It took years of refinement and increased communication capabilities, like fiber optic cables for telephone lines, for users to be able to communicate with each other despite differing types of computers, operating languages, or speed.

ARPAnet continued to grow, still used mostly by military contractors and the DOD. In the 1970s it was opened to non-military users, mainly universities. The first host was installed at UCLA, the second at Stanford, both in California. By 1971 software was being created to enable messages to be sent to and from any computer. E-mail then became accessible to all. International connections were available by 1973. In 1983 ARPAnet was split into military and civilian sections, and the civilian network was dubbed the Internet. It is now defined as the physical structure of the network—its phone lines, servers, and clients.

The World Wide Web enhances the Internet. It is a collection of sites and information that can be accessed through those sites. Tim Berners-Lee worked at CERN in Switzerland and wrote software in 1989 to enable high-energy physicists to collaborate with physicists anywhere in the world. This was the beginning of the World Wide Web, which became an essential part of the Internet in 1991. The Web has multimedia capabilities, provides pictures, sound, movement, and text. It is made up of a series of electronic addresses or web sites. The Internet and the World Wide Web became easier and more useful when Web browsers were invented to locate, retrieve, and display this information in both text and pictures.

According to some estimates, there were approximately 40 million personal computers as the twenty-first century dawned, and most of them were connected to the Internet. No business hoping to sell products to a large audience in the new century will be able to ignore personal computers or the Internet. Any individual who wishes access to a wide range of information or to buy goods and services will need a personal computer wired to the Internet to do it.

Since the ENIAC, computers have become more complex, and what once was the size of a football field is not the size of a fingernail. The evolution and development of the computer has taken thousands of giant leaps in advances since the start of the twentieth century and continues to grow. It took thousands of years for ancient scientists, mathematicians, and philosophers to improve the smallest amount on the Abacus. Today, the world relies on computers to take care of everything and without these great men who dedicated their lives to the advancement of computing, the world would not nearly be the way it is today.

The computers manufactured between 1945 -55 are called first Generation Computers. They were extremely large in size with vacuum tubes in their circuitry which generated considerable heat. Hence, special air conditioning arrangements were required to dissipate this heat.

They were extremely slow and their storage capacity was also very less compared to today’s computers. In these computers punched cards were used to enter data in to the computer. These were cards with rectangular holes punched in them using some punching devices. UNIVACI was the first commercially available computer, built in 1951 by Remington Rand Company. It had storage capacity of about 2000 words. These were used mostly for payroll, billing and some mathematical computing.

The computers, in which vacuum tubes were replaced by transistors made from semiconductors, were called second generation computers. The use of transistors reduced the heat generated during the operation. It also decreased the size and increased storage capacity. It required less power to operate and were much faster than first generation computers. Magnetic media was being used as an auxiliary storage of data. These computers used high level languages for writing computer programs. FORTRAN and COBOL were the languages used.

The third,generation computers started in 1966 with incorporation of integrated circuits (IC) in the circuitry. IC is a monolithic circuit comprising a circuitry equivalent to tens of transistors on a single chip of semiconductor having a small area a number of pins for external circuit connections.

IBM 360 series computers in this generation had provision for facilitating time sharing and multiprograms also.

These were small size and cost,effective computers compared to Second generation computers. Storage capacity and speed of these computers was increased many folds as include user friendly package programs, word processing and remote terminals. Remote terminals could use central computer facilities and get the result, instantaneously.

Fourth Generation Computers were introduced after 1976 and in these computers electronic components were further miniaturized through Large Scale Integration (LSI) techniques Microprocessor which are programmable Ics fabricated using LSI technique are used in these computers. Micro computers were developed by combing microprocessor with other LSI Chips, with compact size, increased speed and increased storage capacity. In recent days, Ics fabricated using VLSI (Very Large Scale Integration) techniques are used in Computers. Through this techniques, the storage capacity is increased many folds. Not only that, the speed of these computers is also very high as compared to earlier computers.

During 1980s, some computers called as super computers were introduced in the market. These computers perform operation with exceptionally high speed (approx 100 million operations per sec). This speed is attained by employing number of microprocessors consequently there cost is also very high. These are normally used in very complex application like artificial intelligence etc.

A rapid computer science development and its scientific principles formation began in the 40-s of the 20th century. The electronics and then the microelectronics became the technical basis of computer science. Achievements in the field of artificial intelligence were put into basis of computer development.

In 1623, 6-digit device for arithmetic operations was designed by German scientist Vilgelm Shikkard.

The first real existing mechanical calculus device was «Pascaline», designed by prominent scientist Blez Pascal. It was 6 or 8 digital device for adding and subtracting numbers. In 1673 another 12 digital device was designed by Gotfrid Vilgelm Leibnits. It could instantly multiply and divide big numbers.

At the end of the XVIII century Joseph Jakkard split calculation process into three stages: 1) to design the way of calculations, 2) to write a program as arithmetic operations sequence, 3) to make calculations according to the program.

Englishman Charles Babbage used this innovation, he changed calculation devices from hand-made to automatic.

He designed the project of analytical device — mechanical universal digital program-managed device (1830— 1846). The device consisted of 5 units, namely, arithmetic, storage, control, input and output, similar to first computers, that appeared 100 years later. Arithmetic and storage units were meant for 1000 50-digital numbers. An estimate calculation rate was 1 second for addition and subtraction and 1 minute for multiplication and division.

At the beginning of electronic periods of development the digital calculations were considered only within technology aspect, but its scientific basis only began to grow. Scientific works began to appear. Leibnits, George Boole, Shennon, Shestakov, Gavrilov made a great contribution into the development of computer science.

There were two logical machine constructors in Russia — Pavel Khrushchev and Alexander Shchukarev, who worked in Russian high educational establishments.

First the logic machine was constructed by Khrushchev in Odessa. This device was, inherited by Shchukarev, professor of Kharkov Technology Institute.

At the end of the 20th century the development of computer technologies was at its peak. In fact the most work today is done by computers, which made people's life easier. Humans created computer but we know that neither computer nor set of computers can't be wiser than humanity as a whole.

Since the ENIAC, computers have become more complex, and what once was the size of a football field is not the size of a fingernail. The evolution and development of the computer has taken thousands of giant leaps in advances since the start of the twentieth century and continues to grow. It took thousands of years for ancient scientists, mathematicians, and philosophers to improve the smallest amount on the Abacus. Today, the world relies on computers to take care of everything and without these great men who dedicated their lives to the advancement of computing, the world would not nearly be the way it is today.

list of sources

https://infourok.ru/istorical-development-of-computers-3271505.html

https://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/history-development-and-importance-personal-computers

https://www.bestreferat.ru/referat-329401.html