pc magazine pdf
28-01-2015, 17:35PrintA computer is a device which processes data by means of programmable arithmetic instructions.
Charles Babbage and Ada Lovelace are valid by the calculating machine sketched by Babbage in 1837 Analytical engine as a mentor of the modern universally programmable computer, while Konrad Zuse (Z3, in 1941 and Z4, in 1945), John Presper Eckert and John William Mauchly (ENIAC, in 1946) built the first functional devices of this kind. By the classification of a device as a universally programmable computer the Turing completeness plays an essential role, named according to the English mathematician Alan Turing who has introduced in 1936 the logical model of the Turingmaschine.
The early computers were also called (Largely) computer; their input and output of the data was limited first to the processing of figures. Though modern computers on the contact get on with other data, as for example letters and tones. Nevertheless, these data are converted within the computer also into figures and are processed as figures, which is why a computer also is even today a calculating machine.
With increasing computer power new areas of application presented themselves. Today computers are to be found in all ranges of the everyday life, mostly in specified variations which are cut on a present application purpose. Thus integrated Kleinstcomputer (embedded system) serve for the control of everyday devices like washing machines, video recorders or for the coin check in vending machines; in modern cars they serve, for example, for the display of driving data and take over in "driving wizard" various manoeuvres themselves.
Universal computers are found in Smartphones and game consoles. Personal computers serve the information processing in economy and authorities as well as with private individuals; supercomputers are seated to simulate complicated processes, e.g., in the climatology or for medical calculations.
Origin of the name
The English concept Computer, derived by the verb (to) compute (from Latin: computare "compute"), designated originally people who carried out mostly protracted calculations, for example, for astronomers in the Middle Ages. The earliest known text in which the word is used comes from 1613.  in the New York Times the word appeared for the first time on the 2nd of May, 1892 in a classified ad of the US navy  with the title „A computer Wanted“ (an arithmetic specialist in request) and knowledge was assumed in algebra, geometry, trigonometry and astronomy.
In 1938 Konrad Zuse produced the first freely programmable mechanical computer (Z1) which corresponded in the today's sense already to the concept. In the naming of the Electronic introduced in 1946 the public number cal integrator and computer (ENIAC) the word appears for the first time as a name component. Subsequently computer set up as a type concept for these new machines.
Basically two construction methods differ: A computer is a digital computer if he processes digital data (figures and text characters) with digital device units; it is an analogous computer if he processes analogous data (continuously running electric measuring sizes like voltage or current) with analogous device units.
Today are seated almost excluding digital computers. These follow common basic principles with which its free programming is allowed. Besides, by a digital computer two basic components are distinguished: The hardware which is formed from the electronic, physically touchable parts of the computer, as well as the software which describes the programming of the computer.
A digital computer exists first only of hardware. The hardware provides first a so-called memory, in the datum in helpings like on the numbered pages of a book can be saved and be retrieved any time for the processing or output. Secondly the arithmetic work of the hardware disposes of the basic components for a free programming with which any processing logic can be displayed for data: These components are, in principle, the calculation, the comparison and the branch on condition. A digital computer can add, for example, two figures, compare the result to the third number and then continue depending on the result either in one or the other place of the programme. In the computer science this model is illustrated theoretically by the Turing machine; the Turing machine displays the basic considerations to the computability.
Nevertheless, only by software the digital computer becomes useful. Every software is, in principle, a defined, functional arrangement of the components described on top calculation, comparison and branch on condition and the components can be often used arbitrarily. This arrangement of the components which is designated programme is removed in the form of data in the memory of the computer. From there it can be read out by the hardware and be processed. This function principle of the digital computers has not changed since its origins in the middle of the 20th century substantially, although the details of the technology were considerably improved.
Analogue computers function after another principle. With them analogous construction elements (amplifier, capacitors) replace the logic programming. Analogue computers were seated earlier more often for the simulation by rule processes (see: Today control engineering), have been removed, however, almost completely from digital computers. In a transition period there were also the combined computers which inferred Analogously with a digital computer.
Possible application possibilities for computer are:
Media creation (picture processing and word processing)
Administration and archiving applications
Control of machines and executions (printer, production in the industry by e.g. bot, embedded systems)
Calculations and simulations (e.g., BOINC)
Media reproduction (Internet, television, videos, entertainment applications like computer games, learning software)
Communication (chat, e-mail, social networks)
The principle generally applied today which is designated after its description by John von Neumann of 1946 as from Neumann's architecture defines five main components for a computer:
arithmetic unity (arithmetic logic unit (ALUMINIUM)),
the control unit,
the bus unity,
the memory as well as
the input unit and output unit (en).
In the today's computers the ALUMINIUM and the control unit have mostly melted to a component, the so-called CPU (Central Processing Unit, central processor).
The memory is a number of numbered serially "cells"; each of them can take up a small piece of information. This information is removed as a binary number, so a sequence by information in terms of Einsen and zeros, in the memory cell. A characteristic feature that of Neumann's architecture is, that this binary number (for example: 01000001 what corresponds to the decimal number 65) either a part of the data (e.g., the number 65 or the letter A) or a command for the CPU (on top mentioned "branch on condition") can be.
Substantially in that of Neumann's architecture it is that programme and data divide a storage area (besides the data book in general the lower one and the programmes the high memory area).
On the other hand data and programmes own (physically separated) storage areas are available in the so-called Harvard architecture, data-writing operations can thereby overwrite no programmes.
In that of Neumann's architecture the control unit is responsible to know what is at which place in the memory. One can imagine this in such a way that the control unit has a "pointer" on a certain memory cell in which the next command stands which it has to execute. It reads out this from the memory, recognises, for example, "65", this recognises as "a branch on condition". Then it goes to the next memory cell because she must know where it should jump. It also reads out this value and interprets the number as a number (so-called address) of a memory cell. Then it puts the pointer on this very memory cell to read out its next command there again; the jump is carried out. If the command was, for example, instead of "branch on condition" „Read of value“, it would change not the program pointer, but read out the contents from the subsequently given address simply to forward him then, for example, to the ALUMINIUM.
The ALUMINIUM has the job to combine values from memory cells. It gets the values of the control unit delivered, it settles (added, for example, two figures which have read out the control unit from two memory cells) and returns the value to the control unit which can use the value then for a comparison or write back again in the third memory cell.
Finally, the one and output units are responsible to enter the initial programmes in the memory cells and to indicate the results of the calculation the user.
It of Neumann's architecture is as it were the lowest level of the function principle of a computer above the electric-physical processes in the conductor paths. The first computers were also so programmed really that one wrote the numbers about commands and about certain memory cells in such a way, as it required the programme, successively in the single memory cells. To reduce this expenditure, programming languages were developed. These generate the figures within the memory cells which the computer processes, in the end, as a programme, for the text commands automatically which display semantically understandable contents (e.g., GOTO for the "unconditional branch") also for the programmer.
Later certain recurring procedures were summarised into so-called libraries not to have to invent anew the radian every time, e.g.: interpreting a pressed keyboard button as a letter "A" and with it as a number "65" (in the ASCII-Code). The libraries were bundled up in superior libraries which unterfunctions link for complicated operations (example: the display of a letter "A", consisting of 20 single black and 50 single white dots on the display, after the user has pressed the button "A").
In a modern computer many of these program levels work about one another or together. More complicated duties are disassembled in unterduties which were already edited by other programmers who are based again on the preliminary work of other programmers whose libraries they use. However, at the lowest level there is found always the so-called machine code – that sequence of figures with which the computer is also controled really.
The precursors of the modern computer
The computer technology developed in comparison to other electrical appliances very fast. The history of the development of the computer reaches back till the antiquity and is substantially longer with it than the history of the modern computer technologies and mechanical or electric accessories (calculating machines or hardware). Besides, it also encloses the development of arithmetic methods which were developed possibly for easy writing utensils on paper and boards. In the following it is tried accordingly to give an overview about these developments
Count as a basis of the computer history
The draught of the figures can be led back on no concrete roots and has probably developed with the first needs of the communication between two individuals. One finds into all known languages at least for the figures one and two correspondences. Also in the communication of many animal species (possibly of different primates, but also to birds like the blackbird) the possibility of the differentiation of different amounts from objects can be ascertained.
The advancement of these easy numerical systems probably led to the discovery of the first mathematical arithmetic operation like the addition, the subtraction, the multiplication and the division or also the squares and the square root. These operations were formalized (in formulas displayed) and thereby checkable. Then from it developed continuing considerations, possibly the representation developed by Euklid of the biggest common divisor.
In the Middle Ages the Indian figure system about the Arabian space (so falsely as an Arabian figure system famously) reached Europe and permitted a bigger systematisation at the work with figures. The possibilities permitted the representation of figures, expressions and formulas on paper and the Tabellierung of mathematical functions as for example of the square roots rsp. of the easy logarithm as well as the trigonometry. At the moment of the works of Isaac Newton paper and vellum was an important resource for arithmetic duties and has remained this till the today's time in which researchers like Enrico Fermi filled page-wise paper with mathematical calculations and Richard Feynman computed every mathematical step with the hand up to the solution, although there were already programmable computers at its time.
Early development of calculating machines and arithmetic accessories
The earliest device which is comparable in rudimentary attempts with a today's computer is the abacus, a mechanical arithmetic help which was presumably invented about 1100 B.C. in the indochinesischen cultural space. The abacus was used till the 17th century and then was replaced by the first calculating machines. In some regions of the world the abacus is still used as an arithmetic help. The abacus of Pythagoras also served a similar purpose.
Already in 1. Cent. B.C. was invented with the computer by Antikythera the first calculating machine. The device presumably served for astronomical calculations and functioned with a differential gearing, a technology rediscovered only in the 13th century.
With the setting of the antiquity the technical progress came to the shutdown and in the times of the migration a lot of knowledge got lost (thus, for example, also the computer of Antikythera which was rediscovered only in 1902). Finally, the Middle Ages restrained the technical progress. However, from the modern times the engine of the technical progress started to rotate again slowly and accelerated from now on – and he does this till this day.
The slide rule, one of the most important mechanical arithmetic help to the multiplication and division
1614 published John Napier its table of logarithms and in 1623 built Wilhelm Schickard the first four-species machine and with it the first mechanical computer of the modern times by which he became till this day the „father of the computer era“. Its construction was based on the teamwork of gearwheels which came in the essentials from the range of the watchmaker's art and were used there by which its machine received the name „counting clock“. Practically the machine was applied by Johannes Kepler with its astronomical calculations.
In 1642 followed Blaise Pascal with its calculating machine, the Pascaline. 1668 developed Samuel Morland a calculating machine which added for the first time not decimally, but to which English monetary system was tuned. In 1673 Gottfried Wilhelm Leibniz built its first four-species machine and invented in 1703 the binary numerical system (binary system) which became later the basis for the digital computers and on it building up the digital revolution.
Mechanical computer of 1914
In 1805 developed Joseph-Marie Jacquard Lochkarten to control looms. In 1820 Charles Xavier Thomas built de Colmar the "Arithmometer", the first computer which was produced in mass production and made therefore the computer for large-scale enterprise affordable. Charles Babbage developed from 1820 to 1822 the difference machine (English Difference engine) and in 1837 the Analytical engine, however, it could not build from lack of money. In 1843 Edvard and George Scheutz in Stockholm built the first mechanical computer after the ideas of Babbage. In the same year Ada Lovelace developed a method of the programming of computers after the Babbage system and wrote with it the first computer program. In 1890 the US census was carried out with the help of the punchcard system by Herman Hollerith. In the same year Torres y Quevedo built a chess machine which could put a king with king and tower weakly – and therefore the first game computer.
Today simply as Mac designated Macintosh of the US-American enterprise Apple Inc was the first microcomputer with graphic user interface which was produced in bigger numbers of pieces. Till this day the personal computers of Apple carry the product name Mac in combinations like Mac mini, MacBook for air, MacBook Per, iMac and Mac Pro.
The first Mac was the successor of technically similar, but economically fruitless Apple Lisa at cost of 10,000 US dollars. Macintosh 128 K was introduced on the 24th of January, 1984 by Apple co-founder Steve to jobs. The commercial "1984" for Mac became with Really Bowl XVIII performed. At that time at a price of 2495 US dollars (about 7200 DM / approximate retail price corresponded in Germany: 10,000 DM) one received a computer on the base from Motorola 68000-CPU which was clocked with 8 MHz and could access 128 kilobytes of main memory (RAM) – what turned out fast as not enough. A 3.5-inch disk drive with 400 Kbyte of storage space and an integrated 9-inch monitor completed the first Macintosh.
Just as the predecessor Lisa Macintosh was also equipped with a graphic user interface and a mouse. At the time at that time this was an absolutely new draught for personal computer, because all present systems at the market about keyboard entries were served in a command line. The operating system of Macintosh had originally no name and was called only "system" (with attached version number). Then from version 7.5.1 one said Mac OS (derived from Macintosh Operating system). It was cut from beginning on the service with the mouse and contained at that time revolutionary draughts, like the "recycle bin", with which the delete by files again could be cancelled, the "desk", drag & drop, selecting of text or objects for the purpose of update of the attributes and navigating in the file system with the help of icons. The other basic draughts which should take the shyness widespread still at that time of the use of computers from the users were the Undo function and the generally uniform service of different application programs.
In the beginning in spite of these innovations the new computer sold only in small numbers of pieces. Reasons for it were seen in its high price and in the fact that he was deleted in its form and kind of the use far from him what one understood at that time generally by a professional computer (monitor with green character font on black background and the input of long command lines). Only the succession models of original Macintosh could bind a bigger Nutzerschaft to themselves which reached then a high whole share of the market, was overtaken, however, numerically by Windows systems. The share of the market of Macintosh had sunk by 2000 on a low level of after countable method between three and five percent. With the introduction of Mac OS X the share of the market rose continuously and has reached in 2011 in the USA about 13% and worldwide approx. 6%.
From summer, 1994 till September, 1997 the operating system Mac OS to other computer manufacturers (among the rest, Umax and power Computing) was licensed. The computers resulting from this licence compatible with Macintosh were called Mac-Clones.