AZreferate - Referate und hausaufgaben fur schule.
Referatesuche, Hausarbeiten und Seminararbeiten Kostenlose Online-Dokumente mit Bildern, Formeln und Grafiken. Referate, Facharbeiten, Hausarbeiten und Seminararbeiten findest für Ihre einfache Hausarbeiten.



BetriebstechnikBiographienBiologieChemieDeutschDigitaltechnik
ElectronicaEpochenFertigungstechnikGemeinschaftskundeGeographieGeschichte
InformatikKulturKunstLiteraturManagementMathematik
MedizinNachrichtentechnikPhilosophiePhysikPolitikProjekt
PsychologieRechtSonstigeSportTechnikWirtschaftskunde

Referat Science & Technology - The Computing History

informatik referate

informatik referate

Index

Preface

The Computing History

The Turing Machine

Example: Simple Adding Machine

Visualization of this example

The Turing Test

Back to the Computing History

Epilogue

Bibliography

Science & Technology

Preface

Since this is a very wide topic, I will concentrate on computerscience and -technology.

Today's world is barely imaginable without computers, because they are used everywhere, so to speak in nearly every enterprise, in more and more schools and even in every other household.

So much more is possible with computers, especially since the world wide web has been invented - and who is voluntarily going to miss that? Nobody, I think.

The Computing History

Back in 4000 B.C. inhabitants of the first known civilization in Sumer keep records of commercial transactions on clay tablets. 1000 years later the abacus is invented in Babylonia. 250-230 B.C. the Sieve of Eratosthenes is used to determine prime numbers.

About 1300 A.D. the more familiar wire-and-bead abacus replaces the Chinese calculating rods.

ill. - wire-and-bead abacus

Invention after invention follows. In the 1600s the logarithms are devised, first adding and subtracting machines are produced and the first calculator using a stepped cylindrical gear is built.

This is the first mechanical adding machine, the 'Pascalene', named after its inventor Blaise Pascal.

ill. - Pascalene

1774 Philipp Matthaus Hahn builds and sells a small number of calculating machines precise to 12 digits. 3 years later a multiplying calculator is invented.

First typewriters, the analytical engine, the telegraf, shortly afterwards the telephone and the first four-function calculator follow.In 1895 Guglielmo Marconi transmits a radio signal, 12 years later Gramophone music constitutes the first regular radio broadcasts from New York.

In 1915 Manson Benedicks discovers that the germanium crystal can be used to convert alternating current to direct current, which lets him foreshadow the use of microchips.

1924 T.J. Watson renames the CTR[1] to IBM , which is one of the biggest computer companies today. On the picture you see him with his 'Think' slogan.

ill. - T.J. Watson

In 1927 the first demonstration of television takes place in the United States. Two years later color television signals are successfully transmitted. Differential analyzers and very precise quartz crystal clocks are built.

Between 1930 and 1940 there are many important discoveries concerning computers, like the electric typewriter, the development of a binary circuit[3] based on boolean algebra, the proposal for a digital calculating machine (which should be able to perform the four fundamental operations of arithmethic and operate in a predetermined sequence, the famous 'Point before Line'-rule), a prototype electronic-digital computer, .

This is Konrad Zuse. In 1934 he wants to build a better calculator than those currently available. Two years later he realizes that programs composed of bit combinations can be stored and patents an application for the automatic execution of calculations. In 1938 he completes the Z1, an electromechanical binary computer. A design-refined version, the Z2 follows, and in 1941 he finally builds the Z3, the first fully functional program-controlled electromechanical digital computer.

ill. - K. Zuse

1943 The construction of the ENIAC[4], the first fully electronic digital computer, begins.

1944 The Harvard Mark, also known as 'IBM Automatic Sequence Controlled Calculator' is produced by Howard Aiken.

1945 J. Presper Eckert and John Mauchly sign a contract to build the EDVAC[5] and Grace Murray Hopper finds the first computer 'bug', a moth that had caused a relay failure, while he is working on a prototype of the Harvard Mark II (which is finally completed in 1947 by Howard Aiken and his team).

Zuse's Z4 survives World War II and helps to launch postwar development of scientific computers.

Alan Turing is famous for his various articles in the computer scene, first

of all for 'On computable numbers with an application to the Entscheidungsproblem' (1937), in which he describes the concept of

the Turing Machine. In 1950 he publishes an article about the criteria for the Turing Test of artificial intelligence. Today much of the research in artificial intelligence is based on his theory. Of all the writers of computer-articles, Alan M. Turing is probably the most famous.

ill. - A.M. Turing

The Turing Machine

Turing machine is a theoretical machine, a finite automaton with a finite number of states. It consists of an infinitely long tape with squares on which the symbols can be printed, and a head which is capable of doing three things.

  1. Read (or scan) what is on the tape, one square at a time.
  2. Write on the squares.
  3. Move left or right, depending on the instruction, one square at a time.

The tape consists of squares, each of which is capable of holding a symbol or a blank. The tape is infinitely long but always holds only a finitely long string. Everything after the last symbol is blank. The head is capable of scanning one symbol at a time, and does one of the

following things:

Prints, erases or leaves the symbol as it is on the square and goes to the neighbouring square, left or right. Or does one of the above things but instead of going to the neighbouring square stays on the same square.

The machine is provided with a set of rules or instructions which determine the moves to be made. The head scans the current symbol and the state of the machine and makes its move according to the instruction given.

Example: Simple Adding Machine

e.g. 3 + 5 = 8

Tape for this example: - - - - - - -1 1 1 - 1 1 1 1 1 - - - - - - - - - - - -

The 1s are the symbols and the -s the blank squares. The machine will run step by step through the tape and stop at the first symbol. It will scan through the symbols until it reaches the blank between the 1s. It will write a 1 in the blank, then move on and scan through the rest of 1s. When it reaches the blank after the 1-row it stops and moves one square back left and erases the last symbol. Then it moves step by step back to the beginning of the symbol-row. Now it has added the 3 and the 5 symbols and made a row of 8 symbols.

Visualization of this example:

The square in [brackets] is always the square the machine is currently working on.

- - - 1 1 1 1 1 1 [1] 1 - - - - - - - -

- - - 1 1 1 1 1 [1] 1 1 - - - - - - - -

- - - 1 1 1 1 [1] 1 1 1 - - - - - - - -

- - - 1 1 1 [1] 1 1 1 1 - - - - - - - -

- - - 1 1 [1] 1 1 1 1 1 - - - - - - - -

- - - 1 [1] 1 1 1 1 1 1 - - - - - - - -

- - - [1] 1 1 1 1 1 1 1 - - - - - - - -

- - [-] 1 1 1 1 1 1 1 1 - - - - - - - -

- - - [1] 1 1 1 1 1 1 1 - - - - - - - -




The Turing Test (for artificial intelligence)

A judge (human) communicates with a machine and a human. He can ask them any questions desired. At the end of the experiment the judge has to decide which of them is the human/the machine. If machine and human cannot be distinguished anymore, the machine can be marked as intelligent.

Back to the Computing History

We are in 1947, shortly after World War II. The magnetic drum memory is introduced as a data storage device for computers, the first transistor is developed, Richard Hamming devises a way to find and correct errors in blocks of data, the SSEC[6] is built and many various computers are developed in the following years.

In 1949 the first high-level programming language, called 'Short Order Code' is developed by John Mauchly. 1951 Jay Forrester files a patent application for the matrix core memory, Grace Murray Hopper (a woman!!) develops the first compiler (called A-0) and the first Univac I is delivered to the US Census ill.6 - Univac I

Bureau.

1952 A Univac I predicts on television the outcome of the presidential election and expands the public consciousness regarding computers.

1953 The IBM 650, debuts and becomes the first mass-produced computer.

ill. - G.M. Hopper

1952 Thomas Watson Jr. becomes President of IBM.

Again many various computers are built, and IBM introduces and begins to install the RAMAC[9] for hard disk data storage.

ill. - T. Watson      ill. - J. McCarthy ill. - RAMAC

Several different transistors are developed, a new compiler is devised and John McCarthy forms the MIT's[10] Artificial Intelligence Department in 1957. Two years later he develops Lisp for artificial intelligence applications.

1958 Bell's development of the modem data phone enables telephones to transmit binary data.

In the same year Jack Kilby develops a prototype semiconductor IC[12] while Robert Noyce works separately on ICs at Fairchild Semiconductor Corp.

ill. - modem data phone

In 1959 Japan's first commercial transistor computer NEAC 2201 is demonstrated in an exhibition and Xerox[13] introduces the first commercial copy machine. Robert Noyce and Gordon Moore file a patent application for integrated circuit technology on behalf of the Fairchild Semiconductor Corp., while Jack Kilby at Texas Instruments designs a flip-flop IC. UNESCO sponsors the first major computer conference.

1960 the first commercial computer with a monitor and a keyboard input is introduced by DEC[15], a computer 'the Perceptron' that can learn by trial and error through a neural network is built and standards for a new programming language 'Algol 60' are established.

1961 Georg C. Devol patents a robotic device, which is soon marketed as the first industrial robot and IBM's 7030 computer is completed and runs about 30 times faster than his predecessor.

1962 The first video game is invented by Steve Russell, the Telstar communications satellite is launched and relays the first transatlantic television pictures and the world's most powerful computer named 'Atlas' is introduced.

1963 Joseph Weizenbaum develops a 'mechanical psychiatrist' called Eliza that appears to possess intelligence (based on Alan Turing's idea) and the American National Standards Institute accepts ASCII 7-bit code for information interchange.

1964 IBM announces the 'third-generation' of computers, Basic[17] is developed by John Kemeny and Doug Engelhart invents the mouse.


ill. - ASCII        

1965 DEC debuts the first minicomputer,

project MAC leads to the Multics operating ill. - the first mouse

system and at the university of Belgrade,

Raiko Tomovic makes one of the earliest

attempts to develop an artificial limb with

a sense of touch.

ill. - artificial limb

1967-69 New programming languages are developed, a handheld four-function calculator is built, the first computers with integrated circuits are introduced, a Federal Information Processing Standard encourages use of the six-digit data format (YYMMDD) for information interchange, sowing the seeds of the 'Year 2000 Crisis', the Rand Corp. presents a decentralized communications network concept to ARPA[18], Intel is founded and Bell Labs begins to develop Unix, another operating system.

1970 The floppy disk has its debut!!! It is the only data storage device that has survived till today, inspite of the fact that it can only hold 1.44 MB of data. Even today floppy disks are sometimes very important (for example if your PC won't function anymore and you have a little floppy boot disk, it can perhaps rescue all your data!).

1971-75 Ray Tomlinson sends the first network e-mail message, the Intel 4004 microprocessor - a 'computer on a chip' is developed, shortly afterwards the Intel 8008 - the first 8-bit microprocessor and not much later the Intel 8080 follow. Handheld calculators become popular, Nolan Bushnell founds Atari, because his video game is so successful, many programming languages are developed (Modula-2, C[20], Smalltalk, Prolog), Alan Kay develops a forerunner of the PC - his 'office computer', based on Smalltalk employs icons, graphics and a mouse, the ethernet is invented, ill. - C


Charles Simonyi at Xerox writes the first WYSIWYG[21] application, IBM introduces the laser printer and the first PC, an Altair 8800 appears on a magazine cover.

ill. - Altair

1976-79 The Cray-1 is the first supercomputer with a vectorial architecture, IBM develops the ink-jet printer, Steve Jobs and Steve Wozniak design and build the Apple I and a year later incorporate Apple Computer. The Apple II in 1977 establishes the benchmark for personal computers, Bill Gates and Paul Allen found Microsoft, PCs from Tandy and Commodore come with built-in monitors (and thus require no television hookup), Wordstar[22] is introduced, the first electronic spreadsheet program is unveiled, Intel's first 16-bit processor - the 8086 - debuts, Benoit Mandelbrot continues his research into fractals by gererating a Mandelbrot set, digital videodisks appear thanks to Sony and Philips and first cellular phones are tested in Japan and Chicago.

ill.17 - Mandelbrot set

1980-83 IBM selects PC-DOS[23] from upstart Microsoft as the operating system for its new PC, the Osborne 1 'portable' computer weighs 24 pounds and is the size of a small suitcase, Columbia Data and Compaq produce first IBM PC 'clones', the first version of AutoCAD is shipped, ill. - Osborne 1 the Cray X-MP (two Cray-1 computers linked) prove three times faster

than a Cray-1, commercial e-mail service begins among 25 cities, Lotus 1-2-3 includes graphics such as pie charts and bar graphs, completion of TCP/IP[25] switchover marks the creation of the global internet and C++ (an extension to C) is developed.

1984/85 CD-ROM is invented and provides much greater storage capacity for digital data, MIDI standards are developed to interface computers and digital music synthesizers, Apple gives computer graphics a boost with its MacPaint program, better transistors and chips are manufactured, The Last Starfighter: a motion picture uses extensive supercomputer-generated graphics, supercomputer speeds reach 1 billion operations/sec with the release of Cray 2, Microsoft brings Macintosh-like features to DOS-compatible computers with the development of Windows 1.0, Intel introduces the 80386 chip with 32-bit processing, Paul Brainard's PageMaker becomes the first PC desktop publishing program and is widely used and Tony Kyogo builds a robot - the Omnibot 2000, which can move, talk and carry objects.

ill. - Omnibot 2000

1986-90 Steve Jobs next computer debuts, but attracts too few buyers to compete in the market, Tim Berners-Lee proposes the World Wide Web project to CERN[26], Intel's 80486 chip with 1.2 million transistors is introduced, Seymour Cray founds the Cray Computer Corp., Microsoft introduces Windows 3.0 and Tim Berners-Lee writes the initial prototype for the World Wide Web, which uses his other creations: URLs , HTML and HTTP .

ill. - T. Berners-Lee

1991-95 IBM, Motorola and Apple's PowerPC alliance is announced, the Michelangelo virus generates concern, but causes very little damage, Apple introduces the first personal digital assistant 'Newton', Intel's Pentium is introduced, students and staff at the University of Ilinois' NCSA create a graphical user interface for Internet navigation called NCSA Mosaic, Jim Clark and Marc Andreesen found Netscape Communications (original Mosaic Communications) and their first browser becomes available in 1994. The Java programming language is unveiled and enables platform-independent application development and Windows 95 is launched on August 24 with great fanfare.

1996-2001 Operating Systems: Windows 98 and Windows NT follow. NT stands for network, but is first treated as Windows 'nice try'. It is much more stable than the other Windows' and its successor Windows 2000 is even more stable, but still Windows 98 is the most spread operating system. There is also Linux, a free operating system invented by Linus Torvalds, it is developed further by many programmers, because it is open source (everyone can get the source and write his own features). Last but not least there is also still the Apple MacOS, which is highly popular with graphic designers, but all in all it has only

about 5% market share.                                                                       ill. - Tux: Linux Mascot

Microprocessors: The Intel Pentium MMX[31] generation is launched, shortly afterwards the Intel Pentium II, then the Intel Pentium III and now even the Intel Pentium 4 is manufactured.

Besides Intel there is only one other microprocessor producer to compete: AMD[32]. The AMD K6 is the first processor to take as a serious competitor for the Pentium generation. With its K7 and finally the Thunderbird-Athlon, AMD is the leader for a short time, but now Intel has taken the lead again with the Pentium 4. My personal favorite is the AMD Athlon.

Cellular phones and Laptops are made increasingly smaller, some cell phones have the height of a matchbox today, handhelds are developed further, robots that have artificial intelligence are built, more and more peripherals for computers, like scanners, webcams, microphones, cordless mouses, touchscreens, . are developed.

ill.22 - Athlon 1Ghz

The World Wide Web has spread very fast, so that now nearly everyone who owns a computer also got internet. It is even possible to make video conferences through the internet today. Much more has been achieved with computers during the last years.for example the genetic fingerprint and many other research-results.

Computer development is today's fastest development - when you buy a computer it is already out of date in a few months, so if you always want to be up to date you need a lot of money.

Epilogue

It is not a mistery that I am very interested in computers, so I think it is pretty obvious why I had taken exactly this theme: simple interest. Well, I had not expected it to be that hard to find all the material and then write this stuff, but when I look upon it now, then I have to admit I am proud of my work, but I am also relieved it is over now.

Bibliography

  1. Internet pages:

https://www.babylon.com

https://www.watch.impress.co.jp/akiba

https://www.turing.org.uk/turing

https://aleph0.clarku.edu/~anil

https://www.computer.org

https://www.knowledgemedia.org/netacademy/glossary.nsf/kw_id_all/868

https://www.abelard.org/turpap2/turpap2.htm

  1. Books: none


Calculating, Tabulating and Recording Co., founded in 1911

International Business Machines Co.

Consists only of rows of 1 and 0, like 1 1 0 0 0 1 for 49. It's very simple to convert, but every computer is based on this binary circuit today.

Electronic Numeral Integrator And Computer

Electronic Discrete Variable Automatic Computer: First stored-program digital computer.

Selective Sequence Electronic Calculator

Program that converts source code of a program written in a high programming language to binary code, so that the computer can understand it.

Universal Automatic Computer: The first big computer that was built in series.

Random-Access Method of Accounting and Control; better data storage device

Massachusetts Institute of Technology

List Processing - a programming language

Integrated Circuit, also known as chip.

Xerox Corporation is a leader in today's office technology, particularly copiers.

United Nations Educational, Scientific and Cultural Organization, agency of the United Nations Organisation (UNO) that promotes international cooperation,specificially in the fields of education, science and culture.

Digital Equipment Corp., founded in 1958 in Maynard/Massachusetts

American Standard Code for Information Interchange: standard character array for letters and symbols. e.g. 65 is the ASCII code for the capital A, 32 the one for the lower case a.

Beginner's All-Purpose Symbolic Instruction Code, a programming language

Advanced Research Projects Agency

One of today's leading chip producing corporations.

C and C++ are the most used programming languages in today's programs.

What You See Is What You Get: application in which a document appears on-screen exactly as it will appear when printed.

a word processing program

Personal Computer - Disc Operating System

Computer Aided Design: method of creating designs and blueprints using a computer, mostly used to create 3D (motion) pictures today.

Transmission Control Protocol / Internet Protocol

European Council for Nuclear Research, stationed in Bern/Switzerland.

Uniform Resource Locator: The address of a certain file or directory on the internet, like https://www.yourname.com.

Hyper Text Markup Language: Basic language used to write internetpages.

Hyper Text Transfer Protocol: Protocol used to transfer data over the World Wide Web.

National Center for Supercomputing Applications

Multi Media eXtensions

Advanced Micro Devises: In business since 1969, but only a serious competitor since 1998.



Referate über:


Datenschutz




Copyright © 2024 - Alle Rechte vorbehalten
AZreferate.com
Verwenden sie diese referate ihre eigene arbeit zu schaffen. Kopieren oder herunterladen nicht einfach diese
# Hauptseite # Kontact / Impressum