Scrigroup - Documente si articole

     

HomeDocumenteUploadResurseAlte limbi doc
AccessAdobe photoshopAlgoritmiAutocadBaze de dateC
C sharpCalculatoareCorel drawDot netExcelFox pro
FrontpageHardwareHtmlInternetJavaLinux
MatlabMs dosPascalPhpPower pointRetele calculatoare
SqlTutorialsWebdesignWindowsWordXml

Utilizarea si programarea calculatoarelor - Programare si limbaje

calculatoare



+ Font mai mare | - Font mai mic



Utilizarea si programarea calculatoarelor - Programare si limbaje

Bibliografie

[1] Arghir, Mariana; Detesan, Ovidiu-Aurelian, Utilizarea calculatorului si programarea in limbajul C, Editura U.T. Pres, Cluj-Napoca, 2005.



[2] Avram, V., Dodescu, G., General Informatics, Editura Economica, Bucuresti, 1997.

[3] Gookin, Dan, Troubleshooting Your PC for Dummies, 2nd Edition, Wilwy Publishing, Inc., Indianapolis, Indiana, 2005.

[4] Mueller, Scott, Upgrading and Repairing PCS, 15th Edition, QUE Publishing, Indianapolis, Indiana, 2004.

[5] Null, Linda; Lobur, Julia, The Essentials of Computer Organization and Architecture, Jones and Bartlett Publishers, Sudbury, Massachusetts, 2003.

[6] Preston, Dan, How to Buy the Perfect Computer, https://infoheaven-digital-books.com.

[7] Feibel, Verner, The Encyclopedia of Networking, 2nd Edition, Sybex Network Press, Alameda, California, 1996.

[8] Colectia revistei Chip

[9] Colectia revistei Extreme PC

[10] Colectia revistei PC World

[11] www.tomshardware.com

Cap.1 Scurt istoric al dezvoltarii tehnicii de calcul

Many (Numeroase) discoveries and inventions have directly and indirectly contributed to the development of the personal computer as we know it today.

The first computers (calculatoare) of any kind were simple calculators (socotitoare). Even these evolved from mechanical devices to electronic digital devices.

1.1 Repere istorice

The following is a timeline of some significant events in computer history. It is not meant to be complete, just a representation of some of the major landmarks (pietre de hotar, jaloane) in computer development:

1617

John Napier creates 'Napier's Bones,' (vergelele lui Napier) wooden or ivory (fildes) rods (tije) used for calculating.

1642

Blaise Pascal introduces the Pascaline digital adding machine.

1822

Charles Babbage conceives the Difference Engine and later the Analytical Engine, a true general-purpose computing machine.

1906

Lee De Forest patents the vacuum tube triode (trioda in tub electronic), used as an electronic switch in the first electronic computers.

1936

Alan Turing publishes 'On Computable Numbers,' a paper in which he conceives an imaginary computer called the Turing Machine, considered one of the foundations of modern computing. Turing later worked on breaking (descifrarea) the German Enigma code.

1937

John V. Atanasoff begins work on the Atanasoff-Berry Computer (ABC), which would later be officially credited as the first electronic computer.

1943

Thomas (Tommy) Flowers develops the Colossus, a secret British code-breaking computer designed to decode secret messages encrypted by the German Enigma cipher machines.

1945

John von Neumann writes 'First Draft of a Report on the EDVAC,' in which he outlines the architecture of the modern stored-program computer (calculator modern cu program memorat).

1946

ENIAC is introduced, an electronic computing machine built by John Mauchly and J. Presper Eckert.

1947

On December 23, William Shockley, Walter Brattain, and John Bardeen successfully test the point-contact transistor, setting off the semiconductor revolution.

1949

Maurice Wilkes assembles the EDSAC, the first practical stored-program computer, at Cambridge University.

1950

Engineering Research Associates of Minneapolis builds the ERA 1101, one of the first commercially produced computers.

1952

The UNIVAC I delivered to the U.S. Census Bureau is the first commercial computer to attract widespread public attention.

1953

IBM ships (livreaza) its first electronic computer, the 701.

1954

The IBM 650 magnetic drum (tambur magnetic) calculator establishes itself as the first mass-produced computer, with the company selling 450 in one year.

1955

Bell Laboratories announces the first fully transistorized computer, TRADIC.

1956

MIT researchers build the TX-0, the first general-purpose, programmable computer built with transistors.

1956

(Incep zorii erei stocarii pe disc magnetic) The era of magnetic disk storage dawns with IBM's shipment of a 305 RAMAC to Zellerbach Paper in San Francisco.

1958

Jack Kilby creates the first integrated circuit at Texas Instruments to prove that resistors and capacitors can exist on the same piece of semiconductor material.

1959

IBM's 7000 series mainframes are the company's first transistorized computers.

1959

Robert Noyce's practical integrated circuit, invented at Fairchild Camera and Instrument Corp., allows printing (crearea) of conducting channels directly on the silicon surface.

1960

The precursor to the minicomputer, DEC's PDP-1, sells for $120,000.

1961

According to Datamation magazine, IBM has an 81.2% share of the computer market in 1961, the year in which it introduces the 1400 Series.

1964

CDC's 6600 supercomputer, designed by Seymour Cray, performs up to three million instructions per second-a processing speed three times faster than that of its closest competitor, the IBM Stretch.

1964

IBM announces System/360, a family of six mutually compatible computers and 40 peripherals that can work together.

1965

Digital Equipment Corp. introduces the PDP-8, the first commercially successful minicomputer.

1966

Hewlett-Packard enters the general-purpose computer business with its HP-2115 for computation, offering a computational power formerly found only in much larger computers.

1969

The root (originea) of what is to become the Internet begins when the Department of Defense establishes four nodes on the ARPAnet: two at University of California campuses (one at Santa Barbara and one at Los Angeles) and one each at SRI International and the University of Utah.

1971

A team at IBM's San Jose Laboratories invents the 8'' floppy disk.

1971

The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News.

1971

(Scientific American face reclama unuia dintre primele calculatoare personale, The Kenbak-1, estimat la $750) The Kenbak-1, one of the first personal computers, advertises for $750 in Scientific American.

1972

Hewlett-Packard announces the HP-35 as 'a fast, extremely accurate electronic slide rule' (rigla de calcul electronica) with a solid-state memory similar to that of a computer.

1972

Intel's 8008 microprocessor makes its debut.

1973

The Micral is the earliest commercial, non-kit personal computer based on a microprocessor, the Intel 8008.

1974

Researchers at the Xerox Palo Alto Research Center design the Alto, the first workstation with a built-in mouse for input.

1975

The January edition of Popular Electronics features the Altair 8800, which is based on Intel's 8080 microprocessor, on its cover.

1975

The visual display module (VDM) prototype (prototipul modulului de afisare vizuala), designed by Lee Felsenstein, marks the first implementation of a memory-mapped alphanumeric video display for personal computers.

1976

Steve Wozniak designs the Apple I, a single-board computer.

1976

The 5 1/4'' flexible disk drive and disk are introduced by Shugart Associates.

1976

The Cray I makes its name as the first commercially successful vector processor.

1977

Apple Computer introduces the Apple II.

1977

Commodore introduces the PET (Personal Electronic Transactor).

1978

The VAX 11/780 from Digital Equipment Corp. features the capability to address up to 4.3GB of virtual memory, providing hundreds of times the capacity of most minicomputers.

1979

Motorola introduces the 68000 microprocessor.

1980

Seagate Technology creates the first hard disk drive for microcomputers, the ST-506.

1980

The first optical data storage disk has 60 times the capacity of a 5 1/4'' floppy disk.

1981

Xerox introduces the Star, the first personal computer with a graphical user interface (GUI).

1981

Adam Osborne completes the first portable computer, the Osborne I, which weighs 24 lbs. (aprox. 12 kg) and costs $1,795.

1981

IBM introduces its PC, igniting (declansand) a fast growth of the personal computer market. The IBM PC is the grandfather of all modern PCs.

1981

Sony introduces and ships the first 3 1/2'' floppy drives and disks.

1981

Philips and Sony introduce the CD-DA (Compact Disc Digital Audio) drive. Sony is the first with a CD player on the market.

1983

Apple introduces its Lisa, which incorporates a GUI that's very similar to the one first introduced on the Xerox Star.

1983

Compaq Computer Corp. introduces its first PC clone that uses the same software as the IBM PC.

1984

Apple Computer launches the Macintosh, the first successful mouse-driven computer with a GUI, with a single $1.5 million commercial during the 1984 Super Bowl.

1984

IBM releases the PC-AT (PC Advanced Technology), three times faster than original PCs and based on the Intel 286 chip. The AT introduces the 16-bit ISA bus (pe care se bazeaza toate PC-urile moderne) and is the computer all modern PCs are based on.

1985

Philips introduces the first CD-ROM drive.

1986

Compaq announces the Deskpro 386, the first computer on the market to use what was then Intel's new 386 chip.

1987

IBM introduces its PS/2 machines, which make the 3 1/2'' floppy disk drive and VGA video standard for PCs. The PS/2 also introduces the MicroChannel Architecture (MCA) bus, the first plug-and-play bus (magistrala) for PCs.

1988

Apple cofounder Steve Jobs, who left Apple to form his own company, unveils (lanseaza) the NeXT.

1988

Compaq and other PC-clone makers develop Enhanced Industry Standard Architecture (EISA), which unlike MicroChannel retains (pastreaza) backward compatibility with the existing ISA bus.

1989

Intel releases (lanseaza) the 486 (P4) microprocessor, which contains more than one million transistors. Intel also introduces 486 motherboard chipsets.

1990

The World Wide Web (WWW) is born when Tim Berners-Lee, a researcher at CERN-the high-energy physics laboratory in Geneva-develops (dezvolta) Hypertext Markup Language (HTML).

1993

Intel releases the Pentium (P5) processor. Intel shifts from numbers to names for its chips after it learns it's impossible to trademark a number. Intel also releases motherboard chipsets and, for the first time, complete motherboards as well.

1995

Intel releases the Pentium Pro processor, the first in the P6 processor family.

1995

Microsoft releases (lanseaza pe scara larga) Windows 95, the first mainstream 32-bit operating system, in a huge rollout.

1997

Intel releases the Pentium II processor, essentially a Pentium Pro with MMX instructions added.

1997

AMD introduces the K6, which is compatible with the Intel P5 (Pentium).

1998

Microsoft releases Windows 98.

1998

Intel releases the Celeron, a low-cost version of the Pentium II processor. Initial versions have no cache, but within a few months Intel introduces versions with a smaller but faster L2 cache.

1999

Intel releases the Pentium III, essentially a Pentium II with SSE (Streaming SIMD Extensions) added. (SIMD = Single Instruction Multiple Data)

1999

AMD introduces the Athlon.

2000

Microsoft releases Windows Me (Millennium Edition) and Windows 2000.

2000

Both Intel and AMD introduce processors running at 1GHz.

2000

AMD introduces the Duron, a low-cost Athlon with reduced L2 cache.

2000

Intel introduces the Pentium 4, the latest processor in the Intel Architecture 32-bit (IA-32) family.

2001

Intel releases the Itanium processor, its first 64-bit (IA-64) processor for PCs.

2001

Intel introduces the first 2GHz processor, a version of the Pentium 4. It took the industry 28 1/2 years to go from 108KHz to 1GHz, but only 18 months to go from 1GHz to 2GHz.

2001

Microsoft releases Windows XP Home and Professional, for the first time merging the consumer (9x/Me) and business (NT/2000) operating system lines under the same code base (an extension of Windows 2000).

2002

Intel releases the first 3GHz-class processor, a 3.06GHz version of the Pentium 4. This processor also introduces Intel's Hyper-Threading (HT) technology (which enables a single processor to work with two application threads at the same time) to desktop computing.

2003

AMD releases the Athlon 64, the first 64-bit processor targeted at the mainstream consumer and business markets.

2006

Intel Dual Core, 1024 Kb, bus 800, LGA775

AMD Athlon 64x2, 2x512 Kb, bus 2000, Socket AM2

2007

Intel Core 2 Quad, 12288 Kb, bus 1333, LGA775

AMD Phenom, Triple Core, Quad Core, 4096 Kb, bus 3600, Socket AM2

1.2 Etapele dezvoltarii tehnicii de calcul. Generatii de calculatoare

1.2.1 Generatia Zero: Masinile de calcul mecanice (1642-1945)

Prior to the 1500s, a typical European businessperson used an abacus for calculations and recorded the result of his ciphering in Roman numerals. After the decimal numbering system finally replaced Roman numerals, a number of people invented devices to make decimal calculations even faster and more accurate.

Wilhelm Schickard (1592-1635) has been credited with the invention of the first mechanical calculator, the Calculating Clock (exact date unknown). This device was able to add and subtract numbers containing as many as six digits.

In 1642, Blaise Pascal (1623-1662) developed a mechanical calculator called the Pascaline to help his father with his tax work. The Pascaline could do addition with carry and subtraction. It was probably the first mechanical adding device actually used for a practical purpose. In fact, the Pascaline was so well conceived that its basic design was still being used at the beginning of the twentieth century, as evidenced by the Lightning Portable Adder in 1908, and the Addometer in 1920.

Gottfried Wilhelm von Leibniz (1646-1716), a noted mathematician, invented a calculator known as the Stepped Reckoner that could add, subtract, multiply, and divide. None of these devices could be programmed or had memory. They required manual intervention throughout each step of their calculations.

Although machines like the Pascaline were used into the twentieth century, new calculator designs began to emerge in the nineteenth century. One of the most ambitious of these new designs was the Difference Engine by Charles Babbage (1791-1871). Some people refer to Babbage as "the father of computing." Babbage built his Difference Engine in 1822. The Difference Engine got its name because it used a calculating technique called the method of differences. The machine was designed to mechanize the solution of polynomial functions and was actually a calculator, not a computer. Babbage also designed a general-purpose machine in 1833 called the Analytical Engine. Although Babbage died before he could build it, the Analytical Engine was designed to be more versatile than his earlier Difference Engine. The Analytical Engine would have been capable of performing any mathematical operation. The Analytical Engine included many of the components associated with modern computers: an arithmetic processing unit to perform calculations (Babbage referred to this as the mill), a memory (the store), and input and output devices. Babbage also included a conditional branching

operation where the next instruction to be performed was determined by the result of the previous operation.

Ada, Countess of Lovelace and daughter of poet Lord Byron, suggested that Babbage write a plan for how the machine would calculate numbers. This is regarded as the first computer program, and Ada is considered to be the first computer programmer.

A perennial problem facing machine designers has been how to get data into the machine. Babbage designed the Analytical Engine to use a type of punched card for input and programming. Using cards to control the behavior of a machine did not originate with Babbage, but with one of his friends, Joseph-Marie Jacquard (1752-1834). In 1801, Jacquard invented a programmable weaving loom (razboi de tesut) that could produce intricate patterns in cloth. Jacquard gave Babbage a tapestry (panza) that had been woven (tesuta) on this loom (razboi) using more than 10,000 punched cards.

To Babbage, it seemed only natural that if a loom could be controlled by cards, then his Analytical Engine could be as well. Ada expressed her delight with this idea, writing, "[T]he Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."

The punched card proved to be the most enduring means of providing input to a computer system. Keyed data input had to wait until fundamental changes were made in how calculating machines were constructed. In the latter half of the nineteenth century, most machines used wheeled mechanisms (mecanisme cu roti), which were difficult to integrate with early keyboards because they were levered devices. But levered devices (dispozitive cu parghie) could easily punch cards and wheeled devices could easily read them. So a number of devices were invented to encode and then "tabulate" card-punched data.

The most important of the late-nineteenth-century tabulating machines was the one invented by Herman Hollerith (1860-1929). Hollerith's machine was used for encoding and compiling 1890 census (recensamant?) data. This census was completed in record time, thus boosting Hollerith's finances and the reputation of his invention. Hollerith later founded the company that would become IBM. His 80-column punched card, the Hollerith card, was a staple of automated data processing for over 50 years.

1.2.2 Prima generatie: calculatoarele cu tuburi electronice (1945-1953)

Although Babbage is often called the "father of computing," his machines were mechanical, not electrical or electronic.

In the 1930s, Konrad Zuse (1910-1995) picked up where Babbage left off, adding electrical technology and other improvements to Babbage's design. Zuse's computer, the Z1, used electromechanical relays instead of Babbage's hand-cranked gears. The Z1 was programmable and had a memory, an arithmetic unit, and a control unit. Because money and resources were scarce in wartime Germany, Zuse used discarded movie film instead of punched cards for input. Although his machine was designed to use vacuum tubes, Zuse, who was building his machine on his own, could not afford the tubes. Thus, the Z1 correctly belongs in the first generation, although it had no tubes. Zuse built the Z1 in his parents' Berlin living room while Germany was at war with most of Europe. Fortunately, he couldn't convince the Nazis to buy his machine. They did not realize the tactical advantage such a device would give them. Allied bombs destroyed all three of Zuse's first systems, the Z1, Z2, and Z3. Zuse's impressive machines could not be refined until after the war and ended up being another "evolutionary dead end" in the history of computers. Digital computers, as we know them today, are the outcome of work done by a number of people in the 1930s and 1940s. Pascal's basic mechanical calculator was designed and modified simultaneously by many people; the same can be said of the modern electronic computer.

Notwithstanding the continual arguments about who was first with what, three people clearly stand out as the inventors of modern computers: John Atanasoff, John Mauchly, and J. Presper Eckert.

John Atanasoff (1904-1995) has been credited with the construction of the first completely electronic computer. The Atanasoff Berry Computer (ABC) was a binary machine built from vacuum tubes. Because this system was built specifically to solve systems of linear equations, we cannot call it a general-purpose computer. There were, however, some features that the ABC had in common with the general-purpose ENIAC (Electronic Numerical Integrator and Computer), which was invented a few years later. These common features caused considerable controversy as to who should be given the credit (and patent rights) for the invention of the electronic digital computer.

John Mauchly (1907-1980) and J. Presper Eckert (1929-1995) were the two principle inventors of the ENIAC, introduced to the public in 1946. The ENIAC is recognized as the first all-electronic, general-purpose digital computer. This machine used 17,468 vacuum tubes, occupied 1,800 square feet of floor space, weighed 30 tons, and consumed 174 kilowatts of power. The ENIAC had a memory capacity of about 1,000 information bits (about 20 10-digit decimal numbers) and used punched cards to store data. John Mauchly's vision for an electronic calculating machine was born from his lifelong interest in predicting the weather mathematically. While a professor of physics at Ursinus College near Philadelphia, Mauchly engaged dozens of adding machines and student operators to crunch mounds of data that he believed would reveal mathematical relationships behind weather patterns.

1.2.3 A doua generatie: calculatoarele tranzistorizate (1954-1965)

The vacuum tube technology of the first generation was not very dependable. In fact, some ENIAC detractors believed that the system would never run because the tubes would burn out faster than they could be replaced. Although system reliability wasn't as bad as the doomsayers predicted, vacuum tube systems often experienced more downtime than uptime.

In 1948, three researchers with Bell Laboratories-John Bardeen, Walter Brattain, and William Shockley-invented the transistor. This new technology not only revolutionized devices such as televisions and radios, but also pushed the computer industry into a new generation. Because transistors consume less power than vacuum tubes, are smaller, and work more reliably, the circuitry in computers consequently became smaller and more reliable. Despite using transistors, computers of this generation were still bulky and quite costly. Typically only universities, governments, and large businesses could justify the expense. Nevertheless, a plethora of computer makers emerged in this generation; IBM, Digital Equipment Corporation (DEC), and Univac (now Unisys) dominated the industry. IBM marketed the 7094 for scientific applications and the 1401 for business applications. DEC was busy manufacturing the PDP-1. A company founded (but soon sold) by Mauchly and Eckert built the Univac systems. The most successful Unisys systems of this generation belonged to its 1100 series. Another company, Control Data Corporation (CDC), under the supervision of Seymour Cray, built the CDC 6600, the world's first supercomputer. The $10 million CDC 6600 could perform 10 million instructions per second, used 60-bit words, and had an astounding 128 kilowords of main memory.

1.2.4 A treia generatie: calculatoarele cu circuite integrate (1965-1980)

The real explosion in computer use came with the integrated circuit generation. Jack Kilby invented the integrated circuit (IC) or microchip, made of germanium.

Six months later, Robert Noyce (who had also been working on integrated circuit design) created a similar device using silicon instead of germanium. This is the silicon chip upon which the computer industry was built. Early ICs allowed dozens of transistors to exist on a single silicon chip that was smaller than a single "discrete component" transistor. Computers became faster, smaller, and cheaper, bringing huge gains in processing power.

The IBM System/360 family of computers was among the first commercially available systems to be built entirely of solid-state components. The 360 product line was also IBM's first offering where all of the machines in the family were compatible, meaning they all used the same assembly language. Users of smaller machines could upgrade to larger systems without rewriting all of their software. This was a revolutionary new concept at the time.

The IC generation also saw the introduction of time-sharing and multiprogramming (the ability for more than one person to use the computer at a time).

Multiprogramming, in turn, necessitated the introduction of new operating systems for these computers. Time-sharing minicomputers such as DEC's PDP-8 and PDP-11 made computing affordable to smaller businesses and more universities.

IC technology also allowed for the development of more powerful supercomputers.

Seymour Cray took what he had learned while building the CDC 6600 and started his own company, the Cray Research Corporation. This company produced a number of supercomputers, starting with the $8.8 million Cray-1, in 1976. The Cray-1, in stark contrast to the CDC 6600, could execute over 160 million instructions per second and could support 8 megabytes of memory.

1.2.5 A patra generatie: calculatoarele VLSI (1980-????)

In the third generation of electronic evolution, multiple transistors were integrated onto one chip. As manufacturing techniques and chip technologies advanced, increasing numbers of transistors were packed onto one chip.

There are now various levels of integration: SSI (small scale integration), in which there are 10 to 100 components per chip; MSI (medium scale integration), in which there are 100 to 1,000 components per chip; LSI (large scale integration), in which there are 1,000 to 10,000 components per chip; and finally, VLSI (very large scale integration), in which there are more than 10,000 components per chip. This last level, VLSI, marks the beginning of the fourth generation of computers.

To give some perspective to these numbers, consider the ENIAC-on-a-chip project. In 1997, to commemorate the fiftieth anniversary of its first public demonstration, a group of students at the University of Pennsylvania constructed a single-chip equivalent of the ENIAC. The 1,800 square-foot, 30-ton beast that devoured 174 kilowatts of power the minute it was turned on had been reproduced on a chip the size of a thumbnail. This chip contained approximately 174,569 transistors-an order of magnitude fewer than the number of components typically placed on the same amount of silicon in the late 1990s.

VLSI allowed Intel, in 1971, to create the world's first microprocessor, the 4004, which was a fully functional, 4-bit system that ran at 108KHz. Intel also introduced the random access memory (RAM) chip, accommodating four kilobits of memory on a single chip. This allowed computers of the fourth generation to become smaller and faster than their solid-state predecessors.

VLSI technology, and its incredible shrinking (miniaturizare) circuits, spawned the development of microcomputers. These systems were small enough and inexpensive enough to make computers available and affordable to the general public. The premiere microcomputer was the Altair 8800, released in 1975 by the Micro Instrumentation and Telemetry (MITS) corporation. The Altair 8800 was soon followed by the Apple I and Apple II, and Commodore's PET and Vic 20. Finally, in 1981, IBM introduced its PC (Personal Computer).

Today, the average desktop computer has many times the computational power of the mainframes of the 1960s. Since the 1960s, mainframe computers have seen stunning improvements in price-performance ratios owing to VLSI technology. Although the IBM System/360 was an entirely solid-state system, it was still a water-cooled, powergobbling behemoth. It could perform only about 50,000 instructions per second and supported only 16 megabytes of memory (while usually having kilobytes of physical memory installed). These systems were so costly that only the largest businesses and universities could afford to own or lease one. Today's mainframes-now called "enterprise servers"-are still priced in the millions of dollars, but their processing capabilities have grown several thousand times over, passing the billion-instructions-per-second mark in the late 1990s. These systems, often used as Web servers, routinely support hundreds of thousands of transactions per minute!

The processing power brought by VLSI to supercomputers defies comprehension. The first supercomputer, the CDC 6600, could perform 10 million instructions per second, and had 128 kilobytes of main memory. By contrast, supercomputers of today contain thousands of processors, can address terabytes of memory, and will soon be able to perform a quadrillion instructions per second.

1.2.6 A cincea generatie

What technology will mark the beginning of the fifth generation? Some say that the fifth generation will mark the acceptance of parallel processing and the use of networks and single-user workstations. Many people believe we have already crossed into this generation. Some people characterize the fifth generation as being the generation of neural network, DNA, or optical computing systems.

It's possible that we won't be able to define the fifth generation until we have advanced into the sixth or seventh generation, and whatever those eras will bring.

1.2.7 Legea lui Moore

So where does it end? How small can we make transistors? How densely can we pack chips? No one can say for sure. Every year, scientists continue to thwart prognosticators' attempts to define the limits of integration. In fact, more than one skeptic raised an eyebrow when, in 1965, Intel founder Gordon Moore stated, "The density of transistors in an integrated circuit will double every year."

The current version of this prediction is usually conveyed as "the density of silicon chips doubles every 18 months." This assertion has become known as Moore's Law. Moore intended this postulate to hold for only 10 years. However, advances in chip manufacturing processes have allowed this law to hold for almost 40 years (and many believe it will continue to hold well into the 2010s).

Yet, using current technology, Moore's Law cannot hold forever. There are physical and financial limitations that must ultimately come into play. At the current rate of miniaturization, it would take about 500 years to put the entire solar system on a chip!

Cap.2 Arhitectura unui sistem de calcul

2.1 Nivelele ierarhice ale unui calculator

If a machine is to be capable of solving a wide range of problems, it must be able to execute programs written in different languages, from FORTRAN and C to Lisp and Prolog. The only physical components we have to work with are wires and gates. A formidable open space-a semantic gap (gol, vid semantic)-exists between these physical components and a high-level language such as C++. For a system to be practical, the semantic gap must be invisible to most of the users of the system.

Programming experience teaches us that when a problem is large, we should break it down and use a "divide and conquer" approach. In programming, we divide a problem into modules and then design each module separately. Each module performs a specific task and modules need only know how to interface with other modules to make use of them. Computer system organization can be approached in a similar manner. Through the principle of abstraction, we can imagine the machine to be built from a hierarchy of levels, in which each level has a specific function and exists as a distinct hypothetical machine. We call the hypothetical computer at each level a virtual machine. Each level's virtual machine executes its own particular set of instructions, calling upon machines at lower levels to carry out the tasks when necessary. Figure 1 shows the commonly accepted layers representing the abstract virtual machines.

Fig. 2.1 The Abstract Levels of Modern Computing Systems

Level 6, the User Level, is composed of applications and is the level with which everyone is most familiar. At this level, we run programs such as word processors, graphics packages, or games. The lower levels are nearly invisible from the User Level.



Politica de confidentialitate | Termeni si conditii de utilizare



DISTRIBUIE DOCUMENTUL

Comentarii


Vizualizari: 3021
Importanta: rank

Comenteaza documentul:

Te rugam sa te autentifici sau sa iti faci cont pentru a putea comenta

Creaza cont nou

Termeni si conditii de utilizare | Contact
© SCRIGROUP 2024 . All rights reserved