Showing posts with label COMPUTER HISTORY. Show all posts
Showing posts with label COMPUTER HISTORY. Show all posts

Friday, 4 March 2011

Semiconductors and microprocessors

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.
Modern smartphones are fully programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence

Programs

The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will carry process them. While some computers may have strange concepts "instructions" and "output" (see quantum computing), modern computers based on the von Neumann architecture are often have machine code in the form of an imperative programming language.
In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors.

Stored program architecture


A 1970s punched card containing one line from a FORTRAN program. The card reads: "Z(1) = Y + W(1)" and is labelled "PROJ039" for identification purposes.
This section applies to most common RAM machine-based computers.
In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction.
Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.
Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time—with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. For example:
mov #0, sum     ; set sum to 0
mov #1, num ; set num to 1
loop: add num, sum ; add num to sum
add #1, num ; add 1 to num
cmp num, #1000 ; compare num to 1000
ble loop ; if num <= 1000, go back to 'loop'
halt ; end of program. stop running
Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in about a millionth of a second.

Bugs

Errors in computer programs are called "bugs". Bugs may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases they may cause the program to "hang"—become unresponsive to input such as mouse clicks or keystrokes, or to completely fail or "crash". Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an "exploit"—code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.
When the term 'bugs' first came into computing use, bug referred to literally dead bugs shorting circuitry in valve/tube computers.

Machine code

In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode, the command to multiply them would have a different opcode and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from—each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches.
While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember—a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) tend to be unique to a particular type of computer. For instance, an ARM architecture computer (such as may be found in a PDA or a hand-held videogame) cannot understand the machine language of an Intel Pentium or the AMD Athlon 64 computer that might be in a PC.

Stored-program architecture

Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the "stored program architecture" or von Neumann architecture. This design was first formally described by John von Neumann in the paper First Draft of a Report on the EDVAC, distributed in 1945. A number of projects to develop computers based on the stored-program architecture commenced around this time, the first of these being completed in Great Britain. The first working prototype to be demonstrated was the Manchester Small-Scale Experimental Machine (SSEM or "Baby") in 1948. The Electronic Delay Storage Automatic Calculator (EDSAC), completed a year after the SSEM at Cambridge University, was the first practical, non-experimental implementation of the stored program design and was put to use immediately for research work at the university. Shortly thereafter, the machine originally described by von Neumann's paper—EDVAC—was completed but did not see full-time use for an additional two years.

Nearly all modern computers implement some form of the stored-program architecture, making it the single trait by which the word "computer" is now defined. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture.
Beginning in the 1950s, Soviet scientists Sergei Sobolev and Nikolay Brusentsov conducted research on ternary computers, devices that operated on a base three numbering system of −1, 0, and 1 rather than the conventional binary numbering system upon which most computers are based. They designed the Setun, a functional ternary computer, at Moscow State University. The device was put into limited production in the Soviet Union, but supplanted by the more common binary architecture.

Semiconductors and microprocessors

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household
Modern smartphones are fully programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence.

First general-purpose computers

In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.
The Most Famous Image in the Early History of Computing

This portrait of Jacquard was woven in silk on a Jacquard loom and required 24,000 punched cards to create (1839). It was only produced to order. Charles Babbage owned one of these portraits ; it inspired him in using perforated cards in his analytical engine
It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine.Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed ; nevertheless his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. This machine was given to the Science museum in South Kensington in 1910.
In the late 1880s, Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ... To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.
Alan Turing is widely regarded to be the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine, providing a blueprint for the electronic digital computer.Of his role in the creation of the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine".

The Zuse Z3, 1941, considered the world's first working programmable, fully automatic computing machine.

The ENIAC, which became operational in 1946, is considered to be the first general-purpose electronic computer.

EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.

Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging.
The Atanasoff–Berry Computer (ABC) was among the first electronic digital binary computing devices. Conceived in 1937 by Iowa State College physics professor John Atanasoff, and built with the assistance of graduate student Clifford Berry,the machine was not programmable, being designed only to solve systems of linear equations. The computer did employ parallel computation. A 1973 court ruling in a patent dispute found that the patent for the 1946 ENIAC computer derived from the Atanasoff–Berry Computer.
The inventor of the program-controlled computer was Konrad Zuse, who built the first working computer in 1941 and later in 1955 the first computer based on magnetic storage.
George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability.
A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult.Shannon 1940 Notable achievements include.
  • Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer.
  • The non-programmable Atanasoff–Berry Computer (commenced in 1937, completed in 1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements.
  • The secret British Colossus computers (1943),which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.
  • The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.
  • The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.

History of computing

The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century onwards, the word began to take on its more familiar meaning, describing a machine that carries out computations.

Limited-function early computers


The Jacquard loom, on display at the Museum of Science and Industry in Manchester, England, was one of the first programmable devices.
The history of the modern computer begins with two separate technologies—automated calculation and programmability—but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. Examples of early mechanical calculating devices include the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism, an ancient astronomical computer built by the Greeks around 80 BC.The Greek mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when.This is the essence of programmability.
The "castle clock", an astronomical clock invented by Al-Jazari in 1206, is considered to be the earliest programmable analog computer.It displayed the zodiac, the solar and lunar orbits, a crescent moon-shaped pointer travelling across a gateway causing automatic doors to open every hour, and five robotic musicians who played music when struck by levers operated by a camshaft attached to a water wheel. The length of day and night could be re-programmed to compensate for the changing lengths of day and night throughout the year.
The Renaissance saw the invention of the mechanical calculator, a device that could perform all four arithmetic operations without relying on human intelligence, in 1642. The mechanical calculator was at the root of the development of computers in two separate ways ; initially, it is in trying to develop more powerful and more flexible calculators that the computer was first theorized (Charles Babbage, Alan Turing) and then developed (ABC, Z3, ENIAC...) leading to the development of mainframe computers, but also the microprocessor, which started the personal computer revolution, and which is now at the heart of all computers regardless of size or purpose, was invented serendipitously by Intel during the development of an electronic calculator, a direct descendant to the mechanical calculator.

Computer

.
Computer
Columbia Supercomputer - NASA Advanced Supercomputing Facility.jpgDell PowerEdge Servers.jpg
2010-01-26-technikkrempel-by-RalfR-05.jpgDelta-C personal computer.jpg
Acer Aspire 8920 Gemstone by Georgy.JPGCentcom20040818.jpg
A computer is an programmable machine designed to read and execute sequentially a list of instructions that make it perform arithmetical and logical operations on binary numbers. Conventionally a computer consists of some form of short or long term memory for data storage and a central processing unit, which functions as a control unit and contains the arithmetic logic unit. Peripherals (for example keyboard, mouse or graphics card) can be connected to allow a the computer to receive outside input and display output.
A computers processing unit executes series of instructions that make it read, manipulate and then store data. Test and jump instructions allow to move within the program space and therefore to execute different instructions as a function of the current state of the machine or its environment.
The computer can also respond to interrupts that make it execute specific sets of instructions and then return and continue what it was doing before the interruption.
The first electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).
Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous.

Computer Engineering



An example of an FPGA programming/evaluation board, representative of one a computer engineer might use for processor, hardware, and software design
Computer engineering, also called computer systems engineering, is a discipline that integrates several fields of electrical engineering and computer science required to develop computer systems.Computer engineers usually have training in electronic engineering, software design, and hardware-software integration instead of only software engineering or electronic engineering. Computer engineers are involved in many hardware and software aspects of computing, from the design of individual microprocessors, personal computers, and supercomputers, to circuit design. This field of engineering not only focuses on how computer systems themselves work, but also how they integrate into the larger picture.
Usual tasks involving computer engineers include writing software and firmware for embedded microcontrollers, designing VLSI chips, designing analog sensors, designing mixed signal circuit boards, and designing operating systems. Computer engineers are also suited for robotics research, which relies heavily on using digital systems to control and monitor electrical systems like motors, communications, and sensors.
The first accredited computer engineering degree program in the United States was established at Case Western Reserve University in 1971. As of October 2004, there were 170 ABET-accredited computer engineering programs in the US.Due to increasing job requirements for engineers, who can concurrently design hardware, software, firmware, and manage all forms of computer systems used in industry, some tertiary institutions around the world offer a bachelor's degree generally called computer engineering. Both computer engineering and electronic engineering programs include analog and digital circuit design in their curricula. As with most engineering disciplines, having a sound knowledge of mathematics and sciences is necessary for computer engineers.
In many institutions, computer engineering students are allowed to choose areas of in-depth study in their junior and senior year, because the full breadth of knowledge used in the design and application of computers is beyond the scope of an undergraduate degree.

References

  1. ^ IEEE Computer Society; ACM(12 December 2004). Computer Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Computer Engineering. p. iii. http://www.computer.org/portal/cms_docs_ieeecs/ieeecs/education/cc2001/CCCE-FinalReport-2004Dec12-Final.pdf. Retrieved 2006-04-21. "Computer System engineering has traditionally been viewed as a combination of both electronic engineering (EE) and computer science (CS)." 
  2. ^ Trinity College Dublin. "What is Computer System Engineering". http://www.tcd.ie/Engineering/about/what_is_eng/computer_eng_intro.html. Retrieved 2006-04-21. , "Computer engineers need not only to understand how computer systems themselves work, but also how they integrate into the larger picture. Consider the car. A modern car contains many separate computer systems for controlling such things as the engine timing, the brakes and the air bags. To be able to design and implement such a car, the computer engineer needs a broad theoretical understanding of all these various subsystems & how they interact."
  3. ^ IEEE Computer Society; ACM (12 December 2004). Computer Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Computer Engineering. p. 7. http://www.computer.org/portal/cms_docs_ieeecs/ieeecs/education/cc2001/CCCE-FinalReport-2004Dec12-Final.pdf. Retrieved 2006-04-21. "In the United States, the first computer engineering program accredited by ABET (formerly the Accreditation Board for Engineering and Technology) was at Case Western Reserve University in 1972. As of October 2004, ABET has accredited over 170 computer engineering or similarly named programs." 

External links

  • Computer Engineering Conference Calendar
  • Engineering Salary Calculator - Hardware Engineering Outlook
  • Your Career in the Electrical, Electronics, and Computer Engineering Fields
  • Computer Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Computer Engineering

Saturday, 26 February 2011

Computer Engineering Technology

Come out, meet your instructors and learn more about Computer Engineering Technology!
Computer Engineering TechnologyComputer Engineers continue to be in high-demand around the world. While growth in the use of microprocessors and micro-controllers in the electronics industry continues to grow so does the need for well-trained personnel. Camosun provides great facilities and instructors who have real-world experience in computer and electronic engineering. Computer engineering is projected to be one of the fastest-growing professional occupations over the next decade.
As a Computer Engineer, you'll be working in the innovation, design, integration and manufacture of computers, computer systems and computer-related hardware.
As a Computer Engineering Technologist, you'll have skills in:
  • Computer-aided design (CAD);
  • Instrumentation and data acquisition;
  • Micro-controller system design;
  • Process control hardware and software;
  • Software development and support;
  • Data communications systems and local area networks.

Computer Engineering programs

Electronics and Computer Engineering Access
Electronics and Computer Engineering Access is a great program if you're interested in Computer Engineering, Electronic Engineering or becoming an Electronics and Network Technician and need to upgrade your academic qualifications. After completing the Access program you decide not to continue your education, you can seek employment in entry-level positions in areas such as electronics assembly, schematic capture, and printed circuit board production and repair. More information...
Computer Engineering Technology
The Computer Engineering Technology program provides you with the skills, in both the hardware and software aspects of microprocessors and micro-controllers and their applicationsm that are increasingly demanded in the electronics industry. More information...
Computer Engineering Bridge
This program, offered by Camosun College with the full support of the University of Victoria (UVic), will provide you with access to the third year of Engineering at UVic. This program is for graduates of the Computer Engineering Technology or Electronics Engineering Technology diploma program. You'll be able to further your education at the university level, working toward a lucrative, creatively challenging, wide-open career as a professional engineer in the field of computers, computer systems and computer applications. More information...
Network and Electronics Technician
If you're fascinated by computers, circuitry, light-speed optics and processing power, the Network and Electronics Technician program will help you enter the field of computers, networks and electronics.