History of Computers - muneeb-mbytes/computerArchitectureCourse GitHub Wiki
THE HISTORY OF COMPUTERS
5 Generations of computers :
Let's explore the fundamental shifts in computing across five generations, spanning from the era of vacuum tubes to the integration of artificial intelligence
There have been transformative changes in
- Size
- Cost
- Efficiency
- Power Consumption
- Reliability
- SIZE: The evolution of computers across five generations has witnessed remarkable changes in terms of size.
- In the first generation, computers filled entire rooms, with vacuum tubes dominating the landscape.
- The second generation brought about a reduction in size, thanks to the advent of transistors.
- Integrated circuits in the third generation marked a significant size decrease, enabling computers to become more compact.
- The fourth generation introduced microprocessors, leading to the development of personal computers that were significantly smaller.
- Finally, the fifth generation has seen a further reduction in size with the integration of very large scale integration (VLSI) chips, contributing to the creation of sleek and powerful devices, including laptops and mobile devices.
From room-sized machines to portable and pocket-sized devices, the size evolution across these generations reflects the incredible progress in computer technology.
- COST:
-
First Generation: High Cost - The first computers, like the ENIAC, were custom-built and extraordinarily expensive, often funded by government agencies or large institutions due to the high cost of manufacturing and maintaining vacuum tube-based systems.
-
Second Generation: Cost Reduction with Transistors - The introduction of transistors led to a significant reduction in costs compared to vacuum tubes. Transistors were smaller, more durable, and less expensive to produce, making computers more accessible to a broader range of organizations.
-
Third Generation: Further Cost Reduction with Integrated Circuits - The use of integrated circuits brought another wave of cost reduction. These circuits were smaller, more reliable, and cheaper to manufacture than discrete transistors, making computers more commercially viable.
-
Fourth Generation: Microprocessor Revolution and Mass Production - The advent of microprocessors allowed for standardized components and mass production, leading to a significant drop in costs. Personal computers became more affordable, contributing to the democratization of computing.
-
Fifth Generation: Continued Affordability with VLSI - Very Large Scale Integration (VLSI) technology and advanced manufacturing processes have continued the trend of cost reduction. Standardized components and efficient production methods have made computing devices, including laptops and mobile phones, more cost-effective and widely accessible.
- EFFICIENCY:
The efficiency of computers has seen significant improvements across the five generations, driven by advancements in technology, architecture, and manufacturing processes.
-
First Generation: Computers were large, power-hungry, and inefficient in terms of both space and energy consumption. The use of vacuum tubes contributed to high power requirements and frequent failures.
-
Second Generation: The introduction of transistors led to improvements in efficiency, as these components were smaller, more reliable, and consumed less power than vacuum tubes. However, computers still required considerable space and energy.
-
Third Generation: Integrated circuits marked a significant leap in efficiency, reducing the physical size of computers and lowering power consumption. The advent of time-sharing systems and more efficient programming languages also contributed to improved overall efficiency.
-
Fourth Generation: Microprocessors brought about a major increase in efficiency. Computers became smaller, more powerful, and energy-efficient. The development of energy-efficient architectures and standardized components contributed to improved performance.
-
Fifth Generation: The integration of very large scale integration (VLSI) chips and advancements in parallel processing and specialized architectures further enhanced efficiency. Energy-efficient designs became a priority, especially with the proliferation of mobile devices and the push for environmentally friendly computing solutions.
- POWER CONSUMPTION:
-
First Generation : High Power Consumption - Vacuum tube-based computers, such as ENIAC, consumed vast amounts of power. The tubes generated significant heat and required extensive cooling systems, contributing to high energy consumption.
-
Second Generation : Transistor Efficiency - The advent of transistors in the second generation marked a significant improvement in power consumption. Transistors were more energy-efficient and generated less heat compared to vacuum tubes, contributing to a reduction in overall power requirements.
-
Third Generation : Integrated Circuit Advancements - The use of integrated circuits further improved power efficiency. These circuits were smaller, consumed less energy, and contributed to a reduction in the physical size of computers, as well as their associated power needs.
-
Fourth Generation : Microprocessor Efficiency - The introduction of microprocessors in the fourth generation led to a substantial increase in power efficiency. These integrated chips were more energy-efficient and allowed for the development of smaller, more portable computers with lower power requirements.
-
Fifth Generation : Energy-Efficient Designs - With the integration of Very Large Scale Integration (VLSI) and advancements in semiconductor technology, the fifth generation has seen a continued focus on energy efficiency. Design considerations for laptops, mobile devices, and data centers prioritize minimizing power consumption, reflecting an ongoing commitment to sustainability in computing.
- RELIABILITY
-
First Generation: Low Reliability - Vacuum tube-based computers in the first generation were prone to frequent failures and downtime. The tubes were delicate and often had a short lifespan, leading to less reliable computing systems.
-
Second Generation: Improved Reliability with Transistors - The transition to transistors significantly enhanced the reliability of computers. Transistors were more durable and had a longer lifespan than vacuum tubes, resulting in more stable and dependable systems.
-
Third Generation: Enhanced Stability with Integrated Circuits - The use of integrated circuits further improved reliability. These circuits reduced the number of physical connections, minimizing points of failure, and contributed to increased stability in computer systems.
-
Fourth Generation: Microprocessor Advances - The introduction of microprocessors continued the trend of improved reliability. Standardized components and the integration of more functions into a single chip reduced the complexity of systems, leading to more reliable and user-friendly computers.
-
Fifth Generation: High Reliability and Fault Tolerance - Advances in Very Large Scale Integration (VLSI) and sophisticated error detection and correction techniques have significantly increased the reliability of computers in the fifth generation. Fault-tolerant designs and redundancy measures contribute to highly reliable computing systems, especially in critical applications and data centers.
First Generation Computers
- Main electronic component β Vacuum tube
- Main memory β Magnetic drums and magnetic tapes
- Power β Consume a lot of electricity and generates a lot of heat
- Speed and size β Very slow and very large in size (often taking up entire room)
- Input/output devices β Punched cards and paper tape
- Programming language β machine language
- Examples β ENIAC, UNIVAC1, etc.
THE FIRST ELECTRONIC COMPUTER - ENIAC
ENIAC- Electronic Numerical Integrator and Calculator
ENIAC : Electronic Numerical Integrator and Computer, built by J. Presper Eckert and John V. Mauchly was a general-purpose computer.
Funded by - United states army
ENIAC was a general-purpose machine used for computing artillery-firing tables
Artillery firing tables: Large-caliber weapons designed to launch military weapons over long distances
- ENIAC is the U-shaped computer, which was 80 feet long by 8.5 feet high and several feet wide
- Each of the 20 10-digit registers was 2 feet long
- In total, ENIAC used 18,000 vacuum tubes
Size: ENIAC was about 100 times larger in physical size compared to modern computers
Speed: Despite being much larger, ENIAC was incredibly slower than today's computers. ENIAC was over 100 million times slower than modern computers.
Performance: ENIAC could perform 1900 additions (basic arithmetic operations) in one second. In contrast, today's computers can perform billions (or even trillions) of operations in a second.
- Unlike today's computers with keyboards and screens, ENIAC's programming was a hands-on process. People had to physically plug in cables and set switches to tell the machine what to do
- To give ENIAC information, people used punched cards. These were like early computer versions of input devices
- Programming for typical calculations required from half an hour to a whole day
- ENIAC was a general-purpose machine, meaning it could be used for various tasks. However, it had limitations. One notable limitation was its relatively small storage capacity, and programming it was a difficult process
Commercial Devolopments
THE FIRST COMMERCIAL COMPUTER - UNIVAC1
- UNIVAC - Universal Automatic Computer
- The invention and development of the Universal Automatic Computer (UNIVAC) were spearheaded by J. Presper Eckert and John Mauchly .
- Remington Rand played a pivotal role in the history and commercialization of the Universal Automatic Computer (UNIVAC) .
- One of the special features of UNIVAC was its early adoption of a stored-program architecture .
- stored-program architecture - UNIVAC's adoption of the stored-program concept enabled it to store instructions and data in memory, streamlining operations and enhancing its computational efficiency.
- UNIVAC I is the rectangular cabinet-like structure,approximately 14.5 feet long, 7.5 feet wide, and 9 feet hight.
- It used fewer vacuum tubes compared to ENIAC. The **UNIVAC I used approximately 5,000 vacuum tubes.
- size - UNIVAC I was significantly smaller than ENIAC but still larger than modern computers, occupying a substantial amount of space within a room.
- Speed - UNIVAC I was markedly slower than modern computers, likely by millions of times in processing speed, highlighting the vast technological advancements in computational capabilities over time.
- Performance - UNIVAC I exhibited superior performance to ENIAC, operating at higher speeds and with enhanced computational capabilities.
- UNIVAC I aimed to replace the older punched-card accounting machines in commercial data processing.
- It used an operator keyboard and console typewriter for simple, or limited, input and magnetic tape for all other input and output.
- Operator Keyboard and Console Typewriter -Used for simple or limited data entry tasks, operators could input commands or data directly.
- Magnetic Tape - This medium was employed for extensive input and output needs.
- Printed Output on Tape and Tape Printer - Translated recorded data into printed copies for easy reading and interpretation by users.
- UNIVAC faced challenges with its large size, limited processing speed, frequent maintenance needs due to vacuum tube fragility, constrained memory, and complex, low-level programming demands compared to contemporary computers.
IBM
-
In the 1950s and early 1960s, systems like the **IBM 705 and IBM 1401, while less powerful, acted as central data hubs in businesses, laying the groundwork for modern mainframes.
-
The IBM System/360 (S/360), introduced by **IBM on April 7, **1964, encompassed a series of **mainframe computers delivered from 1965 to 1978.
-
Mainframe computers, also known as "big iron," are used by big companies for important tasks like handling large amounts of data and processing essential transactions.
-
The System/360 was the first computer family designed to handle both business and scientific tasks of all sizes.
-
Size - The System/360 represented a range of sizes, from smaller models like the Model 30 with memory starting from 8 KB to larger models capable of up to 8 MB of main memory.
-
Performance - The performance of the System/360 models varied widely.
-
The slowest Model 30 could handle up to 34,500 instructions per second, while the high-performance Model 91 introduced in 1967 could execute a remarkable 16.6 million instructions per second.
-
Transition from Vacuum Tubes to Transistors - The System/360 series switched from using vacuum tubes to IBM's Solid Logic Technology (SLT), which used transistors and made computers more powerful, smaller, and more reliable due to the transistors' efficiency compared to vacuum tubes.
-
Solid Logic Technology (SLT) - It was a way of building computer components using tiny electronic parts called transistors, helping make computers smaller, faster, and more reliable compared to older methods using larger, less efficient parts like vacuum tubes.
-
Solid Logic Technology cards
- IBM System/360 computers:models **40, 50, 65, and **75 were all introduced in 1964.
Sr.No | IBM Model | Clock Rate | Range of Memory Sizes | Approximate Price |
---|---|---|---|---|
1 | IBM Model 40 | 1.6 MHz | 32 KBβ256 KB | $225,000 |
2 | IBM Model 50 | 2.0 MHz | 128 KBβ256 KB | $550,000 |
3 | IBM Model 65 | 5.0 MHz | 256 KBβ1 MB | $1,200,000 |
4 | IBM Model 75 | 5.1 MHz | 256 KBβ1 MB | $1,900,000 |
- Disadvantage IBM's size sometimes leads to slow decision-making, it faces stiff competition, and its high-end solutions can be costly for smaller businesses.
Cray-1, the first commercial vector supercomputer
-
This particular machine held the unique distinction of being the fastest computer for scientific purposes as well as the most cost-effective computer in terms of performance. When seen from above, the computer resembles the letter C.
-
Cray led the design of the Control Data Corporation CDC 6600 in Minnesota. This machine included many ideas that are beginning to be found in the latest microprocessors.
-
The CDC 6600 is often called the first RISC (Reduced Instruction Set Computer), due to the simplicity of its instruction set. The reason for its simplicity was the desire for speed.
The Apple IIc Plus
-
At the same time that Seymour Cray was developing the most costly Numerous designers from across the globe were considering using the microprocessor to make a low-cost, home-use computer. Although there isn't a single personal computer pioneer, the Apple IIe, developed by Steve Jobs and Steve Wozniak in 1977, established industry norms for affordability, mass production, and dependability.
-
The Apple IIc Plus, released in 1988, utilized a Complex Instruction Set Computer (CISC) architecture. CISC processors typically have a rich set of instructions, allowing a single instruction to perform complex tasks. The IIc Plus used the 65C02 processor, an enhanced version of the 6502 processor, which was a CISC-based architecture commonly found in early personal computers.
-
Despite having a four-year advantage, Apple's personal computers came in second in terms of popularity. Following its 1981 release, the IBM Personal Computer went on to become the best-selling computer of all time.
-
As a result of its popularity, Microsoft became the most widely used operating system and Intel the most widely used CPU. Even though it costs several times more than a music CD, the Microsoft operating system is currently the most widely used CD! Of course, the IBM-compatible personal computer has changed significantly in the more than 30 years that it has been around.
-
The earliest personal computers actually had 64 kilobytes of RAM and 16-bit processors; the only nonvolatile storage was a sluggish, low-density floppy disc! Originally created by IBM to load diagnostic programmes into mainframes, floppy discs were a significant
Xerox Alto
- The Xerox Alto computer is developed at Xerox PARC (Palo Alto Research Center Incorporated) in the early 1970s.
- It included a mouse , a bit-mapped scheme, Windows-based user interface, and a local network Connection (Ethernet).
- The Xerox Alto had a significant impact on the design of many different computers and software platforms, such as the Apple Macintosh, IBM-compatible PC, Windows, Mac OS.
- It support an operating system based on a graphical user interface (GUI).
Some key features and contributions of the Xerox Alto:
- A bit-mapped graphics display integrated with a computer .
- A mouse, which was invented earlier, but included on every Alto and used extensively in the user interface.
- A local area network (LAN), which became the precursor to the Ethernet.
- A user interface based on Windows and featuring a WYSIWYG (what you see is what you get) editor.
MEASURING PERFORMANCE
Performance Observation
- ENIAC was to be 1000 times faster than the Harvard Mark-I
- IBM Stretch (7030) was to be 100 time faster than the fastest computer then in existence.
- The original measure of performance was the time required to perform an individual operation, such as addition.
What is Average instruction execution time?
Time for each instruction * weight in the mix
- Since instruction sets were similar, this was a more precise comparison than add times.
- From average instruction execution time, then, it was only a small step to MIPS.
- MIPS (Million instructions per second) had the virtue of being easy to understand; hence, it grew in popularity.
THE QUEST FOR AN AVERAGE PROGRAM
As processors were becoming more sophisticated and relied on memory hierarchies and pipelining, a single execution time for each instruction no longer existed; neither execution time nor MIPS, therefore, could be calculated from the instruction mix and the manual.
Lets look into the Benchmarks
Figure : some of the Benchmarks and its uses
GOOD TO KNOW ABOUT KERNELS
Kernel : Kernels are small, time- intensive pieces from real programs that are extracted and then
used as benchmarks.
Best examples Livermore Loops and Linpack.
- Livermore Loops consist of a series of 21 small loop fragments.
- Linpack consists of a portion of a linear algebra subroutine package.
Why Kernels ?
- Kernels isolates the performance of individual features of a computer and to explain the reasons for differences in the performance of real programs.
- Because scientific applications often use small pieces of code that execute for a long time, characterizing performance with kernels is most popular in this application class.
PERFORMANCE EVALUATION
What is SPEC?
- SPEC is a group formed for the performance evaluation in 1988, by the representative companies such as Apollo/ Hewlett- Packard, DEC, MIPS, and Sun.
HISTORY OF SPEC
SPEC89 : Contained six floating-point benchmarks but only four integer benchmarks. Calculating a single summary measurement using the geometric mean of execution times normalized to a VAX-11/780 meant that this measure favored computers with strong floating-point performance.
SPEC92 : dropped matrix300, and provided separate means (SPEC INT and SPECFP) for integer and floating-point programs. In addition, the SPECbase measure, which disallows program-specific optimization flags, was added to provide users with a performance measurement that would more closely match what they might experience on their own programs.
SPEC95 :Added some new integer and floating-point benchmarks, as well as removing some benchmarks that suffered from flaws or had running times that had become too small given the factor of 20 or more performance improvement since the first SPEC release.changed the base computer for normalization to a Sun SPARCStation 10/40, since operating versions of the original base computer were becoming difficult to find!
SPEC2006 : Most recent SPEC, all floating-point programs in SPEC2006 are new, and for integer programs just two are from SPEC2000, one from SPEC95, none from SPEC92, and one from SPEC89. The sole survivor from SPEC89 is the gcc compiler.
In 2008, SPEC provided benchmark sets for graphics, high-performance scientific computing, object-oriented computing, file systems, Web servers and clients, Java,engineering CAD applications, and power.
A HALF-CENTUARY PROGRESS
Itβs good to know the dramatic changes that have occurred in just over 50 years π
After adjusting for inflation, price/performance has improved by almost 100 billion in 55 years, or about 58% per year. Another way to say it is weβve seen a factor of 10,000 improvement in cost and a factor of 10,000,000 improvement in performance.
THE GROWTH OF EMBEDDED COMPUTING
The Fact
- The best-selling processor in the world remains an 8-bit micro controller used in cars, some home appliances, and other simple applications.
- For many years, the dominant use of embedded processors was for industrial control applications, and although this use continued to grow, the processors tended to be very cheap and the performance relatively low.
- To evaluate performance, the embedded community was inspired by SPEC to create the Embedded Microprocessor Benchmark Consortium (EEMBC).
- Started in 1997, it consists of a collection of kernels organized into suites that address different portions of the embedded industry. They announced the second generation of these benchmarks in 2007.
Growth of Embedded Computing
Embedded processors initially served industrial control applications with low-cost, low-performance chips, like the prevalent 8-bit microcontrollers used in cars and home appliances. In the late 1980s and early 1990s, new opportunities emerged, expanding to advanced applications like video games, set-top boxes, cell phones, and personal digital assistants.
The surge in information appliances and networking spurred a massive increase in embedded processors and performance demands. To assess performance, the Embedded Microprocessor Benchmark Consortium (EEMBC) formed in 1997, inspired by SPEC benchmarks. EEMBC introduced a second-generation benchmark suite in 2007, addressing various sectors within the embedded industry.
Nintendo Famicom Disk System
Ricoh 2A03 8-bit processor (Famicom)
February 21,1986
Since 1951, there have been thousands of new computers using a wide range of technologies and having widely varying capabilities.
Below figure summarizes the key characteristics of some machines mentioned in this section and shows the dramatic changes that have occurred in just over 50 years.