The earliest computers were mainframes that lacked any form of operating system. Each user had sole use of the machine for a scheduled period of time and would arrive at the computer with program and data, often on punched paper cards and magnetic or paper tape. The program would be loaded into the machine, and the machine would be set to work until the program completed or crashed. Programs could generally be debugged via a control panel using toggle switches and panel lights. It is said that Alan Turing was a master of this on the early Manchester Mark 1 machine, and he was already deriving the primitive conception of an operating system from the principles of the Universal Turing machine.
Symbolic languages, assemblers, and compilers were developed for programmers to translate symbolic program-code into binary code that previously would have been hand-encoded. Later machines came with libraries of support code on punched cards or magnetic tape, which would be linked to the user's program to assist in operations such as input and output. This was the genesis of the modern-day operating system. However, machines still ran a single job at a time. At Cambridge University in England the job queue was at one time a washing line from which tapes were hung with different colored clothes-pegs to indicate job-priority.
As machines became more powerful, the time to run programs diminished and the time to hand off the equipment to the next user became very large by comparison. Accounting for and paying for machine usage moved on from checking the wall clock to automatic logging by the computer. Run queues evolved from a literal queue of people at the door, to a heap of media on a jobs-waiting table, or batches of punch-cards stacked one on top of the other in the reader, until the machine itself was able to select and sequence which magnetic tape drives processed which tapes. Where program developers had originally had access to run their own jobs on the machine, they were supplanted by dedicated machine operators who looked after the well-being and maintenance of the machine and were less and less concerned with implementing tasks manually. When commercially available computer centers were faced with the implications of data lost through tampering or operational errors, equipment vendors were put under pressure to enhance the runtime libraries to prevent misuse of system resources. Automated monitoring was needed not just for CPU usage but for counting pages printed, cards punched, cards read, disk storage used and for signaling when operator intervention was required by jobs such as changing magnetic tapes and paper forms. Security features were added to operating systems to record audit trails of which programs were accessing which files and to prevent access to a production payroll file by an engineering program, for example.
All these features were building up towards the repertoire of a fully capable operating system. Eventually the runtime libraries became an amalgamated program that was started before the first customer job and could read in the customer job, control its execution, record its usage, reassign hardware resources after the job ended, and immediately go on to process the next job. These resident background programs, capable of managing multistep processes, were often called monitors or monitor-programs before the term OS established itself.
An underlying program offering basic hardware-management, software-scheduling and resource-monitoring may seem a remote ancestor to the user-oriented OSes of the personal computing era. But there has been a shift in the meaning of OS. Just as early automobiles lacked speedometers, radios, and air-conditioners which later became standard, more and more optional software features became standard features in every OS package, although some applications such as data base management systems and spreadsheets remain optional and separately priced. This has led to the perception of an OS as a complete user-system with an integrated graphical user interface, utilities, some applications such as text editors and file managers, and configuration tools.
The true descendant of the early operating systems is what is now called the "kernel". In technical and development circles the old restricted sense of an OS persists because of the continued active development of embedded operating systems for all kinds of devices with a data-processing component, from hand-held gadgets up to industrial robots and real-time control-systems, which do not run user applications at the front-end. An embedded OS in a device today is not so far removed as one might think from its ancestor of the 1950s.
The broader categories of systems and application software are discussed in the computer software article.
The mainframe
It is generally thought that the first operating system used for real work was GM-NAA I/O, produced in 1956 by General Motors' Research division for its IBM 704. Most other early operating systems for IBM mainframes were also produced by customers.
Early operating systems were very diverse, with each vendor or customer producing one or more operating systems specific to their particular mainframe computer. Every operating system, even from the same vendor, could have radically different models of commands, operating procedures, and such facilities as debugging aids. Typically, each time the manufacturer brought out a new machine, there would be a new operating system, and most applications would have to be manually adjusted, recompiled, and retested.
Systems on IBM hardware
The state of affairs continued until the 1960s when IBM, already a leading hardware vendor, stopped work on existing systems and put all their effort into developing the System/360 series of machines, all of which used the same instruction and input/output architecture. IBM intended to develop a single operating system for the new hardware, the OS/360. The problems encountered in the development of the OS/360 are legendary, and are described by Fred Brooks in The Mythical Man-Month—a book that has become a classic of software engineering. Because of performance differences across the hardware range and delays with software development, a whole family of operating systems were introduced instead of a single OS/360.
IBM wound up releasing a series of stop-gaps followed by two longer-lived operating systems:
- OS/360 for mid-range and large systems. This was available in three System generation options:
- PCP for early users and for those without the resources for multiprogramming.
- MFT for mid-range systems, replaced by MFT-II in OS/360 Release 15/16. This had one successor, OS/VS1, which was discontinued in the 1980s.
- MVT for large systems. This was similar in most ways to PCP and MFT (most programs could be ported among the three without being re-compiled), but has more sophisticated memory management and a time-sharing facility, TSO. MVT had several successors including the current z/OS.
- DOS/360 for small System/360 models had several successors including the current z/VSE. It was significantly different from OS/360.
IBM maintained full compatibility with the past, so that programs developed in the sixties can still run under z/VSE (if developed for DOS/360) or z/OS (if developed for MFT or MVT) with no change.
IBM also developed, but never officially released, TSS/360, a time-sharing system for the S/360 Model 67.
Several operating systems for the IBM S/360 and S/370 architectures were developed by third parties, including the Michigan Terminal System (MTS) and MUSIC/SP.
Other mainframe operating systems
Control Data Corporation developed the SCOPE operating system in the 1960s, for batch processing and later developed the MACE operating system for time sharing, which was the basis for the later Kronos. In cooperation with the University of Minnesota, the Kronos and later the NOS operating systems were developed during the 1970s, which supported simultaneous batch and timesharing use. Like many commercial timesharing systems, its interface was an extension of the DTSS time sharing system, one of the pioneering efforts in timesharing and programming languages.
In the late 1970s, Control Data and the University of Illinois developed the PLATO system, which used plasma panel displays and long-distance time sharing networks. PLATO was remarkably innovative for its time; the shared memory model of PLATO's TUTOR programming language allowed applications such as real-time chat and multi-user graphical games.
UNIVAC, the first commercial computer manufacturer, produced a series of EXEC operating systems. Like all early main-frame systems, this was a batch-oriented system that managed magnetic drums, disks, card readers and line printers. In the 1970s, UNIVAC produced the Real-Time Basic (RTB) system to support large-scale time sharing, also patterned after the Dartmouth BASIC system.
Burroughs Corporation introduced the B5000 in 1961 with the MCP (Master Control Program) operating system. The B5000 was a stack machine designed to exclusively support high-level languages, with no software, not even at the lowest level of the operating system, being written directly in machine language or assembly language; the MCP was the first OS to be written entirely in a high-level language - ESPOL, a dialect of ALGOL - although ESPOL had specialized statements for each "syllable" (opcode) in the B5000 instruction set. MCP also introduced many other ground-breaking innovations, such as being one of[NB 1] the first commercial implementations of virtual memory. The rewrite of MCP for the B6500 is still in use today in the Unisys ClearPath/MCP line of computers.
GE introduced the GE 600 series with the General Electric Comprehensive Operating Supervisor (GECOS) operating system. After Honeywell acquired GE's computer business, it was renamed to General Comprehensive Operating System (GCOS). Honeywell expanded the use of the GCOS name to cover all its operating systems in the 1970s, though many of its computers had nothing in common with the earlier GE 600 series and their operating systems were not derived from the original GECOS.
Project MAC at MIT, working with GE and BTL, developed Multics, which introduced the concept of ringed security privilege levels.
Digital Equipment Corporation developed many operating systems for its various computer lines, including TOPS-10 and TOPS-20 time sharing systems for the 36-bit PDP-10 class systems. Before the widespread use of Unix, TOPS-10 was a particularly popular system in universities, and in the early ARPANET community.
In the late 1960s through the late 1970s, several hardware capabilities evolved that allowed similar or ported software to run on more than one system. Early systems had utilized microprogramming to implement features on their systems in order to permit different underlying architecture to appear to be the same as others in a series. In fact most 360's after the 360/40 (except the 360/165 and 360/168) were microprogrammed implementations. But soon other means of achieving application compatibility were proven to be more significant.
The Universal Time-Sharing System (UTS) was an operating system for the XDS Sigma line of computers, succeeding BTM/BPM. UTS was announced in 1966, but because of delays did not actually ship until 1971. It was designed to provide multi-programming services for online (interactive) user programs in addition to batch-mode production jobs, symbiont (spooled) I/O, and critical real-time processes. System Daemons, called "ghost jobs" were used to run monitor code in user space. The final release, D00, shipped in January, 1973. It was succeeded by the CP-V operating system, which combined UTS with the heavily batch-oriented XOS.
The CP-V operating system, the compatible successor to UTS, was released in August, 1973.CP-V supported the same CPUs as UTS plus the Xerox 560. CP-V offered "single-stream and multiprogrammed batch; timesharing; and the remote processing mode, including intelligent remote batch." Realtime processing was added in release B00 in April, 1974, and transaction processing in release C00 in November, 1974.
CP-R (Control Program for Real-Time) was a realtime variant of CP-V for Xerox 550 and Sigma 9 computer systems. CP-R supported three types of tasks: Foreground Primary Tasks, Foreground Secondary Tasks, and Batch Tasks.
In 1975 Xerox sold its computer business to Honeywell, who ported CP-V to their 6000 computer line and renamed it CP-6. CP-6 was discontinued in 1988.
Minicomputers and the rise of Unix
The beginnings of the Unix operating system was developed at AT&T Bell Laboratories in the late 1960s. Because it was essentially free in early editions, easily obtainable, and easily modified, it achieved wide acceptance. It also became a requirement within the Bell systems operating companies. Since it was written in the C language, when that language was ported to a new machine architecture, Unix was also able to be ported. This portability permitted it to become the choice for a second generation of minicomputers and the first generation of workstations. By widespread use it exemplified the idea of an operating system that was conceptually the same across various hardware platforms. It still was owned by AT&T and that limited its use to groups or corporations who could afford to license it. It became one of the roots of the free software and open source movements.
Other than that, Digital Equipment Corporation created several operating systems for its 16-bit PDP-11 class machines, including the simple RT-11 system, the time-sharing RSTS operating systems, and the RSX-11 family of real-time operating systems, and the VMS system for the 32-bit VAX computer.
Another system which evolved in this time frame was the Pick operating system. The Pick system was developed and sold by Microdata Corporation who created the precursors of the system. The system is an example of a system which started as a database application support program and graduated to system work.
The Microcomputer: 8-bit home computers and game consoles
Beginning in the mid-1970s, a new class of small computers came onto the marketplace. Featuring 8-bit processors, typically the MOS Technology 6502, Intel 8080 or the Zilog Z-80, along with rudimentary input and output interfaces and as much RAM as practical, these systems started out as kit-based hobbyist computers but soon evolved into an essential business tool.
Home computers
While many 8-bit home computers of the 1980s, such as the Commodore 64, Apple II series, the Atari 8-bit, the Amstrad CPC, ZX Spectrum series and others could load a third-party disk-loading operating system, such as CP/M or GEOS, they were generally used without one. Their built-in operating systems were designed in an era when floppy disk drives were very expensive and not expected to be used by most users, so the standard storage device on most was a tape drive using standard compact cassettes. Most, if not all, of these computers shipped with a built-in BASIC interpreter on ROM, which also served as a crude command line interface, allowing the user to load a separate disk operating system to perform file management commands and load and save to disk. The most popular home computer, the Commodore 64, was a notable exception, as its DOS was on ROM in the disk drive hardware, and the drive was addressed identically to printers, modems, and other external devices.
More elaborate operating systems were not needed in part because most such machines were used for entertainment and education, and seldom used for more serious business or science purposes.
Another reason is that the hardware they evolved on initially shipped with minimal amounts of computer memory—4-8 kilobytes was standard on early home computers—as well as 8-bit processors without specialized support circuitry like a MMU or even a dedicated real-time clock. On this hardware, a complex operating system's overhead supporting multiple tasks and users would likely compromise the performance of the machine without really being needed.
Video games and even the available spreadsheet, database and word processors for home computers were mostly self-contained programs that took over the machine completely. Although integrated software existed for these computers, they usually lacked features compared to their standalone equivalents, largely due to memory limitations. Data exchange was mostly performed though standard formats like ASCII text or CSV, or through specialized file conversion programs.
Game consoles and video games
Since virtually all video game consoles and arcade cabinets designed and built after 1980 were true digital machines (unlike the analog Pong clones and derivatives), some of them carried a minimal form of BIOS or built-in game, such as the ColecoVision, the Sega Master System and the SNK Neo Geo. There were however successful designs where a BIOS was not needed, such as the original Nintendo Entertainment System and its clones.
Modern day game consoles and videogames, starting with the PC-Engine, all have a minimal BIOS that also provides some interactive utilities such as memory card management, audio or video CD playback, copy protection and sometimes carry libraries for developers to use etc. Few of these cases, however, would qualify as a true operating system.
The most notable exceptions are probably the Dreamcast game console which includes a minimal BIOS, like the PlayStation, but can load the Windows CE operating system from the game disk allowing easily porting of games from the PC world, and the Xbox game console, which is little more than a disguised Intel-based PC running a secret, modified version of Microsoft Windows in the background. Furthermore, there are Linux versions that will run on a Dreamcast and later game consoles as well.
Long before that, Sony had released a kind of development kit called the Net Yaroze for its first PlayStation platform, which provided a series of programming and developing tools to be used with a normal PC and a specially modified "Black PlayStation" that could be interfaced with a PC and download programs from it. These operations require in general a functional OS on both platforms involved.
In general, it can be said that videogame consoles and arcade coin operated machines used at most a built-in BIOS during the 1970s, 1980s and most of the 1990s, while from the PlayStation era and beyond they started getting more and more sophisticated, to the point of requiring a generic or custom-built OS for aiding in development and expandability.
The personal computer era: Apple, Amiga, PC/MS/DR-DOS and beyond
The development of microprocessors made inexpensive computing available for the small business and hobbyist, which in turn led to the widespread use of interchangeable hardware components using a common interconnection (such as the S-100, SS-50, Apple II, ISA, and PCI buses), and an increasing need for "standard" operating systems to control them. The most important of the early OSes on these machines was Digital Research's CP/M-80 for the 8080 / 8085 / Z-80 CPUs. It was based on several Digital Equipment Corporation operating systems, mostly for the PDP-11 architecture. Microsoft's first operating system, M-DOS, was designed along many of the PDP-11 features, but for microprocessor based systems. MS-DOS, or PC-DOS when supplied by IBM, was based originally on CP/M-80. Each of these machines had a small boot program in ROM which loaded the OS itself from disk. The BIOS on the IBM-PC class machines was an extension of this idea and has accreted more features and functions in the 20 years since the first IBM-PC was introduced in 1981.
The decreasing cost of display equipment and processors made it practical to provide graphical user interfaces for many operating systems, such as the generic X Window System that is provided with many Unix systems, or other graphical systems such as Microsoft Windows, the RadioShack Color Computer's OS-9 Level II/MultiVue, Commodore's AmigaOS, Apple's Mac OS, or even IBM's OS/2. The original GUI was developed on the Xerox Alto computer system at Xerox Palo Alto Research Center in the early '70s and commercialized by many vendors.
The rise of virtualization
Operating systems originally ran directly on the hardware itself and provided services to applications. With CP-67 on the IBM System/360 Model 67 and Virtual Machine Facility/370 (VM/370) on System/370, IBM introduced the notion of virtual machine, where the operating system itself runs under the control of a hypervisor, instead of being in direct control of the hardware. VMware popularized this technology on personal computers. Over time, the line between virtual machines, monitors, and operating systems was blurred:
- Hypervisors grew more complex, gaining their own application programming interface,memory management or file system.
- Virtualization becomes a key feature of operating systems, as exemplified by Hyper-V in Windows Server 2008 or HP Integrity Virtual Machines in HP-UX.
- In some systems, such as POWER5 and POWER6-based servers from IBM, the hypervisor is no longer optional.
- Applications have been re-designed to run directly on a virtual machine monitor.
In many ways, virtual machine software today plays the role formerly held by the operating system, including managing the hardware resources (processor, memory, I/O devices), applying scheduling policies, or allowing system administrators to manage systems. |