INTRODUCTION TO PART I
FOUNDATIONS OF COMPUTER SECURITY
The foundations of computer security include answers to the superficially simple question “What is this all about?” Our first part establishes a technological and historical context for information assurance so that readers will have a broad understanding of why information assurance matters in the real world. Chapters focus on principles that will underlie the rest of the text: historical perspective on the development of our field; how to conceptualize the goals of information assurance in a well-ordered schema that can be applied universally to all information systems; computer hardware and network elements underlying technical security; history and modern developments in cryptography; and how to discuss breaches of information security using a common technical language so that information can be shared, accumulated, and analyzed.
Readers also learn or review the basics of commonly used mathematical models of information-security concepts and how to interpret survey data and, in particular, the pitfalls of self-selection in sampling about crimes. Finally, the first section of the text introduces elements of law (U.S. and international) applying to information assurance. This legal framework from a layman’s viewpoint provides a basis for understanding later chapters; in particular, when examining privacy laws and management’s fiduciary responsibilities.
Chapter titles and topics in Part I include:
1. Brief History and Mission of Information System Security. An overview focusing primarily on developments in the second half of the twentieth century and the first decade of the twenty-first century
2. History of Computer Crime. A review of key computer crimes and notorious computer criminals from the 1970s to the mid-2000s
3. Toward a New Framework for Information Security. A systematic and thor- ough conceptual framework and terminology for discussing the nature and goals of securing all aspects of information, not simply the classic triad of confiden- tiality, integrity, and availability
4. Hardware Elements of Security. A review of computer and network hardware underlying discussions of computer and network security
I · 1 Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
I · 2 FOUNDATIONS OF COMPUTER SECURITY
5. Data Communications and Information Security. Fundamental principles and terminology of data communications, and their implications for information assurance
6. Local Area Network Topologies, Protocols, and Design. Information assur- ance of the communications infrastructure
7. Encryption. Historical perspectives on cryptography and steganography from ancient times to today as fundamental tools in securing information
8. Using a Common Language for Computer Security Incident Information. An analytic framework for understanding, describing, and discussing security breaches by using a common language of well-defined terms
9. Mathematical Models of Computer Security. A review of the most commonly referenced mathematical models used to describe information-security functions
10. Understanding Studies and Surveys of Computer Crime. Scientific and sta- tistical principles for understanding studies and surveys of computer crime
11. Fundamentals of Intellectual Property Law. An introductory review of cy- berlaw: laws governing computer-related crime, including contracts, and intel- lectual property (trade secrets, copyright, patents, open-source models). Also, violations (piracy, circumvention of technological defenses), computer intru- sions, and international frameworks for legal cooperation
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
1CHAPTER
BRIEF HISTORY AND MISSION OF INFORMATION SYSTEM SECURITY
Seymour Bosworth and Robert V. Jacobson
1.1 INTRODUCTION TO INFORMATION SYSTEM SECURITY 1 · 1
1.2 EVOLUTION OF INFORMATION SYSTEMS 1 · 3 1.2.1 1950s: Punched-Card
Systems 1·4 1.2.2 Large-Scale Computers 1·4 1.2.3 Medium-Size Computers 1·5 1.2.4 1960s: Small-Scale
Computers 1·6 1.2.5 Transistors and Core
Memory 1·7 1.2.6 Time Sharing 1·7 1.2.7 Real-Time, Online
Systems 1·7 1.2.8 A Family of Computers 1·7 1.2.9 1970s: Microprocessors 1·8 1.2.10 The First Personal
Computers 1·8 1.2.11 The First Network 1·8 1.2.12 Further Security
Considerations 1·9 1.2.13 The First “Worm” 1·9 1.2.14 1980s: Productivity
Enhancements 1·9
1.2.15 1980s: The Personal Computer 1·9
1.2.16 Local Area Networks 1·10 1.2.17 1990s: Interconnection 1·11 1.2.18 1990s: Total
Interconnection 1·12 1.2.19 Telecommuting 1·12 1.2.20 Internet and the World
Wide Web 1·12 1.2.21 Virtualization and the
Cloud 1·13 1.2.22 Supervisory Control
and Data Acquisition 1·13
1.3 GOVERNMENT RECOGNITION OF INFORMATION ASSURANCE 1 · 13 1.3.1 IA Standards 1·13 1.3.2 Computers at Risk 1·14 1.3.3 InfraGard 1·18
1.4 RECENT DEVELOPMENTS 1 · 19
1.5 ONGOING MISSION FOR INFORMATION SYSTEM SECURITY 1 · 20
1.6 NOTES 1 · 20
1.1 INTRODUCTION TO INFORMATION SYSTEM SECURITY. The growth of computers and of information technology has been explosive. Never before has an entirely new technology been propagated around the world with such speed and with so great a penetration of virtually every human activity. Computers have brought vast benefits to fields as diverse as human genome studies, space exploration, artificial intelligence, and a host of applications from the trivial to the most life-enhancing.
Unfortunately, there is also a dark side to computers: They are used to design and build weapons of mass destruction as well as military aircraft, nuclear submarines,
1 · 1 Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
1 · 2 BRIEF HISTORY AND MISSION OF INFORMATION SYSTEM SECURITY
and reconnaissance space stations. The computer’s role in formulating biologic and chemical weapons, and in simulating their deployment, is one of its least auspicious uses.
Of somewhat lesser concern, computers used in financial applications, such as fa- cilitating the purchase and sales of everything from matchsticks to mansions, and transferring trillions of dollars each day in electronic funds, are irresistible to miscre- ants; many of them see these activities as open invitations to fraud and theft. Computer systems, and their interconnecting networks, are also prey to vandals, malicious ego- tists, terrorists, and an array of individuals, groups, companies, and governments intent on using them to further their own ends, with total disregard for the effects on innocent victims. Besides these intentional attacks on computer systems, there are innumerable ways in which inadvertent errors can damage or destroy a computer’s ability to perform its intended functions.
Because of these security problems and because of a great many others described in this volume, the growth of information systems security has paralleled that of the computer field itself. Only by a detailed study of the potential problems, and implementation of the suggested solutions, can computers be expected to fulfill their promise, with few of the security lapses that plague less adequately protected systems. This chapter defines a few of the most important terms of information security and includes a very brief history of computers and information systems, as a prelude to the works that follow. Security can be defined as the state of being free from danger and not exposed to
damage from accidents or attack, or it can be defined as the process for achieving that desirable state. The objective of information system security1 is to optimize the performance of an organization with respect to the risks to which it is exposed. Risk is defined as the chance of injury, damage, or loss. Thus, risk has two elements:
(1) chance—an element of uncertainty, and (2) potential loss or damage. Except for the possibility of restitution, information system security actions taken today work to reduce future risk losses. Because of the uncertainty about future risk losses, perfect security, which implies zero losses, would be infinitely expensive. For this reason, risk managers strive to optimize the allocation of resources by minimizing the total cost of information system security measures taken and the risk losses experienced. This optimization process is commonly referred to as risk management. Risk management in this sense is a three-part process:
1. Identification of material risks 2. Selection and implementation of measures to mitigate the risks 3. Tracking and evaluating of risk losses experienced, in order to validate the first
two parts of the process
The purpose of this Handbook is to describe information security system risks, the measures available to mitigate these risks, and techniques for managing security risks. (For a more detailed discussion of risk assessment and management, see Chapters 47 and 54.)
Risk management has been a part of business for centuries. Renaissance merchants often used several vessels simultaneously, each carrying a portion of the merchandise, so that the loss of a single ship would not result in loss of the entire lot. At almost the same time, the concept of insurance evolved, first to provide economic protection against the loss of cargo and later to provide protection against the loss of buildings
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
EVOLUTION OF INFORMATION SYSTEMS 1 · 3
by fire. Fire insurers and municipal authorities began to require adherence to standards intended to reduce the risk of catastrophes like the Great Fire of London in 1666. The Insurance Institute was established in London one year later. With the emergence of corporations as limited liability stock companies, corporate directors have been required to use prudence and due diligence in protecting shareholders’ assets. Security risks are among the threats to corporate assets that directors have an obligation to address.
Double-entry bookkeeping, another Renaissance invention, proved to be an excel- lent tool for measuring and controlling corporate assets. One objective was to make insider fraud more difficult to conceal. The concept of separation of duties emerged, calling for the use of processing procedures that required more than one person to complete a transaction. As the books of account became increasingly important, ac- counting standards were developed, and they continue to evolve to this day. These standards served to make books of account comparable and to assure outsiders that an organization’s books of account presented an accurate picture of its condition and assets. These developments led, in turn, to the requirement that an outside auditor perform an independent review of the books of account and operating procedures.
The transition to automated accounting systems introduced additional security re- quirements. Some early safeguards, such as the rule against erasures or changes in the books of account, no longer applied. Some computerized accounting systems lacked an audit trail, and others could have the audit trail subverted as easily as actual entries.
Finally, with the advent of the Information Age, intellectual property has become an increasingly important part of corporate and governmental assets. At the same time that intellectual property has grown in importance, threats to intellectual property have become more dangerous, because of information system (IS) technology itself. When sensitive information was stored on paper and other tangible documents, and rapid copying was limited to photography, protection was relatively straightforward. Nev- ertheless, document control systems, information classification procedures, and need- to-know access controls were not foolproof, and information compromises occurred with dismaying regularity. Evolution of IS technology has made information control several orders of magnitude more complex. The evolution and, more importantly, the implementation of control techniques have not kept pace.
The balance of this chapter describes how the evolution of information systems has caused a parallel evolution of information system security and at the same time has increased the importance of anticipating the impact of technical changes yet to come. This overview will clarify the factors leading to today’s information system security risk environment and mitigation techniques and will serve as a warning to remain alert to the implication of technical innovations as they appear. The remaining chapters of this Handbook discuss information system security risks, threats, and vulnerabilities, their prevention and remediation, and many related topics in considerable detail.
1.2 EVOLUTION OF INFORMATION SYSTEMS. The first electromechanical punched-card system for data processing, developed by Herman Hollerith at the end of the nineteenth century, was used to tabulate and total census field reports for the U.S. Bureau of the Census in 1890. The first digital, stored-program computers developed in the 1940s were used for military purposes, primarily cryptanalysis and the calculation and printing of artillery firing tables. At the same time, punched-card systems were already being used for accounting applications and were an obvious choice for data input to the new electronic computing machines.
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
1 · 4 BRIEF HISTORY AND MISSION OF INFORMATION SYSTEM SECURITY
1.2.1 1950s: Punched-Card Systems. In the 1950s, punched-card equip- ment dominated the commercial computer market.2 These electromechanical devices could perform the full range of accounting and reporting functions. Because they were programmed by an intricate system of plugboards with a great many plug-in cables, and because care had to be exercised in handling and storing punched cards, only expe- rienced persons were permitted near the equipment. Although any of these individuals could have set up the equipment for fraudulent use, or even engaged in sabotage, apparently few, if any, actually did so.
The punched-card accounting systems typically used four processing steps. As a pre- liminary, operators would be given a “batch” of documents, typically with an adding machine tape showing one or more “control totals.” The operator keyed the data on each document into a punched card and then added an extra card, the batch control card, which stored the batch totals. Each card consisted of 80 columns, each containing, at most, one character. A complete record of an inventory item, for example, would be contained on a single card. The card was called a unit record, and the machines that pro- cessed the cards were called either unit record or punched-card machines. It was from the necessity to squeeze as much data as possible into an 80-character card that the later Year 2000 problem arose. Compressing the year into two characters was a universally used space-saving measure; its consequences 40 years later were not foreseen.
A group of punched cards, also called a “batch,” were commonly held in a metal tray. Sometimes a batch would be rekeyed by a second operator, using a “verify-mode” rather than actually punching new holes in the cards, in order to detect keypunch errors before processing the card deck. Each batch of cards would be processed separately, so the processes were referred to as “batch jobs.”
The first step would be to run the batch of cards through a simple program, which would calculate the control totals and compare them with the totals on the batch control card. If the batch totals did not reconcile, the batch was sent back to the keypunch area for rekeying. If the totals reconciled, the deck would be sort-merged with other batches of the same transaction type, for example, the current payroll. When this step was complete, the new batch consisted of a punched card for each employee in employee- number order. The payroll program accepted this input data card deck and processed the cards one by one. Each card was matched up with the corresponding employee’s card in the payroll master deck to calculate the current net pay and itemized deductions and to punch a new payroll master card, including year-to-date totals. The final step was to use the card decks to print payroll checks and management reports. These steps were identical with those used by early, small-scale electronic computers. The only difference was in the speed at which the actual calculations were made. A complete process was still known as a batch job.
With this process, the potential for abuse was great. The machine operator could control every step of the operation. Although the data was punched into cards and verified by others, there was always a keypunch machine nearby for use by the machine operator. Theoretically, that person could punch a new payroll card and a new batch total card to match the change before printing checks and again afterward. The low incidence of reported exploits was due to the controls that discouraged such abuse, and possibly to the pride that machine operators experienced in their jobs.
1.2.2 Large-Scale Computers. While these electromechanical punched card machines were sold in large numbers, research laboratories and universities were working to design large-scale computers that would have a revolutionary effect on
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
EVOLUTION OF INFORMATION SYSTEMS 1 · 5
the entire field. These computers, built around vacuum tubes, are known as the first generation. In March 1951, the first Universal Automatic Computer (UNIVAC) was accepted by the U.S. Census Bureau. Until then, every computer had been a one-off design, but UNIVAC was the first large-scale, mass-produced computer, with a total of 46 built. The word “universal” in its name indicated that UNIVAC was also the first computer designed for both scientific and business applications.3
UNIVAC contained 5,200 vacuum tubes, weighed 29,000 pounds, and consumed 125 kilowatts of electrical power. It dispensed with punched cards, receiving input from half-inch-wide metal tape recorded from keyboards, with output either to a similar tape or to a printer. Although not a model for future designs, its memory consisted of 1,000 72-bit words and was fabricated as a mercury delay line. Housed in a cabinet about six feet tall, two feet wide, and two feet deep was a mercury-filled coil running from top to bottom. A transducer at the top propagated slow-moving waves of energy down the coil to a receiving transducer at the bottom. There it was reconverted into electrical energy and passed on to the appropriate circuit, or recirculated if longer storage was required.
In 1956, IBM introduced the Random Access Method of Accounting and Control (RAMAC) magnetic disk system. It consisted of 50 magnetically coated metal disks, each 24 inches in diameter, mounted on a common spindle. Under servo control, two coupled read/write heads moved to span each side of the required disk and then inward to any one of 100 tracks. In one revolution of the disks, any or all of the information on those two tracks could be read out or recorded. The entire system was almost the size of a compact car and held what, for that time, was a tremendous amount of data—5 megabytes. The cost was $10,000 per megabyte, or $35,000 per year to lease. This compares with some of today’s magnetic hard drives that measure about 31∕2 inches wide by 1 inch high, store as much as 1,000 gigabytes, and cost less than $400, or about $0.0004 per megabyte.
Those early, massive computers were housed in large, climate-controlled rooms. Within the room, a few knowledgeable experts, looking highly professional in their white laboratory coats, attended to the operation and maintenance of their million- dollar charges. The concept of a “user” as someone outside the computer room who could interact directly with the actual machine did not exist.
Service interruptions, software errors, and hardware errors were usually not critical. If any of these caused a program to fail or abort, beginning again was a relatively simple matter. Consequently, the primary security concerns were physical protection of the scarce and expensive hardware, and measures to increase their reliability. Another issue, then as now, was human fallibility. Because the earliest computers were programmed in extremely difficult machine language, consisting solely of ones (1s) and zeros (0s), the incidence of human error was high, and the time to correct errors was excessively long. Only later were assembler and compiler languages developed to increase the number of people able to program the machines and to reduce the incidence of errors and the time to correct them.
Information system security for large-scale computers was not a significant issue then for two reasons. First, only a few programming experts were able to utilize and manipulate computers. Second, there were very few computers in use, each of which was extremely valuable, important to its owners, and consequently closely guarded.
1.2.3 Medium-Size Computers. In the 1950s, smaller computer systems were developed with a very simple configuration; punched-card master files were replaced by punched paper tape and, later, by magnetic tape, and disk storage systems. The
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
1 · 6 BRIEF HISTORY AND MISSION OF INFORMATION SYSTEM SECURITY
electromechanical calculator with its patchboard was replaced by a central processor unit (CPU) that had a small main memory, sometimes as little as 8 kilobytes,4 and limited processing speed and power. One or two punched-card readers could read the data and instructions stored on that medium. Later, programs and data files were stored on magnetic tape. Output data were sent to cardpunches, for printing on unit record equipment, and later to magnetic tape. There was still no wired connection to the outside world, and there were no online users because no one, besides electronic data processing (EDP) people within the computer room, could interact directly with the system. These systems had very simple operating systems and did not use multiprocessing; they could run only one program at a time.
The IBM Model 650, as an example, introduced in 1954, measured about 5 feet by 3 feet by 6 feet and weighed almost 2,000 pounds. Its power supply was mounted in a similarly sized cabinet, weighing almost 3,000 pounds. It had 2,000 (10-digit) words of magnetic drum primary memory, with a total price of $500,000 or a rental fee of $3,200 per month. For an additional $1,500 per month, a much faster core memory, of 60 words, could be added. Input and output both utilized read/write punch-card machines. The typical 1950s IS hardware was installed in a separate room, often with a viewing window so that visitors could admire the computer. In an early attempt at security, visitors actually within the computer room were often greeted by a printed sign saying:
Achtung! Alles Lookenspeepers!
Das computermachine ist nicht fur gefingerpoken und mittengrabben. Ist easy schnappen der springenwerk, blowenfusen, und poppencorken mit spitzensparken. Ist nicht fur gewerken bei das dumbkopfen. Das rubbernecken sightseeren keepen hans in das pockets muss. Relaxen und watch das blinkenlichten.
Since there were still no online users, there were no user IDs and passwords. Programs processed batches of data, run at a regularly scheduled time—once a day, once a week, and so on, depending on the function. If the data for a program were not available at the scheduled run time, the operators might run some other job instead and wait for the missing data. As the printed output reports became available, they were delivered by hand to their end users. End users did not expect to get a continuous flow of data from the information processing system, and delays of even a day or more were not significant, except perhaps with paycheck production.
Information system security was hardly thought of as such. The focus was on batch controls for individual programs, physical access controls, and maintaining a proper environment for the reliable operation of the hardware.
1.2.4 1960s: Small-Scale Computers. During the 1960s, before the intro- duction of small-scale computers, dumb5 terminals provided users with a keyboard to send a character stream to the computer and a video screen that could display characters transmitted to it by the computer. Initially, these terminals were used to help computer operators control and monitor the job stream, while replacing banks of switches and indicator lights on the control console. However, it was soon recognized that these terminals could replace card readers and keypunch machines as well. Now users, iden- tified by user IDs, and authenticated with passwords, could enter input data through a CRT terminal into an edit program, which would validate the input and then store it on a hard drive until it was needed for processing. Later, it was realized that users also could directly access data stored in online master files.
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
EVOLUTION OF INFORMATION SYSTEMS 1 · 7
1.2.5 Transistors and Core Memory. The IBM 1401, introduced in 1960 with a core memory of 4,096 characters, was the first all-transistor computer, marking the advent of the second generation. Housed in a cabinet measuring 5 feet by 3 feet, the 1401 required a similar cabinet to add an additional 12 kilobytes of main memory. Just one year later, the first integrated circuits were used in a computer, making possible all future advances in miniaturizing small-scale computers and in reducing the size of mainframes significantly.
1.2.6 Time Sharing. In 1961, the Compatible Time Sharing System (CTSS) was developed for the IBM 7090/7094. This operating system software, and its associated hardware, was the first to provide simultaneous remote access to a group of online users through multiprogramming.6 “Multiprogramming” means that more than one program can appear to execute at the same time. A master control program, usually called an operating system (OS), managed execution of the functional applications programs. For example, under the command of the operator, the OS would load and start application #1. After 50 milliseconds, the OS would interrupt the execution of application #1 and store its current state in memory. Then the OS would start application #2 and allow it to run for 50 milliseconds, and so on. Usually, within a second after users had entered keyboard data, the OS would give their applications a time slice to process the input. During each time slice, the computer might execute hundreds of instructions. These techniques made it appear as if the computer were entirely dedicated to each user’s program. This was true only so long as the number of simultaneous users was fairly small. After that, as the number grew, the response to each user slowed down.
1.2.7 Real-Time, Online Systems. Because of multiprogramming and the ability to store records online and accessible in random order, it became feasible to provide end users with direct access to data. For example, an airline reservation system stores a record of every seat on every flight for the next 12 months. A reservation clerk, working at a terminal, can answer a telephoned inquiry, search for an available seat on a particular flight, quote the fare, sell a ticket to the caller, and reserve the seat. Similarly, a bank officer can verify an account balance and effect money transfers. In both cases, each data record can be accessed and modified immediately, rather than having to wait for a batch to be run. Today, both the reservation clerk and the bank officer can be replaced by the customers themselves, who directly interface with the online computers.
While this advance led to a vast increase in available computing power, it also increased greatly the potential for breaches in computer security. With more complex operating systems, with many users online to sensitive programs, and with databases and other files available to them, protection had to be provided against inadvertent error and intentional abuse.
1.2.8 A Family of Computers. In 1964, IBM announced the S/360 family of computers, ranging from very small-scale to very large-scale models. All of the six models used integrated circuits, which marked the beginning of the third generation of computers. Where transistorized construction could permit up to 6,000 transistors per cubic foot, 30,000 integrated circuits could occupy the same volume. This lowered the costs substantially, and companies could buy into the family at a price within their means. Because all computers in the series used the same programming language and the same peripherals, companies could upgrade easily when necessary. The 360 family
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
1 · 8 BRIEF HISTORY AND MISSION OF INFORMATION SYSTEM SECURITY
quickly came to dominate the commercial and scientific markets. As these computers proliferated, so did the number of users, knowledgeable programmers, and technicians. Over the years, techniques and processes were developed to provide a high degree of security to these mainframe systems.
The year 1964 also saw the introduction of another computer with far-reaching influence: the Digital Equipment Corp. (DEC) PDP-8. The PDP-8 was the first mass- produced true minicomputer. Although its original application was in process control, the PDP-8 and its progeny quickly proved that commercial applications for minicom- puters were virtually unlimited. Because these computers were not isolated in secure computer rooms but were distributed throughout many unguarded offices in widely dispersed locations, totally new risks arose, requiring innovative solutions.
1.2.9 1970s: Microprocessors. The foundations of all current personal com- puters (PCs) were laid in 1971 when Intel introduced the 4004 computer on a chip. Measuring 1/16 inch long by 1/8 inch high, the 4004 contained 2,250 transistors with a clock speed of 108 kiloHertz. The current generation of this earliest programmable microprocessor contains millions of transistors, with speeds over 1 gigaHertz, or more than 10,000 times faster. Introduction of microprocessor chips marked the fourth generation.
1.2.10 The First Personal Computers. Possibly the first personal computer was advertised in Scientific American in 1971. The KENBAK–1, priced at $750, had three programming registers, five addressing modes, and 256 bytes of memory. Although not many were sold, the KENBACK–1 did increase public awareness of the possibility for home computers.
It was the MITS Altair 8800 that became the first personal computer to sell in substantial quantities. Like the KENBACK–1, the Altair 8800 had only 256 bytes of memory, but it was priced at $375 without keyboard, display, or secondary memory. About one year later, the Apple II, designed by Steve Jobs and Steve Wozniak, was priced at $1,298, including a CRT display and a keyboard.
Because these first personal computers were entirely stand-alone and usually under the control of a single individual, there were few security problems. However, in 1978, the VisiCalc spreadsheet program was developed. The advantages of standardized, inexpensive, widely used application programs were unquestionable, but packaged programs, as opposed to custom designs, opened the way for abuse because so many people understood their user interfaces as well as their inner workings.
1.2.11 The First Network. A national network, conceived in late 1969, was born as ARPANET7 (Advanced Research Projects Agency Network), a Department of Defense–sponsored effort to link a few of the country’s important research universities, with two purposes: to develop experience in interconnecting computers and to increase productivity through resource sharing. This earliest connection of independent large- scale computer systems had just four nodes: the University of California at Los Angeles (UCLA), the University of California at Santa Barbara, Stanford Research Institute, and the University of Utah. Because of the inherent security in each leased-line inter- connected node, and the physically protected mainframe computer rooms, there was no apparent concern for security issues. It was this simple network, with no thought of security designed in, from which evolved today’s ubiquitous Internet and the World Wide Web (WWW), with their vast potential for security abuses.
Bosworth, S., Kabay, M. E., & Whyne, E. (Eds.). (2014). Computer security handbook, set. Retrieved from http://ebookcentral.proquest.com Created from apus on 2018-03-09 12:02:42.
C o p yr
ig h t ©
2 0 1 4 . Jo
h n W
ile y
& S
o n s,
I n co
rp o ra
te d . A
ll ri g h ts
r e se
rv e d .
EVOLUTION OF INFORMATION SYSTEMS 1 · 9
1.2.12 Further Security Considerations. With the proliferation of remote terminals on commercial computers, physical control over access to the computer room was no longer sufficient. In response to the new vulnerabilities, logical access control systems were developed. An access control system maintains an online table of authorized users. A typical user record would store the user’s name, telephone number, employee number, and information about the data the user was authorized to access and the programs the user was authorized to execute. A user might be allowed to view, add, modify, and delete data records in different combinations for different programs.