Michael E. Whitman Ph.D., CISM, CISSP
Herbert J. Mattord Ph.D., CISM, CISSP
Andrew Green MSIS Kennesaw State University
Principles of Incident Response and Disaster Recovery Second Edition
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
This is an electronic version of the print textbook. Due to electronic rights restrictions, some third party content may be suppressed. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. The publisher reserves the right to
remove content from this title at any time if subsequent rights restrictions require it. For valuable information on pricing, previous editions, changes to current editions, and alternate formats, please visit www.cengage.com/highered to search by
ISBN#, author, title, or keyword for materials in your areas of interest.
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Principles of Incident Response & Disaster Recovery, Second Edition
Michael E. Whitman, Herbert J. Mattord, Andrew Green
Vice President, Careers & Computing:
Dave Garza
Acquisitions Editor: Nick Lombardi
Product Development Manager:
Leigh Hefferon
Senior Product Manager:
Michelle Ruelos Cannistraci
Brand Manager: Kristin McNary
Marketing Development Manager:
Mark Linton
Marketing Coordinator:
Elizabeth Murphy
Senior Production Director:
Wendy Troeger
Production Manager: Andrew Crouth
Senior Content Project Manager:
Andrea Majot
Art Director: GEX
Cover image: iStock.com
© 2014 Course Technology, Cengage Learning
ALL RIGHTS RESERVED. No part of this work covered by the copyright herein may be reproduced, transmitted, stored or used in any form or by any means graphic, electronic, or mechanical, including but not limited to photocopying, recording, scanning, digitizing, taping, Web distribution, information networks, or information storage and retrieval systems, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without the prior written permission of the publisher.
For product information and technology assistance, contact us at
Cengage Learning Customer & Sales Support, 1-800-354-9706
For permission to use material from this text or product,
submit all requests online at cengage.com/permissions
Further permissions questions can be emailed to
permissionrequest@cengage.com
Library of Congress Control Number: 2013932024
ISBN-13: 978-1-111-13805-9
ISBN-10: 1-111-13805-2
Course Technology 20 Channel Center Street Boston, MA 02210 USA
Cengage Learning is a leading provider of customized learning solutions with office locations around the globe, including Singapore, the United Kingdom, Australia, Mexico, Brazil, and Japan. Locate your local office at: international.cengage.com/region
Cengage Learning products are represented in Canada by Nelson Education, Ltd.
For your lifelong learning solutions, visit www.cengage.com/coursetechnology
Purchase any of our products at your local college store or at our preferred online store www.cengagebrain.com
Visit our corporate website at cengage.com.
Some of the product names and company names used in this book have been used for identification purposes only and may be trademarks or registered trademarks of their respective manufacturers and sellers.
Any fictional data related to persons or companies or URLs used throughout this book is intended for instructional purposes only. At the time this book was printed, any such data was fictional and not belonging to any real persons or companies.
Course Technology and the Course Technology logo are registered trademarks used under license.
Course Technology, a part of Cengage Learning, reserves the right to revise this publication and make changes from time to time in its content without notice.
The programs in this book are for instructional purposes only. They have been tested with care, but are not guaranteed for any particular intent beyond educational purposes. The author and the publisher do not offer any warranties or representations, nor do they accept any liabilities with respect to the programs.
Printed in the United States of America 1 2 3 4 5 6 7 16 15 14 13
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
To Rhonda, Rachel, Alex, and Meghan, thank you for your loving support. —MEW
To my daughter, Becky. Always stay strong. —HJM
For my nieces, Lexidoodle and Alliecat, and my nephew Timmy. —AG
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Brief Contents
PREFACE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
CHAPTER 1 An Overview of Information Security and Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
CHAPTER 2 Planning for Organizational Readiness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
CHAPTER 3 Contingency Strategies for IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
CHAPTER 4 Incident Response: Planning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
CHAPTER 5 Incident Response: Detection and Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
CHAPTER 6 Incident Response: Organizing and Preparing the CSIRT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
CHAPTER 7 Incident Response: Response Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
CHAPTER 8 Incident Response: Recovery and Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
CHAPTER 9 Disaster Recovery: Preparation and Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
CHAPTER 10 Disaster Recovery: Operation and Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 409
CHAPTER 11 Business Continuity Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437
CHAPTER 12 Crisis Management and International Standards inIR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477
APPENDIX A Sample Business Continuity Plan for ABC Co. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 529
APPENDIX B Contingency Plan Template from the Computer Security Resource Center at the National Institute of Standards and Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
APPENDIX C Sample Crisis Management Plan for Hierarchical Access, Ltd.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565
GLOSSARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577
INDEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 583
v
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Table of Contents
PREFACE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
CHAPTER 1 An Overview of Information Security and Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Opening Case Scenario: Pernicious Proxy Probing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Information Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Key Information Security Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Overview of Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Know Yourself. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Know the Enemy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Risk Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Risk Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Risk Control Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Contingency Planning and Its Components. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Business Impact Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Incident Response Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Disaster Recovery Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Business Continuity Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Contingency Planning Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Role of Information Security Policy in Developing Contingency Plans. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Key Policy Definitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 Enterprise Information Security Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Issue-Specific Security Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Systems-Specific Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Policy Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Virtualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Ethical Considerations in the Use of Information Security Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Closing Case Scenario: Pondering People . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
CHAPTER 2 Planning for Organizational Readiness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Opening Case Scenario: Proper Planning Prevents Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Beginning the Contingency Planning Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Commitment and Support of Senior Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Elements Required to Begin Contingency Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Contingency Planning Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
A Sample Generic Policy and High-Level Procedures for Contingency Plans . . . . . . . . . . . . . . . . . . . . . . . . . . 55
vii
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Business Impact Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Determine Mission/Business Processes and Recovery Criticality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Identify Resource Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Identify System Resource Recovery Priorities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
BIA Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Online Questionnaires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Facilitated Data-Gathering Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Process Flows and Interdependency Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Risk Assessment Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 IT Application or System Logs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Financial Reports and Departmental Budgets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Audit Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Production Schedules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Budgeting for Contingency Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Incident Response Budgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Disaster Recovery Budgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Business Continuity Budgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Crisis Management Budgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Closing Case Scenario: Outrageously Odd Outages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
CHAPTER 3 Contingency Strategies for IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Opening Scenario: Panicking over Powder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Data and Application Resumption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Online Backups and the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Disk to Disk to Other: Delayed Protection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Redundancy-Based Backup and Recovery Using RAID . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 Database Backups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 Application Backups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 Backup and Recovery Plans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Real-Time Protection, Server Recovery, and Application Recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Site Resumption Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 Exclusive Site Resumption Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 Shared-Site Resumption Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 Service Agreements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 Hands-On Project 3-1: Command-line Backup Using rdiff-backup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124 Hands-On Project 3-2: Copying Virtual Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Closing Case Scenario: Disaster Denied . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
viii Table of Contents
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
CHAPTER 4 Incident Response: Planning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Opening Case Scenario: DDoS Dilemma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
The IR Planning Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 Forming the IR Planning Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Developing the Incident Response Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 Building the Computer Security Incident Response Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Incident Response Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Information for attack success end case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 Planning for the Response During the Incident . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 Planning for “After the Incident”. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Reaction!. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 Planning for “Before the Incident” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
The CCDC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Assembling and Maintaining the Final IR Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Closing Case Scenario: The Never-Ending Story . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
CHAPTER 5 Incident Response: Detection and Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Opening Case Scenario: Oodles of Open Source Opportunities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
Detecting Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 Possible Indicators of an Incident. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 Probable Indicators of an Incident . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Technical Details: Rootkits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 Definite Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 Identifying Real Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Intrusion Detection and Prevention Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
Technical Details: Processes and Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 IDPS Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 Why Use an IDPS? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 IDPS Network Placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Technical Details: Ports and Port Scanning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 IDPS Detection Approaches. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204 Automated Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Incident Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208 Collection of Data to Aid in Detecting Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 Challenges in Intrusion Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
Table of Contents ix
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Closing Case Scenario: Jokes with JJ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
CHAPTER 6 Incident Response: Organizing and Preparing the CSIRT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Opening Case Scenario: Trouble in Tuscaloosa . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
Building the CSIRT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233 Step 1: Obtaining Management Support and Buy-In . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234 Step 2: Determining the CSIRT Strategic Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234 Step 3: Gathering Relevant Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240 Step 4: Designing the CSIRT Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
A Sample Generic Policy and High-Level Procedures for Contingency Plans . . . . . . . . . . . . . . . . . . . . . . . . . 243 Step 5: Communicating the CSIRT’s Vision and Operational Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 Step 6: Beginning CSIRT Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 Step 7: Announce the operational CSIRT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250 Step 8: Evaluating CSIRT Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250 Final Thoughts on CSIRT Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Outsourcing Incident Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 Current and Future Quality of Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 Division of Responsibilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 Sensitive Information Revealed to the Contractor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 Lack of Organization-Specific Knowledge. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 Lack of Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254 Handling Incidents at Multiple Locations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254 Maintaining IR Skills In-House . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Closing Case Scenario: Proud to Participate in Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
CHAPTER 7 Incident Response: Response Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Opening Case Scenario: Viral Vandal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
IR Response Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269 Response Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270 Incident Containment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
The Cuckoo’s Egg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273 Incident Eradication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274 Incident Recovery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
Incident Containment and Eradication Strategies for Specific Attacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
Egghead . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276 Handling Denial of Service (DoS) Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
x Table of Contents
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Malware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282 Unauthorized Access. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 Inappropriate Use. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295 Hybrid or Multicomponent Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299
Automated IR Response Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
Closing Case Scenario: Worrisome Worms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
CHAPTER 8 Incident Response: Recovery and Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Opening Case Scenario: Wily Worms Wake Workers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
Recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 Identify and Resolve Vulnerabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 Restore Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316 Restore Services and Processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316 Restore Confidence across the Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 After-Action Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 Plan Review and Maintenance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 Rehearsal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 Law Enforcement Involvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 Reporting to Upper Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321 Loss Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
Sample Impact Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
Incident Forensics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322 Legal Issues in Digital Forensics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323 Digital Forensics Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324
Technical Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325 Digital Forensics Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
eDiscovery and Anti-Forensics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
Closing Case Scenario: Bureaucratic Blamestorms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365
CHAPTER 9 Disaster Recovery: Preparation and Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
Opening Case Scenario: Flames Force Fan Fury . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370
Disaster Classifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
Table of Contents xi
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Forming the Disaster Recovery Team. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 Organization of the DR Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 Special Documentation and Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 376
Disaster Recovery Planning Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377 Develop the DR Planning Policy Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378 Review the Business Impact Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382 Identify Preventive Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382 Develop Recovery Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382 Develop the DR Plan Document . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383 Plan Testing, Training, and Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386 Plan Maintenance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
Information Technology Contingency Planning Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387 Client/Server Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 Data Communications Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389 Mainframe Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
Sample Disaster Recovery Plans. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391 The Business Resumption Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393
The DR Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
Closing Case Scenario: Proactively Pondering Potential Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
CHAPTER 10 Disaster Recovery: Operation and Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 409
Opening Case Scenario: Dastardly Disaster Drives Dialing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
Facing Key Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
Preparation: Training the DR Team and the Users . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412 Plan Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413 Plan Triggers and Notification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 Disaster Recovery Planning as Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 DR Training and Awareness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 DR Plan Testing and Rehearsal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 421 Rehearsal and Testing of the Alert Roster. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
Disaster Response Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423
Recovery Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424
Resumption Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424
Restoration Phase. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425 Repair or Replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425 Restoration of the Primary Site . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 426 Relocation from Temporary Offices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 426 Resumption at the Primary Site . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427 Standing Down and the After-Action Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 428
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
xii Table of Contents
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 430
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 430
Closing Case Scenario: Smart Susan Starts Studying . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436
CHAPTER 11 Business Continuity Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437
Opening Case Scenario: Lovely Local Location . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 438
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 439
Business Continuity Team. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 440 BC Team Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441 Special Documentation and Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 442
Business Continuity Policy and Plan Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443 Develop the BC Planning Policy Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444 Review the BIA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448 Identify Preventive Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448 Create BC Contingency (Relocation) Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448 Develop the BC Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449 Ensure BC Plan Testing, Training, and Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Ensure BC Plan Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Sample Business Continuity Plans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453
Implementing the BC Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Preparation for BC Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454 Returning to a Primary Site . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 457 BC After-Action Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459
Continuous Improvement of the BC Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459 Improving the BC Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459 Improving the BC Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
Maintaining the BC Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465 Periodic BC Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465 BC Plan Archivist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 466
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 466
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 469
Closing Case Scenario: Exciting Emergency Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475
CHAPTER 12 Crisis Management and International Standards inIR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477
Opening Case Scenario: Terrible Tragedy Today . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 478
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 478
Crisis Management in the Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 479 Crisis Terms and Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 479 Crisis Misconceptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 481
Preparing for Crisis Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 482 General Preparation Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 482 Organizing the Crisis Management Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483
Table of Contents xiii
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Crisis Management Critical Success Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485 Developing the Crisis Management Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 487 Crisis Management Training and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 490
Ongoing Case: Alert Roster Test at HAL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491
Post-crisis Trauma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494 Posttraumatic Stress Disorder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494 Employee Assistance Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495 Immediately after the Crisis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495
Getting People Back to Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496 Dealing with Loss. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496
Law Enforcement Involvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 497 Federal Agencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 498 Local Agencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
Managing Crisis Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502 Crisis Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
The 11 Steps Of Crisis Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 503 Avoiding Unnecessary Blame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 508
Succession Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509 Elements of Succession Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 510 Succession Planning Approaches for Crisis Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
International Standards in IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 513 NIST Standards and Publications in IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 513 ISO Standards and Publications in IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 513 Other Standards and Publications in IR/DR/BC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515
Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
Review Questions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 519
Real-World Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 519
Hands-On Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 520
Closing Case Scenario: Boorish Board Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
APPENDIX A Sample Business Continuity Plan for ABC Co. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 529
APPENDIX B Contingency Plan Template from the Computer Security Resource Center at the National Institute of Standards and Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
APPENDIX C Sample Crisis Management Plan for Hierarchical Access, Ltd. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565
GLOSSARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577
INDEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 583
xiv Table of Contents
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Preface
As global networks expand the interconnection of the world’s technically complex infra- structure, communication and computing systems gain added importance. Information secu- rity has gained in importance as a professional practice, and information security has emerged as an academic discipline. Recent events, such as malware attacks and successful hacking efforts, have pointed out the weaknesses inherent in unprotected systems and exposed the need for heightened security of these systems. In order to secure technologically advanced systems and networks, both education and the infrastructure to deliver that educa- tion are needed to prepare the next generation of information technology and information security professionals to develop a more secure and ethical computing environment. There- fore, improved tools and more sophisticated techniques are needed to prepare students to recognize the threats and vulnerabilities present in existing systems and to design and develop the secure systems needed in the near future. Many years have passed since the need for improved information security education has been recognized, and as Dr. Ernest McDuffie of NIST points out:
While there is no doubt that technology has changed the way we live, work, and play, there are very real threats associated with the increased use of technology and our growing dependence on cyberspace….
Education can prepare the general public to identify and avoid risks in cyber- space; education will ready the cybersecurity workforce of tomorrow; and
xv
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
education can keep today’s cybersecurity professionals at the leading edge of the latest technology and mitigation strategies.
Source: NIST
The need for improvements in information security education is so great that the U.S. National Secu- rity Agency (NSA) has established Centers of Academic Excellence in Information Assurance, as described in Presidential Decision Directive 63, “The Policy on Critical Infrastructure Protection,” May 1998:
The program goal is to reduce vulnerabilities in our National Information Infrastructure by promoting higher education in information assurance, and producing a growing num- ber of professionals with IA expertise in various disciplines.
Source: National Security Agency
The technical nature of the dominant texts on the market does not meet the needs of students who have a major other than computer science, computer engineering, or electronic engineering. This is a key concern for academics who wish to focus on delivering skilled undergraduates to the commer- cial information technology (IT) sector. Specifically, there is a clear need for information security, information systems, criminal justice, political science, and accounting information systems students to gain a clear understanding of the foundations of information security.
Approach This book provides an overview of contingency operations and its components as well as a thorough treatment of the administration of the planning process for incident response, disaster recovery, and business continuity. It can be used to support course delivery for information-security-driven programs targeted at information technology students, as well as IT management and technology management curricula aimed at business or technical management students.
Learning Support—Each chapter includes a Chapter Summary and a set of open-ended Review Questions. These are used to reinforce learning of the subject matter presented in the chapter.
Chapter Scenarios—Each chapter opens and closes with a case scenario that follows the same fic- tional company as it encounters various contingency planning or operational issues. The closing sce- nario also includes a few discussion questions. These questions give the student and the instructor an opportunity to discuss the issues that underlie the content.
Hands-On Learning—At the end of each chapter, Real-World Exercises and Hands-On Projects are provided. These give students the opportunity to examine the contingency planning arena outside the classroom. Using these exercises, students can pursue the learning objectives listed at the begin- ning of each chapter and deepen their understanding of the text material.
Boxed Examples—These supplemental sections, which feature examples not associated with the ongoing case study, are included to illustrate key learning objectives or extend the coverage of plans and policies.
New to This Edition This edition provides a greater level of detail than the previous edition, specifically in the examination of incident response activities. It incorporates new approaches and methods that have been developed at NIST. Although the material on disaster recovery, business continuity, and crisis management has not
xvi Preface
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
been reduced, the text’s focus now follows that of the IT industry in shifting to the prevention, detection, reaction to, and recovery from computer-based incidents and avoidance of threats to the security of infor- mation. We are fortunate to have had the assistance of a reviewer who worked as a contributing author for NIST, ensuring alignment between this text and the methods recommended by NIST.
Author Team Long-time college professors and information security professionals Michael Whitman and Herbert Mattord have jointly developed this text to merge knowledge from the world of academic study with practical experience from the business world. Professor Andrew Green has been added to this proven team to add a new dimension of practical experience.
Michael Whitman, Ph.D., CISM, CISSP Michael Whitman is a professor of information security and assurance in the Information Systems Department, Michael J. Coles College of Business at Ken- nesaw State University, Kennesaw, Georgia, where he is the director of the KSU Center for Informa- tion Security Education (infosec.kennesaw.edu). Dr. Whitman has over 20 years of experience in higher education, with over 12 years of experience in designing and teaching information security courses. He is an active researcher in information security, fair and responsible use policies, and computer-use ethics. He currently teaches graduate and undergraduate courses in information secu- rity. He has published articles in the top journals in his field, including Information Systems Research, Communications of the ACM, Information and Management, Journal of International Business Studies, and Journal of Computer Information Systems. He is a member of the Association for Computing Machinery and the Association for Information Systems. Under Dr. Whitman’s lead- ership, Kennesaw State University has been recognized by the National Security Agency and the Department of Homeland Security as a National Center of Academic Excellence in Information Assurance Education three times; the university’s coursework has been reviewed by national-level information assurance subject matter experts and determined to meet the national training standard for information systems security professionals. Dr. Whitman is also the coauthor of Principles of Information Security, 4th edition; Management of Information Security, 4th edition; Readings and Cases in the Management of Information Security; Readings and Cases in Information Security: Law and Ethics; The Hands-On Information Security Lab Manual, 3rd edition; Roadmap to the Management of Information Security for IT and Information Security Professionals; Guide to Fire- walls and VPNs, 3rd edition; Guide to Firewalls and Network Security, 2nd edition; and Guide to Network Security, all published by Course Technology. In 2012, Dr. Whitman was selected by the Colloquium for Information Systems Security Education as the recipient of the 2012 Information Assurance Educator of the Year award.
Herbert Mattord, Ph.D. CISM, CISSP Herbert Mattord completed 24 years of IT industry experi- ence as an application developer, database administrator, project manager, and information security practitioner before joining the faculty of Kennesaw State University in 2002. Dr. Mattord is an assistant professor of information security and assurance and the coordinator for the Bachelor of Business Administration in Information Security and Assurance program. He is the operations man- ager of the KSU Center for Information Security Education and Awareness (infosec.kennesaw.edu) as well as the coordinator for the KSU certificate in Information Security and Assurance. During his career as an IT practitioner, Dr. Mattord has been an adjunct professor at: Kennesaw State Uni- versity; Southern Polytechnic State University in Marietta, Georgia; Austin Community College in Austin, Texas; and Texas State University: San Marcos. He currently teaches undergraduate courses in information security, data communications, local area networks, database technology, project management, systems analysis and design, and information resources management and policy. He
Preface xvii
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
was formerly the manager of corporate information technology security at Georgia-Pacific Corpora- tion, where much of the practical knowledge found in this textbook was acquired. Professor Mat- tord is also the coauthor of Principles of Information Security, 4th edition; Management of Informa- tion Security, 4th edition; Readings and Cases in the Management of Information Security; Readings and Cases in Information Security: Law and Ethics; The Hands-On Information Security Lab Man- ual, 3rd edition; Roadmap to the Management of Information Security for IT and Information Security Professionals; Guide to Firewalls and VPNs, 3rd edition; Guide to Firewalls and Network Security, 2nd edition; and Guide to Network Security, all published by Course Technology.
Andrew Green, MSIS Andrew Green is a lecturer of information security and assurance in the Informa- tion Systems Department, Michael J. Coles College of Business at Kennesaw State University, Kennesaw, Georgia. Mr. Green has over a decade of experience in information security. Prior to entering academia full time, he worked as an information security consultant, focusing primarily on the needs of small and medium-sized businesses. Prior to that, he worked in the healthcare IT field, where he developed and supported transcription interfaces for medical facilities throughout the United States. Mr. Green is also a full-time Ph.D. student at Nova Southeastern University, where he is studying information systems with a concentration in information security. He is the coauthor of Guide to Firewalls and VPNs, 3rd edition andGuide to Network Security, both published by Course Technology.
Structure The textbook is organized into 12 chapters and 3 appendices. Here are summaries of each chapter’s contents:
Chapter 1. An Overview of Information Security and Risk Management This chapter defines the concepts of information security and risk management and explains how they are integral to the management processes used for incident response and contingency planning.
Chapter 2. Planning for Organizational Readiness The focus of this chapter is on how an organiza- tion can plan for and develop organizational processes and staffing appointments needed for suc- cessful incident response and contingency plans.
Chapter 3. Contingency Strategies for IR/DR/BC This chapter explores the relationships between contingency planning and the subordinate elements of incident response, business resumption, disas- ter recovery, and business continuity planning. It also explains the techniques used for data and application backup and recovery.
Chapter 4. Incident Response: Planning This chapter expands on the incident response planning process to include processes and activities that are needed as well as the skills and techniques used to develop such plans.
Chapter 5. Incident Response: Detection and Decision Making This chapter describes how incidents are detected and how decision making regarding incident escalation and plan activation occur.
Chapter 6. Incident Response: Organizing and Preparing the CSIRT This chapter presents the details of the actions that the CSIRT performs and how they are designed and developed.
Chapter 7. Incident Response: Response Strategies This chapter describes IR reaction strategies and how they are applied to incidents.
xviii Preface
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Chapter 8. Incident Response: Recovery and Maintenance This chapter describes how an organiza- tion plans for and executes the recovery process when an incident occurs; it also expands on the steps involved in the ongoing maintenance of the IR plan.
Chapter 9. Disaster Recovery: Preparation and Implementation This chapter explores how organi- zations prepare for disasters and recovery from disasters.
Chapter 10. Disaster Recovery: Operation and Maintenance This chapter presents the challenges an organization faces when engaged in DR operations and how such challenges are met.
Chapter 11. Business Continuity Planning This chapter covers how organizations ensure continu- ous operations even when the primary facilities used by the organization are not available.
Chapter 12. Crisis Management and International Standards in IR/DR/BC This chapter covers the role of crisis management and recommends the elements of a plan to prepare for crisis response. The chapter also covers the key international standards that affect IR, DR, and BC.
Appendices. The three appendices present sample BC and crisis management plans and templates.
Text and Graphic Conventions Wherever appropriate, additional information and exercises have been added to this book to help you better understand what is being discussed in the chapter. Icons throughout the text alert you to additional materials. The icons used in this textbook are described here:
Notes present additional helpful material related to the subject being described.
Offline boxes offer material that expands on the chapter’s contents but that may not be central to the learning objectives of the chapter.
Technical Details boxes provide additional technical information on informa- tion security topics.
Real World Exercises are structured activities to allow students to enrich their understanding of selected topics presented in the chapter by exploring Web- based or other widely available resources.
Hands-On Projects offer students the chance to explore the technical aspects of the theories presented in the chapter.
Preface xix
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Instructor’s Materials The following supplemental materials are available for use in a classroom setting. All the supple- ments available with this book are provided to the instructor on a single CD-ROM (ISBN: 9781111138066) and online at the textbook’s Web site.
Please visit login.cengage.com and log in to access instructor-specific resources.
To access additional course materials, please visit www.cengagebrain.com. At the CengageBrain.com home page, search for the ISBN of your title (from the back cover of your book) using the search box at the top of the page. This will take you to the product page, where these resources can be found.
Additional materials designed especially for you might be available for your course online. Go to www.cengage.com/coursetechnology and search for this book title periodically for more details.
Electronic Instructor’s Manual—The Instructor’s Manual that accompanies this textbook includes additional instructional material to assist in class preparation, including suggestions for classroom activities, discussion topics, and additional projects.
Solution Files—The Solution Files include answers to selected end-of-chapter materials, including the Review Questions and some of the Hands-On Projects.
ExamView—This textbook is accompanied by ExamView, a powerful testing software package that allows instructors to create and administer printed, computer (LAN-based), and Internet exams. ExamView includes hundreds of questions that correspond to the topics covered in this text, enabling students to generate detailed study guides that include page references for further review. The computer-based and Internet testing components allow students to take exams at their compu- ters, and also save the instructor time by grading each exam automatically.
PowerPoint Presentations—This book comes with Microsoft PowerPoint slides for each chapter. These are included as a teaching aid for classroom presentation. They can also be made available to students on the network for chapter review, or they can be printed for classroom distribution. Instruc- tors, feel free to add your own slides for additional topics you introduce to the class.
Information Security Community Site—Stay Secure with the Information Security Community Site! Connect with students, professors, and professionals from around the world, and stay on top of this ever-changing field.
● Visit www.cengage.com/community/infosec. ● Download resources such as instructional videos and labs. ● Ask authors, professors, and students the questions that are on your mind in our Discussion
Forums. ● See up-to-date news, videos, and articles. ● Read author blogs. ● Listen to podcasts on the latest Information Security topics.
Acknowledgments The authors would like to thank their families for their support and understanding for the many hours dedicated to this project, hours taken in many cases from family activities. Special thanks to Karen Scarfone, coauthor of several NIST SPs. Her reviews and suggestions resulted in a more read- able manuscript. Additionally, the authors would like to thank Doug Burks, primary developer of
xx Preface
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
the Security Onion project used in this textbook. Doug’s insight and suggestions for the Hands-On Projects helped make them more robust and practical for students to use.
Reviewers We are indebted to the following individuals for their respective contributions of perceptive feed- back on the initial proposal, the project outline, and the individual chapters of the text:
Karen Scarfone, Scarfone Cybersecurity Gary Kessler, Embry-Riddle Aeronautical University
Special Thanks The authors wish to thank the editorial and production teams at Course Technology. Their diligent and professional efforts greatly enhanced the final product:
Michelle Ruelos Cannistraci, Senior Product Manager
Kent Williams, Developmental Editor
Nick Lombardi, Acquisitions Editor
Andrea Majot, Senior Content Project Manager
Nicole Ashton Spoto, Technical Editor
In addition, several professional and commercial organizations and individuals have aided the development of the textbook by providing information and inspiration, and the authors wish to acknowledge their contribution:
Bernstein Crisis Management
Continuity Central
Information Systems Security Associations
Institute for Crisis Management
National Institute of Standards and Technology
Oracle, Inc.
Purdue University
Rothstein Associates, Inc.
SunGard
Our colleagues in the Department of Information Systems and the Michael J. Coles College of Business, Kennesaw State University
Dr. Amy Woszczynski, Interim Chair of the Department of Information Systems, Michael J. Coles College of Business, Kennesaw State University
Dr. Kathy Schwaig, Dean of the Michael J. Coles College of Business, Kennesaw State University
Our Commitment The authors are committed to serving the needs of the adopters and readers. We would be pleased and honored to receive feedback on the textbook and its supporting materials. You can contact us through Course Technology.
Preface xxi
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
chapter1
An Overview of Information Security and Risk Management
An ounce of prevention is worth a pound of cure. —Benjamin Franklin
Upon completion of this material, you should be able to: ● Define and explain information security ● Identify and explain the basic concepts of risk management ● List and discuss the components of contingency planning ● Describe the role of information security policy in the development of contingency plans
1
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Introduction This book is about being prepared for the unexpected, being ready for such events as incidents and disasters. We call this contingency planning, and the sad fact is that most organizations don’t incorporate it into their day-to-day business activities. Such organi- zations are often not well prepared to offer the proper response to a disaster or security incident. By July 2012, Internet World Stats estimated that there were over 2.4 billion people online,1 representing one third of the world’s 6.9 billion population. Each one of those online users is a potential threat to any online system. The vast majority of Inter- net users will not intentionally probe, monitor, attack, or attempt to access an organiza- tion’s information without authorization; however, that potential does exist. If even less than 1/10 of 1 percent of online users make the effort, the result would be almost two and a half million potential attackers.
Paul Alexander and his boss Amanda Wilson were sitting in Amanda’s office discussing the coming year’s budget when they heard a commotion in the hall. Hearing his name mentioned, Paul stuck his head out the door and saw Jonathon Jasper (“JJ” to his friends) walking quickly toward him.
“Paul!” JJ called again, relieved to see Paul waiting in Amanda’s office. “Hi, Amanda,” JJ said, then, looking at Paul, he added, “We have a problem.” JJ was
one of the systems administrators at Hierarchical Access LTD (HAL), a Georgia-based Internet service provider that serves the northwest region of metropolitan Atlanta.
Paul stepped out into the hall, closing Amanda’s door behind him. “What’s up, JJ?” “I think we’ve got someone sniffing around the e-mail server,” JJ replied. “I just
looked at the log files, and there is an unusual number of failed login attempts on accounts that normally just don’t have that many, like yours!”
Paul paused a moment. “But the e-mail server’s proxied,” he finally said to JJ, “which means it must be an
internal probe.” “Yeah, that’s why it’s a problem,” JJ replied. “We haven’t gotten this kind of thing
since we installed the proxy and moved the Web and e-mail servers inside the DMZ. It’s got to be someone in-house.”
JJ looked exasperated. “And after all that time I spent conducting awareness training!” “Don’t worry just yet,” Paul told him. “Let’s make a few calls, and then we’ll go
from there. Grab your incident response book and meet me in the conference room in 10 minutes. Grab Tina in network operations on the way.”
Opening Case Scenario: Pernicious Proxy Probing
2 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 In the weeks that followed the September 11, 2001 attacks in New York, Pennsylvania, and Washington D.C., the media reported on the disastrous losses that various organizations were suffering. Still, many organizations were able to continue conducting business. Why? The reason is that those organizations were prepared for unexpected events. The cataclysm in 2001 was not the first attack on the World Trade Center (WTC). On February 26, 1993, a car bomb exploded beneath one of the WTC towers, killing 6 and injuring over 1000. The attack was limited in its devastation only because the attackers weren’t able to acquire all the components for a coordinated bomb and cyanide gas attack.2
Still, this attack was a wake-up call for the hundreds of organizations that conducted business in the WTC. Many began asking the question, “What would we have done if the attack had been more successful?” As a direct result, many of the organizations occupying the WTC on September 11, 2001 had developed contingency plans. Although thousands of people lost their lives in the attack, many were able to evacuate, and many organizations were prepared to resume their businesses in the aftermath of the devastation.
A 2008 Gartner report found that two out of three organizations surveyed had to invoke their disaster recovery or business continuity plans in the two years preceding the study.3 Consider- ing that nearly 80 percent of businesses affected by a disaster either never reopen or close within 18 months of the event, having a disaster recovery and business continuity plan is vital to sustaining operations when disasters strike.4 Considering the risks, it is imperative that management teams create, implement, and test effective plans to deal with incidents and disasters. For this reason, the field of information security has been steadily growing and is taken seriously by more and more organizations, not only in the United States but throughout the world.
Before we can discuss contingency planning in detail, we must introduce some critical con- cepts of which contingency planning is an integral part. The first of these, which serves as the overall disciplinary umbrella, is information security. This refers to many interlinked programs and activities that work together to ensure the confidentiality, integrity, and availability of the information used by organizations. This includes steps to ensure the protection of organiza- tional information systems, specifically during incidents and disasters. Because information security is a complex subject, which includes risk management as well as information security policy, it is important to have an overview of that broad field and an understanding of these major components. Contingency planning is an important element of information security, but before management can plan for contingencies, it should have an overall strategic plan for information security in place, including risk management processes to guide the appropriate managerial and technical controls. This chapter serves as an overview of information security, with special consideration given to risk management and the role that contingency planning plays in (1) information security in general and (2) risk management in particular.
Information Security The Committee on National Security Systems (CNSS) has defined information security as the protection of information and its critical elements, including the systems and hard- ware that use, store, and transmit that information. This definition is part of the CNSS model (see Figure 1-1), which serves as the conceptual framework for understanding information security. The model evolved from a similar model developed within the
Information Security 3
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
computer security industry, known as the C.I.A. triangle. An industry standard for com- puter security since the development of the mainframe, the C.I.A. triangle illustrates the three most critical characteristics of information used within information systems: confi- dentiality, integrity, and availability.
Information assets have the characteristics of confidentiality when only those persons or com- puter systems with the rights and privileges to access it are able to do so. Information assets have integrity when they are not exposed (while being stored, processed, or transmitted) to corruption, damage, destruction, or other disruption of their authentic states; in other words, the information is whole, complete, and uncorrupted. Finally, information assets have availability when authorized users—persons or computer systems—are able to access them in the specified format without interference or obstruction. In other words, the information is there when it is needed, from where it is supposed to be, and in the format expected.
In summary, information security (InfoSec) is the protection of the confidentiality, integrity, and availability of information, whether in storage, during processing, or in transmission. Such protection is achieved through the application of policy, education and training, and technology.
Key Information Security Concepts In general, a threat is an object, person, or other entity that is a potential risk of loss to an asset, which is the organizational resource being protected. An asset can be logical, such as a Web site, information, or data, or it can be physical, such as a person, com- puter system, or other tangible object. A threat can become the basis for an attack—an intentional or unintentional attempt to cause damage to or otherwise compromise the information or the systems that support it. A threat-agent is a specific and identifiable instance of a general threat that exploits vulnerabilities set up to protect the asset. NIST defines a vulnerability as “a flaw or weakness in system security procedures, design, implementation, or internal controls that could be exercised (accidentally triggered or intentionally exploited) and result in a security breach or violation of the system’s secu- rity policy.”5 Vulnerabilities that have been examined, documented, and published are referred to as well-known vulnerabilities. Some vulnerabilities are latent and thus not revealed until they are discovered and made known.
Po licy
Ed uc
ati on
Te ch
no log
y Confidentiality
Integrity
Availability
Polic y Edu
catio n Tec
hnolo gy
Storage Processing Transmission
Confidentiality
Integrity
Availability
Storage Processing Transmission © Cengage Learning 2014
Figure 1-1 The CNSS security model
4 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 There are two common uses of the term exploit in information security. First, threat-agents are said to exploit a system or information asset by using it illegally for their personal gains. Second, threat-agents can create an exploit, or means to target a specific vulnerabil- ity, usually found in software, to formulate an attack. A defender tries to prevent attacks by applying a control, a safeguard, or a countermeasure; these terms, all synonymous with control, represent security mechanisms, policies, or procedures that can successfully counter attacks, reduce risk, resolve vulnerabilities, and generally improve the security within an organization.
The results of a 2012 study that collected, categorized, and ranked the identifiable threats to information security are shown in Table 1-1. The study compared its findings with a prior study conducted by one of its researchers.
The threat categories shown in Table 1-1 are explained in detail in the following sections.
Trespass Trespass is a broad category of electronic and human activities that can breach the confidentiality of information. When an unauthorized individual gains access to the information an organization is trying to protect, that act is categorized as a deliber- ate act of trespass. In the opening scenario of this chapter, the IT staff members at HAL were more disappointed than surprised to find someone poking around their mail server, looking for a way in. Acts of trespass can lead to unauthorized real or virtual actions that enable information gatherers to enter premises or systems they have not been autho- rized to enter.
Threat Category 2010 Ranking Prior Ranking
Espionage or trespass 1 4
Software attacks 2 1
Human error or failure 3 3
Theft 4 7
Compromises to intellectual property 5 9
Sabotage or vandalism 6 5
Technical software failures or errors 7 2
Technical hardware failures or errors 8 6
Forces of nature 9 8
Deviations in quality of service from service providers 10 10
Technological obsolescence 11 11
Information extortion 12 12
Table 1-1 Threats to information security6 Source: 2003 Study © Communications of the ACM used with permission
Information Security 5
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
The classic perpetrator of deliberate acts of espionage or trespass is the hacker. In this text, hackers are people who bypass legitimate controls placed on information systems in order to gain access to data or information against the intent of the owner. More specifically, a hacker is someone who uses skill, guile, or fraud to attempt to bypass the controls placed around information that belongs to someone else.
Software Attacks Deliberate software attacks occur when an individual or group designs software to attack a system. This software is referred to as malicious code, mali- cious software, or malware. These software components or programs are designed to damage, destroy, or deny service to the target systems. Some of the more common instances of malicious code are viruses and worms, Trojan horses, logic bombs, bots, rootkits, and back doors. Equally prominent among the recent incidences of malicious code are the denial-of-service attacks conducted by attackers on popular e-commerce sites. A denial-of-service (DoS) attack seeks to deny legitimate users access to services by either tying up a server’s available resources or causing it to shut down. A variation on the DoS attack is the distributed DoS (DDoS) attack, in which an attacker compro- mises a number of systems, then uses these systems (called zombies or bots) to attack an unsuspecting target.
A potential source of confusion when it comes to threats posed by malicious code are the differences between the method of propagation (worm versus virus), the payload (what the malware does once it is in place, such as deny service or install a back door), and the vector of infection (how the code is transmitted from system to system, whether through social engineering or by technical means, such as an open network share). Various concepts related to the topic of malicious code are discussed in the following sections.
Viruses Computer viruses are segments of code that perform malicious actions. The code attaches itself to an existing program and takes control of that program’s access to the targeted computer. The virus-controlled target program then carries out the virus’s plan by replicating itself and inserting itself into additional targeted systems.
Opening an infected e-mail or some other seemingly trivial action can cause anything from random messages popping up on a user’s screen to the destruction of entire hard drives of data. Viruses are passed from machine to machine via physical media, e-mail, or other forms of computer data transmission. When these viruses infect a machine, they may immedi- ately scan the local machine for e-mail applications; they may even send themselves to every user in the e-mail address book.
There are several types of viruses. One type is the macro virus, which is embedded in auto- matically executing macrocode, common in word-processed documents, spreadsheets, and database applications. Another type, the boot virus, infects the key operating systems files located in a computer’s boot sector.
Worms Named for the tapeworm in John Brunner’s novel The Shockwave Rider, worms are malicious programs that replicate themselves constantly without requiring another pro- gram to provide a safe environment for replication. Worms can continue replicating them- selves until they completely fill available resources, such as memory, hard drive space, and
6 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 network bandwidth. These complex behaviors can be invoked with or without the user downloading or executing the file. Once the worm has infected a computer, it can redis- tribute itself to all e-mail addresses found on the infected system. Further, a worm can deposit copies of itself onto all Web servers that the infected system can reach, so that users who subsequently visit those sites become infected themselves. Worms also take advantage of open shares found on the network in which an infected system is located, placing working copies of the worm code onto the server so that users of those shares are likely to become infected.
Back Doors and Trap Doors A virus or worm can have a payload that installs a back door or trap door component in a system, which allows the attacker to access a system, at will, with special privileges. Examples of these kinds of payloads are SubSeven, Back Orifice, and Flashfake.
Polymorphism One of the biggest ongoing problems in fighting viruses and worms are polymorphic threats. A polymorphic threat is one that changes its apparent shape over time, making it undetectable by techniques that look for preconfigured signatures. These viruses and worms actually evolve, changing their size and appearance to elude detection by antivi- rus software programs. This means that an e-mail generated by the virus may not match previous examples, making detection more of a challenge.
Propagation Vectors The way that malicious code is spread from one system to another can vary widely. One common way is through a social engineering attack—that is, getting the computer user to perform an action that enables the infection. An example of this is the Trojan horse, often simply called a Trojan. A Trojan is something that looks like a desirable program or tool but is in fact a malicious entity. Other propagation vectors do not require human interaction, leveraging open network connections, file shares, or software vulnerabil- ities to spread themselves.
Malware Hoaxes As frustrating as viruses and worms are, perhaps more time and money is spent on resolving malware hoaxes. Well-meaning people can disrupt the harmony and flow of an organization when they send random e-mails warning of dangerous malware that is fictitious. While these individuals feel they are helping out by warning their coworkers of a threat, much time and energy is wasted as everyone forwards the message to everyone they know, posts the message on social media sites, and begins updating antivirus protection software. By teaching its employees how to verify whether a malware threat is real, the organization can reduce the impact of this type of threat.
Human Error or Failure This threat category includes acts performed by an authorized user, usually without malicious intent or purpose. When people use information systems, mistakes sometimes happen as a result of inexperience, improper training, incorrect assumptions, and so forth. Unfortunately, small mistakes can produce extensive damage with catastrophic results. This is what is meant by human error. Human failure, on the other hand, is the intentional refusal or unintentional inability to comply with policies, guidelines, and procedures, with a potential loss of information. An organization may be
Information Security 7
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
doing its part to protect information, but if an individual employee fails to follow estab- lished protocols, information can still be put at risk.
Theft The threat of theft—the illegal taking of another’s property—is a constant prob- lem. Within an organization, property can be physical, electronic, or intellectual. The value of information assets suffer when they are copied and taken away without the own- er’s knowledge. This threat category also includes acts of espionage, given that an attacker is often looking for information to steal. Any breach of confidentiality can be construed as an act of theft.
Attackers can use many different methods to access the information stored in an information system. Some information gathering is quite legal—for example, when doing research. Such techniques are collectively referred to as competitive intelligence. When information gathering employs techniques that cross the threshold of what is considered legal or ethical, it becomes known as industrial espionage.
Also of concern in this category is the theft or loss of mobile devices, including phones, tablets, and computers. Although the devices themselves are of value, perhaps even more valu- able is the information stored within. Users who have been issued company equipment may establish (and save) VPN-connection information, passwords, access credentials, company records, customer information, and the like. This valuable information becomes a target for information thieves. In fact, it has become commonplace to find lost or stolen devices in the trash, with the hard drives or data cards (like phone SIMs) removed or the data having been copied and erased The information is more valuable and easier to conceal than the actual device itself.
Users who travel or use their devices away from home should be extremely careful when leav- ing the device unattended at a restaurant table, conference room, or hotel room. Actually, most globally engaged organizations now have explicit policy directives that prohibit taking these portable devices to certain countries and direct employees required to travel to take sanitized, almost disposable, devices that are not allowed contact with internal company net- works or technology.
Compromises to Intellectual Property Many organizations create or support the development of intellectual property as part of their business operations. FOLDOC, an online dictionary of computing, defines intellectual property (IP) this way:
The ownership of ideas and control over the tangible or virtual representation of those ideas. Use of another person’s intellectual property may or may not involve royalty payments or permission but should always include proper credit to the source.7
Source: FOLDOC
IP includes trade secrets, copyrights, trademarks, and patents, all of which employees use to conduct day-to-day business. Once an organization has properly identified its IP, breaches in the controls placed to control access to it constitute a threat to the security of this information.
Often, an organization purchases or leases the IP of other organizations and must therefore abide by the purchase or licensing agreement for its fair and responsible use.
8 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 Of equal concern is the exfiltration, or unauthorized removal of information, from an organization. Most commonly associated with disgruntled employees, the protection of intellectual property from unauthorized disclosure to third parties further illustrates the severity of this issue. Theft of organizational IP, such as trade secrets or trusted informa- tion like customer personal and financial records, is a commonplace issue. Data exfiltration is also being made tougher to combat because of the increasing popularity of “bring your own device” (or BYOD) systems, which allow employees to attach their own personal devices to the corporate network. These devices are frequently not as secure as the systems owned and maintained by the organization. If compromised by attackers prior to attaching to the corporate network, BYOD systems can easily be used as conduits to allow data to be exfiltrated. Additionally, unhappy employees can use these devices to copy data, then leave the organization with that valuable asset in their hands and no one the wiser.
Among the most common IP breaches is the unlawful use or duplication of software-based intellectual property, more commonly known as software piracy. Because most software is licensed to a particular purchaser, its use is restricted to a single user or to a designated user in an organization. If the user copies the program to another computer without securing another license or transferring the license, he or she has violated the copyright. Software licenses are strictly enforced by a number of regulatory and private organizations, and soft- ware publishers use several control mechanisms to prevent copyright infringement. In addition to the laws surrounding software piracy, two watchdog organizations investigate allegations of software abuse: the Software & Information Industry Association (SIIA), the Web site for which can be found at www.siia.net, and the Business Software Alliance (BSA), which can be found at www.bsa.org.
Sabotage or Vandalism This threat category involves the deliberate sabotage of a computer system or business or acts of vandalism to either destroy an asset or damage an organization’s image. The acts can range from petty vandalism by employees to organized sabotage by outsiders. A frequently encountered threat is the assault on an organization’s electronic profile—its Web site.
A much more sinister form of hacking is cyberterrorism. Cyberterrorists hack systems to conduct terrorist activities through network or Internet pathways. The United States and other governments are developing security measures intended to protect the critical computing and communications networks as well as the physical and power utility infrastructures.
Technical Software Failures or Errors This threat category stems from purchasing software with unknown hidden faults. Large quantities of computer code are written, pub- lished, and sold before all the significant security-related bugs are detected and resolved. Also, combinations of particular software and hardware may reveal new bugs. While most bugs are not a security threat, some may be exploitable and may result in potential loss or damage to information used by those programs. In addition to bugs, there may be untested failure conditions or purposeful subversions of the security controls built into systems. These may be oversights or intentional shortcuts left by programmers for benign or malign rea- sons. Collectively, shortcut access routes into programs that bypass security checks are called trap doors; they can cause serious security breaches.
Information Security 9
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Software bugs are so commonplace that entire Web sites are dedicated to documenting them—for example, Bugtraq (www.securityfocus.com) and the National Vulnerability Data- base (http://nvd.nist.gov). These resources provide up-to-the-minute information on the latest security vulnerabilities and a very thorough archive of past bugs.
Technical Hardware Failures or Errors Technical hardware failures or errors occur when a manufacturer distributes equipment containing a known or unknown flaw. These defects can cause the system to perform outside of expected parameters, resulting in unreliable service or lack of availability. Some errors are terminal, in that they result in the unrecoverable loss of the equipment. Some errors are intermittent, in that they only periodi- cally manifest themselves, resulting in faults that are not easily identified. For example, equipment can sometimes stop working or can work in unexpected ways. Murphy’s Law says that if something can possibly go wrong, it will. In other words, it’s not whether some- thing will fail but when.
Forces of Nature Forces of nature, also known as force majeure, or acts of God, pose some of the most dangerous threats imaginable because they often occur with very little warn- ing. Fire, flood, earthquake, lightning, volcanic eruptions, even animal or insect infestation— these threats disrupt not only the lives of individuals but also the storage, transmission, and use of information.
Deviations in Quality of Service by Service Providers This threat category covers situations in which a product or service is not delivered to the organization as expected. Utility companies, service providers, and other value-added organizations form a vast web of interconnected services. An organization’s information system depends on the successful operation of such interdependent support systems, including power grids, telecom networks, parts suppliers, service vendors, and even the janitorial staff and garbage haulers. Any one of these support systems can be interrupted by storms, employee illnesses, or other unforeseen events.
An example of this threat category occurs when a construction crew damages a fiber-optic link for an ISP. The backup provider may be online and in service but may only be able to supply a fraction of the bandwidth the organization needs for full service. This degradation of service is a form of availability disruption. Internet service, communications, and power irregularities can dramatically affect the availability of information and systems.
Technological Obsolescence This threat category involves antiquated or outdated infrastructure that leads to unreliable and untrustworthy systems. Management must recog- nize that when technology becomes outdated, there is a risk of a loss of data integrity from attacks. Strategic planning should always include an analysis of the technology that is currently in use. Ideally, proper planning will prevent the risks stemming from technology obsolesce, but when obsolescence is identified, management must take immediate action. IT professionals play a large role in the identification of obsolescence.
Information Extortion The threat of information extortion is the possibility that an attacker or trusted insider will steal information from a computer system and demand compensation for its return or for an agreement to not disclose the information. Extortion
10 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 is common in credit card number theft. Unfortunately, organized crime is increasingly involved in this area.
Other Threats Listings The Computer Security Institute conducts an annual study of computer crime, the results for which are shown in Table 1-2. Malware attacks continue to cause the most financial loss, and malware continues to be the most frequently cited attack (with a reported loss of over $42 million in 2009 alone). Nearly 70 percent of respondents noted that they had experienced one or more malware attacks in the 12-month reporting period—and that doesn’t include companies that are unwilling to report attacks. The fact is, almost every company has been attacked. Whether or not that attack was successful depends on the company’s security efforts.
Type of Attack or Misuse 2010/11 2008 2006 2004 2002 2000
Malware infection (revised after 2008) 67% 50% 65% 78% 85% 85%
Being fraudulently represented as sender of phishing message
39% 31% (new category)
Laptop/mobile hardware theft/loss 34% 42% 47% 49% 55% 60%
Bots/zombies in organization 29% 20% (new category
Insider abuse of Internet access or e-mail 25% 44% 42% 59% 78% 79%
Denial of service 17% 21% 25% 39% 40% 27%
Unauthorized access or privilege escalation by insider
13% 15% (revised category)
Password sniffing 11% 9% (new category)
System penetration by outsider 11% (revised category)
Exploit of client Web browser 10% (new category)
Other Attacks/Misuse categories with less than 10% responses not listed above include (listed in decreasing order of occurrence/reporting):
Financial fraud
Web site defacement
Exploit of wireless network
Other exploit of public-facing Web site
Theft of or unauthorized access to PII or PHI due to all other causes
Instant Messaging misuse
Theft of or unauthorized access to IP due to all other causes
Exploit of user’s social network profile
Theft of or unauthorized access to IP due to mobile device theft/loss
Theft of or unauthorized access to PII or PHI due to mobile device theft/loss
Exploit of DNS Server
Extortion or blackmail associated with threat of attack or release of stolen data
Table 1-2 Top Ten CSI/FBI survey results for types of attack or misuse (2000-2011)8 Source CSI/FBI surveys 2000 to 2010/11 (www.gocsi.com)
Information Security 11
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Overview of Risk Management One part of information security is risk management, which is the process of identifying and controlling the risks to an organization’s information assets. All managers are expected to play a role in the risk management process, but information security managers are expected to play the largest roles. Very often, the chief information officer (CIO) will delegate much of the responsibility for risk management to the chief information security officer (CISO).
Given that contingency planning is considered part of the risk management process, it is important to fully understand how risk management works and how contingency planning fits within that process. Risk management consists of two major undertakings: risk identifica- tion and risk control. Risk identification is the process of examining, documenting, and asses- sing the security posture of an organization’s information technology and the risks it faces. Risk control is the process of applying controls to reduce the risks to an organization’s data and information systems. The various components of risk management and their relationships to one another are shown in Figure 1-2.
As an aspiring information security professional, you will have a key role to play in risk management. As part of the management team within an organization’s management, you may find yourself on the team that must structure the IT and information security func- tions to perform a successful defense of the organization’s information assets—the infor- mation and data, hardware, software, procedures, and people. The IT community must serve the information technology needs of the broader organization and, at the same
Inventorying assets
Classifying assets
Identifying threats & vulnerabilities
Risk controlRisk identification
Selecting strategy
Justifying controls
Risk assessment is the documented result of
the risk identification process.
Risk management
© Cengage Learning 2014
Figure 1-2 Components of risk management
12 Chapter 1 An Overview of Information Security and Risk Management
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
1 time, leverage the special skills and insights of the information security community. The information security team must lead the way with skill, professionalism, and flexibility as it works with the other communities of interest to appropriately balance the usefulness and security of the information system.
Looked at another way, risk management is the process of identifying vulnerabilities in an organization’s information systems and taking carefully reasoned steps to ensure the confi- dentiality, integrity, and availability of all the components of the organization’s information system. Each of the three elements in the C.I.A. triangle is an essential part of an organiza- tion’s ability to sustain long-term competitiveness. When the organization depends on IT- based systems to remain viable, information security and the discipline of risk management move beyond theoretical discussions and become an integral part of the economic basis for making business decisions. These decisions are based on trade-offs between the costs of apply- ing information systems controls and the benefits realized from the operation of secured, avail- able systems.
An observation made over 2400 years ago by Chinese General Sun Tzu is relevant to informa- tion security today:
If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.9
Source: Oxford University Press
Consider for a moment the similarities between information security and warfare. Information security managers and technicians are the defenders of information. The many threats mentioned earlier are constantly attacking the defenses surrounding information assets. Defenses are built in layers, by placing safeguard upon safeguard. You attempt to detect, prevent, and recover from attack after attack after attack. Moreover, organizations are legally prevented from switching to offense, and the attackers themselves have no need to expend their resources on defense. To be victorious, you must therefore know yourself and know the enemy.
Know Yourself First, you must identify, examine, and understand the information and systems currently in place within your organization. To protect assets, which are defined here as informa- tion and the systems that use, store, and transmit information, you must understand what they are, how they add value to the organization, and to which vulnerabilities they are susceptible. Once you know what you have, you can identify what you are already doing to protect it. Just because you have a control in place to protect an asset does not necessarily mean that the asset is protected. Frequently, organizations implement control mechanisms but then neglect to periodically perform the necessary review, revision, and maintenance of their own systems. The policies, education and training programs, and technologies that protect information must be carefully maintained and administered to ensure that they are still effective.
Overview of Risk Management 13
Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part.
Know the Enemy Once you are informed of your organization’s assets and weaknesses, you can move on to the other part of Sun Tzu’s advice: know the enemy. This means identifying, examining, and understanding the threats facing the organization. You must determine those threat aspects that most directly affect the organization and the security of the organization’s information assets. You can then use your understanding of these aspects to create a list of threats priori- tized by how important each asset is to the organization.