Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

What are computer ethics definition

24/11/2021 Client: muhammad11 Deadline: 2 Day

Computer Ethics Assignment

TECHNOLOGY Controversies, Questions, and Strategies

for Ethical Computing

HERMAN T. TAVANI Rivier University

FFIRS3GXML 10/20/2012 0:58:24 Page 2

VP & Executive Publisher: Donald Fowley Executive Editor: Beth Lang Golub Editorial Assistant: Katherine Willis Marketing Manager: Chris Ruel Marketing Assistant: Marissa Carroll Associate Production Manager: Joyce Poh Production Editor: Jolene Ling Designer: Kenji Ngieng Cover Photo Credit: Bernhard Lang/Getty Images, Inc. Production Management Services: Thomson Digital

This book was set in 10/12 TimesTenLTStd-Roman by Thomson Digital, and printed and bound by Edwards Brothers Malloy. The cover was printed by Edwards Brothers Malloy.

This book is printed on acid free paper.

Founded in 1807, John Wiley & Sons, Inc. has been a valued source of knowledge and understanding for more than 200 years, helping people around the worldmeet their needs and fulfill their aspirations. Our company is built on a foundation of principles that include responsibility to the communities we serve and where we live and work. In 2008, we launched a Corporate Citizenship Initiative, a global effort to address the environmental, social, economic, and ethical challenges we face in our business. Among the issues we are addressing are carbon impact, paper specifications and procurement, ethical conduct within our business and among our vendors, and community and charitable support. For more information, please visit our website: www.wiley.com/go/citizenship.

Copyright# 2013, 2011, 2007, 2004 JohnWiley & Sons, Inc. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976United States CopyrightAct, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc. 222 Rosewood Drive, Danvers, MA 01923, website www.copyright.com. Requests to the Publisher for permission should be addressed to the PermissionsDepartment, JohnWiley& Sons, Inc., 111 River Street, Hoboken, NJ 07030- 5774, (201)748-6011, fax (201)748-6008, website http://www.wiley.com/go/permissions.

Evaluation copies are provided to qualified academics and professionals for review purposes only, for use in their courses during the next academic year. These copies are licensed and may not be sold or transferred to a third party. Upon completion of the review period, please return the evaluation copy to Wiley. Return instructions and a free of charge return mailing label are available at www.wiley.com/go/returnlabel. If you have chosen to adopt this textbook for use in your course, please accept this book as your complimentary desk copy. Outside of the United States, please contact your local sales representative.

Library of Congress Cataloging-in-Publication Data

Tavani, Herman T. Ethics and technology : controversies, questions, and strategies for ethical

computing / Herman T. Tavani, Rivier University—Fourth edition. pages cm

Includes bibliographical references and index. ISBN 978-1-118-28172-7 (pbk.)

1. Computer networks—Moral and ethical aspects. I. Title. TK5105.5.T385 2013 175—dc23

2012028589

Printed in the United States of America

10 9 8 7 6 5 4 3 2 1

http://www.wiley.com/go/permissions
http://www.wiley.com/go/citizenship
http://www.copyright.com
http://www.wiley.com/go/returnlabel
FFIRS3GXML 10/20/2012 0:58:24 Page 3

In memory of my grandparents, Leon and Marian (Roberts) Hutton,

and Antonio and Clelia (Giamberardino) Tavani

FFIRS3GXML 10/20/2012 0:58:24 Page 4

FTOC3GXML 10/20/2012 1:3:1 Page 5

� CONTENTS AT A GLANCE

PREFACE xvii

ACKNOWLEDGMENTS xxvii

FOREWORD xxix

CHAPTER 1. INTRODUCTION TO CYBERETHICS: CONCEPTS, PERSPECTIVES, AND METHODOLOGICAL FRAMEWORKS 1

CHAPTER 2. ETHICAL CONCEPTS AND ETHICAL THEORIES: ESTABLISHING AND JUSTIFYING A MORAL SYSTEM 33

CHAPTER 3. CRITICAL REASONING SKILLS FOR EVALUATING DISPUTES IN CYBERETHICS 74

CHAPTER 4. PROFESSIONAL ETHICS, CODES OF CONDUCT, AND MORAL RESPONSIBILITY 101

CHAPTER 5. PRIVACY AND CYBERSPACE 131

CHAPTER 6. SECURITY IN CYBERSPACE 174

CHAPTER 7. CYBERCRIME AND CYBER-RELATED CRIMES 201

CHAPTER 8. INTELLECTUAL PROPERTY DISPUTES IN CYBERSPACE 230

CHAPTER 9. REGULATING COMMERCE AND SPEECH IN CYBERSPACE 269

CHAPTER 10. THE DIGITAL DIVIDE, DEMOCRACY, AND WORK 303

CHAPTER 11. ONLINE COMMUNITIES, CYBER IDENTITIES, AND SOCIAL NETWORKS 337

CHAPTER 12. ETHICAL ASPECTS OF EMERGING AND CONVERGING TECHNOLOGIES 368

GLOSSARY 411

INDEX 417

v

FTOC3GXML 10/20/2012 1:3:1 Page 6

FTOC3GXML 10/20/2012 1:3:1 Page 7

� TABLE OF CONTENTS

PREFACE xvii New to the Fourth Edition xviii Audience and Scope xix Organization and Structure of the Book xxi The Web Site for Ethics and Technology xxiii A Note to Students xxiv Note to Instructors: A Roadmap for Using This Book xxiv A Note to Computer Science Instructors xxv

ACKNOWLEDGMENTS xxvii FOREWORD xxix

c CHAPTER 1

INTRODUCTION TO CYBERETHICS: CONCEPTS, PERSPECTIVES, AND METHODOLOGICAL FRAMEWORKS 1

Scenario 1–1: A Fatal Cyberbullying Incident on MySpace 1 Scenario 1–2: Contesting the Ownership of a Twitter Account 2 Scenario 1–3: “The Washingtonienne” Blogger 2 1.1 Defining Key Terms: Cyberethics and Cybertechnology 3

1.1.1 What Is Cybertechnology? 4 1.1.2 Why the Term Cyberethics? 5

1.2 The Cyberethics Evolution: Four Developmental Phases in Cybertechnology 6 1.3 Are Cyberethics Issues Unique Ethical Issues? 9 Scenario 1–4: Developing the Code for a Computerized Weapon System 10 Scenario 1–5: Digital Piracy 11

1.3.1 Distinguishing between Unique Technological Features and Unique Ethical Issues 11

1.3.2 An Alternative Strategy for Analyzing the Debate about the Uniqueness of Cyberethics Issues 12

1.3.3 A Policy Vacuum in Duplicating Computer Software 13 1.4 Cyberethics as a Branch of Applied Ethics: Three Distinct Perspectives 14

1.4.1 Perspective #1: Cyberethics as a Field of Professional Ethics 15 1.4.2 Perspective #2: Cyberethics as a Field of Philosophical Ethics 18 1.4.3 Perspective #3: Cyberethics as a Field of Sociological/Descriptive Ethics 21

Scenario 1–6: The Impact of Technology X on the Pleasantville Community 21 1.5 A Comprehensive Cyberethics Methodology 24

1.5.1 A “Disclosive” Method for Cyberethics 25 1.5.2 An Interdisciplinary and Multilevel Method for Analyzing

Cyberethics Issues 26 1.6 A Comprehensive Strategy for Approaching Cyberethics Issues 27 1.7 Chapter Summary 28

vii

FTOC3GXML 10/20/2012 1:3:1 Page 8

Review Questions 28 Discussion Questions 29 Essay/Presentation Questions 29 Scenarios for Analysis 29 Endnotes 30 References 31 Further Readings 32 Online Resources 32

c CHAPTER 2

ETHICAL CONCEPTS AND ETHICAL THEORIES: ESTABLISHING AND JUSTIFYING A MORAL SYSTEM 33

2.1 Ethics and Morality 33 Scenario 2–1: The “Runaway Trolley”: A Classic Moral Dilemma 34

2.1.1 What Is Morality? 35 2.1.2 Deriving and Justifying the Rules and Principles of a Moral System 38

2.2 Discussion Stoppers as Roadblocks to Moral Discourse 42 2.2.1 Discussion Stopper #1: People Disagree on Solutions to

Moral Issues 43 2.2.2 Discussion Stopper #2: Who Am I to Judge Others? 45 2.2.3 Discussion Stopper #3: Morality Is Simply a Private Matter 47 2.2.4 Discussion Stopper #4: Morality Is Simply a Matter for Individual

Cultures to Decide 48 Scenario 2–2: The Perils of Moral Relativism 49 2.3 Why Do We Need Ethical Theories? 52 2.4 Consequence-Based Ethical Theories 53

2.4.1 Act Utilitarianism 55 Scenario 2–3: A Controversial Policy in Newmerica 55

2.4.2 Rule Utilitarianism 55 2.5 Duty-Based Ethical Theories 56

2.5.1 Rule Deontology 57 Scenario 2–4: Making an Exception for Oneself 58

2.5.2 Act Deontology 59 Scenario 2–5: A Dilemma Involving Conflicting Duties 60 2.6 Contract-Based Ethical Theories 61

2.6.1 Some Criticisms of Contract-Based Theories 62 2.6.2 Rights-Based Contract Theories 63

2.7 Character-Based Ethical Theories 64 2.7.1 Being a Moral Person vs. Following Moral Rules 64 2.7.2 Acquiring the “Correct” Habits 65

2.8 Integrating Aspects of Classical Ethical Theories into a Single Comprehensive Theory 66 2.8.1 Moor’s Just-Consequentialist Theory and Its Application to

Cybertechnology 67 2.8.2 Key Elements in Moor’s Just-Consequentialist Framework 69

2.9 Chapter Summary 70 Review Questions 70 Discussion Questions 71 Essay/Presentation Questions 71 Scenarios for Analysis 72 Endnotes 72

viii c Table of Contents

FTOC3GXML 10/20/2012 1:3:2 Page 9

References 73 Further Readings 73

c CHAPTER 3

CRITICAL REASONING SKILLS FOR EVALUATING DISPUTES IN CYBERETHICS 74

3.1 Getting Started 74 Scenario 3–1: Reasoning About Whether to Download a File from “Sharester” 75

3.1.1 Defining Two Key Terms in Critical Reasoning: Claims and Arguments 75 3.1.2 The Role of Arguments in Defending Claims 76 3.1.3 The Basic Structure of an Argument 76

3.2 Constructing an Argument 78 3.3 Valid Arguments 80 3.4 Sound Arguments 83 3.5 Invalid Arguments 85 3.6 Inductive Arguments 86 3.7 Fallacious Arguments 87 3.8 A Seven-Step Strategy for Evaluating Arguments 89 3.9 Identifying Some Common Fallacies 91

3.9.1 Ad Hominem Argument 92 3.9.2 Slippery Slope Argument 92 3.9.3 Fallacy of Appeal to Authority 93 3.9.4 False Cause Fallacy 93 3.9.5 Begging the Question 94 3.9.6 Fallacy of Composition/Fallacy of Division 94 3.9.7 Fallacy of Ambiguity/Equivocation 95 3.9.8 Appeal to the People (Argumentum ad Populum) 95 3.9.9 The Many/Any Fallacy 96 3.9.10 The Virtuality Fallacy 97

3.10 Chapter Summary 98 Review Questions 98 Discussion Questions 98 Essay/Presentation Questions 99 Scenarios for Analysis 99 Endnotes 99 References 100 Further Readings 100

c CHAPTER 4

PROFESSIONAL ETHICS, CODES OF CONDUCT, AND MORAL RESPONSIBILITY 101

4.1 Professional Ethics 102 4.1.1 What Is a Profession? 103 4.1.2 Who Is a Professional? 103 4.1.3 Who Is a Computer/IT Professional? 104

4.2 Do Computer/IT Professionals Have Any Special Moral Responsibilities? 105 4.2.1 Safety-Critical Software 105

4.3 Professional Codes of Ethics and Codes of Conduct 106 4.3.1 The Purpose of Professional Codes 107 4.3.2 Some Criticisms of Professional Codes 108 4.3.3 Defending Professional Codes 109 4.3.4 The IEEE-CS/ACM Software Engineering Code of Ethics and Professional

Practice 110

Table of Contents b ix

FTOC3GXML 10/20/2012 1:3:2 Page 10

4.4 Conflicts of Professional Responsibility: Employee Loyalty and Whistle-Blowing 112 4.4.1 Do Employees Have an Obligation of Loyalty to Employers? 112 4.4.2 Whistle-Blowing Issues 114

Scenario 4–1: Whistle-Blowing and the “Star Wars” Controversy 115 4.4.3 An Alternative Strategy for Understanding Professional Responsibility 117

4.5 Moral Responsibility, Legal Liability, and Accountability 117 4.5.1 Distinguishing Responsibility from Liability and Accountability 118 4.5.2 Accountability and the Problem of “Many Hands” 119

Scenario 4–2: The Therac-25 Machine 120 4.5.3 Legal Liability and Moral Accountability 120

4.6 Risk Assessment in the Software Development Process 121 Scenario 4–3: The Aegis Radar System 121 4.7 Do Some Computer Corporations Have Special Moral Obligations? 122

4.7.1 Special Responsibilities for Search Engine Companies 123 4.7.2 Special Responsibilities for Companies that Develop Autonomous Systems 124

4.8 Chapter Summary 125 Review Questions 126 Discussion Questions 126 Essay/Presentation Questions 126 Scenarios for Analysis 127 Endnotes 128 References 128 Further Readings 130

c CHAPTER 5

PRIVACY AND CYBERSPACE 131

5.1 Are Privacy Concerns Associated with Cybertechnology Unique or Special? 132 5.2 What is Personal Privacy? 134

5.2.1 Accessibility Privacy: Freedom from Unwarranted Intrusion 135 5.2.2 Decisional Privacy: Freedom from Interference in One’s

Personal Affairs 135 5.2.3 Informational Privacy: Control over the Flow of Personal

Information 136 5.2.4 A Comprehensive Account of Privacy 136

Scenario 5–1: Descriptive Privacy 137 Scenario 5–2: Normative Privacy 137

5.2.5 Privacy as “Contextual Integrity” 137 Scenario 5–3: Preserving Contextual Integrity in a University Seminar 138 5.3 Why is Privacy Important? 139

5.3.1 Is Privacy an Intrinsic Value? 140 5.3.2 Privacy as a Social Value 141

5.4 Gathering Personal Data: Monitoring, Recording, and Tracking Techniques 141 5.4.1 “Dataveillance” Techniques 141 5.4.2 Internet Cookies 142 5.4.3 RFID Technology 143 5.4.4 Cybertechnology and Government Surveillance 145

5.5 Exchanging Personal Data: Merging and Matching Electronic Records 146 5.5.1 Merging Computerized Records 146

Scenario 5–4: Merging Personal Information in Unrelated Computer Databases 147 5.5.2 Matching Computerized Records 148

Scenario 5–5: Using Biometric Technology at Super Bowl XXXV 149

x c Table of Contents

FTOC3GXML 10/20/2012 1:3:2 Page 11

5.6 Mining Personal Data 150 5.6.1 How Does Data Mining Threaten Personal Privacy? 150

Scenario 5–6: Data Mining at the XYZ Bank 151 5.6.2 Web Mining 154

Scenario 5–7: The Facebook Beacon Controversy 154 5.7 Protecting Personal Privacy in Public Space 156 Scenario 5–8: Shopping at SuperMart 157 Scenario 5–9: Shopping at Nile.com 157

5.7.1 Search Engines and the Disclosure of Personal Information 158 Scenario 5–10: Tracking Your Search Requests on Google 159

5.7.2 Accessing Online Public Records 160 Scenario 5–11: Accessing Online Public Records in Pleasantville 161 Scenario 5–12: Accessing a State’s Motor Vehicle Records Online 162 5.8 Privacy-Enhancing Technologies 162

5.8.1 Educating Users about PETs 163 5.8.2 PETs and the Principle of Informed Consent 163

5.9 Privacy Legislation and Industry Self-Regulation 164 5.9.1 Industry Self-Regulation Initiatives Regarding Privacy 164

Scenario 5–13: Controversies Involving Google’s Privacy Policy 166 5.9.2 Privacy Laws and Data Protection Principles 166

5.10 Chapter Summary 168 Review Questions 169 Discussion Questions 169 Essay/Presentation Questions 170 Scenarios for Analysis 170 Endnotes 171 References 171 Further Readings 173

c CHAPTER 6

SECURITY IN CYBERSPACE 174

6.1 Security in the Context of Cybertechnology 174 6.1.1 Cybersecurity as Related to Cybercrime 175 6.1.2 Security and Privacy: Some Similarities and Some Differences 175

6.2 Three Categories of Cybersecurity 176 6.2.1 Data Security: Confidentiality, Integrity, and Availability

of Information 177 6.2.2 System Security: Viruses, Worms, and Malware 178

Scenario 6–1: The Conficker Worm 178 6.2.3 Network Security: Protecting our Infrastructure 179

Scenario 6–2: The GhostNet Controversy 179 6.3 “Cloud Computing” and Security 180

6.3.1 Deployment and Service/Delivery Models for the Cloud 181 6.3.2 Securing User Data Residing in the Cloud 182

6.4 Hacking and “The Hacker Ethic” 183 6.4.1 What Is “The Hacker Ethic”? 184 6.4.2 Are Computer Break-ins Ever Ethically Justifiable? 186

6.5 Cyberterrorism 187 6.5.1 Cyberterrorism vs. Hacktivism 188

Scenario 6–3: Anonymous and the “Operation Payback” Attack 189 6.5.2 Cybertechnology and Terrorist Organizations 190

Table of Contents b xi

FTOC3GXML 10/20/2012 1:3:2 Page 12

6.6 Information Warfare (IW) 191 6.6.1 Information Warfare vs. Conventional Warfare 191

Scenario 6–4: The Stuxnet Worm and the “Olympic Games” Operation 192 6.6.2 Potential Consequences for Nations that Engage in IW 192

6.7 Cybersecurity and Risk Analysis 194 6.7.1 The Risk Analysis Methodology 194 6.7.2 The Problem of “De-Perimeterization” of Information Security for

Analyzing Risk 195 6.8 Chapter Summary 196 Review Questions 196 Discussion Questions 197 Essay/Presentation Questions 197 Scenarios for Analysis 197 Endnotes 198 References 198 Further Readings 200

c CHAPTER 7

CYBERCRIME AND CYBER-RELATED CRIMES 201

7.1 Cybercrimes and Cybercriminals 201 7.1.1 Background Events: A Brief Sketch 202 7.1.2 A Typical Cybercriminal 203

7.2 Hacking, Cracking, and Counterhacking 203 7.2.1 Hacking vs. Cracking 204 7.2.2 Active Defense Hacking: Can Acts of “Hacking Back” or Counter

Hacking Ever Be Morally Justified? 204 7.3 Defining Cybercrime 205

7.3.1 Determining the Criteria 206 7.3.2 A Preliminary Definition of Cybercrime 207

Scenario 7–1: Using a Computer to File a Fraudulent Tax Return 207 7.3.3 Framing a Coherent and Comprehensive Definition of Cybercrime 208

7.4 Three Categories of Cybercrime: Piracy, Trespass, and Vandalism in Cyberspace 208 7.5 Cyber-Related Crimes 209

7.5.1 Some Examples of Cyber-Exacerbated vs. Cyber-Assisted Crimes 209 7.5.2 Identity Theft 211

7.6 Technologies and Tools for Combating Cybercrime 213 Scenario 7–2: Intercepting Mail that Enters and Leaves Your Neighborhood 213

7.6.1 Biometric Technologies 214 7.6.2 Keystroke-Monitoring Software and Packet-Sniffing Programs 215

7.7 Programs and Techniques Designed to Combat Cybercrime in the United States 216 7.7.1 Entrapment and “Sting” Operations to Catch Internet Pedophiles 216

Scenario 7–3: Entrapment on the Internet 216 7.7.2 Enhanced Government Surveillance Techniques and the Patriot Act 217

7.8 National and International Laws to Combat Cybercrime 218 7.8.1 The Problem of Jurisdiction in Cyberspace 218

Scenario 7–4: A Virtual Casino 218 Scenario 7–5: Prosecuting a Computer Corporation in Multiple Countries 219

7.8.2 Some International Laws and Conventions Affecting Cybercrime 220 Scenario 7–6: The Pirate Bay Web Site 221 7.9 Cybercrime and the Free Press: The WikiLeaks Controversy 221

7.9.1 Are WikiLeaks’ Practices Ethical? 222

xii c Table of Contents

FTOC3GXML 10/20/2012 1:3:2 Page 13

7.9.2 Are WikiLeaks’ Practices Criminal? 222 7.9.3 WikiLeaks and the Free Press 223

7.10 Chapter Summary 225 Review Questions 225 Discussion Questions 226 Essay/Presentation Questions 226 Scenarios for Analysis 226 Endnotes 227 References 228 Further Readings 229

c CHAPTER 8

INTELLECTUAL PROPERTY DISPUTES IN CYBERSPACE 230

8.1 What is Intellectual Property? 230 8.1.1 Intellectual Objects 231 8.1.2 Why Protect Intellectual Objects? 232 8.1.3 Software as Intellectual Property 232 8.1.4 Evaluating an Argument for Why It is Wrong to Copy

Proprietary Software 233 8.2 Copyright Law and Digital Media 235

8.2.1 The Evolution of Copyright Law in the United States 235 8.2.2 The Fair-Use and First-Sale Provisions of Copyright Law 236

Scenario 8–1: Making Classic Books Available Online 237 Scenario 8–2: Decrypting Security on an e-Book Reader 237

8.2.3 Software Piracy as Copyright Infringement 238 8.2.4 Napster and the Ongoing Battles over Sharing Digital Music 239

Scenario 8–3: The Case of MGM v. Grokster 241 8.3 Patents, Trademarks, and Trade Secrets 242

8.3.1 Patent Protections 242 8.3.2 Trademarks 243 8.3.3 Trade Secrets 243

8.4 Jurisdictional Issues Involving Intellectual Property Laws 244 8.5 Philosophical Foundations for Intellectual Property Rights 245

8.5.1 The Labor Theory of Property 245 Scenario 8–4: DEF Corporation vs. XYZ Inc. 246

8.5.2 The Utilitarian Theory of Property 247 Scenario 8–5: Sam’s e-Book Reader Add-on Device 247

8.5.3 The Personality Theory of Property 248 Scenario 8–6: Angela’s Bþþ Programming Tool 249 8.6 The Free Software and the Open Source Movements 250

8.6.1 GNU and the Free Software Foundation 250 8.6.2 The “Open Source Software” Movement: OSS vs. FSF 251

8.7 The “Common-Good” Approach: An Alternative Framework for Analyzing the Intellectual Property Debate 252 8.7.1 Information Wants to be Shared vs. Information Wants to be Free 254 8.7.2 Preserving the Information Commons 256 8.7.3 The Fate of the Information Commons: Could the Public Domain of

Ideas Eventually Disappear? 257 8.7.4 The Creative Commons 259

8.8 PIPA, SOPA, and RWA Legislation: Current Battlegrounds in the Intellectual Property War 260

Table of Contents b xiii

FTOC3GXML 10/20/2012 1:3:2 Page 14

8.8.1 The PIPA and SOPA Battles 261 8.8.2 RWA and Public Access to Health-Related Information 261

Scenario 8–7: Elsevier Press and “The Cost of Knowledge” Boycott 262 8.8.3 Intellectual Property Battles in the Near Future 263

8.9 Chapter Summary 264 Review Questions 264 Discussion Questions 265 Essay/Presentation Questions 265 Scenarios for Analysis 265 Endnotes 266 References 267 Further Readings 268

c CHAPTER 9

REGULATING COMMERCE AND SPEECH IN CYBERSPACE 269

9.1 Background Issues and Some Preliminary Distinctions 270 9.1.1 The Ontology of Cyberspace: Is the Internet a Medium or a Place? 270 9.1.2 Two Categories of Cyberspace Regulation 271

9.2 Four Modes of Regulation: The Lessig Model 273 9.3 Digital Rights Management and the Privatization of Information Policy 274

9.3.1 DRM Technology: Implications for Public Debate on Copyright Issues 274 Scenario 9–1: The Sony Rootkit Controversy 275

9.3.2 Privatizing Information Policy: Implications for the Internet 276 9.4 The Use and Misuse of (HTML) Metatags and Web Hyperlinks 278

9.4.1 Issues Surrounding the Use/Abuse of HTML Metatags 278 Scenario 9–2: A Deceptive Use of HTML Metatags 279

9.4.2 Hyperlinking and Deep Linking 279 Scenario 9–3: Deep Linking on the Ticketmaster Web Site 280 9.5 E-Mail Spam 281

9.5.1 Defining Spam 281 9.5.2 Why Is Spam Morally Objectionable? 282

9.6 Free Speech vs. Censorship and Content Control in Cyberspace 284 9.6.1 Protecting Free Speech 284 9.6.2 Defining Censorship 285

9.7 Pornography in Cyberspace 286 9.7.1 Interpreting “Community Standards” in Cyberspace 286 9.7.2 Internet Pornography Laws and Protecting Children Online 287 9.7.3 Virtual Child Pornography 288

Scenario 9–4: A Sexting Incident Involving Greensburg Salem High School 290 9.8 Hate Speech and Speech that can Cause Physical Harm to Others 292

9.8.1 Hate Speech on the Web 292 9.8.2 Online “Speech” that Can Cause Physical Harm to Others 294

9.9 “Network Neutrality” and the Future of Internet Regulation 294 9.9.1 Defining Network Neutrality 295 9.9.2 SomeArgumentsAdvanced byNetNeutrality’s Proponents andOpponents 296 9.9.3 Future Implications for the Net Neutrality Debate 296

9.10 Chapter Summary 297 Review Questions 298 Discussion Questions 298 Essay/Presentation Questions 299 Scenarios for Analysis 299 Endnotes 300

xiv c Table of Contents

FTOC3GXML 10/20/2012 1:3:2 Page 15

References 300 Further Readings 301

c CHAPTER 10

THE DIGITAL DIVIDE, DEMOCRACY, AND WORK 303

10.1 The Digital Divide 304 10.1.1 The Global Digital Divide 304 10.1.2 The Digital Divide within Nations 305

Scenario 10–1: Providing In-Home Internet Service for Public School Students 306 10.1.3 Is the Digital Divide an Ethical Issue? 307

10.2 Cybertechnology and the Disabled 309 10.2.1 Disabled Persons and Remote Work 310 10.2.2 Arguments for Continued WAI Support 311

10.3 Cybertechnology and Race 312 10.3.1 Internet Usage Patterns 312 10.3.2 Racism and the Internet 313

10.4 Cybertechnology and Gender 314 10.4.1 Access to High-Technology Jobs 315 10.4.2 Gender Bias in Software Design and Video Games 317

10.5 Cybertechnology, Democracy, and Democratic Ideals 317 10.5.1 Has Cybertechnology Enhanced or Threatened Democracy? 318 10.5.2 How has Cybertechnology Affected Political Elections in

Democratic Nations? 322 10.6 The Transformation and the Quality of Work 324

10.6.1 Job Displacement and the Transformed Workplace 324 10.6.2 The Quality of Work Life in the Digital Era 328

Scenario 10–2: Employee Monitoring and the Case of Ontario v. Quon 329 10.7 Chapter Summary 331 Review Questions 332 Discussion Questions 332 Essay/Presentation Questions 333 Scenarios for Analysis 333 Endnotes 334 References 335 Further Readings 336

c CHAPTER 11

ONLINE COMMUNITIES, CYBER IDENTITIES, AND SOCIAL NETWORKS 337

11.1 Online Communities and Social Networking Services 337 11.1.1 Online Communities vs. Traditional Communities 337 11.1.2 Blogs in the Context of Online Communities 339 11.1.3 Assessing Pros and Cons of Online Communities 339

Scenario 11–1: A Virtual Rape in Cyberspace 342 11.2 Virtual Environments and Virtual Reality 343

11.2.1 What is Virtual Reality (VR)? 344 11.2.2 Ethical Controversies Involving Behavior in VR Applications and Games 345 11.2.3 Misrepresentation, Bias, and Indecent Representations in VR Applications 349

11.3 Cyber Identities and Cyber Selves: Personal Identity and Our Sense of Self in the Cyber Era 351 11.3.1 Cybertechnology as a “Medium of Self-Expression” 352 11.3.2 “MUD Selves” and Distributed Personal Identities 352 11.3.3 The Impact of Cybertechnology on Our Sense of Self 353

11.4 AI and its Implications for What it Means to be Human 355

Table of Contents b xv

FTOC3GXML 10/20/2012 1:3:2 Page 16

11.4.1 What is AI? A Brief Overview 355 11.4.2 The Turing Test and John Searle’s “Chinese Room” Argument 357 11.4.3 Cyborgs and Human-Machine Relationships 358

Scenario 11–2: Artificial Children 361 11.4.4 Do (At Least Some) AI Entities Warrant Moral Consideration? 361

11.5 Chapter Summary 363 Review Questions 363 Discussion Questions 364 Essay/Presentation Questions 364 Scenarios for Analysis 365 Endnotes 365 References 366 Further Readings 367

c CHAPTER 12

ETHICAL ASPECTS OF EMERGING AND CONVERGING TECHNOLOGIES 368

12.1 Converging Technologies and Technological Convergence 368 12.2 Ambient Intelligence (AmI) and Ubiquitous Computing 369

12.2.1 Pervasive Computing 371 12.2.2 Ubiquitous Communication 371 12.2.3 Intelligent User Interfaces 371 12.2.4 Ethical and Social Issues in AmI 372

Scenario 12–1: E. M. Forster’s Precautionary Tale 373 Scenario 12–2: Jeremy Bentham’s Panopticon 375 12.3 Bioinformatics and Computational Genomics 376

12.3.1 Computing and Genetic “Machinery”: Some Conceptual Connections 376 12.3.2 Ethical Issues and Controversies 376

Scenario 12–3: deCODE Genetics Inc. 377 12.3.3 ELSI Guidelines and Genetic-Specific Legislation 380

12.4 Nanotechnology and Nanocomputing 381 12.4.1 Nanotechnology: A Brief Overview 382 12.4.2 Optimistic vs. Pessimistic Views of Nanotechnology 383 12.4.3 Ethical Issues in Nanotechnology and Nanocomputing 386

12.5 Autonomous Machines and Machine Ethics 389 12.5.1 What is an Autonomous Machine (AM)? 390 12.5.2 Some Ethical and Philosophical Questions Involving AMs 393 12.5.3 Machine Ethics and Moral Machines 398

12.6 A “Dynamic” Ethical Framework for Guiding Research in New and Emerging Technologies 402 12.6.1 Is an ELSI-Like Model Adequate for New/Emerging Technologies? 402 12.6.2 A “Dynamic Ethics” Model 403

12.7 Chapter Summary 404 Review Questions 404 Discussion Questions 405 Essay/Presentation Questions 405 Scenarios for Analysis 405 Endnotes 406 References 407 Further Readings 409

GLOSSARY 411

INDEX 417

xvi c Table of Contents

FPREF3GXML 10/20/2012 1:5:10 Page 17

c

PREFACE

As the digital landscape continues to evolve at a rapid pace, new variations of moral, legal, and social concerns arise along with it. Not surprisingly, then, an additional cluster of cyberethics issues has emerged since the publication of the previous edition of Ethics and Technology in late 2009. Consider, for example, the ways in which Cloud- based storage threatens the privacy and security of our personal data. Also consider the increasing amount of personal data that social networking sites such as Facebook and major search engine companies such as Google now collect. Should we worry about how that information can be subsequently used? Should we also worry about the filtering techniques that leading search engines now use to tailor or “personalize” the results of our search queries based on profiles derived from information about our previous search requests? Some analysts note that the current information-gathering/profiling practices and techniques used in the commercial sector can also be adopted by governments, and they point out that these practices could not only support the surveillance initiatives of totalitarian governments but could also threaten the privacy of citizens in democratic countries as well.

Also consider the impact that recent cyberwarfare activities, including the clan- destine cyberattacks allegedly launched by some nation sates, could have for our national infrastructure. Additionally, consider the national-security-related concerns raised by the WikiLeaks controversy, which has also exacerbated an ongoing tension between free speech on the Internet vs. standards for “responsible reporting” on the part of investigative journalists. And the recent debate about “network neutrality” causes us to revisit questions about the extent to which the service providers responsi- ble for delivering online content should also be able to control the content that they deliver.

Other kinds of concerns now arise because of developments in a relatively new subfield of cyberethics called “machine ethics” (sometimes referred to as “robo-ethics”). For example, should we develop autonomous machines that are capable of making decisions that have moral implications? Some semiautonomous robots, which serve as companions and caregivers for the elderly and as “babysitters” for young children, are already available. Recent and continued developments in robotics and autonomous machines may provide many conveniences and services, but they can also cause us to question our conventional notions of autonomy, moral agency, and trust. For example, can/should these machines be fully autonomous? Can they qualify as (artificial) moral agents? Also, will humans be able to trust machines that they will increasingly rely on to carry out critical tasks? If we do not yet know the answers to these questions, and if no clear and explicit policies are in place to guide research in this area, should we continue to develop autonomous machines? These and related questions in the emerging

xvii

FPREF3GXML 10/20/2012 1:5:10 Page 18

field of machine ethics are but a few of the many new questions we examine in the fourth edition of Ethics and Technology.

Although new technologies emerge, and existing technologies continue to mature and evolve, many of the ethical issues associated with them are basically variations of existing ethical problems. At bottom, these issues reduce to traditional ethical concerns having to do with dignity, respect, fairness, obligations to assist others in need, and so forth. So, we should not infer that the moral landscape itself has been altered because of behaviors made possible by these technologies. We will see that, for the most part, the new issues examined in this edition of Ethics and Technology are similar in relevant respects to the kinds of ethical issues we examined in the book’s previous editions. However, many emerging technologies present us with challenges that, initially at least, do not seem to fit easily into our conventional ethical categories. So, a major objective of this textbook is to show how those controversies can be analyzed from the perspective of standard ethical concepts and theories.

The purpose ofEthics and Technology, as stated in the prefaces to the three previous editions of this book, is to introduce students to issues and controversies that comprise the relatively new field of cyberethics. The term “cyberethics” is used in this textbook to refer to the field of study that examines moral, legal, and social issues involving cybertechnology. Cybertechnology, in turn, refers to a broad spectrum of computing/ information and communication technologies that range from stand-alone computers to the current cluster of networked devices and technologies. Many of these technologies include devices and applications that are connected to privately owned computer networks as well as to the Internet itself.

This textbook examines a wide range of cyberethics issues—from specific issues of moral responsibility that directly affect computer and information technology (IT) professionals to broader social and ethical concerns that affect each of us in our day- to-day lives. Questions about the roles and responsibilities of computer/IT professionals in developing safe and reliable computer systems are examined under the category of professional ethics. Broader social and ethical concerns associated with cybertechnology are examined under topics such as privacy, security, crime, intellectual property, Internet regulation, and so forth.

c NEW TO THE FOURTH EDITION

New pedagogical material includes

� a newly designed set of end-of-chapter exercises called “Scenarios for Analysis,” which can be used for either in-class analysis or group projects;

� new and/or updated (in-chapter) scenarios, illustrating both actual cases and hypothetical situations, which enable students to apply methodological concepts/ frameworks and ethical theories covered in Chapters 1 and 2;

� new sample arguments in some chapters, which enable students to apply the tools for argument analysis covered in Chapter 3;

� updated “review questions,” “discussion questions,” and “essay/presentation questions” at the end of chapters;

xviii c Preface

FPREF3GXML 10/20/2012 1:5:10 Page 19

� an updated and revised glossary of key terms used in the book; � an updated Ethics and Technology Companion Site with new resources and

materials for students and instructors.

New issues examined and analyzed include

� ethical and social aspects of Cloud computing, including concerns about the privacy and security of users’ data that is increasingly being stored in “the Cloud”;

� concerns about the “personalization filters” that search engine companies use to tailor our search results to conform to their perceptions of what we want.

� questions about Google’s (2012) privacy policy vis-�a-vis the amount of user data that can be collected via the search engine company’s suite of applications;

� concerns about cyberwarfare activities involving nation states and their alleged launching of the Stuxnet worm and Flame virus;

� controversies surrounding WikiLeaks and the tension it creates between free speech and responsible journalism, as well as for concerns involving national security;

� concerns affecting “network neutrality” and whether regulation may be required to ensure that Internet service providers do not gain too much control over the content they deliver;

� controversies in “machine ethics,” including the development of autonomous machines capable of making decisions that have moral impacts;

� questions about whether we can trust artificial agents to act in ways that will always be in the best interests of humans.

In revising the book, I have also eliminated some older, now out-of-date, material. Additionally, I have streamlined some of the material that originally appeared in previous editions of the book but still needs to be carried over into the present edition.

c AUDIENCE AND SCOPE

Because cyberethics is an interdisciplinary field, this textbook aims at reaching several audiences and thus easily runs the risk of failing to meet the needs of any one audience. I have nonetheless attempted to compose a textbook that addresses the needs of computer science, philosophy, social/behavioral science, and library/information science students. Computer science students need a clear understanding of the ethical challenges they will face as computer professionals when they enter the workforce. Philosophy students, on the contrary, should understand how moral issues affecting cybertechnology can be situated in the field of applied ethics in general and then analyzed from the perspective of ethical theory. Social science and behavioral science students will likely want to assess the sociological impact of cybertechnology on our social and political institutions (govern- ment, commerce, and education) and sociodemographic groups (affecting gender, race, ethnicity, and social class). And library science and information science students should be aware of the complexities and nuances of current intellectual property laws that threaten unfettered access to electronic information, and should be informed about recent regulatory schemes that threaten to censor certain forms of electronic speech.

Preface b xix

FPREF3GXML 10/20/2012 1:5:10 Page 20

Students from other academic disciplines should also findmany issues covered in this textbook pertinent to their personal and professional lives; some undergraduates may elect to take a course in social and ethical aspects of technology to satisfy one of their general education requirements. Although Ethics and Technology is intended mainly for undergraduate students, it could be used, in conjunction with other texts, in graduate courses as well.

We examine ethical controversies using scenarios that include both actual cases and hypothetical examples, wherever appropriate. In some instances I have deliberately constructed provocative scenarios and selected controversial cases to convey the severity of the ethical issues we consider. Some readers may be uncomfortable with, and possibly even offended by, these scenarios and cases—for example, those illustrating unethical practices that negatively affect children and minorities. Although it might have been politically expedient to skip over issues and scenarios that could unintentionally offend certain individuals, I believe that no textbook in applied ethics would do justice to its topic if it failed to expose and examine issues that adversely affect vulnerable groups in society.

Also included in most chapters are sample arguments that are intended to illustrate some of the rationales that have been put forth by various interest groups to defend policies and laws affecting privacy, security, property, and so forth, in cyberspace. Instructors and students can evaluate these arguments via the rules and criteria estab- lished in Chapter 3 to see how well, or how poorly, the premises in these arguments succeed in establishing their conclusions.

Exercise questions are included at the end of each chapter. First, basic “review questions” quiz the reader’s comprehension of key concepts, themes, issues, and scenarios covered in that chapter. These are followed by higher level “discussion questions” designed to encourage students to reflect more deeply on some of the contro- versial issues examined in the chapter. In addition to “essay/presentation questions” that are also included in each chapter, a new set of “Scenarios for Analysis” have been added in response to instructors who requested the addition of some unanalyzed scenarios for classroom use. Building on the higher level nature of the discussion questions and essay/presentation questions, these scenarios are intended to provide students and instructors with additional resources for analyzing important controversies intro- duced in the various chapters. As such, these scenarios can function as in-class resources for group projects.

Some essay/presentation questions and end-of-chapter scenarios ask students to compare and contrast arguments and topics that span multiple chapters; for example, students are asked to relate arguments used to defend intellectual property rights, considered in Chapter 8, to arguments for protecting privacy rights, examined in Chapter 5. Other questions and scenarios ask students to apply foundational concepts and frameworks, such as ethical theory and critical thinking techniques introduced in Chapters 2 and 3, to the analysis of specific cyberethics issues examined in subsequent chapters. In some cases, these end-of-chapter questions and scenarios may generate lively debate in the classroom; in other cases, they can serve as a point of departure for various class assignments and group projects. Although no final “solutions” to the issues and dilemmas raised in these questions and scenarios are provided in the text, some “strategies” for analyzing them are included in the section of the book’s Web site (www.w iley.co m/college /tava ni) entit led “Strate gies for Discussi on Quest ions. ”

xx c Preface

http://www.wiley.com/college/tavani
FPREF3GXML 10/20/2012 1:5:11 Page 21

c ORGANIZATION AND STRUCTURE OF THE BOOK

Ethics and Technology is organized into 12 chapters. Chapter 1, “Introduction to Cyberethics: Concepts, Perspectives, and Methodological Frameworks,” defines key concepts and terms that will appear throughout the book. For example, definitions of terms such as cyberethics and cybertechnology are introduced in this chapter. We then examine whether any ethical issues involving cybertechnology are unique ethical issues. We also consider how we can approach cyberethics issues from three different perspec- tives: professional ethics, philosophical ethics, and sociological/descriptive ethics, each of which represents the approach generally taken by a computer scientist, a philosopher, and a social/behavioral scientist. Chapter 1 concludes with a proposal for a comprehen- sive and interdisciplinary methodological scheme for analyzing cyberethics issues from these perspectives.

In Chapter 2, “Ethical Concepts and Ethical Theories: Establishing and Justifying a Moral System,”we examine some of the basic concepts that make up a moral system. We draw a distinction between “ethics” and “morality” by defining ethics as “the study of morality.” “Morality,” or a moral system, is defined as an informal, public system comprising rules of conduct and principles for evaluating those rules. We then examine consequence-based, duty-based, character-based, and contract-based ethical theories. Chapter 2 concludes with a model that integrates elements of competing ethical theories into one comprehensive and unified theory.

Chapter 3, “Critical Reasoning Skills for Evaluating Disputes in Cyberethics,” includes a brief overview of basic concepts and strategies that are essential for debating moral issues in a structured and rational manner. We begin by describing the structure of a logical argument and show how arguments can be constructed and analyzed. Next, we examine a technique for distinguishing between arguments that are valid and invalid, sound and unsound, and inductive and fallacious. We illustrate examples of each type with topics affecting cybertechnology and cyberethics. Finally, we identify some strate- gies for spotting and labeling “informal” logical fallacies that frequently occur in everyday discourse.

Chapter 4, “Professional Ethics, Codes of Conduct, and Moral Responsibility,” examines issues related to professional responsibility for computer/IT professionals. We consider whether there are any special moral responsibilities that computer/IT professionals have as professionals. We then examine some professional codes of conducted that have been adopted by computer organizations. We also ask: To what extent are software engineers responsible for the reliability of the computer systems they design and develop, especially applications that include “life-critical” and “safety- critical” software? Are computer/IT professionals ever permitted, or perhaps even required, to “blow the whistle” when they have reasonable evidence to suggest that a computer system is unreliable? Finally, we examine some schemes for analyzing risks associated with the development of safety-critical software.

We discuss privacy issues involving cybertechnology in Chapter 5. First, we examine the concept of privacy as well as some arguments for why privacy is considered an important human value. We then look at how personal privacy is threatened by the kinds of surveillance techniques and data-collection schemes made possible by cybertechnol- ogy. Specific data-gathering and data-exchanging techniques are examined in detail. We next consider some challenges that data mining and Web mining pose for protecting

Preface b xxi

FPREF3GXML 10/20/2012 1:5:11 Page 22

personal privacy in public space. In Chapter 5, we also consider whether technology itself, in the form of privacy-enhancing technologies (or PETs), can provide an adequate solution to some privacy issues generated by cybertechnology.

Chapter 6, “Security in Cyberspace,” examines security threats in the context of computers and cybertechnology. Initially, we differentiate three distinct senses of “security”: data security, system security, and network security. We then examine the concepts of “hacker” and “hacker ethic,” and we ask whether computer break-ins can ever be morally justified. Next, we differentiate acts of “hacktivism,” cyberterrorism, and information warfare. Chapter 6 concludes with a brief examination of risk analysis in the context of cybersecurity.

We begin our analysis of cybercrime, in Chapter 7, by considering whether we can construct a profile of a “typical” cybercriminal. We then propose a definition of cybercrime that enables us to distinguish between “cyberspecific” and “cyber-related” crimes to see whether such a distinction would aid in the formulation of more coherent cybercrime laws. We also consider the notion of legal jurisdiction in cyberspace, especially with respect to the prosecution of cybercrimes that involve interstate and international venues. In addition, we examine technological efforts to combat cyber- crime, such as controversial uses of biometric technologies.

Chapters 8 and 9 examine legal issues involving intellectual property and free speech, respectively, as they relate to cyberspace. One objective of Chapter 8, “Intellectual Property Disputes in Cyberspace,” is to show why an understanding of the concept of intellectual property is important in an era of digital information. We consider three theories of property rights and make important distinctions among legal concepts such as copyright law, patent protection, and trademarks. Additionally, we consider specific scenarios involving intellectual property disputes, including the original Napster contro- versy as well as some recent peer-to-peer (P2P) networks that have been used for file sharing. We also examine the Free Software and the Open Source Software initiatives. Finally, we consider a compromise solution that supports and encourages the sharing of digital information in an era when strong copyright legislation seems to discourage that practice.

Chapter 9, “Regulating Commerce and Speech in Cyberspace,” looks at additional legal issues, especially as they involve regulatory concerns in cyberspace. We draw distinctions between two different senses of “regulation” as it applies to cyberspace, and we also consider whether the Internet should be understood as a medium or as a “place.” We also examine controversies surrounding e-mail spam, which some believe can be viewed as a formof “speech” in cyberspace.We then askwhether all formsof online speech should be granted legal protection; for example, should child pornography, hate speech, and speech that can cause physical harm to others be tolerated in online forums?

Chapter 10 examines a wide range of equity-and-access issues from the perspective of cybertechnology’s impact for sociodemographic groups (affecting class, race, and gender). The chapter begins with an analysis of global aspects of the “digital divide.”We then examine specific equity-and-access issues affecting disabled persons, racial minor- ities, and women. Next, we explore the relationship between cybertechnology and democracy, and we consider whether the Internet facilitates democracy or threatens it. We then examine some social and ethical issues affecting employment in the contemporary workplace, and we ask whether the use of cybertechnology has trans- formed work and has affected the overall quality of work life.

xxii c Preface

FPREF3GXML 10/20/2012 1:5:11 Page 23

In Chapter 11, we examine issues pertaining to online communities, virtual-reality (VR) environments, and artificial intelligence (AI) developments in terms of two broad themes: community and personal identity in cyberspace. We begin by analyzing the impact that cybertechnology has for our traditional understanding of the concept of community. In particular, we ask whether online communities, such as Facebook and Twitter, raise any special ethical or social issues. Next, we examine some implications that behaviors made possible by virtual environments and virtual-reality applications have for our conventional understanding of personal identity. The final section of Chapter 11 examines the impact that developments in AI have for our sense of self and for what it means to be human.

Chapter 12, the final chapter of Ethics and Technology, examines some ethical challenges that arise in connection with emerging and converging technologies. We note that cybertechnology is converging with noncybertechnologies, including biotechnology and nanotechnology, generating new fields such as bioinformatics and nanocomputing that, in turn, introduce ethical concerns. Chapter 12 also includes a brief examination of some issues in the emerging (sub)field of machine ethics. Among the questions consid- ered are whether we should develop autonomous machines that are capable of making moral decisions and whether we could trust those machines to always act in our best interests.

A Glossary that defines terms commonly used in the context of computer ethics and cyberethics is also included. However, the glossary is by no means intended as an exhaustive list of such terms. Additional material for this text is available on the book’s W eb site : www .wiley.c om/col lege.tava ni.

c THEWEB SITE FOR ETHICS AND TECHNOLOGY

Seven appendices for Ethics and Technology are available only in online format. Appendices A through E include the full text of five professional codes of ethics: the ACM Code of Ethics and Professional Conduct, the Australian Computer Society Code of Ethics, the British Computer Society Code of Conduct, the IEEE Code of Ethics, and the IEEE-CS/ACM Software Engineering Code of Ethics and Professional Practice, respectively. Specific sections of these codes are included in hardcopy format as well, in relevant sections of Chapter 4. Two appendices, F and G, are also available online. Appendix F contains the section of the IEEE-CS/ACM Computing Curricula 2001 Final Report that describes the social, professional, and ethical units of instruction mandated in their computer science curriculum. Appendix G provides some additional critical reasoning techniques that expand on the strategies introduced in Chapter 3.

The Web site for Ethics and Technology also contains additional resources for instructors and students. Presentation slides in PowerPoint format for Chapters 1–12, as well as graphics (for tables and figures in each chapter), are available in the “Instructor” and “Student” sections of the site. As noted earlier, a section on “Strategies,” which includes some techniques for answering the discussion questions and unanalyzed sce- narios included at the end of each of the book’s 12 chapters, is also included on this site.

The book’s Web site is intended as an additional resource for both instructors and students. It also enables me to “update the book,” in between editions, with new issues and scenarios in cyberethics, as they arise. For example, a section entitled “Recent

Preface b xxiii

http://www.wiley.com/college.tavani
FPREF3GXML 10/20/2012 1:5:11 Page 24

Controversies” is included on the book’s Web site. I invite your feedback as to how this site can be continually improved.

c A NOTE TO STUDENTS

If you are taking an ethics course for the first time, you might feel uncomfortable with the prospect of embarking on a study of moral issues and controversial topics that might initially cause you discomfort because ethics is sometimes perceived to be preachy, and its subject matter is sometimes viewed as essentially personal and private in nature. Because these are common concerns, I address them early in the textbook. I draw a distinction between an ethicist, who studies morality or a “moral system,” and a moralist who may assume to have the correct answers to all of the questions; note that a primary objective of this book is to examine and analyze ethical issues, not to presume that any of us already has the correct answer to any of the questions I consider.

To accomplish this objective, I introduce three types of conceptual frameworks early in the textbook. In Chapter 1, I provide a methodological scheme that enables you to identify controversial problems and issues involving cybertechnology as ethical issues. The conceptual scheme included in Chapter 2, based on ethical theory, provides some general principles that guide your analysis of specific cases as well as your deliberations about which kinds of solutions to problems should be proposed. A third, and final, conceptual framework is introduced in Chapter 3 in the form of critical reasoning techniques, which provides rules and standards that you can use for evaluating the strengths of competing arguments and for defending a particular position that you reach on a certain issue.

This textbook was designed and written for you, the student! Whether or not it succeeds in helping you to meet the objectives of a course in cyberethics is very important to me, so I welcome your feedback on this textbook; and I would sincerely appreciate hearing your ideas on how this textbook could be improved. Please feel free to write to me with your suggestions, comments, and so forth. My email address is htavani@rivier .edu. I look forward to hearing from you!

c NOTE TO INSTRUCTORS: A ROADMAP FOR USING THIS BOOK

The chapters that make up Ethics and Technology are sequenced so that readers are exposed to foundational issues and conceptual frameworks before they examine specific problems in cyberethics. In some cases, it may not be possible for instructors to cover all of the material in Chapters 1–3. It is strongly recommended, however, that before students are assigned material in Chapter 4, they at least read Sections 1.1, 1.4–1.5, 2.4– 2.8, and 3.1. Instructors using this textbook can determine which chapters best accom- modate their specific course objectives. Computer science instructors, for example, will likely want to assign Chapter 4, on professional ethics and responsibility, early in the term. Social science instructors, on the other hand, will likely examine issues discussed in Chapters 10 and 11 early in their course. Philosophy instructors may wish to structure their courses beginning with a thorough examination of the material on ethical concepts

xxiv c Preface

FPREF3GXML 10/20/2012 1:5:11 Page 25

and ethical theory in Chapter 2 and techniques for evaluating logical arguments in Chapter 3. Issues discussed in Chapter 12 may be of particular interest to CS instructors teaching advanced undergraduate students.

Many textbooks in applied ethics include a requisite chapter on ethical concepts/ theory at the beginning of the book. Unfortunately, they often treat them in a cursory manner; furthermore, these ethical concepts and theories are seldom developed and reinforced in the remaining chapters. Thus, readers often experience a “disconnect” between the material included in the book’s opening chapter and the content of the specific cases and issues discussed in subsequent chapters. By incorporating elements of ethical theory into my discussion and analysis of the specific cyberethics issues I examine, I have tried to avoid the “disconnect” between theory and practice that is commonplace in many applied ethics textbooks.

c ANOTE TO COMPUTER SCIENCE INSTRUCTORS

Ethics and Technology can be used as the main text in a course dedicated to ethical and social issues in computing, or it can be used as a supplementary textbook for computer science courses in which one or more ethics modules are included. As I suggested in the preceding section, instructors may find it difficult to cover all of the material included in this book in the course of a single semester. And as I also previously suggested, computer science instructors will likely want to ensure that they allocate sufficient course time to the professional ethical issues discussed in Chapter 4. Also of special interest to computer science instructors and their students will be the sections on computer security and risk analysis in Chapter 6; open source code and intellectual property issues in Chapter 8; and regulatory issues affecting software code in Chapter 9. Because computer science instructors may need to limit the amount of class time they devote to covering founda- tional concepts included in the earlier chapters, I recommend covering at least the critical sections of Chapters 1–3 described previously. This should provide computer science students with some of the tools they will need as professionals to deliberate on ethical issues and to justify the positions they reach.

In designing this textbook, I took into account the guidelines on ethical instruction included in the Computing Curricula 2001 Final Report, issued in December 2001 by the IEEE-CS/ACM Joint Task Force on Computing Curricula, which recommends the inclusion of 16 core hours of instruction on social, ethical, and professional topics in the curriculum for undergraduate computer science students. [See the online Appendix F at www .wiley.c om/college .tavani for de tailed inform ation abo ut the social /professiona l (SP) units in the Computing Curricula 2001.] Each topic prefaced with an SP designation defines one “knowledge area” or a CS “body of knowledge.” They are distributed among the following 10 units:

SP1: History of computing (e.g., history of computer hardware, software, and networking)

SP2: Social context of computing (e.g., social implications of networked computing, gender-related issues, and international issues)

Preface b xxv

http://www.wiley.com/college.tavani
FPREF3GXML 10/20/2012 1:5:11 Page 26

SP3: Methods and tools of analysis (e.g., identifying assumptions and values, making and evaluating ethical arguments)

SP4: Professional and ethical responsibilities (e.g., the nature of professionalism, codes of ethics, ethical dissent, and whistle-blowing)

SP5: Risks and liabilities of computer-based systems (e.g., historical examples of software risks)

SP6: Intellectual property (e.g., foundations of intellectual property, copyrights, patents, and software piracy)

SP7: Privacy and civil liberties (e.g., ethical and legal basis for privacy protection, technological strategies for privacy protection)

SP8: Computer crime (e.g., history and examples of computer crime, hacking, viruses, and crime prevention strategies)

SP9: Economic issues in computing (e.g., monopolies and their economic implications; effect of skilled labor supply)

SP10: Philosophical frameworks (e.g., ethical theory, utilitarianism, relativism)

All 10 SP units are covered in this textbook. Topics described in SP1 are examined in Chapters 1 and 10, and topics included in SP2 are discussed in Chapters 1 and 11. The methods and analytical tools mentioned in SP3 are described at length in Chapters 2 and 3, whereas professional issues involving codes of conduct and professional responsibility described in SP4 are included in Chapters 4 and 12. Also discussed in Chapter 4, as well as in Chapter 6, are issues involving risks and liabilities (SP5). Intellectual property issues (SP6) are discussed in detail in Chapter 8 and in certain sections of Chapter 9, whereas privacy and civil liberty concerns (SP7) are discussed mainly in Chapters 5 and 12. Chapters 6 and 7 examine topics described in SP8. Economic issues (SP9) are considered in Chapters 9 and 10. And philosophical frameworks of ethics, including ethical theory (SP10), are discussed in Chapters 1 and 2.

Table 1 illustrates the corresponding connection between SP units and the chapters of this book.

TABLE 1 SP (“Knowledge”) Units and Corresponding Book Chapters

SP unit 1 2 3 4 5 6 7 8 9 10 Chapter(s) 1, 10 1, 11 2, 3 4, 12 4, 6 8, 9 5, 12 6, 7 9, 10 1, 2

xxvi c Preface

FACKN3GXML 10/20/2012 1:7:33 Page 27

c

ACKNOWLEDGMENTS

In revising Ethics and Technology for a fourth edition, I have once again drawn from several of my previously published works. Chapters 1–4, on foundational and profes- sional issues in cyberethics, incorporate material from four articles: “The State of Computer Ethics as a Philosophical Field of Inquiry,”Ethics and Information Technology 3, no. 2 (2001); “Applying an Interdisciplinary Approach to Teaching Computer Ethics,” IEEE Technology and Society Magazine 21, no. 3 (2002); “The Uniqueness Debate in Computer Ethics,” Ethics and Information Technology 4, no. 1 (2002); and “Search Engines and Ethics,” Stanford Encyclopedia of Philosophy (2012).

Chapter 5, on privacy in cyberspace, also draws from material in four works: “Computer Matching and Personal Privacy,” Proceedings of the Symposium on Com- puters and the Quality of Life (ACM Press, 1996); “Informational Privacy, Data Mining, and the Internet,” Ethics and Information Technology 1, no. 2 (1999); “Privacy Enhanc- ing Technologies as a Panacea for Online Privacy Concerns: Some Ethical Considera- tions,” Journal of Information Ethics 9, no. 2 (2000); and “Applying the ‘Contextual Integrity’ Model of Privacy to Personal Blogs in the Blogosphere” (coauthored with Frances Grodzinsky), International Journal of Internet Research Ethics 3 (2010). Chapters 6 and 7, on security and crime in cyberspace, draw from material in three sources: “Privacy and Security” in Duncan Langford’s book Internet Ethics (Macmillan/St. Martins, 2000); “Defining the Boundaries of Computer Crime: Piracy, Trespass, and Vandalism in Cyberspace” in Readings in CyberEthics 2nd ed. (Jones and Bartlett, 2004); and “Privacy in ‘the Cloud’” (coauthored with Frances Grodzinsky), Computers and Society 41, no. 1 (2011).

In Chapters 8 and 9, on intellectual property and Internet regulation, I drew from material in “Information Wants to be Shared: An Alternative Approach for Analyzing Intellectual Property Disputes in the Information Age,”Catholic Library World 73, no. 2 (2002); and two papers coauthored with Frances Grodzinsky: “P2P Networks and the Verizon v. RIAA Case,” Ethics and Information Technology 7, no. 4 (2005) and “Online File Sharing: Resolving the Tensions between Privacy and Property” Computers and Society 38, no. 4 (2008). Chapters 10 and 11, on the digital divide, democracy, and online communities, draw from material from two papers: “Ethical Reflections on the Digital Divide,” Journal of Information, Communication and Ethics in Society 1, no. 2 (2003) and “Online Communities, Democratic Ideals, and the Digital Divide” (coauthored with Frances Grodzinsky) in Soraj Hongladarom and Charles Ess’s book Information Tech- nology Ethics: Cultural Perspectives (IGI Global, 2007).

Chapter 12, on emerging and converging technologies, incorporates material from my book Ethics, Computing, and Genomics (Jones and Bartlett, 2006), and from three recently published papers: “CanWeDevelop Artificial Agents Capable of Making Good

xxvii

FACKN3GXML 10/20/2012 1:7:33 Page 28

Moral Decisions?” Minds and Machines 21, no. 3 (2011); “Trust and Multi-Agent Systems” (coauthored with Jeff Buechner), Ethics and Information Technology 13, no. 1 (2011); and “Ethical Aspects of Autonomous Systems” in Michael Decker and Mathias Gutmann’s book Robo- and Information-Ethics (Berlin: Verlag LIT, 2012).

The fourth edition of Ethics and Technology has benefited from suggestions and comments I received from many anonymous reviewers, as well as from the following colleagues: Jeff Buechner, Lloyd Carr, Jerry Dolan, Frances Grodzinsky, Kenneth Himma, James Moor, Martin Menke, Wayne Pauley, Mark Rosenbaum, Regina Tavani, and John Weckert. I am especially grateful to Fran Grodzinsky (Sacred Heart Univer- sity), with whom I have coauthored several papers, for permitting me to incorporate elements of our joint research into relevant sections of this book. And I ammost grateful to Lloyd Carr (Rivier University) for his invaluable feedback on several chapters and sections of this edition of the book, which he was willing to review multiple times; his astute comments and suggestions have helpedme to refinemany of the positions I defend in this book.

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Quality Homework Helper
Top Class Results
Unique Academic Solutions
Financial Analyst
High Quality Assignments
Premium Solutions
Writer Writer Name Offer Chat
Quality Homework Helper

ONLINE

Quality Homework Helper

I will be delighted to work on your project. As an experienced writer, I can provide you top quality, well researched, concise and error-free work within your provided deadline at very reasonable prices.

$17 Chat With Writer
Top Class Results

ONLINE

Top Class Results

I have done dissertations, thesis, reports related to these topics, and I cover all the CHAPTERS accordingly and provide proper updates on the project.

$49 Chat With Writer
Unique Academic Solutions

ONLINE

Unique Academic Solutions

I have worked on wide variety of research papers including; Analytical research paper, Argumentative research paper, Interpretative research, experimental research etc.

$23 Chat With Writer
Financial Analyst

ONLINE

Financial Analyst

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$23 Chat With Writer
High Quality Assignments

ONLINE

High Quality Assignments

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$30 Chat With Writer
Premium Solutions

ONLINE

Premium Solutions

I have assisted scholars, business persons, startups, entrepreneurs, marketers, managers etc in their, pitches, presentations, market research, business plans etc.

$44 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

What sets biblical christianity apart from other religions - Mckenzies ground rice coles - Forecasts are usually classified by time horizon into which three categories? - Tempest moyes microlights glider - Managerial issues of a networked organization - Free tr 069 server - Cloud computing - Billy elliot final scene - Situational leadership theory pros and cons - Absolute value function transformations worksheet - Adirondack white pine cabins - Nike inc case study solution - A deontological moral theory regards the morality of actions as - Bluebird mfg has received a special - Frazer v walker 1967 1 ac 569 - Leadership Portfolio - Aon and aoa network diagrams examples - Trail of tears drawing - Uber Case - +91$^*9414601882 lOvE prOblEm sOlutiOn in unitEd statEs - Hobnob doggie daycare arapahoe road - What is a satchel of richards meaning - Motor town soul of the machine locker combination - Langley sound and light - What makes australia unique - Vera bradley case study - Discovering human sexuality 3rd edition ebook - What is data abstraction in python - What times 7 equals 49 - Piper arrow v speeds - Aldershot farnham and district - Approaches to the study of globalization by manfred steger pdf - Schneider electric vfd selection tool - Words sound the same but have different meanings - 3 page Essay needed - Math 533 project part b - Wk 16 - Truth in 12 angry men - What energy system does a 100m sprinter use - Me talk pretty one day author's goal - What is positive feedback in homeostasis - The circle of governments machiavelli summary - Dr kenyon chace avenue - BIO 110 (ANATOMY AND PHYSIOLOGY II) - Brk carbon monoxide alarm - Examples of diverging lenses in everyday life - Unit 9 - What is management support system - Gainmax shelf tech system - Responsorial psalm wedding i have loved you - Meezan bank car financing - Warwick house medical centre - #{91 =9876751387}{ Husaband wife love problem solution specialist baba ji in Mumbai - Kenneth bronson vsim documentation - Elsewhere in america the crisis of belonging in contemporary culture - Written Assignment - DISCUSSION RESPONSE(NUR630) - Information technology interview questions answers - Mike maloney hidden secrets of money ep 4 - Moral rights in the workplace - Operations and algebraic thinking grade 2 lesson plans - Paragraphs and essays 13th edition pdf - Assignment finish in 3hours - What is the acceleration of a cheetah - What does a newspaper article need - Death of the iceman documentary - Ethical Issues in Business - Context free grammar odd length - Toll transitions defence phone number - Planning and operational variances - English Comp 2 - Nursing - Week 6 word of week - Dr freeman psychiatrist killeen tx - Atlas god roman name - Why are gradients important in diffusion and osmosis - How long to fall 1km - Contesta las preguntas de alejandra. - Uow subject database 2021 - In franchising, the reputation of the franchiser is dependent on: - Principles of advocacy in nursing - Https www youtube com watch v kudhiats36a - Graphing practice answer key - Rsa authentication manager 7.1 installation and configuration guide - Netflix in 2011 case study summary - Chemistry lab 3 - What is the mindtap learning path most similar to - Adding fractions butterfly method - Acc chem res 2012 - Nola pender's health promotion model theory - How to build a logistics network - The revenue account for a merchandising business is entitled - Discussion 3 - Shadow health focused assessment cough - 3 hour unit hydrograph - Coffee contract negotiation exercise - Fahrenheit 451 part 3 discussion questions - Rothschild index - Experiment 1 neutralization of acids and bases data tables - Waas ent virt k9