Responsible Software Engineering With Real-World Case Studies from Google (Daniel J. Barrett) (Z-Library)

Author: Daniel J. Barrett

技术

Today's software applications need more than a friendly interface and correct algorithms. They also need to be responsible: to be beneficial for society and not cause harm. In an era of AI chatbots, deep fake images and videos, social media bubbles, expanding privacy regulations, and a warming planet, it's more important than ever to practice responsible software engineering so your products earn your users' trust—and deserve it. Responsible Software Engineering gathers the wisdom of over 100 Google employees to help you anticipate the effects of your software on the world and its inhabitants. It features expert advice and practical case studies so you can build better applications that are more ready for real-world situations: • Treating people more fairly, regardless of their beliefs, culture, skin tone, and other attributes • Operating more safely, to reduce the risk of physical, psychological, or financial harm • Better protecting people's privacy, particularly when collecting personal information • Incorporating wisdom from the social sciences, law, ethics, and other fields that many engineers may be unfamiliar with • Reducing emissions of carbon dioxide (CO2), to address the risks of climate change Join Daniel J. Barrett, a senior manager at Google and long-time software engineer, to dive into these issues and more, including real-world, large-scale case studies. You'll receive expert advice on how to anticipate the effects of your application on the world and its inhabitants, so you can have more confidence that your products "do the right thing."

📄 File Format: PDF
💾 File Size: 26.2 MB
34
Views
0
Downloads
0.00
Total Donations

📄 Text Preview (First 20 pages)

ℹ️

Registered users can read the full content for free

Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.

📄 Page 1
Resp onsib le Softw are Eng ineering Responsible Software Engineering With Real-World Case Studies from Google Daniel J. Barrett
📄 Page 2
9 7 8 1 0 9 8 1 4 9 1 6 1 5 5 9 9 9 ISBN: 978-1-098-14916-1 US $59.99 CAN $74.99 SOF T WARE ENGINEERING Today’s software applications need more than a friendly interface and correct algorithms. They also need to be responsible: to be beneficial for society and not cause harm. In an era of AI chatbots, deep fake images and videos, social media bubbles, expanding privacy regulations, and a warming planet, it’s more important than ever to practice responsible software engineering so your products earn your users’ trust—and deserve it. Responsible Software Engineering gathers the wisdom of over 100 Google employees to help you anticipate the effects of your software on the world and its inhabitants. It features expert advice and practical case studies so you can build better applications that are more ready for real-world situations: • Treating people more fairly, regardless of their beliefs, culture, skin tone, and other attributes • Operating more safely, to reduce the risk of physical, psychological, or financial harm • Better protecting people’s privacy, particularly when collecting personal information • Incorporating wisdom from the social sciences, law, ethics, and other fields that many engineers may be unfamiliar with • Reducing emissions of carbon dioxide (CO2), to address the risks of climate change Daniel J. Barrett, PhD, has been a software engineer and technical writer for almost 40 years. He’s worked at companies of all sizes, from startups to large corporations, including seven years at Google. He is also the author of Linux Pocket Guide, Efficient Linux at the Command Line, and many other books for software engineers. Responsible Software Engineering “In a world of AI, software engineers need to think more than ever about the security consequences of their products. Responsible Software Engineering is an eye-opening tour through the issues and strategies to make software safer.” Bruce Schneier, author of A Hacker’s Mind: How the Powerful Bend Society’s Rules, and How to Bend Them Back “Responsible Software Engineering tackles the urgent, but often invisible, decisions every developer faces. Barrett transforms nuanced real-world challenges into practical guidance, showing how we can ef fectively consider the broader implications of the software we build. This is essential reading that deserves a spot on the shelf of every developer or technology leader.” Adam Scott, senior engineering director at College Board and author of JavaScript Cookbook, JavaScript Everywhere, and the Ethical Web Development series
📄 Page 3
Daniel J. Barrett Responsible Software Engineering With Real-World Case Studies from Google
📄 Page 4
978-1-098-14916-1 [LSI] Responsible Software Engineering by Daniel J. Barrett Copyright © 2025 Daniel J. Barrett. All rights reserved. Published by O’Reilly Media, Inc., 141 Stony Circle, Suite 195, Santa Rosa, CA 95401. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (https://oreilly.com). For more information, contact our corporate/institu‐ tional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editors: John Devins and Megan Laddusaw Development Editor: Michele Cronin Production Editor: Gregory Hyman Copyeditor: Doug McNair Proofreader: Piper Content Partners Indexer: Potomac Indexing, LLC Cover Designer: Karen Montgomery Interior Designer: David Futato Cover Illustrator: Karen Montgomery Interior Illustrator: Kate Dullea September 2025: First Edition Revision History for the First Edition 2025-09-04: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781098149161 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Responsible Software Engineering, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the author and do not represent the publisher’s views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
📄 Page 5
Table of Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii 1. Responsible Software Engineering: A Quick Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . 1 What Is Responsible Software Engineering? 2 A Little Help from Some Specialists 4 What Is Responsible Engineering Not? 5 A Little History 6 Adopting a Responsible Mindset 8 Summary 11 2. Creating AI Systems That Work Well for Everyone. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 What Is Fairness? 15 Why Is Fairness Hard? 18 Fairness Is Different from Accuracy 20 Fairness Is Relative 21 Bias Is Always Present 22 AI Input Can Be Ambiguous 24 AI Output Can Be Hard to Evaluate 26 Evaluating Fairness 26 Parity Issues 27 Stereotyping Issues 30 Accuracy Issues 33 Combinations of Issues 33 Resources for Evaluating Fairness 35 Mitigating Fairness Issues, in Brief 36 People- and Process-Related Suggestions 36 Technology Solutions 38 iii
📄 Page 6
Case Study: Oversexualized Generated Imagery 41 Summary 45 3. Incorporating Societal Context. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 What Is Societal Context? 49 Issues of Abstraction 51 Making Your Causal Assumptions Explicit 52 Mitigating Bias in the Care Management Algorithm 56 Best Practices 57 Identifying Agents, Artifacts, and Precepts 57 Creating a Welcoming Environment for Exchanging Viewpoints 60 Case Study: Detecting Toxic Comments 64 Summary 69 4. Anticipating and Planning for Downstream Consequences. . . . . . . . . . . . . . . . . . . . . . . 71 Safety and Harm 73 Types of Harm 74 Testing for Safety 75 How Is Safety Related to Ethics? 78 Common Justifications for Sidestepping Ethical Behavior 80 Methods for Anticipating Consequences 82 Testing with Breadth 82 Codesigning with Users 83 Reviewing a List of Harms 84 Practicing Future Regret 85 Running Tabletop Exercises 85 Implementing Abuser and Survivor Testing 86 Stress-Testing Your Applications 88 Trying Chaos Engineering 89 Educating Yourself About Other People’s Lives 89 Case Study: Google’s Moral Imagination Workshop 90 Preparations 91 What Next? 96 Summary 99 5. Securing and Respecting Users’ Privacy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 What Is Privacy? 102 Personally Identifiable Information 104 Data Collection, Trade-offs, and Convenience 104 Privacy from the User’s Perspective 107 No Surprises 108 Transparency 110 iv | Table of Contents
📄 Page 7
Consent 111 Control 113 Privacy from a Data Perspective 115 Minimization 115 Retention 116 Anonymization 117 From Tools to Policy 123 Case Study: Protecting Privacy During the COVID Pandemic 124 Living and Working in a Privacy-Focused World 128 Summary 129 6. Measuring and Reducing Your Code’s Carbon Footprint. . . . . . . . . . . . . . . . . . . . . . . . . 131 Measuring Carbon Emissions 132 Principles of Power 134 Beyond Direct Carbon Emissions 139 Controlling Your Code’s Carbon Footprint 140 Controlling Processor Usage 140 What About Coding for Performance? 143 Controlling the Code’s Location 144 Optimizing for Time of Day 146 Getting Involved 147 Case Study: Cooling a Data Center with AI 148 Summary 150 7. Building a Culture of Responsible Software Engineering. . . . . . . . . . . . . . . . . . . . . . . . 151 Setting Policy 152 Sponsorship and Support 153 Misunderstandings About a Culture of Responsibility 154 Spreading the Word 157 Messaging 157 Educating New Hires 159 Establishing Processes 161 Creating Incentives 163 Incentives from the Top Down 164 Incentives from the Bottom Up 165 Learning from Mistakes 166 Measuring Success 167 Case Study: The Responsible Innovation Challenge 168 Summary 171 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 Table of Contents | v
📄 Page 8
(This page has no text content)
📄 Page 9
1 Not to be confused with The Social Network, a drama about the origins of Facebook starring Jesse Eisenberg. 2 Read a full transcript of The Social Dilemma. Preface This book was created in 14 seconds. That’s the time I spent watching one compelling scene in The Social Dilemma, a documentary film from 2020.1 The film describes how social media companies engage in practices that may be harmful to our health, our ties to other human beings, and democracy itself. I found some scenes convincing and others overly dramatic, but those particular 14 seconds altered my whole outlook on software development. The scene in question is an interview with a former Facebook engineer named Justin Rosenstein, who was a developer of the Like button. His team’s “entire motivation” for inventing likes, he said in the film, was to “spread positivity and love in the world.” As their invention reached billions of users, however, he and his team found that they hadn’t anticipated some serious negative effects on society. “The idea that...teens would be getting depressed when they don’t have enough likes, or it could be leading to political polarization, was nowhere on our radar.”2 This scene nagged at me. I mean, I was coding web applications back in 2007, when the Like button was conceived. What if fate had placed me at Facebook on Rose‐ nstein’s team? Would I have thought ahead about potential risks of the Like button? Or would I just have been swept up in the coolness of the invention? I couldn’t know. But I was fascinated to learn that such experienced software engineers, acting with the best of intentions, could deliver a product with these unforeseen complications. I began to wonder: can we, as a community of software engineers, learn to anticipate and prevent these sorts of unwanted effects of the systems we build? This book is my answer to that question. vii
📄 Page 10
What’s in This Book? This book is about writing software responsibly for the real world—a world that’s complex, multicultural, hard to predict, and downright messy. Applications that work beautifully during development and testing may behave unexpectedly when real peo‐ ple and their lives enter the picture. Anticipating and mitigating these issues is called responsible software engineering. I’ll cover a broad selection of responsible software-engineering principles to help you build better applications that are more ready for real-world situations: • Treating people more fairly, regardless of their beliefs, culture, skin tone, abilities, and other attributes • Operating more safely, to reduce the risk of physical, psychological, or financial harm • Protecting people’s privacy better, particularly when collecting or using their per‐ sonal information • Incorporating wisdom from the social sciences, law, ethics, and other fields that many engineers may be unfamiliar with • Reducing emissions of carbon dioxide (CO2), to address the risks of climate change • Gaining, maintaining, and deserving users’ trust in your products If you’re a software engineer or you work with software engineers to create products, and if you care about the effects of your software on your users’ lives, then this book is for you. (If you don’t care about these effects, I doubly hope you’ll read this book!) Today, in 2025, some of the topics and terms in this book have become much more politicized than when I began writing it in 2021. I’m pretty sure, though, that none of us wants to be denied a job or health care because of an unfair algorithm. None of us wants our most sensitive, private information, or our children’s information, to be collected or revealed without our permission. None of us, I hope, wants to build soft‐ ware with unintended effects that harm people. I wrote this book to share knowledge and best practices to help make algorithms more fair, information more private, and software effects more predictable. viii | Preface
📄 Page 11
What’s Not in This Book? This book is a broad look at responsible software engineering. It’s filled with general guidance, specific tips, and detailed case studies from Google, where I worked for seven years. However, it does not include a few notable things: There’s very little code. If you’re looking for source code to make your software more responsible, this is not the book for you, although I do suggest a few open source libraries to try. In addition, check out Machine Learning for High-Risk Applications: Approaches to Responsible AI by Patrick Hall, James Curtis, and Parul Pandey (O’Reilly). This book is not official Google policy. It is my own work, informed by over a hundred interviews with my fellow Goo‐ gle employees (“Googlers”) and other professionals. I draw many examples in this book from the experiences of Googlers. This should be no surprise, given the book’s subtitle of Real-World Case Studies from Google, but I want to call out this fact directly in case you’re wondering whether this book is a big advertisement for Google products. It’s not. I include these focused examples to create teachable moments about software engineering—the responsible kind and otherwise—and to share stories that you may never have heard before. I also don’t mean to imply that Google’s practices are more or less responsible than those of other software companies. Many companies hire great engineers, and all companies make mistakes. What matters is how they deal with those mistakes afterward. I hope my Google-related case studies provide you with interesting insights into responsible software engineering in practice. A Note About the Characters This book features three cartoon characters named Ree, Cwip, and Endy, who are introduced in Chapter 1. They are intentionally drawn with androgynous features and medium skin tones. I wanted them to look generic enough to avoid stereotyping yet still portray distinct personalities. For example, Cwip’s ideas are consistently irre‐ sponsible or just plain bad, and I didn’t want this behavior to be associated with any particular group, based on Cwip’s outward appearance. O’Reilly illustrator Kate Dullea, editor Michele Cronin, and I spent several months designing and redesigning the characters, and a survey of 60 test readers suggested that we met our goal. I hope you’ll agree. Preface | ix
📄 Page 12
Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Constant width Used for program listings, as well as within paragraphs to refer to program ele‐ ments such as variable or function names, databases, data types, environment variables, statements, and keywords. This element signifies a tip or suggestion. This element signifies a general note. This element indicates a warning or caution. O’Reilly Online Learning For more than 40 years, O’Reilly Media has provided technol‐ ogy and business training, knowledge, and insight to help companies succeed. Our unique network of experts and innovators share their knowledge and expertise through books, articles, and our online learning platform. O’Reilly’s online learning platform gives you on-demand access to live training courses, in-depth learning paths, interactive coding environments, and a vast collection of text and video from O’Reilly and 200+ other publishers. For more information, visit https://oreilly.com. x | Preface
📄 Page 13
How to Contact Us Please address comments and questions concerning this book to the publisher: O’Reilly Media, Inc. 141 Stony Circle, Suite 195 Santa Rosa, CA 95401 800-889-8969 (in the United States or Canada) 707-827-7019 (international or local) 707-829-0104 (fax) support@oreilly.com https://oreilly.com/about/contact.html We have a web page for this book, where we list errata and any additional informa‐ tion. You can access this page at https://oreil.ly/responsible-software-engineering. For news and information about our books and courses, visit https://oreilly.com. Find us on LinkedIn: https://linkedin.com/company/oreilly-media. Watch us on YouTube: https://youtube.com/oreillymedia. Acknowledgments Thank you to the multitudes of Googlers who shared their wisdom and made this book possible. It has been such a privilege to meet and learn from over 160 world- class experts in AI, privacy, safety, carbon emissions, ethics, law, and other topics in the realm of responsible software engineering. Thank you to my colleagues who guided me as I developed and proposed the book idea to senior leaders at Google. Kevin O’Malley and Ricardo Olenewa helped me get started, and Annie Jean-Baptiste, Jess Holbrook, Salim Virji, and Shylaja Nukala added key insights for navigating the process. Thank you to the executive sponsors who approved the book project and fielded my questions along the journey: Bram Bout, Will Carter, Jen Gennai, Jason Freidenfelds, Alice Friend, Maggie Johnson, and Ian Wilbur. Also, thank you to Marian Croak for early encouragement. Special thanks to my manager, Mohamed Dekhil, for his sup‐ port so I could give this book the time and attention it needed. Whenever one writes for the public within a public company, invariably, the content must pass muster with the legal, global communications, and public policy depart‐ ments, plus product owners and other gatekeepers. This time-consuming process, to my surprise and delight, was friendly and smooth. I’m grateful to the enthusiastic, skilled professionals who helped me walk the line between educating the public and protecting Google’s confidential information: Aki Estrella, Amy Coyle, Ben Bariach, Preface | xi
📄 Page 14
Beth Gavin, Brian Gabriel, Carina Koszubatis, Cary Bassin, Chelsea Russo, Chrissy Moy, Chrissy Patterson, Dawn Bloxwich, Duncan Smith, Eli Liliedahl-Allen, Elijah Lawal, Elise Bigelow, Emily Liu, Ian Wilbur, James Pond, Jerry Torres, Jessica Valdez, Julia Wu, Karl Ryan, Kelly Hanson-Schaefer, Kelsea Carlson, Liam Foster, Makiko Izuta, Marissa Urban, Matthew Flegal, Michael Zwibelman, Michelle Alborzfar, Miguel Guevara, Molly Beck, Ndidi Elue, Nicole Schone, Rachel Stigler, Reena Jana, Renee Schneider, Ruth Ann Castro, Ryan Woo, Sandy Karp, Shanice Onike, Shannon Leong, Shira Almeleh, Taylor Montgomery, Thor Wasbotten, Tim Taylor, Tom Kuhn, Yael Marzan, and Zoe Ortiz. My very deepest thanks go out to the wise and generous Googlers who spoke with me about their work and their passion for responsible software engineering in practice. Folks, I am incredibly grateful for all that you taught me. This book could not exist without you: Adam Bender, Alex Beutel, Alicia Chang, Amanda McCroskery, Ana Radovanovic, Andrew Smart, Andrew Trenk, Andrew Zaldivar, Andrey Petrov, Anna Escuer, Anne Peckham, Annie Jean-Baptiste, Anthony House, Armete Mobin, Auriel Wright, Barry Rosenberg, Ben Hutchinson, Ben Treynor, Ben Zevenbergen, Benjamin Treynor Sloss, Beth Tsai, Brandon Jones, Brock Taute, Cary Bassin, Chad Brubaker, Chris Gamble, Christina Greer, Christine Robson, Christopher Bian, Courtney Hel‐ dreth, Craig Swanson, Dale Allsopp, Dan Kane, Darcy Lima, David Madras, David Patterson, David Westbrook, Dennis Kraft, Diane Korngiebel, Donald Martin Jr., Eli Romanova, Emilio Garcia, Emily Liu, George Fairbanks, Ian Schneider, Iason Gabriel, Jamila Smith-Loud, Jaspreet Bhatia, Jilin Chen, Johnny Soraker, Julie Ralph, Julie Rapoport, Karan Gill, Ken Burke, Kendal Smith, Kevin Rabsatt, Lily Yu, Lorenzo Dini, Lucy Vasserman, Manya Sleeper, Marisa Leung, Mark Chow, Matthew Gray, Michael Madaio, Miguel Guevara, Milica Stojmenović, Muthoni Richards, Neal Eck‐ ard, Nina Bhatti, Pam Greene, Parker Barnes, Partha Basu, Paul Nicholas, Rachel Sti‐ gler, Raiden Hasegawa, Reena Jana, Remi Denton, Renee Shelby, Rony Yuria, Sameer Sethi, Sanders Kleinfeld, Sandy Karp, Sasha Brown, Savannah Goodman, Scott Rob‐ son, Shvveta Walia, Stephan Somogyi, Susan Hao, Susanna Ricco, Tamar Savir, Ted Osborne, Teri Karobonik, Tiffany Deng, Titus Winters, Tod Hilton, Tom Manshreck, Tom Stepleton, Tulsee Doshi, Valentina Nesci, Vincent Dao, Will Hawkins, William Quan, X Eyee, Yoni Halpern, Yuchi Liu, and Zach Eddinger. Whew! Extra-special thanks go to Donald Martin Jr., who coauthored Chapter 3, and to Ben Zevenbergen and Amanda McCroskery, who met with me for weeks to codesign the case study on moral imagination in “Case Study: Google’s Moral Imagination Work‐ shop” on page 90. Writing these parts of the book would have been impossible without our close collaborations. Thank you to my editor at O’Reilly, Michele Cronin, for her guidance; senior editor John Devins, for believing in and signing the book; and senior editor Megan Laddu‐ saw. I’m also grateful to illustrator Kate Dullea for bringing the characters Ree, Cwip, and Endy to life, and to O’Reilly’s production team for designing a somewhat xii | Preface
📄 Page 15
nonstandard-looking O’Reilly book. Also, for their insightful and well-informed fact- checking and feedback, I thank O’Reilly’s external technical reviewers: Alex Hamer‐ stone, Andy Petrella, Anirudh Topiwala, Chris Devers, Gen Kallos, Jayant Chowdhary, and Jess Males. Special thanks to Ziad Obermeyer for reviewing the medical example that opened Chapter 3 and drew on research by Ziad and his collaborators. Finally, a gigantic thank-you to my amazing family—Lisa, Sophia, Kay, and Luna—for their love and support that saw me through this four-year project. Preface | xiii
📄 Page 16
(This page has no text content)
📄 Page 17
CHAPTER 1 Responsible Software Engineering: A Quick Introduction A friend of mine writes articles for the national media. A few years ago, one of their articles drew the attention of extremists, who responded with thousands of aggres‐ sive, hateful messages over email, social media, and the telephone—including threats of violence. It was a stressfest for my friend. They blocked the threatening phone call‐ ers and started using Google Voice to screen calls. In case you’re not familiar with Google Voice, it’s a virtual phone line that uses AI to transcribe voicemail messages and email them to you. The hate storm eventually dissipated. Time passed, and life returned to normal. My friend continued using Google Voice anyway. Until one day, they received an unset‐ tling voicemail transcript (shown in Figure 1-1): “Dead dead dead dead dead dead dead...” Figure 1-1. An unfortunate voicemail message as transcribed by Google Voice What Is Fairness? | 1
📄 Page 18
1 In fact, it’s already happening. See Flitter, Emily, and Stacy Cowley, “Voice Deepfakes Are Coming for Your Bank Balance”, New York Times, August 30, 2023. What did the message mean? Was it a threat? Was the horrible hate storm starting up again? Feeling worried, my friend connected to Google Voice and listened to the mes‐ sage directly. The call was not a threat. It was a series of beeps—the classic sound of a disconnected phone call in the United States. The AI software that powers Google Voice had tran‐ scribed the “beep, beep, beep” sound as “dead, dead, dead.” Clearly, the team of engineers who developed Google Voice did not intend for their product to frighten anyone. Their mission was to create a great application. This inci‐ dent demonstrates that software products, even those developed with the best of intentions, can have unexpected effects on users. In my friend’s case, the only consequences of receiving the mistranscribed voicemail message were a few minutes of discomfort, and they can chuckle about it today. But in other cases, the risks can be more significant. On the same day that I wrote this paragraph, a major tech company announced a smartphone feature that can mimic any human voice from a small number of samples. This feature appears to have been developed with good intentions, to help people who have lost their voice due to ill‐ ness or disability. But major Wall Street investment firms use voice recognition to permit their clients to access their financial accounts without a password. It doesn’t take a genius to predict what happens when voice mimicking and voice recognition technologies meet.1 Part of a software engineer’s job is to foresee and prevent harmful effects of their applications as they run in the real world. This idea is called responsible software engi‐ neering. In this chapter, I’ll introduce some concepts of responsible software engi‐ neering that form the foundation of the rest of the book. You’ll learn what responsible software engineering is and isn’t, cruise through a little history, and preview the major themes to come. What Is Responsible Software Engineering? Responsible software engineering means developing software products to be socially beneficial and to not harm the earth or its inhabitants. Let’s unpack that definition one piece at a time: Socially beneficial Socially beneficial software products primarily serve the well-being of the public. 2 | Chapter 1: Responsible Software Engineering: A Quick Introduction
📄 Page 19
To not harm the earth This means to optimize products so they do not squander resources, such as elec‐ tricity and water in data centers, and to contribute as little as possible to harmful carbon dioxide emissions that intensify climate change. Or its inhabitants This means designing products that don’t hurt people physically, economically, psychologically, or in other ways we’ll discuss. I say “inhabitants” rather than “people” to include other living things on our planet when relevant. My definition of responsibility focuses on how we develop products, not on the prod‐ ucts themselves. Pretty much any product can cause harm, depending on how it’s used. Email programs and social media platforms, for example, bring social benefits because they help us communicate quickly over long distances, yet they can also be used in harmful ways, like spreading falsehoods or malware on a large scale. It’s more realistic for us to judge whether a particular application or platform was engineered responsibly. A social media platform that does nothing to block the spread of mal‐ ware would fail this test because its creators arguably have a social responsibility to protect their users. Responsible software engineering is about more than just technology. It also includes the social context in which technology is deployed. Here’s what I mean. Suppose you created the most powerful map app in the world, which helps people take the most efficient routes to their destinations. A responsible design could mean not only calcu‐ lating those routes accurately but also considering the societal impact on the neigh‐ borhoods along those routes. Once people start using your app, it might redirect far more traffic than before into those neighborhoods and increase congestion, pollu‐ tion, and even accidents in those areas. So, you might want to work directly with the inhabitants of the areas to mitigate those unwanted side effects. Of course, it’s impos‐ sible to foresee every possible effect your technology will have, but I’ll present some guidance in Chapter 4. Responsible software engineering is also about more than AI. Responsibility extends to any software engineering that has a pervasive influence on society. An example is the design of certain cryptocurrencies with massive energy needs that exceed the electricity usage of whole countries. Finally, responsible software engineering isn’t just about testing your applications carefully. Testing is certainly critical (and Chapter 4 discusses it), but it happens rela‐ tively late in development, after you’ve made many design decisions and begun cod‐ ing. If you practice responsible software engineering earlier—like when you’re gathering requirements, creating the design, or even just brainstorming—you may proactively catch issues that would be expensive to fix later. What Is Responsible Software Engineering? | 3
📄 Page 20
If the scope of responsible software engineering sounds immense, don’t sweat it right now. I’ll walk you through lots of examples throughout the book, including case stud‐ ies, so you can see how other engineers apply responsible software engineering in the real world. A Little Help from Some Specialists Responsible software engineering is a huge topic, so I’ve invited a trio of specialists to help us throughout this book as we grapple with thorny issues. Let me introduce you. Our first specialist is Ree, a software engineer. Ree wants to develop “ree-sponsible” apps but doesn’t have the skills yet to do so. Ree is curious and wants to do the right thing for users and for Ree’s employer. Ree will be learning right along with you. Our second specialist is Cwip, who is a Creative, Well-Intentioned Person. Cwip likes to come up with ideas for software products and features, but as we’ll see, Cwip’s ideas are often at odds with responsible software practice. Perhaps you know or even work with someone like Cwip. They mean well, but…. 4 | Chapter 1: Responsible Software Engineering: A Quick Introduction
The above is a preview of the first 20 pages. Register to read the complete e-book.

💝 Support Author

0.00
Total Amount (¥)
0
Donation Count

Login to support the author

Login Now
Back to List