Video: Introducing the Responsible Technology Toolkit
Transcript
00:00
(upbeat music) - Hello, welcome to the Responsible Technology Toolkit. My name's John Ridpath, and I'm the author and presenter of this course. I believe that the tech industry is mostly populated by people who want to do the right thing, people who want to navigate the tricky path into the future in a conscientious, thoughtful, and inclusive way. If that sounds like you, I hope that this course can help you on that journey. In this introductory video, we're going to cover three things.
00:29
Firstly, I'll talk a bit about who I am and share my own responsible technology journey. Secondly, I'll talk about how we're defining responsible technology in this course. And finally, I'll give an overview of what you can expect to learn. So, let's get started with my responsible technology story. I started out my career in journalism and then pivoted to web development, and that combination of communication and technology eventually led me into the world of education.
00:52
About a decade ago, I started working at a technology education business, designing courses on topics like artificial intelligence, cybersecurity, and coding. I loved helping people figure out what these technologies and trends meant for them. Back then, I was very much a tech optimist, I believed in the positive power of technology, and particularly in open standards like the World Wide Web. But then something started to feel off for me, I began to see more stories like this one.
01:17
This is a 2014 news story about Facebook's mood manipulation experiment. For one week, Facebook's data scientists adjusted the newsfeed algorithm of almost 700,000 Facebook users. Some people were shown content with happy and positive words, and some were shown content that was determined to be sadder than average. When the week was over, they saw that these manipulated users were more likely to post either especially positive or negative content themselves.
01:42
This probably isn't the worst thing that's happened in tech over the last decade, but I think it's a good example of something that wasn't illegal, but was creepy, sinister, and unethical. I began to spot more and more examples of technology causing harm, interfering in people's decisions, enabling socially destructive forms of capitalism, breaching people's privacy, and discriminating against people.
02:03
And just as I started to pay attention to these more negative stories, I began to spot some more optimistic ones, stories of people fighting against the harms of technology. For example, I read about Joy Buolamwini, who was fighting bias in facial recognition algorithms and started something called the Algorithmic Justice League. Finally, I decided it was time to get involved in responsible technology myself, and I did that by going back into the world of journalism and I wrote a piece for Eye Magazine, which is a design and visual culture magazine.
02:31
My piece was called Ethics in the Age of Data Capitalism, and it came out in Autumn 2019, and it's an article about designers who are tackling questions of tech ethics in their work. In the process of writing that article, I connected with Projects by IF. Projects by IF is a strategy and design firm that specializes in trust. I then went on to work with IF. Firstly, I helped build out their Responsible Technology by Design Framework, and then I went on to work alongside IF with clients like Google, the BBC, and Genomics England, work that focused on making trustworthy products and services easier to design, launch, and maintain.
03:04
And all of this work has led me to this point creating this course. When I started looking into responsible technology, it wasn't really clear where to go or what to do. There were lots of interesting ethical discussions, but nothing felt practical. And for that reason, I became interested in helping other people get their heads around this space and to hit the ground running, which brings us to our definition of responsible technology. The first thing to say is that this is a really broad space with some overlapping and contrasting ideas, the terminology isn't settled upon.
03:32
For example, you might see people using ethical technology and responsible technology interchangeably, and you might see people arguing in favor of one versus the other. Now, of course, there is a difference between ethics and responsibility, but I'm quite relaxed about that higher level terminology. Whether we're talking about ethics or responsibility, what we're really talking about is the intersection of technology and human values. And when I talk about technology, I'm generally referring to things like apps and websites, internet connected devices, artificial intelligence, software, networks, both in terms of how these are produced and they're placed in the wider culture.
04:08
For example, the business models enabled by these technologies, things like e-commerce, online advertising, on-demand entertainment, marketplaces, the sharing economy. And by human values, I'm referring to topics like social justice, human flourishing, inclusivity, equity, privacy, civil liberties, and democracy. Of course, a lot of different topics fall under that umbrella of technology and human values. Here's a list of some of those topics in a document called the Responsible Tech Guide.
04:37
This is produced by All Tech Is Human, who are an organization supporting the responsible technology ecosystem. As you can see, there's a huge range of topics from accessibility to virtual reality ethics. Each of these topics could be a course in their own right. So, let's talk about what we are going to cover in this course. We're going to focus on a series of problem spaces that lie at the heart of responsible technology, and I'm gonna share some practical tools that can help you navigate those spaces.
05:02
Firstly, we'll look at how you can spot the difference between wishful worries and genuine harms. This matters because it's easy to be drawn into speculative ethical debates in this space. Now, some of these ideas might make for fascinating sci-fi, but I would argue they're not a priority compared with some of the genuine harms being experienced right now in society that are being exacerbated by technology. I'll share a Wishful Worry Filter, which is a tool that I've created that can help focus your attention on the things that matter.
05:29
Then, we'll tackle one of the biggest obstacles to bringing responsible technology into our work, how can you persuade your team, your client, or the business you work for to invest real time and real money in making technology more responsible? We'll be exploring one solution to that challenge, Consequence Scanning. Consequence Scanning is an agile process that has been embraced within tech companies, such as Salesforce.
05:51
And I'll share a Miro template that I've created that can help you run your own Consequence Scanning workshop. Then, we'll explore a topic that sits in a more radical space, design justice. We'll look at how the design justice principles can help us center marginalized communities in our design work, and challenge society's structural inequalities. I think Consequence Scanning and design justice speak to the incredible breadth of the responsible technology space.
06:14
Tools like Consequence Scanning support the idea that we can make a change within existing corporate structures, whereas something like design justice advocates for more radical system change. Next, we'll look at the topic of data, something that lies at the heart of all of our digital experiences and is powering some of the technologies with the greatest potential for harm, things like machine learning.
06:34
We'll be taking a look at data sheets and other types of documentation like system cards that can help us work with data in a more reflective, transparent, and accountable way. And after that, we'll take a look at patterns. Firstly, we'll cover deceptive patterns, which are common techniques used in websites and apps that manipulate you in some way. For example, making it difficult for you to cancel a subscription. We'll also be looking at responsible patterns and some of the catalogs created by members of the wider responsible technology movement.
07:02
These catalogs and libraries can be really helpful if you need some inspiration about more responsible ways of creating technology. And finally, we'll look at some of the ways that responsibility can be engineered deeper in the tech stack. We'll be talking about PETs, which stands for Privacy Enhancing Technologies, and we'll be looking at how you can navigate this space using tools like the PETs Adoption Guide. So, as you can see, I've focused on creating a snapshot of some of the tools available in the responsible technology space, and I hope that these tools will be useful in two ways.
07:31
Firstly, I hope that you can use them as the authors intended to directly improve some of the products and services that you are working on. Secondly, I hope they can act as a more tangible way into this topic and open your mind to new ways of thinking. Alongside each video lesson, I'll be sharing plenty of resources and links for you to explore each topic. And if you have any questions or feedback as you're going through the course, please do get in touch.
07:52
Other than that, I think it's just time to get started.