Design Justice 101

with Sasha Costanza-Chock


What is Design Justice?

How can we employ it to disrupt power systems supporting the matrix of domination?

In this episode, we interview Sasha Costanza-Chock about the 101 of Design Justice and how we can use it as a force for collective liberation.

Sasha Costanza-Chock is a researcher and designer who works to support community-led processes that build shared power, dismantle the matrix of domination, and advance ecological survival. Sasha is the Director of Research & Design at the Algorithmic Justice League and is the author of Design Justice: Community-Led Practices to Build the Worlds We Need.

Follow Sasha on Twitter @schock.

If you enjoyed this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.



Transcript

Design-Justice.mp3: Audio automatically transcribed by Sonix

Design-Justice.mp3: this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

Speaker1:
Welcome to Radical I, a podcast about technology, power society and what it means to be human in the age of information. We are your hosts. Dylan and Jess were two PhD students with different backgrounds researching AI and technology ethics.

Speaker2:
In this episode, we interview Sasha Costanza. Chuck and Sasha was one of those people when we were first starting this podcast that was one of our dream guests, and so it was amazing for us finally to be able to sit down with Sasha and to find out more about their work, especially around design justice

Speaker1:
And one of the reasons why we wanted to have Sasha on the show, which will get into a little bit in more detail during the interview itself, was because of the book that they authored titled Design Justice Community led Practices to build the Worlds We Need, and you'll see pretty quickly why this book is so relevant to AI ethics and technology ethics at large. But needless to say, we are huge fans of this book, and it's actually available for free online. So if you're interested in it after listening to this episode, if you haven't heard of the book before, we highly recommend going and checking it out. And if you do like it, please support Sasha as an author by buying the book from your local indie bookstore. And we have the links to do all of that in the show notes. But in this interview, as I already alluded to, we talk to Sasha about the 101 of design justice and how we can use it as a force for collective liberation.

Speaker2:
For those of you who don't know Sasha Sasha Costanza Chalk is a researcher and designer who works to support community led processes that build shared power, dismantle the matrix of domination and advance ecological survival. Sasha is the director of research and design at the Algorithmic Justice League and is the author. As we already said of design justice, community led practices to build the worlds we need. Now we're so excited to present this interview with Sasha, with all of you.

Speaker3:
It's our pleasure today to be on the line with Sasha. Sasha, thank you so much for joining us.

Speaker4:
Thanks for having me.

Speaker3:
And we're just going to dive right into this conversation about design, power and justice. And I'm going to ask you a very broad question around what are your thoughts on the intersection between these three things between design power and social justice?

Speaker4:
Sure. So I think that design is a set of processes and practices that human beings used to make and remake the world and to plan out what we want to see happen in the future. And so that's deeply linked to power on a lot of different levels, including the ways that those who already occupy positions of power get to design things, get to design the future, get the design technologies, including AI, but also including everything the built environment, the apps that we interface with the systems that we use on a daily basis that structure our lives. And so the relationship to social justice would be that if we want to create a world that is more just where we are working to constantly dismantle structural historical forms of inequality like racism and white supremacy, like patriarchy and hetero and cis normative city like ableism and able to supremacy. And like, frankly, capitalism and systems of structural class inequality that have persisted over time and have. Developed into their current intentions, in part through the work of people planning and designing and building systems and structures that maintain them. Well, if we want to dismantle these systems, then we need to think about what is the role of people doing different types of design work in dismantling oppressive systems and constructing new, more laboratory systems and systems that can help us have a planet that will be habitable for future generations.

Speaker5:
One of the reasons that Dylan and I were so excited to have you on our show is because you explore a lot of these topics in depth in your book Design Justice and we'll be getting into the nitty gritty details of that book throughout this interview. But to start off, we were wondering what motivated you to write this book?

Speaker4:
Well, I've been working on this sort of relationship between technology and social movements for a long time for most of my career. I think I went through a lot of different twists and turns in my career. Actually, it's not a it's not a straight line. So, you know, as an undergraduate, I actually studied electronic music and cultural theory, and I did a lot of work, both as a musician, you know, making making beats and making electronic music and organizing. This group called the Tone Burst Collective, which was like a group of musicians and event producers in the Boston area in the mid to late nineties. And we were really into and interested in rave culture. But we were also interested in the ways that we saw it being kind of appropriated or taken up and capitalized on by larger record labels and event production companies and that type of stuff. So we were sort of like a politicized rave producers and musicians, and that was a time when I got really interested in this relationship between sort of technology and existing forms of inequality. I remember I read this book called Noise by Jack Natalie, who's a French theorist, annoys the political economy of music. That book is about how sort of changing forms of music and music technology over time have reflected changes in broader social structure and power structures. So I was really into that at that time. From there, I went to my first job after college, which was actually in a public arts project in Puerto Rico, where we were doing public arts all around the city of San Juan in low income neighborhoods through this city sponsored project called Educate.

Speaker4:
And I was part of the music team. And my role was to support a team of musicians to go into different communities and work with people, to write songs and then record them. And so I had this little mobile recording studio. Roland PSAT and we produced an album called Sueno My Barrio, which is kind of it's like wordplay. It's basically like my my neighborhood's the sounds of my neighborhood and the dreams of my neighborhood. And when I came back, I continued in sort of the arts, you know, space. I was the assistant director of the Boston Cyber Arts Festival. And that was also an opportunity to kind of think about how the cultural sector and was influenced by changes in technology. But it was also a moment when I was really looking at questions of race and gender and who was getting access to these forms of what we were calling at the time cyber art or electronic art. And it was also getting very involved in the indie media network Indymedia. Org was, or also known as the independent media centers, was a global network of citizen or community journalists linked to social movements that emerged in the late nineties and went on into the early 2000s.

Speaker4:
And it predated social media as we know it today. So before YouTube, this network of radical independent journalists was figuring out how to produce video documentation and, you know, upload it and share it globally. Before there was, you know, Twitter or Facebook or other large corporate social media sites, we were using free software and hacking together ways for people to publish directly to the web, their experiences from within social movements, and that spread very, very quickly around the world between between 1999 and when it launched at the as a way for people to do grassroots coverage of the protests against the World Trade Organization in Seattle, in the media spread all around the world and ended up with about 160 local independent media centers in countries everywhere on every continent. And I was very involved in that network. And so through that work, I became really interested in free software, free library and open source software. And I became very interested in the sort of the possibilities of these emerging technologies of the net. So between my background in music and, you know, electronic music production and thinking about and practicing what it meant to use these new tools to intervene in the cultural sector. And then my connection with and learning about free software development through the activities of the Indymedia network and its sort of groundedness in social movements. I would say those two experiences really shaped my politicization around technology and around design and and power and what came after that sort of.

Speaker4:
Going back into the academy, you know, I got a master's degree from the University of Pennsylvania, from the Annenberg School for Communication and then later a doctorate from the University of Southern California, also from the Annenberg School for Communication. But the other one? You know, all of my work, my academic work, my scholarly, you know, production has been deeply informed by, you know, my personal lived experience as a participant in these social movement networks, as well as in these networks of cultural production and also by my experience as a trans person. That's some of the background, you know, story I would say that the, you know, the book design justice, of course, you know, fast forward to to 2020, when that when the book dropped, the book opens with a personal story for my own experience of going through airport security systems and the millimeter wave scanner, which is the machine that you kind of step into and you put your arms up over your head and it spins around you and creates an image of your body at millimetre resolution. And it's meant to, like, identify risks, right? But as a trans person in the book, I narrate the experience of going through that machine and learning about how the TSA agents were operating. The machine on the other side have a little touch screen that has literally a blue buoy button and a pink girl button with those classic like bathroom icons.

Speaker4:
And based on how they read you when you're approaching the millimeter wave scanner, they select blue for a boy or pink for a girl and the body shape that's, you know, you're the scan of your body is being compared against a statistically normalized image of a body that's either male or female in binary. And so as a trans person and as a gender nonconforming person, I always get flagged by that system is being more risky because if they read me as a boy and select, you know, the boy button, you know, I, my breasts are too large according to the statistical construct. You know, based on, you know, the fact that I use hormone replacement therapy and I have breasts, so they get flagged. And if they if the other thing happens, if they select no girl, then my groin area always gets flagged. And then that means that I have to have a physically invasive, you know, further investigation by a TSA officer who's going to run their hands over their risky areas of my body. And this is a very common experience for trans people to have to be flagged by airport security. It's also a really common experience for many other types of bodies to have people with other types of bodies to have. So, for example, ProPublica, did you know stories about how black women's hair is often flagged by these systems, Sikh men and other folks who wear turbans, you know, get flagged by these systems? Disabled people who may have different body configurations, different limbs or may have assistive devices connected to their bodies will often get flagged by these systems.

Speaker4:
So basically, if you're if you're in a type of body that isn't considered normal and I'm using air quotes and the system, the millimeter wave scanning system hasn't been trained to recognize your type of body. You're going to be flagged for more invasive search. And if you inhabit a body that's in multiple marginalized categories, so say you're a black disabled trans person, then you're even more likely to be flagged. And then also, that would make you more likely to have more harmful and negative consequences of being pulled out by the security system. Because then there's a human interaction that takes over and depending on people's assumptions, their racism, their Islamophobia, et cetera, et cetera. You know you, you may then be more likely to be funneled into, you know, more harmful potential outcomes. So I use that personal story to talk about, you know, this much broader question of how the systems, the technological systems that we're designing and building and installing and using everywhere have structural forms of inequality and bias built into them. And the book is like an extended conversation about how that operates and what we can do about it.

Speaker3:
One thing that strikes me about that example and also different examples throughout the book is how daily some of these experiences are like how seemingly, you know, quote unquote ordinary. These experiences are like if you look at, say, the media cycle, it's and you know, the day that we're recording this, you know, the Facebook news is is dropping around whistleblowing and and all of that. And we don't have to dive too far into that rabbit hole. But you know, you see these moments of exposure of some of these algorithms or some of. Systems, but it sounds like when we're talking about design, it's not just those big moments, but also those daily moments. And I'm wondering if you could talk a little bit more when you think about design, maybe coming from social movements or from music, you gave us a definition at the top, but just a little bit more about design and then how we move into justice space.

Speaker4:
I'm sure so, I mean, when I think about design justice, I, you know, and I talk about this in the book too, but I have kind of two parallel definitions or ways of thinking about it. And so on the one hand, I think of design justice as a framework for analysis about how design distributes benefits and burdens between various groups of people and very explicitly, design justice thinks about how design can reproduce or challenge the matrix of domination. The matrix of domination is Patricia Hill Collins term for the intersections of white supremacy, patriarchy and capitalism. And we can add ableism, settler colonialism and other forms of structural inequality. So it's a way to think about and analyze different design objects, tools, processes, interfaces through that lens of how is this particular design either reproducing and or challenging the forms of inequality that we know about that we know are there, that we know are there structuring our lives in the tiny, everyday micro interactions that we have as well as in, you know that that adds up or aggregates to larger, persistent, you know, forms of of ongoing inequality. So it's a framework for analysis, but also design. Justice is a community of practice. So it's not a term that I, you know, came up with. It's a term that comes out of this whole sort of, you know, growing group of people who are working to try and make sure that design both in terms of the benefits and burdens of design are more equitably distributed and also to make sure that more people get to participate in design decisions in meaningful ways and also to rethink and redefine what design really means. Take it out of the realm of just, oh, design is something that you know, credentialed professionals do or people in Silicon Valley do, but also to think about how there are many, many community based indigenous and diasporic design traditions and forms of knowledge and practices that deserve more recognition and validation and support.

Speaker5:
Let's go a little bit deeper into this, I guess the second thread of design justice, which is this community of practice and specifically the word community, because something that we've talked about on this show quite a lot is that one of the critiques of Big Tech is it's a room full of a small amount of homogenous people who are making decisions for a lot of people that are not in that room or at least not represented in that room. And one of the things that you discuss in your book design justice is this method of doing design called community based design. Could you explain to us what that method entails and how it can help us promote justice and raise equality in the way that we design technology?

Speaker4:
Yeah, definitely, I think that there's there's a long history of people trying to think about how can we do design in ways that are more accountable to communities that are more responsive to and reflective of the vast and beautiful diversity of humanity, rather than just being something that's controlled by a small handful of, you know, frankly mostly white cis men from particular class backgrounds in countries in the global north. I mean, that's a little bit reductive, but in general. So there's a whole history of participatory design as a practice. Actually, I would say history's there's multiple threads of where that comes from. So there's a whole sort of Nordic tradition of participatory design with these classic examples of, you know, unionized print shop workers collaborating together with software designers to come up with new, new ways for, you know, for printing, to operate new, new hardware and software that was something called the Utopia project. There's a there's a whole history, there's the participatory design conference that you know, ran for many, many years, and there are different terms that people will use. So participatory design, co-design, asset based community design and of course, one of the most famous ones is human centered design. And I think we need to talk about that a little bit. There's definitely a critique to be had there. To me, I think there's something to learn from all of these approaches to or all these.

Speaker4:
There are common threads from all of these traditions which are about saying instead of having just a smart person in a room who's going to design something and then roll it out to the user, it's a recognition that in reality, people should be involved in the design process, both because you're going to develop more usable and accessible products that way product systems, services, but also because there's a normative component, right? Which is that it's ethically and morally just for people who are going to, you know, their lives are going to be reshaped by a particular thing that you're making for them to be able to have some decision making power about what it's going to be like. I think the difference between some of these different approaches is that some of them cut out the second part of what I just said and focus only on the former. So there's an extractive version of participatory design or human centered design, which isn't interested in sharing decision making power or credit or attribution, and certainly not profits with communities that might participate in the design of an object. So there's an extractive version of participatory design that says, Well, if we get the thing, we get the prototype in front of the users early on, we can gather some really great insights about what people like and what they don't and how to make the interface easier to navigate.

Speaker4:
So we'll do that. That's just user testing. And then we'll have a better product and we can then make more money. I don't. I'm not saying that user testing is bad. I'm saying that there's another element here to go from sort of extractive consultation to community led design processes and in that the latter is concerned with not only will the product or service be better. It's also concerned with who's going to get credit, who's going to get profit, who controls the future development of this thing. Because if you're not asking those questions, then a lot of times what you're doing is just kind of extracting knowledge from people. And in the worst case, this is just straight thievery where, you know, I have some venture capital and some smart people. So I'm going to go like basically grab some ideas from a community and then I'll make a product and I'll sell it back to them. And they won't get, you know, credit or profit share. And they certainly won't get any say in how this is used, what the business model is, who you know, who I'm going to sell their data to and so on and so forth. So Design Justice is asking us to think about that set of questions as well.

Speaker3:
And so then how do we do that, I guess, is the question that I'm that I'm left with. And I guess it's kind of a two part question. Some of which you've alluded to, which is what are some of the barriers to doing this better? Is it just there's a system in this capitalistic world that we live in where Silicon Valley just holds power and so it doesn't open up to everyone? Or is there other stuff going on? And then like, how do we move into? I hate to use that term quote unquote solution space in this. I guess the question again, like how do we do this better?

Speaker4:
Yeah, I definitely I mean to that, I would say I would encourage people to check out the work of the Design Justice Network over at Design Justice dot org. So this is a whole growing community of people who are, you know, doing our best to practice design justice in different domains. So we have sort of like a chapter based structure. There are what we call local nodes in different places around the country and around the world who are meeting up to talk about what design justice might look like in practice in that particular location or that particular subfield. People are sharing that knowledge or writing about it or publishing, you know, how to put the design justice principles into practice in your workplace. So there's like a zen underway for that. We organize exhibitions of design justice projects, and we do sort of like member feature stories where we talk about the way that different members of the Design Justice Network are doing this work. So, for example, there's a great member story about we have a member, Leyla KDDI, who has who is a luthier, which is a guitar maker who has a company called Tuna Tones, Instruments and tuna tones, uses design justice and the creation of guitars because it turns out that the, you know, guitar industry did standardize measurements around human bodies, and the standardization was around certain types of bodies from certain locations, you know, in the world.

Speaker4:
And so your standardized length of a guitar neck and the frets and the materials, the it's it's all set up to privilege certain types of bodies over others. So tuna tones does, you know, design for different sized bodies? So, for example, for people with smaller than the industry standard bodies, which would be women and most women and younger people and other people with smaller bodies like the only guitars that are available for them are typically seen as children's guitars. And so they're made with like less high quality craftsmanship and materials and so on and so forth. So tuna tones will make, you know, smaller guitars because there's lots of people with for various reasons who have smaller bodies and they deserve to have, you know, well-crafted guitars as well. That's a small example of what it might look like to apply design justice in a particular sector or space. And there are many, many other examples. So in in the network, we have people doing graphic design. We have people who are architects or planners who are working on, you know, thinking about design, justice and the built environment. There are people doing software design and development.

Speaker4:
So for example, there's a bunch of people working on this idea of contentful tech, and you can check out contentful text io. Euna Lee, who's one of the co-founders of the Design Justice Network, as well as Boaz Sender and a bunch of other people have been involved in this project, which is to kind of say, OK, we're all having this conversation about data privacy, but data privacy is a frame that I don't know. It's kind of a mixed bag. It kind of individualized, you know, sort of. It's very much a sort of individual personal control over, you know, my data. And it doesn't necessarily center the social aspects of how data gets gathered and circulated and used. So thinking about a reframe around consent drawing from consent culture in, you know, thinking about sex and consent where you have the frys framework, where consent should be freely given revocable, informed, enthusiastic and specific, right? So what would it mean for us to consent, to use technology and for our data to be used through that really strong consent model? So if I give some company or another person the right to use my information or my data that should be freely given, it should be revocable. So I should be able to say, well, I said you could use my data, but I don't want you to use it anymore.

Speaker4:
So, you know, you must delete it, informed and enthusiastic. So that's interesting. What would it mean to enthusiastically assent to the use of my data rather than what we do now, which is click through annoyed consent and specific? So basically, instead of just saying yes, you can have my data and do whatever you want with it, what would it mean if we were able to say, Well, yeah, you can. You can use my data, but only, you know, to, I don't know, improve the usability of my service. But I'm not giving you the right to include my data as you're training your AI system, or I'm not giving you the right to sell my data to a third party and so on and so forth. So if you had specific buckets that you could consent to or not. So there's a bunch of people building a framework like a like a software framework for contentful tech that people could then try and implement in, you know, in platforms and applications that they're developing. So those are two just a couple of examples, but really, it depends on, you know, what area where we're talking about. How do you apply design justice in in different aspects of life?

Speaker3:
And just quickly on the the other part, why are we in this? Why are we in this mess in the first place, I guess beyond capitalistic greed?

Speaker4:
Um, well, I don't know if we could just skip over that so quickly. That's the point. It's a big, big question. Yeah, how how are we in this mess? Well, you know, starting 500 years ago with settler colonialism, indigenous genocide, the construction of worldwide capitalist system, the creation of scientific racism, the elimination of multiple gendered peoples that historically were common across the world, and its replacement with a simplistic, overly simplistic binary system that was imposed through violence and force through the process of settler colonialism, enslavement and transatlantic transportation of African people to provide fuel for the new, newly birthed capitalist economy. Yeah, I think those are the reasons why we're in this position now. So we have hundreds of years of damage to each other and to the planet, to living and non-living things to do. We have other many other forms of like knowledge and types of practice to draw from and recuperate and try and, you know, undo the harm that was done through the, you know, the replacement of. All these many forms of organizing society and ecology with a sort of one world system where everything becomes sort of standardized and scalable. And the vision that there's only there's only one way to organize the planet and our societies has to be challenged and undone. And that's going to be a long process that will probably take generations more. But if we don't do that, work. I mean, the answer is if we all die, because the systems that we built right now is destroying the planet and making it uninhabitable for most species, and soon that will include us. So let's work that we have to do if we want to survive as a species and if we want to try and undo some of the harm and damage that's been done. I don't know if we'll be able to do it, but we have to try.

Speaker5:
Well, that's a perfect segue, actually, because let's let's talk about what it looks like to try this work and full disclosure, I did take last semester in my PhD program. I took a class called community based design that was taught by Richard Rose Rock, who I believe was actually advised by you years ago for her PhD. And in that class, of course, we spent a lot of time going through your book Design Justice, and we talked about what it actually means to do real community based design and to work with members of a community to build rapport, to continuously work with them, getting feedback and co-creating over the course of what could be even decades for some of the things that we were designing together that you could design together with the community. And so when I hold that reality of what, you know, meaningful community based design should look like with the reality of the current state of technology, which is rapid innovation and mass scalability and automation, they seem to be in complete contrast with each other because you can't really work closely with communities over an extended period of time, at least that intimately and also scale, at least in my mind, it doesn't really seem to go together. So I'm wondering what your thoughts are on this and how we can actually effectively incorporate some of this design justice framework into these really, really huge technologies that we're seeing that are global and impacting dozens, if not hundreds or thousands of communities at once.

Speaker4:
Well, why don't you ask me the question about how to do community based design if you just took the state of the art class on it? So I'd love to know, you know, more about what you all learn in that class, but everything that you just said definitely resonates with the way that I think about it. So it's about sort of like building deep relationships, you know, in a community rather than like parachuting in and grabbing something and leaving. It's about localization and specificity, more so than scalability. I think it's Alex Hanna who has a recent article against scale. There might be co-authors as well. Ok, so we just double checked, and this article is called Against Scale Provocations and Resistance to Scale Thinking, and it's by Alex Hanna and Tina Park. Yeah, I can scale basically looking at how there's just this validation of scalability, especially in, you know, in product design and in, you know, apps and platform design. And there's lots of reasons why that exists because that can be very profitable, but making our decisions about how we work and organize our world. Folk with a central focus on profitability has gotten us into this horrible mess that we're in. Lots of things that are terrible for us as individuals, as communities, as a global community and as a, you know, network of interlocking bio spheres. Lots of things that are terrible for for us, people and planet are very profitable. And so that can't be the criterion that we use.

Speaker4:
So I think that sometimes it's about just developing small, local and highly customized solutions, I guess. Another thing I would say is a lot of times there's no solution necessary. The idea of like approaching everything with a problem frame, which we're often taught in design schools or in business schools. I'm a fan of asset based community led design where instead of starting by focusing on problems that people are experiencing, you can also say What's working really well? What's something that's fantastic here? What's something that's beautiful or valuable for some other type of reason? And what would it look like to organize an ethical and accountable way to share that with more people? So, yes, there's a scalability element there, but what would it mean to amplify this positive thing that's already taking place, whether it's just, you know, amplifying it locally? So taking something that's working really great in one household or friend group and extending that across the community or taking something that works really well for one neighbourhood and extending it across the city. I think it's fine to think about different levels of scale. As long as we're not always and only making our decisions about what to focus on and what to invest in based on maximum scalability and maximum profitability. And if that means, you know, designing things for just one person, well, sometimes that's fine. And if it means spending a lot of time with people, there's a lot of value in that as well.

Speaker4:
If it means building relationships over just, you know, deeper relationships over just short term and transactional transactional relationships, that's actually something that can be really beautiful as well. So what does it take to do all of those things? What is it? What does it take to to scale non scalability? Well, one place we could look to would be public investment. So I think that we've given up a lot of the territory of the imagination around how to design and create new technologies and technological systems to the mythology of. The private sector always and only, you know, does it better. So, yes, lots of innovation, you know, happens in private firms and also lots of innovation happens. That's led by users or just people in ways that are non remunerated. Erik von Hippel has a wonderful book about this democratizing innovation, where he tries to actually quantify and analyze the proportion of technological innovation that actually comes from the edges, from people taking products that don't really work for them that well and modifying them and customizing them informally. And then we could think about public sector investment and look at COVID and the COVID vaccines, you know, the underlying science that allowed us to rapidly get the new vaccines out. It's all public funding over a long, long period of time that makes that possible. And then the companies, now they're now we're in a battle because the companies don't want to relax the patents or share the production knowledge in a way that would actually make it possible for us to vaccinate the planet and avoid, you know, future or reduce the possibility of future mutations and perpetual COVID life because they're claiming, well, we created it well, you didn't just create it.

Speaker4:
You got massive, massive, massive amounts of public investment, both to do the underlying science and then even in the last sprint to deliver. And you see that same pattern repeated over and over again in different sectors. So the internet and the underlying infrastructure of everything we're doing, you know, was developed with public, you know, public funding, free library and open source software actually powers the majority of the servers on the planet. And some of that came from just volunteer participation. Some of it came from philanthropy. Some of it came from the private sector investing in open source development, and some of it came from public funds. So I do think we need to sort of take back that terrain of imagination when we think about the collapse of Facebook, WhatsApp and Instagram. You know, this week we're all dealing with the outage at the same time as the whistleblower dropped all of this info on The Wall Street Journal. And we're having this whole conversation about, you know, Facebook as Big Tobacco knowingly doing all of this harm to people. Well, if we think about alternatives to me and I did a Twitter thread about this the other day, but to me, an alternative is not just to say, OK, so let's let's leave and let's go to some other corporate site that has the same business model.

Speaker4:
You know, that will just get us back to the same place, even if Facebook does collapse. To me, we need to think both about the already existing alternatives that people have built, you know, to those tools. So, you know, get off. What's up, get on signal signal emerged from actually in some ways that emerge from the network of people that were involved in indie media where we started this interview. So, you know, radical activist hackers created prototypes for end to end messaging encryption, which is signal. And that same that same protocol was adopted by WhatsApp, also created prototypes for what would later become Twitter and so on and so forth. So there's constant waves of community and social movement led innovation that get grabbed by and monetized by large firms. But there's constantly other sort of models bubbling up all over the place. And if we restructured public investment and said, Well, what would it mean to have sort of heavy public investment in community led design processes to create the technologies that we really need? I think that would start to take us in another direction to build systems that are less harmful, that are more responsive and accessible, and that actually have built in governance mechanisms that can respond to people's needs, not just to the profit imperative.

Speaker3:
We would be remiss if I didn't mention your relatively new role with the algorithmic Justice League. And so as we move towards closing, we would love to hear more about that and your vision for what the impact of that role can be on some of the topics that we've discussed here in this interview.

Speaker4:
Definitely. Yeah. So I'm I'm really excited. I think it's actually maybe the first interview that I do to share that I've left my position at MIT. And I am now the director of research and design at the Algorithmic Justice League, the Algorithmic Justice League, or AJ Borg, was founded by Joy Buolamwini, who of course you know is a field leader in the critical analysis of the ways that facial recognition technologies specifically and A.I. systems more broadly reproduce race and gender bias and harm. And so I've joined AJL to advance the organization's mission. The mission of Agile is to raise awareness about the impacts of AI, equip advocates with empirical research, build the voice and choice of the most impacted communities, and galvanize researchers, policymakers and industry practitioners to mitigate AI harms and biases. And so, in my new role at Agile, I'm responsible for, you know, leading the team to conduct original research, including community based research into both AI bias and harms, as well into alternatives that we might want to develop. And I'm also working on product design, so informed by the research that we do. We are developing tools and prototypes and examples and other types of sort of resources for people to use to rethink AI systems and hold companies to account when companies and governments to account when they deploy A.I. systems that are actually biased and harmful, and also to imagine, you know, other possible future ways that. A.i. systems might be created.

Speaker5:
Well, Sasha, congratulations on the new role. We are big fans of the algorithmic Justice League and we will be sure to include links to that website, as well as many of the various links and resources that were brought up throughout this conversation because there were some really good ones and all those will be in our show notes. But for now, thank you so much for coming on our show and telling us all the ins and outs of design justice and for the incredible and important work that you're doing in this space. It.

Speaker1:
We want to thank Sasha again for joining us today for this wonderful and eye-opening conversation, and as usual, now it is time for our debrief of the conversation and our immediate reactions and takeaways. And for me, one of my immediate reactions just talking about design in general is this like hyper awareness that I think like comes back to me only during conversations like this of just how much design influences our world, not only presently, but historically, and also the kinds of futures that we hope to create, or maybe that we don't hope to create, but that are created through our design decisions. So that's kind of like the place that I'm sitting in right now. Dylan, what are some of your immediate takeaways?

Speaker3:
Yeah, I mean, I

Speaker2:
Think I'm sitting with what the where we ended up in this interview, which was talking a little bit about imagination. And I think that so often both within technology and other areas of our lives, we're conditioned to think of one way of doing things. Or, you know, for me, I'm a creature of habit. So like one way of driving to the grocery store or to school or wherever, and there are so many more options out there if we only take a step back to see them. And I think what Sasha's scholarship really brings to the forefront is that in order to see some of those possibilities and in order to use those possibilities to really drive justice and liberation, we need to ask the folks who are not currently at the status quo table who are doing this design work, specifically marginalized groups. And so for me, I think where I'm at is is really leaning into that concept of imagination. And how do we imagine a new world for everyone? And also just how difficult that is to do just how ingrained we can become in the way that we do things and the way that we have designed technology? And how do we liberate ourselves from that, I think, is the question. And Sasha definitely begins to answer that question. And now I think it's up to us to take that baton and try to run with it.

Speaker1:
Yeah, that is interesting trying to imagine, like even just the day to day things that I know I take for granted all of the time, like the standards and the foundations and these like design decisions that were made forever ago that are just so ingrained in our daily lives we don't ever really think to to question them, let alone be aware of them in the first place. Like, like traffic signs. That was a design, a design decision that somebody made forever ago to regulate the flow of traffic. And I don't ever really like question those design decisions. And I feel like it'd be interesting to imagine a world where all of the standards that we currently exist and live within the bounds of, if those were different, if if they weren't designed the way that they were, or maybe if they were designed by people who weren't historically in power, it reminds me a bit of like speculative and critical design and like using design as a way to reimagine this like positive future that we hope to live in. And it gets into that like sci fi world, which I know we both really enjoy talking about in the context of technology, too. So it's just it's fascinating like having that, that flipping the script and in the narrative in my head and just trying to like be aware of the things that drive my everyday actions, even if I I'm not like consciously aware of it on a regular basis, you know?

Speaker2:
Yeah. And one way that Sasha gets at this is by naming community led design practices and exploring how those play out in the real world to dismantle what Sasha calls the matrix of domination, which I really appreciate how Tarek Sasha is in naming this and breaking it down. Sasha talks about white supremacist hetero patriarchy, ableism, capitalism, colonialism, and I think it's really important in this work. One of the reasons why we started this podcast in the first place was to name some of these factors. Some of these things that may be invisible or within our structures, within different ways that we move through the world without realizing it and to really again name those and then through naming it, allow us the space to be able to implement change within that. And so Sasha's work is, I think, really powerful because it both names this matrix of domination that is present in our design choices, especially when we're not intentional about what those choices are and then how we might combat those through community led design practices.

Speaker1:
Yeah, I thought it was really great in the interview when you I can't remember exactly what the question was, but it was something along the lines of like, how did we get here basically? And you were like minus the like capitalism, whatever, like the our the structure of our society is built on. Sources like, well, I don't think we could just brush past that, that's kind of like the foundation of this work. And I just love that like I guess that the stepping in and say, like, wait a minute, wait a minute, that's kind of like what's causing all this in the first place? And I totally agree with you, like, I think that we can't begin to be aware of these systems that drive our lives. We can't begin to critique them. And then we can't even begin to explore what a better solution might be or what a better design decision might be. What a more inclusive or more welcoming design decision. Accessible design decision might be if we don't recognize what got us here in the first place. So total agreement with that.

Speaker2:
And as we close today, we again want to say that Sasha's book is available for free online. Please do check our show notes for that link, amongst other links with some resources and some of the resources that Sasha brought up over the course of this interview. Over the course of this interview, but for now, for more information on today's show, please visit the episode page at radical A.I. dot org.

Speaker1:
If you enjoyed this episode, we invite you to subscribe rate and review the show on iTunes or your favorite pod catcher. Catch our regularly scheduled episodes the first Wednesday of every month with some bonus episodes in between. Join our conversation on Twitter at a radical iPod. And as always, stay radical. Typekit.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including automated subtitles, secure transcription and file storage, automated transcription, powerful integrations and APIs, and easily transcribe your Zoom meetings. Try Sonix for free today.