Episode 19: Ghost Work and the Role of Compassion in Tech Ethics with Mary Gray


19_ Mary L. Gray.png

In what way does technology make us more or less visible to each other? What is Ghost Work and how might it impact the future of work? How can AI Ethicists relate more intimately with compassion?

To answer these questions and more we welcome Dr. Mary L. Gray to the show. Dr. Mary L. Gray is a Senior Principal Researcher at Microsoft Research and Faculty Associate at Harvard University’s Berkman Klein Center for Internet and Society. Along with her research, Mary teaches at Indiana University, maintaining an appointment as an Associate Professor of the Media School, with affiliations in American Studies, Anthropology, and Gender Studies. She is also the co-author, with Siddharth Suri, of Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Mary is an anthropologist and media scholar by training, and focuses on how everyday uses of technologies transform people’s lives.

Follow Mary L. Gray on Twitter @marylgray

If you enjoy this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.

Relevant Links from the Episode:

Information about Mary’s co-authored book - Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass

Mary’s Personal Website


Transcript

Mary Gray_mixdown.mp3 transcript powered by Sonix—easily convert your audio to text with Sonix.

Mary Gray_mixdown.mp3 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best audio automated transcription service in 2020. Our automated transcription algorithms works with many of the popular audio file formats.

Welcome to Radical A.I., a podcast about radical ideas, radical people and radical stories at the intersection of ethics and artificial intelligence.

We are your hosts, Dylan and Jess.

In this episode, we interview Dr. Mary L. Grey, senior principal researcher at Microsoft Research and faculty associate at Harvard University's Berkman Klein Center for Internet and Society, along with her research. Mary teaches at Indiana University. Maintaining an appointment as an associate professor of the media school with affiliations and American studies, anthropology and gender studies. She is also the co-author with Siddharth Suri of Ghost Work How to Stop Silicon Valley from Building a New Global Underclass. Mary is an anthropologist and media scholar by training and focuses on how everyday uses of technologies transform people's lives.

A few of the big picture questions that we ask and discuss in this interview include. In what way does technology make us more or less visible to each other? In what way does technology make us more or less understood by one another?

What is ghost work and how might it impact the future of work? What is the role of compassion in changing systems of technology?

It was such a pleasure to be able to talk to Mary about these topics and more especially because Mary wears so many different hats in her research life and in her work life and in her life. She is one of the foremost experts on bringing queer theory into this work.

Also in making sure that we as scholars and also as industry professionals are looking not just in urban settings, but also in rural settings and really trying to widen that circle for where our research is occurring and where we're looking at our downstream impacts. As technologists, it is our pleasure to present to you all this interview with Mary.

So it is our absolute pleasure to welcome Mary, great to the show, Mary. How are you doing today?

I'm doing good. Thanks so much for having me here.

Absolutely. And as we begin today, we've seen your name come up a lot recently in various different places. So we know that you're super busy. And before we get into everything that you are doing, we were wondering if you could just tell us a little bit more about yourself and how you. Really, what makes you tick right now?

Yeah. I mean, it's funny because I feel like I keep coming back to why do I ask these questions? Because often I'm these questions around what constitutes ethics, what it means to make change, create change through scholarship when it can be so personally frustrating and completely absorbing. You know, I'm trying to remind myself what what is it that brings me to this work, why I don't go off and do more gardening or dog walking. And it comes back to my earliest work was thinking as a as a political organizer, as a queer youth activist, what difference does the Internet make? I mean, I genuinely started most of my my scholarship, my intellectual work, thinking what difference can we make as political actors who want to effect change and challenge systemic oppression and injustice? What role will this technology that at the time was relatively new and really it was mainstreaming in the early 90s when I began thinking about these questions, which as I say that out loud, I'm not sure I want to say that out loud. But, you know, that is what that that question is what took me to graduate school. And at the time, I was studying anthropology. It was really anthropology was kind of. And Native American studies were my areas of study as an undergraduate.

And they they saved me in so many ways. Like I came out of a part of the country, Southern California, that for the most part had a very set, relatively narrow collection of things that women did or were expected to do. And I didn't really feel like I fit many of the boxes that were available to me. And going off to college was incredibly disorienting. And I latched on to these two disciplines that are mostly about teaching us to think of the world as this constant running experiment of what humans and groups of people can make of the world, how they make sense of the world. Hobbie challenged norms, how they reproduce norms in the same stroke. And so those those two disciplines that actually were at each other's throats, to be honest, are what shaped my taste in questions. They are the lenses, the equipment that I came into. And then as I was starting to do political organizing, particularly outside of cities, but with colleagues and cities, too, we started wondering, you know, it is a kind of a basic question in anthropology technologies, tools as part of what changes society and wanting to understand not just the possibilities, but the constraints. What are the conditions that are going to make this technology that could connect people in quite radically different ways, reorient us to each other? Help people see each other differently? Help them see themselves differently? How will it change the work of building solidarity around questions?

I was wondering if you could see a little bit more about your answer to that question of, you know, what difference does the Internet make or what what makes the. Maybe even as an anthropologist, what makes the Internet or technology, as we think of it now, different than other forms of tools that we have?

Yeah, I feel like the thing that was the hardest to learn was that actually it's never just the technology doing anything. So you think we we often, in part of the modern condition is to imagine innovation will change everything, that technology will interrupt or intervene or take some of the weight off our shoulders to do the very hard work of brokering a new way of being in the world. And it doesn't it can't fix things. It can't it can't change things in and of itself. So, in fact, what I learned and in some ways it was leaving anthropology formally and getting training in media studies, mostly through this really amazing scholars who were themselves at the margins of their own disciplines in sociology, history, political science.

It was learning that that when we're studying, quote unquote, what technology does, really what we're doing is looking for the relationships that we build, the way we imagine technology will find us, bridge our difference, and that it's in that imagining that projection, those relationships we build with technology, with each other, that things start shifting. So it's it's the it it it alone bought whatever. We want to fill that box with what whether it's a laptop, I think of Charisma Machines by Morgan Ames or other really fantastic scholarship on thinking about how much we load technologies as though they were carriers of our kind of keepers of our dreams.

But they're at the end of the day, at least the research I have ethnographic graphicly. If we look at what happens when it lands and people's lives, it turns up, it turns out to be they pick up those technologies and they are placing them, reworking them in the relationships they already have. They're working through the constraints they already deal with every day.

Mary, you you seem to hear a lot of hats and I see your work in academia and also in industry.

And you come from this Native American studies and anthropology background. But you're also kind of labeled as an ethicist. And I know that with that many titles, there's probably a lot of research questions and projects that you're actively working on that you have worked on in the past. So I'm wondering if you can unpack that a little bit for us and for the listeners of what are the questions that you are actively asking right now?

Yeah. Hey, Jess, I realize I didn't say hello at the beginning of the book. I think for me, the three line and I want to be really honest that I think I often look at a project that I'm drawn into and then after I'm usually halfway through it, I, I think about I reflect on how does it connect to the the broader questions that concern me that that might be.

I'm really asking the same question over and over again, which is in what ways do technologies make us more and less visible to each other, more and less understood by each other? And in answering that question or are arriving at some tentative sense of an answer to that question, I'm looking at what are the assumptions that tacit theories we're working with all the time that are blown apart when we manage to articulate our identities differently, say through Web pages or through our tweets or in the case of other research I've done around labor thinking, how do technologies that are effectively now work sites distributed around the world remake the relationships that people have to their visibility and their coherency as a worker and seeing the the incomplete the partial project that comes with like what we think people should do with certain technologies. And then this groundswell of how people manage, where that sense of power, control or clarity gets noisy and falls apart.

So some of the hats that just mentioned was anthropologist. And then also through in this. Ethicists world and I.

I'm well, actually, between Justin I, where people that are trying to bridge some of that humanities plus technologist's gap, or maybe it's not a gap.

And we would love to hear some of your thoughts on that. I'm wearing those. So you do identify as an ethicist like is that something that you bring to work? You identify as an anthropologist, you just not like labels.

I like labels. I mean, in queer fashion, because I ultimately did come out as queer. I think of queerness as a practice, not really an identity. I'm pretty anti-Taliban, Attarian. And, you know, for other scholars listening to this, in some ways it's understanding what kind of work can I do that will resonate with particular communities that do have a very cohesive. In some ways constrained sense of identity and be able to translate or make sense to them. So we don't know. You know, I think the troubling thing about the the label of a ethicist is that it suggests that it's a way of thinking or a set of theoretical frameworks. And to me, ethics is a praxis. I mean, it is both practice and theory, and it is a constant questioning of how do I arrive at this is what I'm supposed to do. It's supposed to do that is at the heart of most of what interests me. I mean, Queer Studies is one of my favorite game to go to toolkits for. How do I look at what is assumed to be normative, typical in TerraPass and contextualize the how we arrived there? What's the geneology difficulty in way? What's the geneology of arriving at? This is what we're supposed to do because in all cases that's a story about what we're supposed to do. It doesn't make it less valuable or important, but it's a deliberation of of different people, stakeholders who are looking at what do we believe we are supposed to do. And to me, the radical intervention of queer studies is to is to always pose the question. We arrived at this as a as a narrative. It's not a given. It's not natural. It's not how things should be. It's how we can make them. It is to constantly open up the project of this will be how we make it. You know, I think I lay claim or maybe pulled the pulled the hat.

I emphasis toward me because, you know, now that I'm in an industry setting, my hope is that I can say there are ways there are established practices within critical, humanistic social sciences that could apply here. You know, we we mostly are a world of data driven insights. When we're talking about computer science and engineering and I think one of the most valuable company contributions of critical humanistic social sciences is that it can say in all cases, those are those data points or are really stand ins for for you and for me and for us and to flesh out and populate and humanize that world so that we never escape that reality that we're talking about. Really, the smelly debris. It's you know, it's it's this you know, it's an iteration. It's it's some instantiation of our social exchanges. And there's a set of disciplines that are really good at reading the tea leaves of social interactions and social extremes. So it's it's bringing that domain expertise of seeing data as really material to read as as moments of social engagement that I hope are what I can bring to the conversation around I and ethics definitely.

And involved talking about data.

I know that data is like so important for part of your work, especially when it comes to your book Ghost Work How to Stop Silicon Valley from Building a New Global Underclass. And first, it be great if you could just kind of introduce the book and what it's all about. But then I specifically am curious what your thoughts are about the importance of data and how you think that it's representative of humans and social interactions and more than just a number or.

Yeah, I must hat tip a few people. If folks read Borkur and Gittleman collection, Raw Data is an oxymoron or read Ruaha Benjamin's race after technology. You know, there there are a number of fantastic pieces of scholarship that help challenge the notion that the data is something static. It is also a narrative. We have a story around what is valuable. We have a story around where data comes from, and I say that so that we see it as something to interrogate to to to not naturalize again, queer studies is fantastic and constantly calling the question on where we naturalize naturalise. Something is a given to be taken. Right. So the thing about data is that it's presented as it's there, it's just there. It's sitting on a table. It's sitting on your desktop. Why can't I have it? And so ghost work is is really, you know, again, for me, a continuation of the questions that I love to ask.

It was coming to Microsoft Research, having finished a project before that out in the country that was thinking about world young people who were using the Internet to come out and connect to queer communities, getting to Microsoft research and understanding that artificial intelligence had a particular narrative about being able to replace people and wanting to understand how how would that work by quite literally what's being built? What are the kinds of algorithmic models that can discern and divine human decision making that are going to replace people were the assumptions about what people bring to human decision making that are embedded in them. And so the beginning of that work was really just asking like, so how is how is this stuff built? What is artificial intelligence? And I did not know if we think about machine learning. I had no idea how much the development of artificial intelligence is really attributable, attributable to so much user generated content being sucked up by tech companies that could effectively replace the very, very difficult work of annotating and cleaning datasets so that they had a ground truth of what exactly did a person decide and could have many versions of this looks like a decision.

This is a clip to the to cook to the right. This is someone who said this is a cat. And then at the same time, not realizing, even after, you know, a tech company can suck up all of that data that we are unaware is being taken from us, from our social exchanges. In most cases, it still requires a ton of people hired in the moment to annotate that data and make sure it's clean training dataset for developing a model and decision making. So that blew my mind. And the book really started with who are these people you're hiring? My dear friends who are doing, you know, developing algorithmic models for human decision making. And when they answer, you know, I don't know how. I was just I was a bit surprised. And then I was quite compelled when I would continue to ask that question. And the question the response to that question ranged from, I don't really know to human computation means I don't have to know. I can just have somebody do this task or not have to know who they are. And many people are doing it. And then what I need comes back to me. It's amazing. And then meeting the folks who actually felt quite both curious and concerned that they they don't know when they're afraid to find out. Wonder what are the conditions of somebody doing this work. And this was like 2012, 2013. And that is when I met my co-author and collaborator, Sadaa sorry, who was he was the first person I met who did quite a bit of behavioral experiments using human computation, who when I asked, do you know who these people are? He said, I don't know, but I'd like to find out. That was really the beginning of it. And if it is to your question, just hooked to what constitutes data, like how do we construct data? And it turns out through not just the human decisions that we're looking at and pulling off of myriad platforms, social media and other data sources, but that human labor, that's integral to making it valuable.

One dynamic that's coming to mind for me and Justin, I have been talking about this quite a bit in our own work where we sometimes have.

Some critique of Silicon Valley or some critique of the tech industry.

And at the same time, we have colleagues that are within it. We both been within it ourselves. And we're talking to you who is, you know, working for Microsoft while at the same time. And sometimes people see Microsoft as like one of the Paramount's of Silicon Valley. And you're also coming out with a book that's saying, you know how to stop Silicon Valley.

And I'm wondering how you navigate that space as being kind of part of it. And then also able to critique it.

I want to say what I hope everyone, when they read this book or they hear me speak or they have a chance to connect at the conference, ask that question like, what's the relationship between the work I produce and being funded by underwritten by an institution that produces these technologies? And the reason I think it's an important question to ask is because they're all scholarship is located somewhere. Some institution is funding and setting priorities for researchers. And when I was at Indiana University, full time on faculty there for most of my first book finishing that book, I am in a very particular setting that, you know, resonated with me, resonated with that project.

I literally felt and did that many times take that work back out to the country. I'd like to talk with young people who I've worked well. So for this project, being in an in an industry funded rate, you know, basic lab meant I had to come to grips with what exactly is my project here and what do I want to speak to? Who is my audience.

And so interestingly and, you know, there's a there's a real luxury here at Microsoft Research in that it's really the last industry based lab that I'm aware of that has no check on its research. I don't it's not vetted through a legal department. PR appreciates when we let them know before publication comes out, that might be seen as critical. Microsoft itself does not take a stand on our research precisely. So they cannot comment on pro or con. What we find. And I think it's actually unique to Microsoft research precisely because it's in a funny spot as a silicon giant one. It's not Silicon Valley, but at the same time, you know, it it is an elder statesperson in in this setting. And it it has the luxury of building most of its wealth on technologies that are seen as closed systems, you know, that kind of value neutral. Now, we absolutely all know that's not true. And if you're coming from a different position, we can absolutely talk about the politics of all this software. But importantly, it fed in and really took on the mantle of Bell Labs. It fed this basic science called computer science when it was just coming up. So it has always held this very particular position, carried the mantle of we are here to advance this science.

And I think that me joining the lab and my colleagues, Tom Gillespie, Kate Crawford, Danah Boyd Baim, we're an interesting sets, the social media collective. We stand out. But in some ways it's a signal of a few people within Microsoft Research saying, gosh, this was 10 years ago. Gosh, we are starting to do stuff that seems to engage society in our course. When we came in the door and we were like, yes, you have been all along. Let us talk to you about that. But we could probably all say that and agree that that so much of technology now more obviously is the mere socio technical system. It shapes and is shaped by society. Technology is shaped and shapes technology into our society. So we say that over time we have a clean habit that, you know, it. It really is the socio technical system we've all been waiting for in science technology studies. It shapes and is shaped by society. Technology is shaped and shaped shapes and shapes shaped by society. So, you know. Say that, and I'm emphatic about that, because I don't think that's obvious to our colleagues in computer science and engineering. And I am I. I think it's actually important for political work to have a full spectrum of agitation.

I used to be a big rock thrower, and I I am right now feel like I'm part of a political work that's there, too, to offer explanation and hold my judgment for as long as I can hold as much compassion as I can for colleagues that are just learning that they're actually building social systems with consequences, severe consequences. And and maybe the next topic for us is I actually believe they can be doing so much better. They just have never been taught how.

And the social consequences and this shaping of society that we see from any huge tech companies.

This is something that you've talked a lot about in your work and something that I'm done and I actually really appreciate is that your work is advocating for these ghost workers and for so many invisible people. And I would love if you could talk a little bit more about what some of that ghost work looks like. And even in a modern day scenario where we're here on Zune because we're living in the Cove at 19 pandemic. And so what's like a current application that we can see ghost work in right now, living in the Times of Cauvin?

Yeah. So the work that we were studying it, it's both the work of data labeling, captioning, translation, work, content, review and moderation that is meant to, as we were earlier discussing, train clean up training data so that it can effectively be used for machine learning. But that's one strain of the work. That much of what we were studying and what I want to connected to is this other stream of work that is the continuation of the casualization and really the dismantlement of full time employment.

So to see that broader trend. And what happens when you take technologies, algorithms, application programming interfaces and the Internet and put those to the task of dismantling a full time job of office work and turning it into a string of tasks better distributed and reorienting what you're accomplishing in that moment is really aggregating people quickly for a snap judgment, a moment of human intervention, and then letting computation continue the process of delivering a service. So content moderation. If I'd said two years ago, content moderation is a thing that people do. I don't think most people would have been able to understand that. And I actually feel like it's so we're just now coming to this awareness. That's a great example. And I'd point everybody to Sarah Roberts, fantastic work on this behind the screen. It also taught me Gillespie's work, custodians of the Internet, to realize technology companies also have been buying into this story that they will be able to automate human decision making in these critical function functions like content moderation. But it turns out any humanist will catch this in a moment, that it's actually quite difficult to discern when something is being said out of love or hate. Right. So capturing and automating the filtering of hate speech, that is that's just spitting in the wind. Precisely because so much of language constantly changes, so much of what we experience is hateful or loving is contextual.

Who says it, when, why? And there's you know, that's the kind of work that we were studying that's turned into that second stream of contract is very quick, combining of computation and human creativity and snap judgment. And when I say snap, mostly it's thinking I need to have somebody answering requests very quickly. That's true of telehealth. That's text based customer sort. Support is quite mundane and it's that it's that world. I like using the phrase information service work to see how much information service work is now, the way in which service economies draw in digital tooling to be able to rouch requests to be able to bring teams of people together to work on a project. And you need somebody who has a different set of programming skills. For example, you bring them to a table. That's that's the world of. Work that we're describing that is under the surface that most people do not track or see. And it isn't a 20 year career at a firm. It is. It is being a contract worker to fill a particular job that is time bound and moving on from from that work. And that technology can effectively applied and in many cases not just high, but devalue the people who are doing that work because they come in, they come out, they come in, they come out. We know the economic principle of churn works against workers best interests.

So we're living in a really. I'm going to say a really incredible time in history right now. Incredible and all the different ways that we could define incredible. And there's a lot of.

Processes happening in parallel. So there's a moment of reckoning around race in the United States right now.

There's also we're living through, as I just mentioned earlier, Koven 19 pandemic.

We have folks like yourself and like our former guest Lily Irani, talking about, you know, Amazon, Mechanical Turk and how we produce research.

And I'm wondering if you see these all as part of a similar pattern or how you connect all of these various things that are happening in this, quote unquote, incredible time that we're living in.

So to me, it's the collision of these two structural changes that have been going on for 20 years. And in the book, there's a history chapter that really takes it back much further. But in more recent times, and particularly through the Internet's capacity to distribute information, we have both the the rise of the use of the Internet to contractual wise, whatever can be contractual wise. I think you can find on your desk that can be passed out. And I can mean a company can look for cheaper labor elsewhere that can fulfill the work that needs to be done. That's the heart of hopefully Iranis chasing innovation and seeing where we start moving the centers and margins of value globally. But there's this second this second significant, significant structural shift, which is every business. It's not just niche jobs. Every business saying, oh, I've got this mechanism that as I strive for, you know, some sort of automation of a particular business process for marketing, writing, branding, podcasting, you name it, that I want to figure out which of the pieces of what I need to do. Can I identify as wrote and then script them and automate them that it's the surge of the approach across all sectors. I want to break off those pieces and turn that into something that is not part of my full time employees day job. And so it's it's it's watching the collision of those two realities arise in service economies that are information services that mostly define economic activity.

And that's the that is the delivery driver who is picking up a food order and bringing it to someone's home as much as it is someone who's doing telehealth, who's a nurse, navigator, helping me stay on my medication. Right. See those two, as well as workers within service economies that now turn to digital technologies and tooling, but are also very much information workers. And then that second structural shift of more and more companies saying you to meet my service economy needs. I'm going to use this mechanism that can source schedule, manage shift. Bill, what I need done through a mixture of the Internet, some software, I mean, I want us to all think and really mundane ways when we talk about A.I., we're not talking full automation. We're mostly talking some businesses that chase full automation cause you get bank chasten full automation and most of work being hit by the semi automation and task application of their day job. And that's the thing. I'm hoping people will track that the the accountability that I want tech industry and engineers building a platform, better software and air quality and all of that to see you are constructing a worksite. You're building a work site. There is no part in history where work conditions were ever good.

If they were left to be determined by the people doing the hiring or by the people who were doing the tooling. We have always needed more conditions to be defined and fortified by policy, by our social safety net. So if you think about everything built for our social contract and our safety net around employment is actually hooked to a very specific technology. The factory floor. Right. So if we see that environment as a particular kind of techni tech know technology, that means there are social relations that are shot through the structuring of certainly people's work. I'm not just talking about Taylorism. I'm talking about the way in which we imagine who will enter that building. How do we incent people, more people to have opportunities to enter what building shift work as a way of organizing on day? All of that. That's what's radically changed in this setting and and really arguably around the globe. And that's why we started in the United States. It didn't change for many people who had been living in informal economies. Cash. Economies that exchange services every day and now they have a mobile phone that helps them do that. That to me, it's it's orienting to. We might have just experienced a blip of seeing full time employment as the way to orient our our social contract. And this work this way of organizing on demand markets through technologies. For me, as an opening up of how do we think about equipping every working adult? So we have a set of essential benefits that are not a part of a factory floor or their timecard.

Yeah, I want to keep digging into this second structural change, actually talking about automation and specifically the future of work. I'm curious how you think automation will impact and is impacting the future of work in terms of, you know, more jobs, less jobs.

Is this good? Is this bad? Yeah. What is your opinion on this?

Yeah, it's really interesting because when we first started this research, it was the beginning of the rise of the robots. Martin Ford's work, you know, plenty of really interesting books that were mostly speculative. They were all there by economists or technologists who had kind of fever, dreams of robots taking over. And I think that to me is the value of the methodology we take to the study of what does work look like today. If we're applying this mechanism that more and more companies are applying to distribute the work, they have to be done.

What that suggests is that the future of work is a is more about what will the organization of an employment relationship look like? How will be algorithmically managed in part?

Right.

That tracking back to me helps us understand the future work. That's why I think the present is a better place to look for. Well, what does that look like? Well, it looks like people not having a reliable person to call when their account is closed or suspended for no good reason. Well, that needs to be fixed. It looks like not thinking very deeply about what work needs to happen right now. Thinking is on Mechanical Turk and what is actually a task that doesn't need to be time bound. So in our research, realizing there were plenty of projects that actually that they could have communicated, I don't need this right now and somebody could have taken a break. They could've built in a pause button. Why isn't that pause button there? Because they weren't thinking about it. Platforms were not thinking. Why should provide that? The biggest the thing that blew my caught my coauthors. Mind just how many people collaborate on a platform? We have zero clue. And now our research shows how much people who are most invested in test based work or are off platform providing social connections, mentoring, support that you would expect to find in any office setting. Of course, that's happening. None of that's facilitated. Why not? Because it's not technologically possible, but because there's so much conservative reading of the current legal statutes and a lack of legal direction on if you support people collaborating. Are you an employer of record? Are you are you liable for what an employer should be providing? And because most firms, especially small firms, they're working with margins that mean they're not going to hire everyone as a an employee in the United States, because that comes with right now a set of benefits that must be provided by an employer. That's stupid. I just want to like if I had to say what's on ethical because I rarely just call anything unethical, it's unethical to leave something as fundamental as health care to whether you happen to have a good job or not. When we look back in history, we will see just how cataclysmic a decision and how unethical a decision that is.

And at the end of the day, I don't believe this is about good or bad intention or good or bad thinking.

It's what are our actions, what will the consequences? How do we constantly interrogate where we are to move in a direction of where we want to be? What is the deliberation around that shared vision of future? What does it look like? So so I do see more. I mean, if I'm not projecting like we know what what is every what is every business learned in the pandemic?

Oh, some of my work can actually be accomplished for people who are working outside of a work site. They don't need to be in a shared work site. So what can I offload to a form of employment that does not cost me a work site that is neither good nor a bad thing?

That's a thing that's happening. What do we do about it?

One of the ways that we see this Radically I podcast project is as a political project. And by political, I don't mean partisan. Right. I mean, in terms of you mentioned Ficco earlier. Right. Like we're talking about power here in systems of power and how those interconnect.

And I'm really curious for you.

Because you mentioned your political history. And then also it still seems like you identify as a political person. And you've also thrown in the word radical a few times here. And we always like to ask our ask our guests.

As you know, the Radically I podcast, what you mean by that and whether you identify as radical in some way.

No, I don't think I mean article.

I mean, I wish I was and I have been in the past. And I think a radical political actor carries the banner, perhaps sometimes quite a bit of fiction, anger, too, to call out what is not happening. And, you know, I'm saying this out loud. I haven't really thought of that. I don't think that's have a good sense of what we think redcoat is anymore. I'm very much moved by pragmatism that some of those some of my training, a grounding.

I believe in nonviolence. I think that's a radical act.

And at the same time, I feel like the role they have to play in the setting, my man at the moment is precisely to be a bridge builder and peacemaker, to be able to absorb the critiques of my colleagues who want more to change foster and to absorb more critiques of my colleagues who feel like they are being pushed and don't know what direction to go and to be able to listen.

They have compassion.

And to sit with the discomfort of wanting things to change and knowing there are people who are bearing far more of all the risk and the loss of things being the way they are.

Before we began this conversation, we were talking a little bit about what we wanted to talk about in this dialogue, and we mentioned a little bit about possible like color blind racism that's happening in technology spaces. And then also the well-being of healthcare workers.

But I kind of want to frame that question in terms of this idea of compassion, because when we talk to folks, even just informally, we just talk to folks it. It's really hard to have compassion.

I think it's really difficult. And I think you in the public spaces that you interact with, like you're really modeling that for folks. So if you have if you can just solve compassion for us in the next few minutes, that would be that would be wonderful. But how do we have compassion for this world that we've been given?

It's a team sport. So I actually believe that compassion isn't one person's job. It's it's how do we within settings where we want to create change, have agreement or negotiation, or create space for each other to act in different ways and understand where those actions are coming from. Now, for me, like cultivating a sense of understanding so that I'm not assuming someone is acting in malice in the fact that individual, even when I feel like they are doing something terrible, believing, and it is a belief, it's not a truth believing that there is some capacity for them to change and that I will be changed as they change like that is what that is, what feeds my ability to sit with where things are not in a not in a static way, but in knowing that in a move where I am telling somebody what to do, that is its own kind of violence, because I feel like right now, you know, I don't know anybody whose heart doesn't feel broken again over again.

It's been it's actually been quite surprising to see the mainstreaming of white people caring differently about black lives, but caring differently about their own complicity, their own complacency, complacency, their own participation, circulating oppression.

And I feel like that again, back to Chris. I think one of the most important things I learned when I came up and out through queer studies and through queer movement was this understanding that it's not about finding how we're all the same under the skin. It is genuinely saying we are so different. And I am better for taking in your difference and understanding your difference like that. I am I am a better person for for that. And that compassion to me does come from this place of believing that the training I receive and in thinking about my role, the racism I perpetuate every day as a white person, I do what I was saying.

This is somebody this week. I don't transcend that in my life as an individual. That's not how this works. It is constantly coming to that awareness. Now, what I change is it's not going to be about me as an individual. It is being part of this collective sense of wanting things to be different because it is so diminishing to have power because of historical violence and that there's not a person who escapes that. But that's that's true. Every social norm. So, you know, straight folks out there accept that every day you are part of what perpetuates oppression. For me, you know. And then what do we do about that? I think that's where the compassion comes from, is accepting. There's just no way to move in the world as a human without perpetuating systems that naturalize how things should be. And in that, that means there is destruction and violence against folks who don't fall within those categories. And here to me is the connection to ethics that ethics is precisely coming to that awareness. Sitting with that awareness and being part of reflecting on how can we do otherwise?

Like to let it be this open question that's constantly remaking itself. You don't get to land on the deep biased data. So thank. We moved really quickly through that. Well, I think we are. I see these moments where I'm like, OK, some older white dudes. I'm not. But that's OK. That's OK. I would say most of the folks coming up understand who listening to this podcast. There is no deep bias in data. And we get to move on. And this goes to the core of what we were talking about earlier. That means much of what we have to learn is to stop thinking about this as a data set to per fact and to assume that this is about constantly rebuilding our social relationships with each other and that technologies have always been a part of that. And that is that is what we come to when we're building anything that we are entering a social world steeped in pain and oppression and beauty. And there is no escaping those those tensions across those things. So maybe that for me is where compassion comes from, because then I'm not looking at somebody who's saying something super frustrating and completely dismissing the possibility that if they don't change our conversation, why change someone who's listening?

As Don and I have been running this project, especially since for situated in the ethics space, we we see this come up again and again, not just compassion, but this discomfort and learning to sit in that discomfort and be okay with it and accept it so that we can utilize it and do something with it and create critiques that are actually actionable. And so we definitely appreciate your definition and also the work that you're doing on all of this. But unfortunately, Mary, we are out of time for this interview. So for folks who would like to engage more deeply with your work and maybe get in touch with you, what's the best way for them to do that?

Probably atme on Twitter. And you can look at my Web site, very ugly word. And if you want more about the book Ghostwood, you can get a guest worker dot info and it has a primer. It's got some materials for people who might be teaching the book or any questions people want. How great.

Well, thank you so much for coming on again, Mary. It's been a pleasure to have you.

Thanks so much for imitation.

Again, we want to thank Mary Gray for coming on the show today and for a wonderful conversation in which, again, we covered so much ground. I felt like there was so much content that I wanted to dig into more than we had time to just. Mary is just such a brilliant scholar and such a compassionate person. And we can talk more about compassion or conversation. But just I'm curious, what stood out to you in this interview with Mary?

First, I think I'd just like to say that talking to Mary made me feel so comforted.

It just kind of felt like we were having like a conversation with a colleague and just casually chatting about, you know, the future of work. And those aren't really casual conversations. It was just like a very calming conversation, which was nice because sometimes talking about these things can be a little bit uncomfortable and it can, you know, raise feelings of anxiety or fear for the future. And so it's kind of nice just to be able to have those conversations and be a little bit not emotionally removed, but to feel supported by the people that you're having those conversations with and to have that safe space to be able to just like explore into topics that might make you feel a little bit of discomfort. But I think especially one of the topics that really was standing out to me that we talked with Mary about was when she was explaining her position of being in Silicon Valley, but then also being critical and critiquing Silicon Valley. And that's something that I just think is like really special, especially for someone who wears so many hats like Mary does. There's a lot of people who are in positions in Silicon Valley, whether they're in positions of power or maybe feeling disempowered by wherever they they lie.

It's it's interesting to be able to take the critique that many people have of Silicon Valley and potentially have the ability to do something with that when you are, quote unquote, in the belly of the beast. And it reminded me a little bit of the advice that Miriam Sweeney gave in our interview with her about just the importance of being critical. And I loved Miriam's words, too, because she was talking about how being critical and critiquing things doesn't have to be something that's, you know, negative or super pessimistic. It can actually be a really great way to just hold the things that are so important to you more accountable. And Cheves said this is like a direct quote from her interview, that it just means you're holding those systems, you know, politicians, companies to a new accountable standard. And I felt like Mary had some really great insight on what it felt like to be in that position of critiquing things as well as, you know, being funded by them and doing work by them and being situated within the things that you're critiquing. And so it's it's definitely an interesting lens to view that critique from. And I've said critique about 20 times.

So I'm going to pass the like to you, Dylan.

What what were your immediate reactions to this interview? Mostly just critique for laughs. So it's.

Like when when Mary get gets on the call, so you whenever you get on the call with someone like on a Zoome call and you haven't talked to them before and maybe it's taken a while to get set up, and they're like a big name in the industry in the world, like Mary is, you always kind of wonder, like how those first like five seconds are going to go, like, what energy are you going to bring to it?

What energy is the gas going to bring to it? Maybe they're having, like, a chaotic day or or whatever. And this was one of those times where we got on the call. And I just like and I say this as a minister. It's like I immediately felt ministered to. There's just something about like Mary's presence in this space where I feel like she just had so much immediate respect for us as not just colleagues, like in a professional sense, but just as like people where we're trying to figure out the answers to these same questions and doing it in like a a collaborating way, a collaborative way. And that's that was the first in the struck me is like is like her presence.

And I think this came up when Mary began talking about her role as a senior principal researcher and as a professor. And again, all of the many hats that she wears. How she sees herself as a bridge builder and a peacemaker. And I was really taken aback when she said that, because what I what I was expected, this was in the radical conversation, right, where we asked her what radical meant to her and all that.

And I really had expected her to be like, oh, yeah. Now, this is this is what radical means to me. And this is what I this is what I mean. And this is how I situate myself in it. And I just felt really disarmed by the hurt, understanding her role as a bridge builder, peacemaker and and a pragmatist in that space.

And that I still I think I'm processing like what that means, what that looks like maybe for me, because that's like how I want to see myself in that space. And it's like, well, how do you do that? Well, how do you bring the pragmatism together with the peacemaking and with the bridge building? And I think what it sounded like for Mary is that it all comes together in this concept of compassion and ability to have difficult conversations, to sit with discomfort, but to remain at the table with one another, even through all of that discomfort. And I think the last thing that I would like to highlight from this interview is this conversation at the end about how we're all complicit, about how as a straight person, I am complicit in the oppression of the LGBTQ plus community.

As a white person, I'm complicit in the oppression and the marginalization of the black community. And, you know, the list goes on. And I think that was the most like uncomfortable part of this conversation for me.

And I think we should.

And maybe for you honestly, listeners, as when you heard Mary talking about that, and I think that's so important, no matter where we are situated in the academy or industry or just as as people new to the space, maybe that that is so pivotal to doing the work and being invited to do the hard work to make a more equitable, equitable space.

And maybe this is a great opportunity as well, Dylan, for us to invite ourselves and also our listeners who might be feeling a little bit of discomfort at maybe some things that we talked about in this interview and things that we talk about in any interview or just topics that come up in the ethics field to sit with that discomfort. This is something that we really encourage you to do as you listen to some of the conversations that we have on this show. If you feel like you're pushing back on an idea. If you feel like you're being pulled to challenge your own opinions or predispositions, sit with that discomfort. Question why you may think what you think and invite yourself to feel compassion.

And I think that's one of our hopes for this show, is to invite folks, and especially through the conversation started with our guests, invite listeners in to these uncomfortable spaces, because, again, that's that's where the work is.

That's where the growth is. That's where the development is. And honestly, with a show called Radically I if we're not making you uncomfortable at some point, we're probably not doing our job. So we would love to hear about it.

Let's say you want to send us an email if you don't want to say it publicly on Twitter or if you want to tweet at us in ways that, like some of these conversations are maybe pushing your boundaries in either wonderful ways or honestly difficult ways. I know, Justin, I are pretty open on the show about different conversations we have in ways that they're challenging our own assumptions and biases and and all of that. So we would love to to have your commiseration, but also to have your own ability in continuing those conversations. And for more information on today's show, please visit the episode page at radical A.I. dot org.

If you enjoyed this episode, we invite you to subscribe rate and review the show on iTunes or your favorite pod, Katcher. Join our conversation on Twitter at radical iPod. And as always.

Automatically convert your audio files to text with Sonix. Sonix is the best online, automated transcription service.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Rapid advancements in speech-to-text technology has made transcription a whole lot easier. Are you a radio station? Better transcribe your radio shows with Sonix. Create and share better audio content with Sonix. Do you have a podcast? Here's how to automatically transcribe your podcasts with Sonix. Better audio means a higher transcript accuracy rate. Are you a podcaster looking for automated transcription? Sonix can help you better transcribe your podcast episodes. Automated transcription is getting more accurate with each passing day. Do you have a lot of background noise in your audio files? Here's how you can remove background audio noise for free.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Sonix is the best online audio transcription software in 2020—it's fast, easy, and affordable.

If you are looking for a great way to convert your audio to text, try Sonix today.