Casteist Technology and Digital Brahminism with Thenmozhi Soundararajan and Seema Hari


Casteist Tech (1).png

What is casteism and how does it play a role in Silicon Valley and technology development? What is digital brahminism? How do these impact and harm individuals and communities?

To answer these questions and more, we invited Thenmozhi Soundararajan and Seema Hari to the show.

Thenmozhi Soundararajan is a Dalit rights artist, technologist, and theorist. Currently, Thenmozhi is the co-Founder and Executive Director of Equality Labs. Seema Hari is an engineer and an anti caste and anti colorism activist.

Nikhil Dharmaraj (one of RAI’s Spring 2021 Interns) was the moderator for this conversation.

Follow Thenmozhi on [Twitter @dalitdiva] [Instagram @dalitdiva]

Follow Seema on [Twitter @SeemaHari] [Instagram @seemahari]

Follow Nikhil on [Twitter @nikdharmaraj] [Instagram @nikhil.dharmaraj]

If you enjoy this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.



Transcript

casteist-tech.mp3: Audio automatically transcribed by Sonix

casteist-tech.mp3: this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

Speaker1:
Welcome to Radical A.I., a podcast about technology, power society and what it means to be human in the age of information. Today, we have a special surprise for all of you. Your typical hosts, Gests myself and Dylan will not be conducting the interview in this episode. We have a special surprise host instead. If you've been following along with our podcast and organization for a while, you might remember that over the last six months we've been running a radical AI podcast internship with two amazing interns, Nikil Dhanraj and Lena Wang. And both of these interns have been working hard over the last six months on several projects, including a series of roundtable dinner discussions about radical visions for technology, which was led by our intern Nikil, and also in collaboration with the White House Foundation and an entire curriculum and set of resources on technology and power, which was led by our intern, Lena. We were so happy to see so many of you at the roundtable events that were led by Nikil over the last few months. And be sure to stay tuned for more information about Leena's projects and curriculum in the coming months as they're launched. And speaking of the incredible work that these interns have done over the last few months. Today is a bittersweet day because today marks the final day of the riots, 21 spring internship. In order to mark the celebration, we are releasing our very first interview conducted by a guest host, one of our interns, Nikil Demiraj. So in this episode, Nikil speaks to 10 Modisane Androgen and Seema Hardy about technology, casteism and surveillance. Dunwoodie is a Dalit rights artist, technologist and theorist. Currently then, Modi is the co-founder and executive director of Equality Labs. CEMA is an engineer and an anti caste and anti colorism activist. And now we'll hand it over to Nikil to take it from here.

Speaker2:
Hi, all, welcome back to another episode of the Radical Eye podcast, so excited to be joining you all today with two incredible guests on the show, Seema Hari and then Morris in the region. And today we're going to be having a very important and fruitful conversation about caste technology and surveillance. So thank you so much and welcome to Seema. And then. So to go ahead and kick off this conversation, I wanted to start by asking you all, how can we conceptualize networks of caste and the production of technology both in South Asia and the South Asian diaspora? And maybe as we go through this conversation, also for our listeners who don't know, it would be great to sort of give a working definition and understanding of caste frameworks in general.

Speaker3:
So I think that this is such an interesting question, because I think that this has been the heart of the work that equality that has been working on for the last five years is really talking about the ways that we're seeing caste apartheid reorganize itself within the digital space. And when you think about the work that Professor Noble has done to really illuminate how structural bias is embedded in code and embedded in supply chains and embedded into workplaces, those are all of the ways that we're seeing caste be we organized in digital realms. And this is something that I often talk about in terms of the theoretical work that we do around test and tech bias as digital ism, where you're basically seeing the logics of caste, you know, not only be transported, but really become the underpinning of so much of the digital infrastructure, both for South Asia and South Asian tech talent that might travel as part of these global supply chains for all of these conglomerates. And the ways that we see caste show up are profound. You know, we are seeing it in terms of caste as hostile workplaces where we see caste oppressed workers seeing slurs and harassment and discrimination become normalized by, you know, in workplaces where even they know that this is a problem they don't have, are properly trained to identify and arrest some of these conditions. We are also seeing this in terms of data stewardship.

Speaker3:
And when we think of, you know, surveillance capitalism, I think that in the South Asian context, this is a surveillance capitalism, not emboldened just by white supremacy, but a white supremacist and rabbinical framework around millions of Dalit and other caste oppressed bodies and communities and geographies. And I also think that we are seeing widespread disinformation networks that are primarily dominant caste networks, who are emboldened by companies who are not implementing their own guidelines to protect and do their basic duty of care with our with within the South Asian markets, but also quite critically, are allowing the normalization of caste system, religious slurs. So the ecosystem for bias and discrimination and the digitization of caste apartheid is in so many realms. And and I think that you can't just speak to it about it being one element, one thing. It is the way that all systems of oppression work. These, you know, techno utopian outlooks that CEOs might pitch to venture capitalists is not the material reality of how these technologies get brutally implemented and hundreds of millions of users and the bias that's baked in, you know, perpetuates and creates even more divides. And so our challenge as technologists who care about caste and racial and gender equity is that we need to basically have an analysis that helps us understand what is broken about these systems and then work forward to kind of address the challenges from that place.

Speaker4:
Yeah, I absolutely agree, and then he really did a good job at explaining the macro effects of all of this, and I really think coming from it in coming from the tech industry, for me, the way that it scares me the most is how technology is really brutal in its analysis. Right. Like whenever you're making a decision as a human being, there's a lot of nuance of world. But as soon as you bring tech into those decisions and like the way describe when you have all these tech European CEOs who are saying that, hey, where are you going to use algorithms to solve everything? Well, algorithms based are based on one and zero decisions, like a string of one and zero decisions. And a lot of these things cannot be cannot be categorized as one zero decisions. And because of that, what you see happening is that algorithms are actually increasing the divide, I would say, and increasing the digital divide and the socio economic divide and contributing in ways that are more harmful than when humans would make those decisions, I would argue. So for me, it's really just like thinking about, you know, when we're talking about straight from like the production of technology, like even, you know, when you think about how people are hired now, that is based on algorithms. Right. So you might you know, LinkedIn is running a machine learning algorithm, box office of candidates for you. But how do you know that the algorithm is not biased against Lucas and gossipers people? Because they might be the ones who never get hired eventually. So then, you know, like and they might not be the ones who are picked in these pools in the first place.

Speaker4:
So that's what scares me the most. And, you know, I that that influenzae I don't know I don't even know the answers to like, how do you combat that? I feel like we need we a digital researchers and people who are writing these algorithms to consider all these factors in their training algorithms, etc., and, you know, like the end result in, you know, there are so many technology workers that are coming from Indian institutes and coming in for their master's degrees as well. And because of the lack of cost reduction and the lack of protection against discrimination for gossipers, folks, we are kind of at the mercy of of managers and professors, et cetera, to protect us. And there's nothing else protecting us. And it prevents us from speaking up even if something happens. You are you are so afraid to speak up because there's no protection. And the second thing is that you risk losing your visa or your job, which then sends you back to where you came from. And, you know, you might have actually escaped a lot of, like, negative experiences or you might be escaping something from your home country. So there's a lot of things in play. And I feel like algorithms and technology are actually emboldening people and building the discrimination that's happening. So we really need to look at it from the from the point of view of empowering the researchers and the engineers who are making these algorithms to be educated at the very least, at the most basic level about what cost means and how it manifests in in social groups.

Speaker2:
Absolutely. Thank you both so much for those incredibly illuminating answers and the frameworks you both laid out make a ton of sense. One thing I wanted to zone in on a little bit is I know that more. You mentioned the term cost apartheid. Would you be able to provide a definition or framework for those listeners who might not be familiar with caste apartheid as a concept?

Speaker3:
Sure. And it's so funny because I think sometimes we're in the middle of of work that you've done for a long time. You forget that you have to take a bird's eye view, you know, particularly because I think that in the United States, though, we have such a great concentration of technical companies. You know, most folks actually lack basic competencies as to what caste is. So just to give people, you know, a basic framework, caste is a system of oppression, analogous, but not the same as race. And I think like race, it is a social fiction. There is no biological foundation to caste. But, you know, the premise of caste is that it was set up in its origins in scripture. And the idea is that, you know, Brahmins who were the priests who developed this system basically carved up the rest of society such that as people, you know, you go down the pyramid, you get more and more polluted because you have less and less desirable jobs. So, you know, the the top is the Brahmins, who are the folks who focus on knowledge and spiritual practices and ritual and kind of hold the the container around what is considered divine and pure and basically get to write the spiritual paradigm. Underneath that, you have the Qataris who are the rulers.

Speaker3:
And then you have the bishops who are the merchants and then you have a peasant caste that's called the shudders, and outside of that system is a group of people that were considered so outcast that they were untouchable because they were spiritually defiling. And, you know, communities that were seen as untouchable, faced punishing violence, extreme social exclusion, which is one of the reasons why we use the term caste apartheid, because essentially where you land on this system can determine who you marry, what job you have, what side you will live on, whether you have access to water and your proximity to violence and structural privileges or lack thereof. And for many that were untouchable, I mean, this was just a grueling sentencing, violent experience, you know. And, you know, I think part of that resistance was that people were like, we don't we don't want to call ourselves untouchable. That's an epithet. So we use the term Dalit. And so, you know, within that little mini history lesson for anyone that is in the tech field and not someone who spends a lot of time in the social sciences, I want to really kind of emphasize you don't need to be a history major to understand that violations, of course, are civil rights violations and all of the things that you would understand better violations for protected classes of people like race and gender and sexual orientation are exactly the same things that you're hearing from from people who are caste oppressed, you know, from slurs in the workplace, discrimination and harassment, terminal termination, termination.

Speaker3:
You know, it's all very practical in terms of how you would look at this from a DCI framework. So I don't think that you need to to know the entire history of it, thousands of years of the caste system to know that it's a problem, that we're seeing caste bias rampant across all aspects of tech and it needs to be addressed meaningfully. And it starts by adding caste as a protected category, which opens the door to data collection and opens the door for open and transparent conversations about the kinds of discrimination that might be happening inside your company, as well as positive investments in terms of coaching and apprenticeships and recruitment and and having suppressed employees not only feel safe about coming out in their workplaces, but actually being able to feel confident in going up there. The, you know, the success pipeline of a particular company. And this is a big deal, because when you think about the amount of South Asians that work in the tech field and that India is a market that most companies want to basically conquer, you know, it's not just the moral thing to do.

Speaker3:
It's actually great business sense because the growth of the Indian market is with the next billion users who are all, you know, majority caste oppressed peoples because they are the last to get online. So it makes sense to have a diverse workspace that can actually speak to those users and really move them into places of confidence and and tooling and creating content that really speaks to this moment. So, you know, it's it's a really important moment that I want to encourage anyone within the tech field that's listening to this podcast to consider, you know, it's great business sense. You're on the right side of history. And more important, you'll be compliant with lock. His caste discrimination is legal and the amount of caste discrimination that we're seeing is so wild, it would really do many of the companies that are listening to consider that they don't be the next Cisco. They're not the next company that's sued by a state for caste discrimination because they didn't take the bull by its horns and really work on it proactively as opposed to through litigation.

Speaker4:
Absolutely, and I think I really I agree with everything that moves that I really want to underscore the point about reaching the next next frontier in India, about reaching all the users who are just starting to come online, etc.. I think I see that a lot in the tech industry. And the biggest problem is that people are looking at it from the lens of people who are sitting in the Silicon Valley. So they're like, oh, we need to build an app for this. And I've been in so many situations where I've been able to explain to them that, you know, these are my people. They don't have feature. They have feature phones. They don't have phones with androids on them. I mean, now they do. But earlier they didn't even have that. But people are thinking of creating solutions that were centered around Android apps. So you need that person in the room to help you think about human centered design. Right. And, you know, we have all these Silicon Valley technologies, UX researchers, you researchers talking about human centered design, but they don't have any decision makers who understand the nuance of the people on the ground and the people who will, you know, eventually use their apps. So I really believe in increasing that representation. I think the representation is something really low. I don't even think we have done enough data collection to understand what the representation of gossipers groups is within the huge technology force in America that comes from India. But I think that number is really low and it all big. It all starts with acknowledging that this is a problem and then taking the steps that Nimoy's just laid out where you make Gosta protected category and then people are comfortable about disclosing their costs and then providing positive reinforcement and positive upliftment opportunities for those people.

Speaker2:
Yeah, absolutely. Thank you so much for those frameworks. Those are incredibly useful to keep in mind and really, really relevant to the current moment. I know also that in this conversation, the Cisco case came up and I'm curious to, I guess, learn a little more about that and then also think about the term you suggested that Morri Digital Romanism, how can we understand the digital Romanism and how it manifests not just in the culture of these workplaces, like with the Cisco case, but also in the products of technology themselves?

Speaker3:
So I think what's so important with the Cisco case is that, you know, as we're looking at, you know, landmarks in terms of American institutions really being forced to confront how big of a problem cast is. I think the State of California's Department Fairness in Employment and Housing, suing Cisco is a pretty big one, because what the DFJ did essentially was investigate, you know, the the complainant who's described pretty horrific experiences, which included, you know, aggressive kind of intimidation and harassment. And DFI doesn't take it lightly. You know, a case like this, they're actually very clear before they investigate, you know, that they're not going to pursue every complaint. But the fact that they were able to do this really shows that they believe that the case has deep merit and they're going to pursue the litigation all the way to the end. And these cases like, you know, will last many years. So it's not an insignificant amount of resources. And it's a bellwether, I think, in terms of American institutions understanding. It is time that we addressed cast. And, you know, and I think that, you know, this is also a bellwether for the whole sector, because I think it was because of BIPAC scholars like Sophia Noble or the folks behind the Algorithmic Justice Project that we are having critical interventions in terms of the development of A.I., in terms of the people who are stewards of our data under surveillance capitalism, because the communities that are being surveilled are not anywhere near the stewardship or design of these conversations.

Speaker3:
And I think with digital is I mean, unfortunately, it's the same thing, you know, in terms of caste depressed communities. And and I think what's important is that we stop building tech without intention and we need to start thinking about what are the blueprints for our liberation and then build technology that serves in that capacity. Because right now we have technology that basically serves venture capital pipelines that have no human rights impact assessments. So when they get deployed in our markets and on our bodies and on our data, you know, they have immeasurable ways that they are causing harm. And there is no easy remedy because once you kind of crack that egg, there's nothing else people can imagine. You know, it's too big to fail. What are we supposed to do? Well, you know, and I think we have to do better than that, particularly coming into this moment. And covid we have a responsibility that what comes next doesn't create conditions that could harm us in the future. And so. I think that's really what I'm most concerned about right now

Speaker4:
And to the second part of your question, Nicole, I do see this manifesting in technology products as well. You know, the first example that I gave was algorithms, but there's also like features that you see online that you're like, why has this been designed in a very romantic lens? You know, like, for example, on Indian marriage websites, you can filter by cost. You could even filter by skin color at a certain point. And there was a lot of, you know, organizing around that and they removed the skin color photo. But you're still allowing people to organize by these old traditional ideas of cost base and. Right. So, you know, there's a lot of mako's of this even like in technological products themselves. And that's I believe that's because the engineers who are building this are mostly from Casterbridge backgrounds and they think this is just a normal thing that they have to do under the name of tradition and. Right. And so they haven't really looked at it from any other lens whatsoever. So it's not you know, I think that it goes beyond just workplace dynamics. It's in the products all around us. For example, on Instagram, too, there was an account called Buffalo Intellectual who did an analysis of how many of cost activists are verified, like none of us get verified because, you know, Instagram people who decide the verification, you know, who have the decision making capability of who gets verified, they are probably from a predominately cosmopolitan.

Speaker4:
And so they don't believe that, you know, these people need to get verified or whatever that algorithm is deciding at the end. So I see this manifesting a lot in technological products themselves. And, you know, I feel like there is no person in the room who is capable of making these decisions. And that's why we are seeing all of this. But like then, we said, we have to build that line really stuck with me. Actually, we have to build for our bright future. So we build backwards from that to say like what we want to see in the future for our liberation and then build backwards from that instead of having these, like, really data products that we fix, you know, keep on fixing, but it never really gets better. So, yeah, that thank you family for that line. That will stick with me forever.

Speaker3:
And I do think it's important that we need to be ambitious for what we want. I honestly, I am not interested in creating a research complex that's all about let's fix shitty tech, you know, and you know and research at our own cost while our people are dying, because that's literally what's happening. Our people are dying. Why this crap is happening. Like, if you just think about the violence that disinformation networks have done in India, not just South Asia and India alone, think about the chaos. You know, I was just talking to a Muslim friend who lost 30 members of her family to covid 30 members. And the death started, you know, early in the pandemic, because if you remember, there was disinformation that was being pushed, called Korona Jihad that was targeting Indian Muslims. And as a result, throughout the entire pandemic, Indian Muslims were denied medical care from multiple institutions. This is fucking criminal. And, you know, the bananas thing is, is like when you take a company like Facebook, they are so negligent, absolutely unequal, you know, and they are just negligent, you know, from the fact that they had to fire one of their top racist who quote unquote, resigned. But we all know what it was because of her political biases, but also that they have said that they are afraid to remove groups that are extremist in nature because they are afraid of what would happen to their staff and their offices. So what they're essentially saying is that it's unsafe for us to moderate and maintain the guidelines and, oh, well, you know, to basically hundreds of millions of Indian users. But I would ask the question, if you find it on safety, even, you know, moderate in the country, then what business do you two have even been in the country doing business in the first place? And we're not asking these questions because, again, of the colonial and terminable dynamics at play with these companies, you know, the fact that you have a colonial administrator in Silicon Valley that works with dominant caste people to basically allow hate speech to become normalised, disinformation to run rampant, you know, calls for violence to occur and without consequences.

Speaker3:
That is the danger of digital pragmatism. And I want to make very clear, having an analysis around Brahmin ism isn't about targeting one caste. It's about the system that creates. Part died, and that is so really critical because, you know, when you think about talking about nante blackness, you know, the the the the consequences of race are held on black and brown bodies and indigenous bodies. But there was a very clear ideology that set up white folks and created a social fiction so to with Romanism. And we can't shy away from using the term Romanism. Familiar with it because it is the tool for understanding the supremacist system. So I would just like really want to emphasize how critical it is for us to use the right terminology, because when we can accurately diagnose the problem, we can find the best medicine. But when we're kind of in the margins, like hunting for the right terms, we're just clawing around trauma and clawing around the consequences without accuracy to be able to design for it. And there are many, many tools that we can use right now to design for this, but only for being really conscious about our next steps.

Speaker2:
Yeah. Thank you so much for those incredible answers. And I think that what you all have described as digital traumatism is such an important framework to keep in mind. And it sort of leads me into my next question, which is I'm curious about how we can understand digital Romanism in a longer history, where Brauman appropriation of knowledge has been like a very important focal point of outcaste scholarship. How do we understand digital Romanism and situated in that history?

Speaker3:
Well, I mean, for me, because, again, this is like a body of work that I've been developing. And so, you know, I think that if you go back to your book on slavery, he that is like one of the kind of polemical texts that really outlines the crimes of Brahmin isms and the ways that it is enslaved the caste oppressed masses. So when you start with that articulation of Brahmin ism, that actually goes very deeply into the bureaucratic mechanisms of caste and how, you know, the top caste kind of manipulate like the other caste professions in order to maintain their power and the ways that they connect up with the English. There is a direct through line from Fullam to Ambedkar to iodized to BTR to current thinking in terms of caste and its digital forms, which I think are actually very crucial for us to be able to root ourselves in. Because we do have language, we do have foundation and the ideological understanding of the system. What we need to do is frame it in the context of how do we understand bias to work in tech? And it happens in terms of the places where people get developed as technologies so that it's it happens in the context of workplaces and hostile workplaces. It happens in terms of the iteration of the priorities of a market or the design of a particular product. You know, it happens in terms of who defines it, that data set and who is a carceral body to be framed within those contexts. Right. And then I also think it has to do with the way data is controlled and the way that data is weaponized and who gets to make the parameters of that from a stewardship perspective. So those are all really key places where we will see opportunities for there to be research and dialogue and discourse. But this moment around digital Brahmin ism is so severe because of all the consequences that are coming out of it right now.

Speaker4:
Yeah, and for me, it's also about like the people who are in positions of power and who are, you know, releasing the information. I mean, you know, kind of these words of curation of what gets out to the general public are the ones who, you know, are from privileged backgrounds. So you will see this in like not just tech, but also in storytelling, et cetera, where the decision makers have get to tell the stories are from Kasparov's backgrounds. So, you know, the knowledge that they are willing to disclose is the one that protects them as well. So you see this a lot when like, you know, when filmmakers are making movies about any Antigua's movement, they are referencing either Gandhi or some other Rummenigge leader and not Ambedkar, who was the champion for our civil rights. Right. So you see that same thing? I think even in research and everywhere else where you might have junior researchers who are contributing to people, but ultimately the people who are releasing the information and the people who are doing the research are majority in majority from privileged backgrounds. And the same thing happens with tech as well. So, you know, you might you can fight an internal battle to make an algorithm better. But ultimately, it's the head of Facebook who is going to make a decision about how algorithms are controlled in India and how they can make really easy tools for the South to use for the governments to use, but not easy tools for human beings to report this information and human beings to report against, you know, harassment and bullying because, you know, how easy is it for, you know, Instagram or Facebook to put another category in, like, hey, I'm getting bullied and this person is using this. It's not against me or any of those kind of rules in place. But they don't do it because they're decision makers for. I'm from these gospel backgrounds, and they don't want this to come to the forefront. So, yeah, I, I just wanted to add that is

Speaker2:
One thing that also, you know, a lot of the scholarship that you all have been referencing regarding highlighting structural bias and structural violence as it pertains to technology, has also been around surveillance. And so I'm really interested to hear your thoughts on making sense of cost as a social variable when we are studying and understanding surveillance, particularly given the moment right now in India with the rise of Hindu fascism.

Speaker3:
So I think that, you know, one of the things I think that's so important for people to know is that Dalit and caste oppressed peoples are one of the most caste realized communities in the global context, and that we have faced caste morality in every aspect of our lives. And it gets translated into structures of policing and surveillance very easily. And, you know, I'm really remembering like. You know, the you know, and for folks that don't even know, like last year, you know, before the pandemic hit, you know, India was in a genocidal crisis where they were about to create one of the largest networks of, you know, detention camps to target Muslims and caste oppressed peoples. And there's an entire data infrastructure that would be part of that. So at a very real material level, tech would be weaponized by people who would use these data sets to find people, denaturalized them and put them into jail and begin the process of what it means to really enact genocide. And I think this is a very critical thing to look at because it's both about the tech that is used to cast realize, you know, whether it's the use of like CCTV and other kinds of monitoring that happens. But also, there were several instances last year in wide scale protests where the Indian government was identifying people through facial recognition and was open about doing that.

Speaker3:
And and I also think that we are seeing, you know, a large scale collaboration of American platforms with a genocidal administration. And I think we have to ask ourselves, what does this mean? You know, when you think about IBM and its role in the Holocaust, you know, there should be a response from American corporations about never again. So what are the ethics of operating in a genocidal context and what level of transparency and oversight is required so that there isn't a harm that is done to vulnerable human rights defenders? And this is super important because just a couple of months ago, Google participated with, you know, the Delhi Central Police to remand, you know, a young activist who had shared a toolkit about the farmers protest, which was a set of protests related to Punjabi farmers who were striking against what they viewed as very unjust. And they are very unjust revisions to farmers law. And the response was draconian. And, you know, so, too, was the use of surveillance, where they basically asked Google to divulge the IP addresses of everyone that opened up this Google doc. And then that gave the Delhi police like a list of people to come in and remand, which means to kidnap them in the middle of the night, not let their parents know, not give them legal support and to put them into conversation with the police.

Speaker3:
Why did you do that? Remember when Google was like, do no evil? This seems like the opposite of like, let's absolutely not just collaborate but be complicit with evil. Straight up, you know, and you can't tell me that they don't know what's going on because the head of Alphabet is their pinchy. He's a thumbhole Brahmin from fucking India, knows exactly what's happening. So how is it on his watch? His company is throwing under the bus young, environmental and caste oppressed and religious oppressed activists like this. Where is their response to that? You know, and it's nothing. And I think that's where we have to look at, you know, black and brown bodies are excellent for markets but are really bad for human rights for these companies. So we need to really, like, hold them accountable for what's happening because, you know, we don't we don't want reparations at the end of this. We need investments that address this problem now and investments in an ambitious enough level for us to really ask for more demand for more and and really start to be architects of the future as opposed to, like repairmen for their shitty systems. Because what's working, what's happening right now is absolutely not working and

Speaker2:
Also building off of a lot of that anti surveillance work that you mentioned as so critical to the entire struggle. I'm very curious to also learn a little bit more about how this sort of entity has the vision for technology can align itself with and stand in solidarity with many of the other justice oriented visions of technology offered by scholars like Sophia Noble, like the Algorithmic Justice League, those projects with which to address anti blackness or other forms of systemic violence technology. How and where does antifascist technology fit in that puzzle of solidarity?

Speaker3:
There is a really powerful group of black scholars that are doing collaboration around these issues, and I think we need to have more engagement and a development of an entire new generation of caste thinkers who are opening the space of what this means. So I think we're going to be seeing a lot of that in the next couple of months. Like I know that we'll be writing some pieces around digital Romanism ism to try to help set a. Frame in a context around this, but it's going to be you know, this is this is the frontier we have to cross in order to really build out what we need for our people.

Speaker4:
I totally agree. I think, you know, like that we but it is it's an act of solidarity across all of these justice movements. And I think that in my experience at least, it's been really easy for me to explain, to explain to people what our experiences are and how this should be a protected category. It's only been challenging when I when I have to go because as people but across the countries and across companies that I've looked at, it's actually been to explain the struggle to any ordinary human being. It's it's been it's not been that hard because they can see that in action in many other walks of their life and like like them, where he said, you know, the protection is just the same protection that we offer for gender and race and all these other constructs. So I do see that this all the time because this struggle for justice, we have a common thread across other justice movements as well. And I think that, you know, when we are talking about equity and when we are talking about the I, I don't think any conversation around that would be complete without thinking about Andy Casteism and I or any company that's working in technology as well. Because India is such a big part of the technology story of any company. I don't think they can talk about equity in a holistic sense without really incorporating these discussions in and, you know, like we said, not just in terms of increasing the representation in the workforce, but also looking at it from the angle of data collection and surveillance and also borrowing decisions that they make and design decisions that they make. So, yeah, I completely agree with them.

Speaker2:
As we look forward also to the future, I wanted to sort of then shift gears and ask in what ways, if any, can technology be a liberating force for Dalit and other caste oppressed communities and the struggle against casteism and the fight for liberation? And more specifically, how does that inform and motivate the work that you both do as activist storytellers, technologists?

Speaker3:
Well, I think that technology is a tool, right? Technology is not the the platform for liberation. It's a tool towards our liberation. And I think that's significant because that level of detachment lets us make some critical decisions from when do we need tech to be free and when is tech an inhibitor to be free. And and I think that some of what we need to start being more critical about is why are we acting as these corporate surveilled platforms are Democratic engagements, particularly when we're mobilizing and doing so much social media in these surveilled places that are collaborating with people who are oppressor's. We have to think about how we can ask and demand more for what we need for our autonomy and freedom. So I think thinking about, you know, creating a next generation of Dalit entrepreneurs who can start to build a liberation tech ecosystem that looks at workers rights, that looks at, you know, a vision for what kest equity could look like and all the different ways that we're seeing those failures in big tech, I think could be very powerful. And because our community does have tech talent, our community does have people who are programmers and that are entering, you know, the workforces of these bigger entities that are just terrible. But imagine if we were able to create like an angel fund for entrepreneurs to be able to build around equitable lines. And so I think that is something I think that super important to look at. And and I also think it's important to document and document the harm and also observe and research what's happening. And so we need stakeholders in all of the pipelines that help create equitable tech, whether it's researchers, developers, you know, UX designers, content creators.

Speaker3:
We need to step forward into this digital realm and counter digital Brahmin ism with an age of Ambedkar where we are able to take an Ambedkar Vision into the next century. And that is a challenge Dalit and caste oppressed peoples are more than up to. And it's very critical for us to really take this direction meaningfully as far as we can, because this is how we would take Ambedkar as caravanned for Justice Forward is by lifting his analysis into these spaces that he could only have imagined. And for folks that are just hearing Ambedkar for the first time, he is a dullards civil rights leader that was the architect of the Indian constitution and trailblazer disfigures desegregation, a caste apartheid across many different Indian institutions and was a legendary thinker and polymath. And I feel like if he was alive today, he would have also found a way to become a developer just because of the kind of thinker and genius that he was. But I think that he doesn't have to be alive today because we're all children of Ambedkar who are fighters for caste abolition. And so I think we need to take that inspiration and move into these realms and understand that technology is never absence of bias, that it's actually formed a bias out of political economies that have bias and that create profit that are inherently biased. And so if we can then start to really unpack that, understand these contexts, we can build around it, we can actually create our own path to the future.

Speaker4:
Well, then we just gave me goosebumps, like she usually does, but every everything she said. But I also wanted to add that one little thing that has helped me and that technology has helped me with in in towards my collaboration is just helping me be connected with all of these people around the world. And like finding my wife's voice in that moment, finding people to learn from finding activist technologies, entrepreneurs, filmmakers, everybody. And, you know, in a way that I didn't feel seen before and I didn't wasn't able to find those people and find that community. I have found that right now. And I think that's something that technology as a tool has helped us with. And, you know, with everything that movie said, you know, as you know, as we get funding, as we come together around Umberto's vision, I truly see like I really have an optimistic outlook towards this. I do feel like we can build an equitable future, but it begins with envisioning that and then working backwards from it to build it with all of the resources that we have at hand.

Speaker2:
Absolutely, and I echo your words. The the insights you shared than where you were truly amazing and really inspiring and give us a lot of hope for the future. And as we sort of close out this incredible conversation, I wanted to just sort of ask about your both of your individual journeys through technology in your life in research and the work that you do as as you see it pertaining to these conversations. And with that, also would love to ask places for listeners to engage more with your work. Twitter handles, emails, websites, whatever you think would be best to share with our listenership today.

Speaker3:
I think that for folks to want to kind of track, like conversations related to cast and tech, like definitely encourage them to check out of quality labs as Instagram and Twitter and Facebook as well and put stuff there. I also really recommend that folks track what's going on with the Cisco case. And so to kind of follow up on updates related to that. Also, Alphabeat Workers Union just released a statement related to cast tech. And so they will also, from time to time, put stuff out there. So there's a lot of ways to keep connected. And I think that following our socials is like a really good way to kind of push out the resources and get connected wherever possible.

Speaker4:
Yeah, I second that quality labs. And then for me, I also am tracking all the things that are happening in India as well. So Internet freedom got in tracks, a lot of the surveillance issues that are happening with India and how big tech is helping the Indian fascist government. So those are two things that I look at usually

Speaker2:
Fantastic and will include all those links in the show notes as well. Thank you all so much for coming on the show. I think that today we really had an incredible conversation and I'm leaving this conversation feeling both troubled and inspired by all that you shared. And I'm so sure that all our listeners today feel the same way. So thank you both so much for all your work and your time today.

Speaker5:
We want to thank Dan Murray and Encima again for joining us today for this really wonderful conversation.

Speaker1:
And Nicole, a big congratulations to you for finishing out this internship.

Speaker5:
Thank you so much. Yes, it was really a great experience to be an intern with the podcast this semester.

Speaker1:
We really enjoyed having you around. And also, you originally approached Dylan and I about doing an interview on this topic a few months ago. So I'm wondering, what was it that drew you to this conversation and what is this topic mean to you?

Speaker5:
Yeah, I think with my own background being Southasian from Brahmin Savar in our family in Silicon Valley and having benefited a lot from these structures of complicity and violence myself, I have really been interested in thinking about caste and technology from the perspective of how we can organize upper caste Indian communities and also other communities to be allies while Center and Belad feminist leadership. And so that sort of idea then to sort of make an episode about Kassin technology really appealed to me to get these conversations going and to amplify the incredible voices and insights of people like Jan and Seema. And so I was really grateful to have this conversation and to be able to platform and central to this really important discussion.

Speaker1:
Thank you, Nikil. And we're really grateful for you for facilitating this important conversation. And also thank you for the amazing work you've done with us over the past six months. Well, that wraps up this episode and also the radical A.I. internship. For more information on today's show, please visit the episode page at a radical. I beg. If you enjoyed this episode, we invite you to subscribe, rate and review the show on iTunes or your favorite podcast. We'll be taking a little bit of a break from our regularly scheduled episodes this summer so you can catch our new measure mentality episodes every month and you can expect our regular programming to come back around August. But until then, join our conversation on Twitter at radical iPod. And as always, stay radical.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including powerful integrations and APIs, collaboration tools, enterprise-grade admin tools, secure transcription and file storage, and easily transcribe your Zoom meetings. Try Sonix for free today.