How to Stay Safe Online with Seyi Akiwowo


How can technology be designed to fight online abuse and harassment?

What is the difference between cancel culture and appropriate accountability?

How can you stay safe online?

In this episode we interview Seyi Akiwowo to discuss her newly released book: How to Stay Safe Online: A digital self-care toolkit for developing resilience and allyship.

Seyi is the founder and CEO of Glitch, a charity that’s been on a mission to end online abuse by making digital citizens of us all since 2017.Seyi is also an author, a consultant and writer within the political and tech space, and a former TED speaker.

Follow Seyi on Twitter @seyiakiwowo

Follow Glitch on Twitter @GlitchUK_

If you enjoyed this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.



Transcript

staying-safe-online.mp3: Audio automatically transcribed by Sonix

staying-safe-online.mp3: this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

Speaker1:
Welcome to Radical Eye, a podcast about technology, power society and what it means to be human in the age of information. We are your hosts, Dylan and Jess, two PhD students with different backgrounds researching AI and technology ethics.

Speaker2:
In this episode, we interview Shay Akiwowo to discuss her newly released book, How to Stay Safe Online A Digital Self-care Toolkit for Developing Resilience and Allyship.

Speaker1:
She is the founder and CEO of Glitch, a charity that's been on a mission to end online abuse by making digital citizens of us all since 2017. Shay is also an author, a consultant and writer within the political and tech space and a former TEDx speaker.

Speaker2:
And without further ado, let's get right into the interview.

Speaker1:
We're on the line today with Shay Akiwowo. Shay, welcome to the show.

Speaker3:
Thanks so much for having me.

Speaker1:
Absolutely. And we're really excited for this conversation today where we're going to discuss a lot of themes that come up in your new book that was recently released this August titled How to Stay Safe Online A Digital Self-care Toolkit for Developing Resilience and Allyship. So if you wouldn't mind, could you begin by just giving us a little bit of a sneak peek into what this book is all about?

Speaker3:
Yeah, this book has been a labor of love, and it comes from my lived experience of being a black woman online in the 21st century. And as well as my expertise having run Glitch, which is a charity working to make the online space safe for women and girls, through a lot of advocacy, campaigning and through training and training that we deliver all over the world, but particularly in the UK. And it's been trying to sum up my my mistakes, my lived experience, interviewing other amazing experts and names like Jameela Jamil, Yasmin, Abdul Majid as well, and Laura Bates around their expertise and experience to try and give. Every day and every day women particularly, who had in mind every day black women who had in mind trying to navigate the trippy waters of being online, who understands that the place has so much potential and ability for connection and discovery and education, but is a bit taken back by the. Amount of harms that exist online and really wanted to make sure that this book was not just a tool kit that supported women to be safe and i.e. the equivalent of Don't walk home late and don't leave your drink unattended kind of advice, but really took a broad definition of safety to be like, let's understand the sector and the arena that we're entering in. So we first of all can be clear about the informed decisions that we need to be making. And then let's also look at that person's ecosystem, their community, their friends, their family, their employers, their teachers. What can they do to make sure that women are safe online? And there was a reason that I focused on black women, because we know that black women are 84% more likely to be abused online than white women. So if we're going to focus on the most minoritized person, that's how we can somewhat guarantee that everybody is having a safer experience online.

Speaker2:
But you say more about how we're talking about safety or I guess another way to ask is, is unsafety the same as online abuse or could you break down these terms a little bit more for us?

Speaker3:
Yeah, really good question. I, I don't think they're the same. I think unsafety, um, is a broader term that online abuse sits in. And I think there's also different forms of online abuse. And I try in this book to censor online abuse and violence for sure, but also talk about the other things that makes us unsafe that actually could lead to us to become perpetrators. So thinking about too much screen time, thinking about not, um, being a critical thinker, about information you're reading and you're sharing. Thinking about digital citizenship essentially is what I talk about understanding our rights and responsibility that if we don't understand safety in the most broadest of sense, we can become someone's perpetrator. And one of those acts could be abuse. Um, and we know that I talk about the cycle of harm as well, talking about how hurt people, hurt people and trauma and if we are decreasing that spaciousness, that time to respond, not to react, we actually are increasingly making the, um, the online space and our on our interactions on these platforms. Unsafe for ourselves and therefore unsafe for the people in our network. So, yes, I definitely think that online abuse is one of the most terrifying forms of unsafety on the platform that is important to focus on because it doesn't get enough spotlight. People do, um, ignore and trivialize online abuse. So it was really important to center that, but also to talk about like that's not the only form of harms we're talking about online. We're talking about unkindness, we're talking about the kind of drip effect that then could potentially lead to abuse. But we're talking about those kind of that unseen anxiety as well with being online that kind of like guardedness and hardening of our hearts online that I actually believe causes us to become unconscious perpetrators, which I talk about in the book.

Speaker1:
I'm curious what the difference between people who are interacting in the let's say the the real world. I'm doing air quotes here versus people who are interacting in the digital world are humans. Do we just have this proclivity to be unkind or to create unsafe spaces or to abuse others? Or is this something that's unique to the digital space that cultivates this in people?

Speaker3:
I don't think that the behaviors that we're seeing online are unique to the online space. And in chapter, the first chapter of my book, I actually talk about why this book shouldn't need to be written. And I talk a lot about how tech companies have framed these online harms that we're seeing on the platforms as unintended consequences when actually if safety by design was there at the very beginning of the creation of these platforms, we wouldn't be seeing a lot of these things. I also take a quite a nuanced and balanced approach, balanced in terms of fairness, not balanced in terms of trying to, you know, be devil's advocate in any in any way. Balancing being speaking the truth that tech companies didn't create misogyny. Tech companies didn't create white supremacy and tech companies didn't create patriarchy. And I don't want that to be the message that people think I'm pushing here, which is not the case. But tech companies have helped weaponize patriarchy, have helped weaponize white supremacy. And that's what I would like us to get better and explore conversations around how we hold tech companies accountable to that. Schools don't create online bullying. Schools don't create bullying. But schools still have a responsibility to safeguard every single child and teacher in their remit and institution. Why have tech companies been able to get away with that level of accountability and safeguarding? I'm not a psychologist, but as I've been doing more and more interviews for the book and talking about the book, criminal behavior Online has become fascinating to me.

Speaker3:
And maybe that is something that I'll explore later down the line. But what I do know from sociology and what I do know from from the expertise, um, having run trainings for women is that there are people out there that one know that they are being abusive online, know that they are targeting women online and would probably group them in your kind of red bucket like people that are just going to be relentless online that you have offline. You have rules offline. We have rule of law. We have social contracts with each other, social norms. And people are still just choosing to be narcissist and deviant. I don't think it's as many as we think there are. There are, but I think there are a group of people who are just in the red zone. Then I think there are people in the amber zone and I think that think there are people in the kind of orange, dark orange zone of the amber zone that are. Have got deviant types of behavior online. But think actually on a result of um of a lot of trauma that they've experienced, grooming that they've experienced.

Speaker3:
That's something that doesn't get talked about enough online unless it's people of color. We actually don't talk about white supremacist radicalization that is happening on the platforms. And so I think you've got people who are lost, but looking, searching for belonging, have faced. Trauma, have faced such hard hardships, particularly in such a short space of time. And think the pandemic is one of them where a lot of people are are looking for something, are looking for belonging. And, you know, without now trying to make this a huge, big kind of philosophical conversation. But I think with the with the with the role of the church changing in people's lives, the role of the pubs and social spaces changing in people's lives, people are yearning for belonging. We are creatures, natural creatures that want to be interdependent. And so what happens to those people in that dark orange bucket is that they go to these harmful places or spaces and are groomed in a pipeline algorithmically as well, which is where we need to hold tech companies accountable to that, where they're showing suicide ideation content, showing incel content, and it's actually amplified on the platform. But also people who are using mainstream rhetoric media to then channel them into these hidden places like Fortune and Reddit Reddit threads. Those are my dark amber.

Speaker3:
Then you've got the light, amber, light, amber. Think of people who have just have no clue, just haven't been taught. The do's and don'ts of the online space haven't been told any told right from wrong haven't been supported in their social norms nor social norms. And we actually haven't formed social norms as a society when it comes to the online space. We haven't actually set out what's our golden classroom rules that we would normally do when we start our first day at school. We haven't done that with the Internet despite the Internet being in its teenage years. So there are people there that are just experimenting, making mistakes, but with no malice or bad intention. And maybe part of this, as I call unconscious perpetrating or another term that I heard Melissa Ryan use, which I love a lot, um, uh, toxic fandom. People who love, you know, Harry Styles, people who love Beyonce doing it in the name of love, but actually are then perpetuating harm because they're being so tribal. That's I think that's where there's interventions around education, setting social norms, digital citizenship conversations. Then you've got your greens, your people who actually, for whatever reason, don't engage in that space online. I think it's terrible. I think we need to be setting a new kind of parameters.

Speaker3:
How do we engage them to kind of be role modelling? Because actually a lot of them don't want to be online because of how toxic it is. So we're seeing this exodus of really great people leaving the online space. I remember when I used to work with Amnesty International quite closely that a lot of the teachers, a lot of the human right teachers, human rights teachers in our of our of our of our lifetime saying I can't do Twitter, I can't do these platforms. We're losing people who who can provide that kind of role, modelling and thought leadership, that intergenerational conversation that we so desperately need in our movements around social justice, because they're like, this is just lawless. So I think we've got complex groupings of people. And then why that that festers online. But I don't think it's that abnormal online. We have it when it comes to offline with why people join gangs, why people join extremist groups, why people, um, display deviant behaviour. What is the role of society? What is the role of, um, um, kind of making it a public health issue? How much is poverty and squalor and food poverty? And the fact in the UK we've got a cost of living crisis that we've never seen before. How much of that is driving people's behaviours online actually.

Speaker2:
One thing I'm hearing, especially when we're talking about this amber zone in the middle, I guess both of them, is that there's a desire for accountability, but then there's also a desire for empathy, right? Like these. Some folks have been traumatized, some folks have been are coming from a place of hurt and then they're taking it out for for belonging and all of that. And I'm curious maybe just what your thoughts are on that tension between needing to do something about this very real and harmful issue, and also that there does need to be some level of care for folks who are perpetuating and perpetrating in this space as well.

Speaker3:
Yes, I definitely think, Dylan, it's a dance between the two. Um, I think it's a dance between the two in ourselves. You know, when when do we show ourselves self-compassion or when and when do we show? When are we are that our worst enemy? You know, our super cruel to ourselves, that hypercritical voice. I think we have that dance in ourselves. And then in society, when's the right time to rehabilitate and reform and deterrence? I don't have the answer, but I definitely know that I'm on a journey of trying to explore. And I and I welcome more conversations like this, because I think that's the key, right? I don't think it's about one binary answer, but an exploration conversations, case by case, a framework principles, I think is what we need to form. And I think the reason why my my learning or my my exploration around this. Expanded or what the catalyst was was George Floyd's murder in 2020 and the amount of reading around abolitionist work actually that was exposed. And you know, thank to the silver lining of the pandemic or lockdown, I should say not the pandemic was time to read and got to read a lot of abolitionist work and a lot around how a lot of feminist movements, feminist NGOs get set up to help this pipeline to prison. And I was like, I don't want to be I, you know, I'm obviously support Black Lives Matter.

Speaker3:
I've done a lot of activism work in my previous career and in in kind of community activism. And so in now being the CEO and founder of a charity. What how else do I show up? Yes, we did a statement around supporting any Black Lives Matter activists with with digital self-care and self-defense training. If they were going to do anything in the news, anything in the media, their online presence kind of being amplified. We delivered free training for them. Yes, we were holding tech companies to account around, you know, things like digital blackface, things around safety for black women, but was like, what do we internally do? There is a lesson and a learning for all of us. And that was making sure that our policy positions weren't just perpetuating a kind of carceral approach, that any time someone was doing cyberflashing, anytime somebody was doing stalking, anytime somebody was being, um, um, causing any of these kind of online abuse offenses, we weren't just automatically in autopilot thinking that the only way to justice was deterrence, but actually centering the victim. What do they want? And talk about this in the book. Ian Wright, footballer for Arsenal, had a case where he was, um, racially abused on Instagram and he was able to get, um, start the process of justice through the courts.

Speaker3:
And because he and it was found that the person who did it was a young lad and because he chosen for his wellbeing to forgive because we know what unforgiveness does to us, right? It hurts us. It's the it's poison while the other person is going free because he just so decided to forgive that person. The judge used his statement to say that the boy could go free. I don't believe in that either. I don't believe in eroding what the victim needs and his day in court and looking at the evidence and the facts, not blaming the victim for doing the forgiving to like all out all is well, actually, what's the education and the reeducation around racism and around white supremacy? I think those were the things that that in the repair after the apology after the acknowledgement is key. I talk about in the book that I don't know how much um um doxing someone's information online where they work when they've made, when they've, when they've done something stupid or offensive or. Um, harmful online. I don't know how helpful it is to be doxing them if their information is public on their own social media platform, that's one thing. But to, you know, do forensic keyboard, warrior warrior type stuff to find them and then expose them, I'm not sure how helpful that is because all you're doing is getting them fired.

Speaker3:
Are you actually getting the reformed behavior that you want? And talk about this in the book with, um, the the Euros and the horrific racism that the footballers experienced after the penalty shoot out. And, um, a guy that worked at a very known estate agent was, was doxed. And then they suspended him and investigated. And I was just like, yes, but what else? Because what does that say to him? His network, his friends who do believe in these very colonialist and racist ideologies? How do we actually get them to change their value system and not not just, uh, suppress that and they go to other channels to say what they still say and believe what they still believe. That is what I have a problem with. So I think this tension, this dance of when do we punish and when do we reform and rehabilitate? When are we all understand that we're on a long life learning of long life journey of learning, and therefore need to be compassionate to that. But that doesn't mean that we allow people to get away with being harmful and threatening. I think it's still a conversation we need to keep having.

Speaker1:
But in the spirit of having that conversation here, guess I'm going to ask the the question that I've been dying to ask since I saw the title of your book, which is How do We Stay Safe online? What what are some of the best practices that you've uncovered in your research?

Speaker3:
Yeah. So I think the first thing is understanding what online is. Um, and understanding the arena. I think a lot of people, um. Rightly so. Don't unpick actually what's happening in the tech market and so try and lay that scene out. So I think that's the first thing. If you're going to be a part of something, a membership of something, you've got to kind of know what the environment is first. The second thing is around digital self care, a term that I coined, um, about five years ago when I was setting up glitch and some training resources. And it's about understanding what is your boundaries, What are you saying yes to and what are you saying no to? How are you looking after yourself and how you centering your body's needs? Because your body and your intellect and your emotions all have amazing wisdom that make up you. And if you're in just one or in hypervigilance and another and you're in this this hyper vigilant state of fight freeze flight and fawn, actually you're just reacting and actually could cause yourself a lot of harm and do a lot do others a lot of harm. So actually, how do you center digital self care? How do you center community? Self care is the is a key thing and in there is around having intentions and values understanding where you want to be online why are you online because of work.

Speaker3:
Are you online because it's for fun and then and reflecting so in in how private your account is and the name that you choose like really being intentional and not. What I feel has happened to a lot of us is like this creeping back out, um, behind the scenes, getting more and more online without without actually deciding I'm going to be online like. It's like, you know, we don't just fall into karate class. We don't just fall into dance class. We don't just fall into, like, gaming sites, right? You have to have there's a period of like logging in creating your profile. But with social media, it's so easy. And so there isn't that moment to really set up your intentions with why you're on here. And think for a lot of people who use their names, a lot of influencer, um, work is kind of like really, really prominent for a lot of people, really aspirational for a lot of people understanding that if you're going to use your full name, what does that mean for your career prospects? What does that mean for your family members? What does that mean for any any children you have? Um, going forward and really being intentional even for parents, you know, deciding, am I going to put my kid's face, my kid's most precious and embarrassing moments online for the world to see forever and ever knowing? That's the first step that there are a lot of pedophiles and a lot of disgusting people out there doesn't mean that you shouldn't shouldn't do it.

Speaker3:
But it's like, are you making an informed decision? Is the is what I mean by digital self-care? And then I think the the the kind of final tips can be summed around, summed up around how do you equip yourself, what's the what's the kind of security measures you can put in place? What's the what's the community community you're going to put in place? If you're standing for politics, you're standing for political representation. Elected. Elected? Elected positions. You're, you know, about to star in a music movie. So you want to build an acting career, whatever it is. That means you're going to be having an online presence. You're going to start a blog. I mean, look, if you're going to have a really well, well, um, well made Pinterest board these days, you know, there's a lot of followers there. You're going to join them, you know, the online mum community.

Speaker3:
What is your defense? What is your thinking about your cyber security? What is your digital self defense and what's the community you're going to build to support you in that? And I talk about all of these kind of tips. Just make sure that we are equipping ourselves and informed. And then finally is how do you join the mission, the mission of making the online space safe through holding tech companies accountable, through holding governments accountable. Talk about what regulation could look like, what policies could look like, what schools could be doing, what employers could be doing to support their staff. And then talk about a lot of my mistakes that I've made. And in hoping that people don't have to make the same mistakes that I do. So that's how I think you can somewhat try and stay safe online and have a joyous time. That's what I'm trying to do, trying to get people to enjoy the Internet like I did in the in the 90s and the early 2000 when 2000 when I grew up on, um, Windows 95 the with the big bulky computer and dial up and MySpace and MSN. Those were joyous times and I'm I want to bring that back for a lot of people.

Speaker2:
One thing that stood out to me in this conversation or one thing I guess I've been reflecting on, is the the who the we is that's both experiencing and perpetuating abuse or who might be most vulnerable at harm. Because it does seem, especially as we talk about white supremacy online and misogyny online, that there are certain groups who are targeted more than other groups in this space. And I'm wondering how you're thinking about digital identity in terms of either vulnerability or support or action items for folks.

Speaker3:
Yeah, I think the the same groups that are vulnerable offline are vulnerable online. And I actually think a little bit more so because you can target masses of people online, you can spread a lot of misinformation online. Things can go viral a lot quickly. Like it's quite hard to do that offline. Um, so I do think it's important for us to acknowledge that there are vulnerabilities minoritized groups, groups that um, are disproportionately affected by online abuse and therefore the harm is also greater, right? The harm to what that means for that person from that community, particularly if it's a socially conservative community, is enormous, is huge. Um, and, but there's the flip side to that with the whole stuff around digital blackface and people have different kind of definitions of that, but like how there's currency in masking yourself as being black or tanned online to sell music or be an influencer. And I think that also is up for discussion. So we know that we can somewhat see race online, gender, online because people try to benefit from it. I remember being on a panel once and somebody told me about, um, trolls they used to work with or interviewed and one lady who was a paid troll, a paid box, um, said that she had to stop presenting as a woman, um, when she was trolling people because other bots, other trolls were trolling her for being a woman.

Speaker3:
So it is really fascinating how much gender stereotypes and racial, like basically unconscious and conscious bias seeps into our online spaces that we cannot pretend that, oh, because we're online, we're colorblind. The people's choices of avatars, emoji, colors, all of that I think, um, is huge. And it also it's also part of safety, right? Inclusion has to be part of safety. So it's taken so long to get disabled representation in emojis. It's taken so long to make close captioning easier for disabled communities. It's, it's, it's taken so long, um, for us to have these, this, this language and these terms around why it's really uncomfortable for a white woman to pretend to be darker than she is. Um, and then that gets trivialized in the media of like, oh, we're policing whatever, whatever, you know. And but these things are evidently some have some form of currency because people are doing it and people are profiting from it.

Speaker1:
I want to take it back to something that you mentioned previously about accountability and part of how we stay safe online, it sounds like, is, you know, holding these people accountable to their actions that are impacting people negatively. And when I think about holding people accountable, I feel like there's probably more and less productive ways to do that. Um, and what's coming to mind is the difference between calling people out and calling people in. And something that we saw was that comes up in your book or at least was mentioned as a potential theme surrounding the topics in your book is the tension between cancel culture and accountability. So and I see cancel culture as this like calling people out versus accountability, which can be framed to be more calling people in. So I'm just curious about that tension and how you see that in this space when we're holding people accountable for some of the abuse that they may be causing or doing.

Speaker3:
Yeah. Again here, I think. I think it's about us coming together to form some principles. And if we can't do that, some principles and values for ourselves, like, you know, the basic ones of like, how would we like to be treated? How would we like to be corrected? You know, how would we like to be held accountable? I think if we can't have conversations in these grand kind of what I'm thinking of, like old school theater style conversations and debates, let's at least have one within ourselves. Don't think that is being had. I think people are getting caught up in the in the in the excitement of it all in the in the energy of it all and and thinking it's just fun and then they'll dip out. But it actually causes a lot of harm. I think there is a room for accountability online. I think there is. It's powerful for smaller voices that are often railroaded offline to to hold institutions to account. Whether that is unfair, Bill's treatment of widows, treatment of asylum seekers and refugees in certain countries. I think it's a powerful tool and it has held institutions to account. It's even held people get refunds, you know, during the pandemic when flights, flights, um, airlines weren't being transparent about their processes. I've certainly hold institutions to account. I've certainly hold individuals who hold powerful safeguarding positions in in organisations to account.

Speaker3:
I'm a being I remember being so Diane Abbott is the first black woman MP in the in the UK. And, um, I think I was in a tweet. We were in a tweet conversation was mentioned, and I basically got an insight to the horrific trolling that she receives. And I think I was showing some solidarity to her. And this this white woman replied and was horrific. Horrific. So I clicked on her Twitter profile, Who are you? Because I have a thing around. When's the right time to engage with a troll? You know, checking. Are they actually a troll? And I give a kind of like makeup of that, you know, are they have they got a, have a profile photo. They've got certain flags in their in their in their in their bio. You know you can just you can work out the kind of troll ish account they've got for followers. And when they look at who they who is probably their other accounts. So I check to see if this is even worth engaging. And honestly, Jess and Dylan, I was horrified. This woman held a safeguarding responsibility at a youth football club in South London. And you clicked on the link of the organization she worked with. There were young black boys and girls.

Speaker3:
That she used this horrible term about Diane. And that she works with. And so I had to talk to the institution. I didn't talk to her. I talked to the institution. I said, your X, Y, Z person at your job has just said this. You need to look into this. So think there is an importance in holding people accountable for their actions and their and and institutions as well. What I think we're seeing is people see in cancel culture is people seeking perfectionism. And think we're also seeing a massive unfairness in cancel culture as well. Lizzo, who used the word Z? Spaz in her lyrics for a song in American language? Spaz doesn't mean what it means in English language. So there was the use of social media to to to call Lizzo in. And many disabled activists uses an opportunity to educate her. But then what did we see other people do? Bring in her weight. Bring in her skin color and wouldn't let it go. Once it hurt, she had changed the lyrics immediately. And they were still going for it. Actually, there's a little bit of a mob style torch bearing. Let's go to their houses and make them pay. There's a lot of anger that sits in it, and I don't think anger should be a principal or value that drives accountability.

Speaker2:
One of the places that you move towards in one of the last chapters of the book is around collectivity and also allyship and collective allyship. And so far we've talked I think we've talked about that. But I'm wondering if you want to put a finer point on it, about what you mean by either collective action or allyship, collective allyship in this space.

Speaker3:
We are not we are not going to be safe online if we think this is an individual game. I could have and talk about this in chapter two of the horrific experience that I went through that set up a glitch and therefore birthed my kind of career and interest in in online safety and gender. I could have stopped and I'm not saying this in a kind of Marxism bit, but there was an option, a choice to just stop at, Well, I got justice. I'm a local councillor, I can speak to the police and I can try and get those people who abuse me online and terrified me and threatened me to to to hold them accountable. But I realized that. Not every black woman can go to the police the way that I can do. Not every woman can do that. There is a there is this tension between the police and and, um, and the communities that I'm from. And tech companies also weren't responsive. So, yes, I could have had a connection with an MP who knew somebody at Google who could help me. But that's not how it should work. Backdoor dealings and all of that. It it, it frustrates me. Let's have things out in the open and transparency. So in order for us to be safe. We all need to use our sense of privilege. We all need collective action. Your liberation is my liberation. My liberation is your liberation. We have to take those principles when it comes to.

Speaker3:
Safety and freedoms and being able to protect our freedoms online. This is a freedom of expression issue for a lot of minoritized communities. So we need collective action. We'd be mistaken. We'll be wasting our time. We will be doing the Wall of Jericho. Folks who went round the round the building over and over and over again. We're going to be doing that if we really think that how to stay safe online is a is about the individual. It's about our collective. It's about our collective responsibility. And knowing that, knowing that in the collective we all have the green and we all have relative degrees of power. You've got the tech companies who can do huge amounts. You've got governments who can do huge amounts. You've got employers who can do so much. You've got parents and and teachers who can do so much. And I try and give so many ideas of to help people see that you don't have to be extreme like me and set up a charity because you because you want to do something about it. You can just help report stuff online. You can just help reply and love Bomb and Minoritized Voices with Love Hearts to help amplify their work where the algorithm shadow bans them. There's so much we can do in collective action so that we're making a safer online experience for everybody because our liberation and everyone's liberation is tied to it.

Speaker1:
So is something that is coming to my mind as I'm hearing you speak about the ways that we can cultivate allyship with our communities or with our friends, our loved ones, our digital communities. Um, I'm also hearing there's there's a bigger picture at play here. And, you know, individuals can do a lot, but they can only do so much. And we're also limited by these large institutions that create these platforms that we have to navigate and create allyship within. So I'm wondering if you can give just a little bit of, again, like a sneak peek, because I know you go extensively into this in your book, but some of the accountability measures that we can set up for large tech corporations and big tech to support things like this, allyship with each other. Yeah.

Speaker3:
Well, I think the first thing I'll say so it's really true to the name of your awesome podcast is around algorithms and automated decision making and, and coding and risk assessment. So in all of that, I think we first of all need a code of practice that is around how AI and automated decision making is deployed on social media platforms. And in that code of practice, making sure that there's a gendered notion to that. And Glitch has done some work with, um, with amazing organizations and lawyers to have a gendered code of practice so that we could say this is the standard in which you can operate online, but you take into account gender and gendered harm on your platform. We also need to look at. That's also important because we need to look at what's been happening recently. The conversation has shifted now. Dylan and Jess, we're not talking about platforms need to be doing content moderation of online abuse. We're actually seeing platforms amplifying the abuse, amplifying content that is against their own community guidelines. Molly Russell was a young teenager who. Died by suicide. The courts in the UK ruled that. Social media platforms named as as particularly Pinterest and Instagram had played a role in in Mollie dying by suicide because the algorithms were amplifying suicide ideation. This is different now to content moderation not being done properly. Women, women reporting abuse and it not being taken properly.

Speaker3:
Platforms are actively amplifying things that are against their community guidelines. We saw that with Andrew Tate over the summer. Andrew Tate is essentially an insult, but he's one of many, which is why it's very important, I think, to not just focus on him, but the ideology that is being spread on these platforms. It's one thing for it to exist under the name of freedom of expression. Sure, we can debate that separately, but fine, let's take it as it it can exist because we all have the right to say what we need to say. But if it's been amplified by the platforms, two young boys who are now believing these things, there is a BuzzFeed and BuzzFeed that I highly recommend listeners to, to to to search this BuzzFeed investigation that looked at some of the people who were listening and following Andrew Tate and how it radicalized them. We shouldn't be supporting any extremist conversations, any extremist on the left, the right, the up and the down. None. And that's my concern. That's what needs to be regulated. That's what it needs to be regulated by a code of practice, a risk assessment. There needs to be a set of matrix that is that is predicated on this idea of a duty of care that tech companies have to users before they roll out any more innovation on their platforms, before they're allowed to buy any more any more social media platforms, because that's another problem, right? We shouldn't be having this monopolizing of the market.

Speaker3:
I talk about that in the book. And I think the third thing that I want to say around what tech companies can be doing is around promoting education, because I truly understand platforms did not create extremism. They did not create patriarchy. But to coin the words that you're either on the side of the oppressor or you're on the side of the Liberator, It's the truth. You can't be neutral. The platform cannot be neutral. We've seen what fake neutrality has done. Platforms should be trying to promote racial literacy, education, decolonization, education, promoting, um, uh, um, education around patriarchy and why it's also an oppressive tool for men. They should be trying to promote these stuff on the platforms. They should be promoting the idea that we've got rights and responsibilities on the platform too. There is so much that tech companies can do. And if I had to pick one, I think it's purely about transparency. I don't get how tech companies can run and make multi billion pounds by paying or commissioning zero hour contracts to agencies around the world to do content moderation with very, very little training. And we only know this information because of whistleblowers, because people are putting their lives on the line to give us this information.

Speaker3:
And I think that's wrong. I don't think our governments. No, no. Near enough to do regulation, um, and near enough to be holding tech companies to account when when it comes to hearings, when it comes to speaking in parliament or speaking, you know, to to public officials, we need transparency. How can tech companies be the only one of the only institutions, one of the only businesses that isn't transparent? That doesn't seem fair. Financial institutions have to be transparent. And after the financial crash, it was imperative. They had to be transparent. You can't buy a can of ginger beer. I can't buy a packet of sweets without having the nutritional information to know that I'm making an informed decision about how much sugar I'm taking, how much fat I'm consuming. Why is it that tech companies can be available 24 hours a day and we have no nutritional guidelines around what this is doing? And yet all the psychologists explained to us that too much screen time has this impact, how the algorithms are having this shortening our attention span. They're worried about the repercussions of lockdown and young people being online for the last two years. We know all of this, but yet it's not transparent and we're not allowed to make informed decisions. So I really want us to be able to have consumer power when it comes to tech platforms.

Speaker2:
I imagine education is part of the reason why you wrote this book, this tool kit. And so I'm curious, as I'm sure this book will be taught in schools at the university level or beyond. And as people are using this as we move to close, I'm wondering in an ideal world, how do you want this book to be used as it gets out into the world?

Speaker3:
Oh, it's a lovely question. Thank you very much. I want this book to inspire. And I and I, and that is it inspiring And and I'm abdicating control because I, I want to be blown by where people get to when they've realized that they can can ask for more and they do deserve more and they can and they do deserve to have freedom of of of speech and expression and safety online like I, I, I want to know what they get inspired and activated to go on to do. Um and what that how that shakes up policies in schools. How that shakes up um policies at work how that means that we're having a public conversation and it's in the public consciousness that how we talk to each other online, how we navigate online, how platforms need to operate and provide a specific standard. Look, all the work that women have done, activists have done to make sure that clubs have to be safe for women. You know, security. Um, there's a number you can text. There's a fake name you can give at the bar if you're feeling, you know, unsafe when it comes to dating or anything like that. The work that we have put on businesses to keep us safe. Health and safety regulation. Take it a step further. You don't go into a restaurant expecting to get food poisoning.

Speaker3:
I just I just hope that inspires those kinds of thinking, those kinds of models, those kinds of, um, uh, inner belief and, and care and, and agency. I hope it sparks agency for people and I hope it I hope that we can have another conversation in, in not, not not in not so long from now about joy. That's what I really want to be talking about. How do we cultivate joy and reimagine the Internet, not fight these very, um, for me, obvious things that should not happen on the platform. You know, women have worked so hard to have equal pay, equal power and and the right to exist, freedom from rape in marriages and all of that. They've worked so hard and we're having the similar, similar conversations online with with the conversations, the conversations we're having should have been having and should have been happening in the 90 seconds, in the early 2000. Definitely, when Facebook was Face Smash or whatever it was, raping girls, it it should have been. The conversations we're having now is what does liberation look like? What does Joy look like? What does amplification of voices look like? What is what is, um, um, citizen power on platforms look like? What does it look like to elect officials on platforms? Those are the conversations I want to be having. How tech can facilitate joy.

Speaker1:
O On that note, I think that brings this conversation to a close, and I'm glad that we were able to end such a difficult topic on such a light hearted and positive future thinking. Note So thank you so much for coming on the show and for writing this book and sharing this knowledge with everyone. We really appreciate it and you have definitely inspired us and our listeners. So thank you.

Speaker3:
Thank you so much for having me. Jason Dillon.

Speaker2:
We want to thank Shea again for joining us today and for this wonderful conversation. Jess, what's on your mind?

Speaker1:
What is on my mind is accountability for big tech, which is a difficult thing to talk about because I feel like accountability is such a high level, vague buzzword. Everybody's always like, we need more accountability for engineers and for large tech corporations. But like, what does accountability actually mean? What does it mean in like the day to day? What does it mean in terms of protocols? What does it mean? Mean in terms of regulations? And I don't know the answer to those questions, but but I think what at least high level, what was going through my mind was that I feel like there's sort of three different levels of accountability that big tech could play or could could have a role in when it comes to like reducing online abuse. And the first level, like the lowest level, I guess, is to just stop amplifying the abuse that humans already do to one another offline. And that's through like some of the stuff that she was saying, like better content, moderation policies, things like that. Then like the next level up is to like stop perpetuating the type of abuse that we already as humans do to each other offline. So maybe like bringing in, um, whether that's bringing in more moderators or like not allowing platforms to have so much anonymity, you know, different interventions there. And then this like third level, which I guess is like the ideal level, at least that's coming to my mind right now, would be for big tech to try to fix the problems that humans already have when it comes to abusing each other or mistreating each other. And I don't really know what that would look like in like a realistic world I could think of like idealistic, potentially unrealistic designs for that. But I don't know what that would look like realistically. So yeah, in the spirit of the annoying high level, talk about accountability, I guess that's like what was coming to my mind was like these high level ideas of what accountability actually means. But what how does that sit with you? What were you thinking about after this interview?

Speaker2:
What I really appreciated about this conversation and also Shay's new book is the fact that we get some answers, at least the beginning of some answers to the questions that you're asking about how we make a change. And in this case, we have a book that can serve as a resource, because something that came up in this conversation a lot was about literacy and about education and how do we provide tools to people who need them most. And so hopefully this book is a great next step in providing people those tools, because I think what we talked about around transparency and accountability and how those two things are connected is key here. And I loved what she said about food as a metaphor. You know, like food has a label on it. Food has things where, you know, it needs to be clear about what is inside of it. So why do we not hold online spaces and other platforms to the same standard? But for more information on today's show, please visit the episode page at Radical Eyeborg.

Speaker1:
If you enjoyed this episode, we invite you to subscribe, rate and review the show on iTunes or your favorite Podcatcher. Catch our regularly scheduled episodes the last Wednesday of every month with some bonus episodes, possibly in between. Join our conversation on Twitter at Radical iPod. And as always, stay radical.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including automatic transcription software, secure transcription and file storage, automated subtitles, powerful integrations and APIs, and easily transcribe your Zoom meetings. Try Sonix for free today.