Let's Talk About Sex: Digital Pornography and LGBTQIA+ Censorship w/ Alex Monea


What is the history of digital pornography?

How do algorithms perpetuate LGBTQIA+ content censorship?

What is the role that content moderation and corporate ownership plays in perpetuating misogyny and heteronormativity?

In this episode, we interview Alex Monea about the history of digital pornography and LGBTQIA+ content censorship. We also discuss his new book published by MIT Press: "The Digital Closet: How the Internet Became Straight"

Alexander Monea is Assistant Professor in the English Department and Cultural Studies PhD Program at George Mason University. He researches data ethics and the intersection between computation and marginalization.

Follow Alex on Twitter @alex_monea.

If you enjoyed this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.


Relevant Resources Related to This Episode:

Ways to Purchase / Read The Digital Closet: How the Internet Became Straight


Transcript

Digital Closet.mp3: Audio automatically transcribed by Sonix

Digital Closet.mp3: this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

Speaker1:
Welcome to Radical A.I., a podcast about technology, power, society, and what it means to be human in the age of information. As always, we are your hosts. Dylan and Jess were two PhD students with different backgrounds researching A.I. and technology ethics. And welcome to Season three.

Speaker2:
Jess Yay. And to kick off this third season, in our first episode, we interview Alex Monet on the history of digital pornography and the suppression of sexual speech from LGBTQIA communities. As we explore his new book, The Digital Closet How the Internet Became Straight.

Speaker1:
Alex Monet is assistant professor in the English Department and Cultural Studies PhD program at George Mason University. He researches data, ethics and the intersection between computation and marginalization.

Speaker2:
And today we're just going to jump right into it. So here's our interview with Alex Monet on his new book.

Speaker1:
We are on the line today with Alex Monet to talk about his new book, The Digital Closet How the Internet Became Straight. And once we read the synopsis of this book, we were like, Wow, we have to have Alex on this podcast. And I think you'll all see why momentarily. But Alex, could you talk a bit high level about what this book is and also why it was important for you to write it?

Speaker3:
Yeah, of course. And thanks for having me on, gentlemen. Just it's great to be here. I think the the major thrust of the book is arguing that. Content moderation algorithms that are increasingly controlling what is visible and what is invisible online are in most instances biased against LGBTQIA content. They tend to over block or identify as a false positive. Lgbtqia+ community pages. Sex education, material art, literature, activism and other sorts of non pornographic content. And in the instance of pornography, they also inordinately make LGBTQ+ pornography harder to find without really doing much to make heteronormative pornography harder to find. So across the board, whether it's pornographic content or everyday content, I think that the LGBTQ+ community online is is bearing the brunt of these content moderation algorithms. And while the title might be a little bit hyperbolic to get people interested, I think there's a systematic trend across the Internet of increasingly rendering LGBTQ+ content invisible online and well.

Speaker2:
So actually, let's go back to the title of this book and specifically the second half, how the Internet Became Straight. So without just asking the question, how did the Internet become straight? I think I'm specifically wondering if you can tell us a bit of the history of digital pornography in general and how it's been policed and regulated online.

Speaker3:
So pornography online, especially its its moderation by Internet platforms and ISPs is a really interesting part of Internet history that isn't dug into much at all. So, for instance, Google from its inception has had a huge interest in. Understanding what pornography is, what signals indicate that a piece of content is pornographic so that they can control when and where pornography shows up. There's a consistent claim across the industry that advertisers are very hesitant to have their ads appear alongside pornographic search results. And there's also a very big concern that organic search results that are non pornographic might have a pornographic ad next to them. So in either instance, they don't want pornography showing up when it's unexpected. And for them, this is central to their monetization scheme. So the way that Google derives money through advertising is threatened by not being able to control when and where pornography shows up. And one of the things that I found that was really interesting, you know, I came at this by researching Google. Most of my earlier research was on Google. Was that Matt Cutts, who was the head of spam, making sure that spam didn't destroy Google's search results, was first tasked when he joined the company over 20 years ago with developing a porn filter. And so he would try to find textual classifiers, right? So keywords that indicated pornography link, topologies that indicated pornography and other sorts of user behaviors such as click through and amount of time spent on site to try to build classifiers for pornography.

Speaker3:
And there are stories of him running around the office. In 2000, 2001, trying to get colleagues to break his algorithm by locating porn on Google that he was trying to block. And so for 20 years, Google has been really invested in trying to control when and where pornography shows up. And it's linked to a lot of significant developments at the company. So their research into computer vision is actually largely developed in pornography censorship. So the algorithms that we use to tag photos in Google photos or to use Google Image Search, those algorithms all have their roots in porn censorship at Google. So it's been a historically very important but invisible ized aspect of running an Internet platform that that's been central to the business models of a lot of these companies. As the Web turned to an increasingly ad free service with advertising based revenue generation, or what Shoshana Zuboff calls surveillance capitalism. Right. And so I think you see it developing across the Web. There's probably similar stories at all of these companies. And just I think the shift really happens when you move from. I don't know. Some people might call it a move from Web 1.0 to a Web 2.0.

Speaker3:
I'd call it a shift towards platform resolution. Right. Once the Internet stops being this sort of like Wild West that we imagine it, where anyone with an Internet connection can throw up a Geocities page or an angel fire page and show up in the top ten results on Google. And instead, you know, most people experience the Internet through the lens of a few specific platforms, whether it's Instagram, Facebook, Twitter, or even Google search results. You've really concentrated the ability to control what is visible, what's discoverable online into the hands of very few people. Because not only is this a small number of companies, these companies are usually small. They have few employees and they have even fewer people working on these sorts of policies that determine what content is visible, these sorts of algorithms that try to automate it, and the offshore content, moderation labor that they purchased to try to fix any gaps in that system that they built. So porn censorship has always been central to the the business of Internet platforms. And as they increasingly took control of the Internet, the heteronormative biases of these. Small handful of companies and the small handful of people working at them exercise way oversized influence on how we experience the Internet today.

Speaker1:
So this is striking for me as I also think about the rise of only fans and the, I guess the narrative or the mythos that actually pornography is, is more in the hands of creators. And so it sounds like we both have that that's happening, that dynamic of people being able to create their own content, and then also this massive restriction and a surveillance of legitimate porn versus not legitimate porn and where porn can be posted and this moderation. Can you unpack that a little bit?

Speaker3:
I'd say that when I start talking about this with people most frequently, if I'm talking to cisgender and hetero person, they're very surprised by this and they assume that this can't be true. Whereas when I present this to nonbinary people, trans people, queer people, a lot of them say, Well, duh, of course this is the case. Everyone's been talking about it for years, right? So part of it is really just like what corners of the Internet you spend time on which content creators you're talking to. It gives you a sense of this. There's definitely a pervading mythos that you know. That the Internet. Brought like a tidal wave of pornography that was uncontrollable and everyone gave up on regulating it like post 1990s and that no one's been organizing against it ever since. Right. Which I, I show unpack in the book is, is really a bizarre ideology we have best captured in the Internet meme of rule 34. Right like if if it exists, there is porn of it, whereas the opposite tends to be true. Right? People have been very motivated and successfully organizing against pornography online without cessation. Since the late 1990s. They've had signal victories of getting it kicked off platforms and they have had undue influence on some of these platforms content, moderation policies.

Speaker3:
So. There's on the one hand, this aspect of what's going on, on the quote unquote legitimate platforms, the non pornographic platforms that tend to block all pornography and in so doing over block a lot of non pornographic content. I think your question has more to do with the the little corners of the web that are left for pornographic content. And what I found when I looked into those was, was a number of things, right? So first off, the idea that amateur content is free from the strictures of the mainstream pornography industry doesn't often tend to be the case for a few reasons. One, the porn industry understands amateur pornography frequently as sort of its research and development laboratory, where they can not have to invest new any money but find new trends that might that they might then commercialize to greater success. So they actually generate a lot of money for mainstream porn producers by serving as that research laboratory. Second, if you're trying to get noticed, you know, you have to be in the top 0.1% of performers on Onlyfans or on pornhub to really make any money. You have to work with the system that you've been provided with. And the analyses that have been done tend to show that the metadata that tube sites use to classify pornography is profoundly heteronormative and exercises an influence on what pornographic content even gets made.

Speaker3:
Right. So if you're an amateur pornographer and you're looking to make money by doing that on the Internet, you're going to look at the metadata that you can tag your content with, the ways that you can make it discoverable on a porn hub or on an only fans. And that tends to better normalize the content as well. And in what studies have been done, you know, it's usually white cis women that are dominating these platforms. And the more marginalized intersections of identity you participate in, the more likely you are to make less money to be hounded by trolls, to face online and offline precarity, and to be shadow banned, censored or or demonetized. So there's a lot of different angles coming at it there. But I think the case is that a lot of these sort of corners of the Internet where porn is left to exist because of the the metadata structures, because they're small spaces that are easy to dominate by the the highly invested, vertically integrated porn companies because of preexisting social biases, they tend to continue to privilege heteronormative porn and have a lot of downsides for queer pornographers, feminist pornographers, things like that.

Speaker2:
So we've done a pretty good job at this point of describing the landscape of some of these heteronormative practices. Like you describe metadata as one example. And also earlier, I think you mentioned that there are just heteronormative biases in general that are sort of permeating from these large tech companies and influencing the content that people are able to discover in the first place. And one of the things that we love to do on this show is get specific. So do you have any examples of metadata or biases that might, I guess, promote this heteronormative agenda? And also, actually, while we're at it, can we define heteronormativity? Because I feel like this word is used pretty often in the discipline of tech ethics. And I don't know if there's a common definition that people think we are grounded in. I'm curious if you think there's one. So maybe we'll start there and then we can go into some examples.

Speaker3:
Yeah, thanks. That's super helpful. Yeah, I'll say that the hardest part of the book for me to write was trying to define heteronormativity. Right. It's one of these concepts. It's almost like what Potter do the Supreme Court Justice Potter Stewart said about pornography. Right? I know it when I see it, but it's hard to define. Heteronormativity is one of these sort of intuitive concepts that I think a lot of us have a grasp on and could maybe point out an example of it when it's happening. But trying to give a formal definition of it is is quite difficult. And when I was working on the book, I think one of the reasons that I found for that is that heterosexuality is a really difficult concept to define. It shot through with tons of internal contradictions, ambiguities. And that's part of its power, right? It's how it's able to persist over time, mutate to different contexts, avoid critiques and sort of dominate is because it it is flexible and can move from definition to definition rather fluidly. So when you're thinking about heteronormativity, it's going to be things that privilege heterosexuality. And so you have to define what the hell heterosexuality is which which is sort of the trick, right? When the Puritans came to the US, sexual norms in the US were largely based around procreative sex, right? You weren't supposed to enjoy sex, you weren't supposed to have sex unless you were trying to create a child.

Speaker3:
And this sort of dominated sexual ethics in the US for, I don't know, 200 years, of course there were deviations from this and regimes of discipline to sort of keep people in line with this. But this is really the norm in the 19th century. You have a sort of competing definition that starts to evolve, right? Originally, heterosexuality emerges in European psychiatric discourses to describe people that have an abnormal and unhealthy desire for opposite sex coupling. Right. So heterosexuality is first used, a term used to describe men and women that can't stop having sex with each other when they're not trying to procreate. But it quickly becomes a discourse that understands a biological drive towards sex, towards orgasm and. That's really intention with the procreative drive, right? So you have this sort of theological ethic around procreative sex that comes into conflict with a more evolutionary psychiatric argument around biological drive towards sex and orgasm, and the two sort of fuse into a compact or an agreement in the late 1800s and early 1900s. Where. It's understood that people have a drive towards sex and primarily this is understood as white, cisgender, middle class men have a drive towards sex and it is increasingly acceptable to funnel that into non procreative sex.

Speaker3:
First, you can have sex with your spouse when you're not trying to procreate. Then there's some slippage and it becomes you can have sex with someone that you are in love with and intend to marry. But before you're married and this sort of constant conflict between the procreative impulse and the libidinal impulse is is what continues to structure heterosexuality, and you keep reaching these agreements. One example I give in the book is that I think you can see a really interesting example of this in the polyamory movement today. Right. It's able to achieve more public legitimacy and visibility because it's couching the idea of multiple sex partners in the language of love and familial dynamics and long term relationships. And so there's this sort of nexus between the nuclear family and procreation, between anatomical like physiological drives towards orgasm and between love or amorous ness that sort of builds the center of heterosexuality. And I think a lot of the normativity is is trying to push people into the cultural forms that that gets understood through, whether that's through traditional gender roles, whether it's through cis normative understandings of anatomical sex, or whether it's through understandings of what kind of sex acts are normal and where you can engage with them. All of that stuff tends to be way more flexible and contextual than this core of like pro creativeness and libidinal desire.

Speaker3:
But because the concept ambulate between those two poles and is flexible to to circumstances, it can take account of that. Right? And so there's a lot of different things that define heterosexuality but or participate in heterosexuality but aren't necessarily dealbreakers. Right. So like you can think of heterosexuality as being two people rather than multiple people. You can think of sexuality as heterosexuality is being conducted in private rather than in public. You can think of it as being conducted with someone that's opposite gender at birth rather than same gender or nonbinary gender. None of these are dealbreakers. And when you start applying it to to concrete instances, it's quick. You can quickly find contradictions that make no sense. Right. So if if we go with the standard definition of heterosexuality, if. A trans man and a trans woman partner up. Are they straight? There. Heterosexual heterosexuality has a really hard time explaining things like that, right? So it's just a really weird, vague concept that gets tied into the politics of race, class and gender in concrete instances. And that really shapes how it's how it's deployed and how the norms evolve. But the kernel has to do with the sort of ambiguous tension between the procreative impulse and libidinal sexual desire. Sorry. That's too long. It's.

Speaker1:
Oh, that's. That was that was great. It's super helpful. And one thing that I'm was thinking of while you were talking is identity in general and identity politics. And then also that transmission of non digital identity. And then how that gets translated into digital spaces, including things like stigma. I was struck when you said, you know, a lot of people aren't writing about this. And one of my questions, you know, in my head is, is why? But I think my question. To you is a bigger question of. Identity and identity politics. And maybe we'll look at the specifics of. The alt right is doing something here and there are other politics that are at play. I know you use that as a case study, so maybe using that as a case study, let's enter back into digital identity and the specific identities that we're looking at. And why are these the ones that are continuing to be targeted in the space?

Speaker3:
Yeah, sure. So. It's really an interesting. Alliance that gets built, even though it's often tacit and the participants aren't all in communication with each other. There's. Sort of shared agenda across anti-porn feminism, evangelical Christian conservatism and the alt right online today in terms of their opposition to to pornography, each of them comes at it from a very different angle or arrives there from a very different route. But they all tend to share a commitment to. Censoring pornography to getting it offline and have been quite successful in doing that. In the book, I spend a lot more time examining the path that the Alt-Right takes to get this to this position, largely because the the sort of alliance between Christian conservatives and anti-porn feminists is well trodden history, right? It's catalogued in a lot of books and articles. Anyone that starts reading about the topic is going to find that center stage from the seventies, eighties and nineties. The alt right is a weird new addition to what I call this alliance of strange bedfellows that arrives there from a really a path of misogyny. Right? When you start looking at different branches of what gets termed the alt right, there's really not a lot of coherent connections across them. Somehow they all manage to collectively organize and rally support around electoral candidates, but there's not really much of a shared ideology outside of a strong commitment to misogyny and to a lesser extent, but still very horrifying to white supremacism or in worst cases, ethno nationalism.

Speaker3:
But misogyny to me seemed to be the thing that connected them the most strongly. And I think in a lot of instances, the anti pornography stance comes about through a. An anger at women being able to successfully monetize sexuality. They understand themselves as a. Having a sort of natural claim to ownership over women's bodies and access to them for sex. And so the fact that not only are they not getting sex in the real world, but they're having to pay for it online, really irritates them. Right. And so that's a huge organizing trigger for a lot of these folks. I also see a lot of people talking about returns to. The masculinity of yesteryear and traditional family and gender roles. And in doing so, they tend to bring with that a lot of cis and heteronormative baggage. And it's a major connection point with evangelical Christian conservatives. Returns to family values, returns to traditional gender roles. Proud boys are huge on this, for instance. And so I think there's a lot of ways that they arrive at this anti pornography stance, whether it's a wanting to return to a nuclear family, whether it's wanting to assert their dominance over women and their right to access to them without paying, whether it's the the fear that the fiscal empowerment that it brings will increase the capacity of feminism to destroy their rights or however they want to interpret it.

Speaker3:
They arrive at an anti pornography stance and they organize around it frequently. So one of the things I cover in the book is what the alt right called the thought audit, so thought for viewers don't know is an initialism that stands for that hoe over there. And it's used to refer to women that are attractive but assumed to be sexually available. Right. And the thought audit for the alt right was an attempt to systematically report sex workers online to the IRS for not paying appropriate taxes. Finally, they found that reporting someone to the IRS is way more difficult than you'd think. You have to have, like, their Social Security number, their name, their address, all this information about them. It's like a 30 page document or something that you have to fill out. And the goal was to get a reward. If you report someone successfully to the IRS, you get like a 30% reward on the money they recoup. But quickly, the the Alt-Right found that this was way too much work and that it wasn't having the intended effect. And so they pivoted and they started using web crawlers to crawl different sex work websites, including places like only fans to pull usernames and most important links to other sites. Right. So most sex workers have to advertise themselves in.

Speaker3:
Clothing or within the community guidelines on major social media platforms. Right. So only fans doesn't have a search feature. So if you want to drive people to your page, you have to have an Instagram and a Twitter and a Facebook or whatever where you sort of produce content that is risque, but we'll be allowed to stay on the platform and then you have to drive people to your onlyfans page. And because of this, there's often a nexus of links across all of your content that can be used to identify you. And so the alt right put together web crawlers to sort of grab this data, and they used it to start systematically reporting sex workers on social media platforms and on financial service platforms online in an attempt to deplatform them. And it was very successful. Right, although limited in its duration. Well, the thought audit as a term is limited in its duration. People are still doing this. They just don't call it this. And it's not organized the same way. But the de-platforming of a sex worker, sex workers was a huge example of sort of the alt right's ability to to mobilize and to leverage platform politics against sex workers. And particularly the people that drew their ire were LGBTQ+ people, people of color, people with disabilities, people that were fat, people that had any other form of marginalization that would trigger an alt right crowd.

Speaker2:
Yeah. Let's actually pause on that topic for a second, because for a little while there, when I was hearing you describe these attacks against sex workers, it sounded a bit like there was just an urge for suppression of digital pornography in general. And until that last sentence that you said, I was curious about how this suppression impacts LGBTQIA folks specifically more so than just anybody who falls outside of that group. And so I guess maybe bringing it back a little bit to technology, specifically, something that you mentioned at the beginning of this interview that's one of the big topics in your book is content, moderation. And so I'm curious how content, moderation, whether it's manually done by humans or automatically done by machine learning algorithms. How is this moderation, policing or suppressing pornographic content in a way that is unfair more for people who are lgbtqia+ than anybody else?

Speaker3:
Sure. And I think that gets at some of the heart of the new findings in the book. So I think there's two aspects here that you can look at. You can look at the algorithms themselves that are sort of automating content, moderation online. And then you can look at the human element, the policies that are made as well as the way that they're implemented by human content moderators, whether they're co-located with a company like Facebook or offshore in the Philippines or in India. As far as the algorithms go, the one that I had the most access to to analyze was Google Save Search, because Google opened up its computer vision algorithm. They call it cloud vision API to developers. And so through developer tools, you're able to toy with it a little bit and get some sense of how it's working. And there was some public record on the data sets that it was trained with, and so those were also accessible. Some of the things that I found when I was looking at the way that. Google image search worked as an example of a. Algorithms that a lot of these companies have that that operate very similarly and are trained on similar data sets. First off, the data set that it's trained on is is hugely biased. So safe search image recognition technologies at Google, things like that are largely trained on. The Dataset Image Net, which was implemented in the late aughts and was basically a large repository of labeled images that you could use to train computer vision algorithms.

Speaker3:
It drew all of its ontology, the the words that could be used to describe photos from word net, which is a similar set of sets or language terms from the nineties and both word net and image net because it draws on word net for all of its classifiers are. Filled with heteronormative bias. So, for instance, word net. Connects homosexuality to pedophilia and to bestiality. And it's ontology. It connects it connects miscegenation to interspecies breeding. It connects masturbation to self abuse. It really is sort of a catchall of historically leveraged anachronisms for slandering the queer community. All of those sort of get embedded in the the ontology that is used in the data set that these algorithms are trained on. Additionally, when you look at the actual image dataset, the the pictures that are available that are training these algorithms, how to see you find more biases. So for instance, image net contained a category for sluts, and the entire category is filled with what labels thought were scantily clad women. And interspersed there are a few, presumably trans women. But for the most part, the entire category of of what it means to be derogatorily sexually available is scantily clad women. Right. A social a social bias and an understanding that anyone that is trans is thus also a slut is also preserved in this category. Similarly, they have a category for what they call closet queens, which they describe as closeted gay men.

Speaker3:
Right now, the category isn't very populated. They don't have a ton of images for it, and the ones they do have are comically bad, right. It's like to leather daddies holding each other. Right. But there's. Problems here as well. Right. So a lot of the images that they do have are men hugging, particularly if one man is hugging a man from behind. Right. And there's a sense here that male intimacy is is weird and is inherently problematic or needs to be hidden and thus unearthed. There's a sense here of sort of producing automated algorithmic gaydar that that's quite scary and having these sort of categories in the ontology calls for their population. Right. So if I'm farming out tasks on Amazon Mechanical Turk or a similar platform and I give people the option to label things with Slut or with a closet queen or with things like that. I'm calling those things into existence and inviting them to populate my dataset. How these things actually impact the way that Google Save Search is working is hard to trace out exactly right, because I don't have access to the full algorithm to to really run a lot of examples through and figure out what's going on. I did start to run some datasets through and found some interesting things though. One was that it had a high tendency to classify art as pornography, and this was even art that Google has indexed in Google Image Search and knows is art.

Speaker3:
So I can go to Google Image Search, search for something like Venus de Milo, download the picture and then run it through its API. And then it will tell me that it's porn. It's even worse with stuff that isn't classified as as art. Right? Stuff that isn't in a museum setting. And Google thus knows, is like a historically important oil painting or sculpture. So things like community art or fan art or things like that and. That means that the classifier is much worse at distinguishing between pornography and art when it comes to community art, low budget art, fan art, things like that. And lots of historians of museums and of art history have demonstrated the sort of power relations that the museum holds in society as a legitimate matter of what constitutes art. And it's oftentimes at the expense of queer communities. And so one of the things that's clearly going to be happening is that community art and underfunded art and art, underrepresented in museums like that from the LGBTQ+ community, is going to be unduly censored. I mean, some of the other instances I found it's harder to relate to an exact training mechanism in safe search, but there's tons of other examples of content being over broadly blocked. I'm also happy to talk about sort of the human end of moderation and the policies that companies like Facebook put into place to determine what's acceptable on the platform.

Speaker1:
Maybe we can do that through. I'm going to put my designer my designer hat on for a second where all this sounds really overwhelming. You have the social dynamic, you have the socio technical dynamic, you have the technical recommender system dynamic. There's there's a lot there's a lot of problems here. And I'm not going to ask you the obnoxious question, which is what should we do? But I'm going to ask you the obnoxious question of what what do we do to begin to dismantle these systems? Is it the political organizing? Is it the maybe the designers who are building out these algorithms and just awareness and education? Is it something that the consumers can do or all the above or none of the above?

Speaker3:
Partly I want to hedge on this and say that I come at this from a privileged position. You know, I'm a white male academic who views it from that perspective, and it really needs the contribution of a lot more folks than me, even if that means them saying everything in the book I did is is wrong and we need to start over. And so I'd really want feedback from listeners, from people on the street that are organizing from sex workers on whether some of these are the right approaches. But but I do try to outline some of them because I think that it's generous and important to do in a book, even if it's the point where you're most vulnerable because you're most easily demonstrated to be wrong. And I think it's it is a mix of of all of the above. Right. I think. These data sets are really pernicious because they oftentimes become benchmarks and then it's really hard to change them in retrospect, right. So even if you want to change image net now, it disrupts its ability to serve as a benchmark for comparing computer science projects over the past decade. Right. So getting data sets right up front is hugely important. And I think that the hacker ethos that pervades Silicon Valley is detrimental to this. Right. The idea that that the best way to move forward is to slap something together and then patch it as it breaks is a problematic way to approach things that are going to become standards. And so I think investing upfront to build data sets the right way to analyze them for bias rather than just looking for the cheapest, most easily available thing and fixing it later would pay dividends.

Speaker3:
I think doing that is going to require more diversity in the workforce. When you start looking at their policies, it's very clear that there aren't enough trans people in the room because so many of the policies these companies make assume that definition, that anatomical definitions of gender are easily easy sort of starting points for building policy, right? Almost all of these companies have an understanding of quote unquote female nipples. Right. As like the the the line you can't cross in terms of content on their sites. And when you have trans people that are transitioning, posting pictures of their chest at each period during their hormone treatment, asking when they're being censored, it throws these sorts of policies into jeopardy. When you have men with gynecomastia getting getting censored for this it really shows that these sorts of. Axiomatic tenets that they're taking wouldn't be there if they had better input at the beginning. So having more diversity on the on the teams and not just at mid-level management, I mean actually on the research projects as they're determining what data set to use, what parameters you use on the algorithm, what sensitivity is set it to, what accuracy constitutes a publishable or implementable result. Things like that I think would be helpful from the outside. I don't really I don't know about you all, but I don't really trust Mark Zuckerberg to make the right decision or Jeff Bezos to make the right decision.

Speaker3:
I don't think that. That a few billionaires are going to make social progress for us. And so I think we need watchdog groups outside of these organizations that collect data and that produce reports and that can effectively advocate for users. So this is I would love to have an organization that collects data on queer content being censored online, produces reports on where it's being censored and why analyses the sort of impact it's having and can hold these companies accountable in a language that they respond to. Right, with some quantitative data. I also think that using our regulatory bodies more effectively to control this is important, although you run into the problem there of we also have to fix our democracy so that our regulatory bodies are responsive to our citizens. So you start running into bigger issues. I think that demanding that companies implement anti censorship commitments would be an important step forward. And really, I mean, I'm probably more radical. And so I think about things like running ISPs and social media platforms as public utilities, as advantageous and kicking the billionaires out. I think. Really the movement ought to be towards. Social media that is built for everyone and that facilitates encounters with difference, while sort of preserving some of the lessons we've learned over the past two decades about how anonymity doesn't share its benefits with everyone equally and how we can maintain accountability while also having an Internet that is heterogeneous and surprising.

Speaker2:
Well, unfortunately, Alex, as usual, we could talk about this topic for a much longer time than we have so far, but we are reaching the end of our time. So I know you mentioned before if people are interested in this topic, but also if they maybe have other ideas for ways that we can improve the status of a lot of the themes that you mentioned in your book. Is there a best way for them to get in touch with you or to connect with the research community that's doing this work?

Speaker3:
Yeah. I mean, they can app me on Twitter or email me, both of which would be welcomed, even if they just want to give examples of how this has happened to them. Because I'm starting to try to collate them and make sure that they are more permanent. One of the issues was that people would make complaints that their content was getting taken down. But those complaints are ephemeral. They're tweets that get deleted or that are hard to see once their account goes private. There's really no archive of the stuff that's been lost, and so I'd love to hear from people on that. As far as the research community. Get in touch with me about building it. I haven't found it yet, and I'd like to see it.

Speaker1:
Sure. Well, just again, for listeners and we'll say this in the intro and the outro, we'll be plugging it constantly. But the name of this book is The Digital Closet How the Internet Became Straight, published by MIT Press and of course, written by Alex Monet. Alex, thank you so much for joining us today.

Speaker3:
Thanks for having me. It's been a pleasure to be here.

Speaker2:
We want to thank Alex again for joining us today for this wonderful conversation. And as usual, now it is time for Dylan and I to debrief our immediate reactions and initial takeaways from this conversation, which if you didn't know, we do at the end of every episode, at least for a few minutes. So, Dylan, let's do the thing that we do where we talk about our immediate reactions and what's yours.

Speaker1:
Yeah, you know, sex, sexuality, sex, work, pornography. These are topics we haven't covered that much on the show and topics that even when we were brainstorming the Radically podcast in the first place, these were like some of the big topics that we had written up on our little planning whiteboard, including maybe looking at like sex robots or looking at onlyfans or looking at something. Because there's so many important conversations that are happening in this space that are under-covered, that are not as fully what's what's the word I'm looking for. Underrepresented.

Speaker2:
Represented.

Speaker1:
Represented. Yeah, underrepresented. And that was something that I'm thinking about right now in what Alex said in in the research community, but then also in probably a lot of elements of industry or areas of folks who are designing technological systems and websites and social media platforms. There aren't a lot of people unless you're in like the porn industry in which you're probably pretty specifically focused on like pornography and sexuality and things like that. There aren't a lot of folks who are looking at this through a critical eye, and this conversation reminded me how important it is that we begin to have these conversations about representation, not just about the topic, but about the identities and the digital identities that are playing out in this online pornographic space and beyond. And it made me wonder, you know, what's the barriers coming from an academic space right now? Like, why aren't people talking about this? Like, sexuality is part of our human experience, and yet there's something about pornography specifically that I at least have seen very few people talking about. And so it makes me wonder, is there still a stigma around it? Is it just not seen as legitimate in scholarship? It rose. This interview arose. Lots of questions for me. Just what are you thinking about?

Speaker2:
It's interesting hearing you say like, basically, we're humans and sex is a part of being human and it's a topic that is just not publicly spoken about regularly. Like it's sort of this taboo topic in most circles, especially in professional circles and in like academia. And it actually reminds me a bit of the research that you're working on right now. Dylan With Death. Like the other thing that comes to my mind that is like so inherently human is in fact actually the only guarantee that we have as humans is dying. And it's one of the most taboo topics that people refuse to engage with and talk about. And so I, I totally agree. I think it's fascinating that, like, this is this is a concept that surrounds so much of humanity and what it means to be human and creating new humans. But it's something that is given so little time in the spotlight. And another word that you said that that I really was reflecting on was representation, because it's interesting. There's there's almost like two sides to this. So there's like representation. There's actually probably many more sides this. But I'll focus on two. So there's, there's representation in terms of like who are the voices and identities being represented in explicit material in pornographic content online? And is it equally balanced to represent the identities that we see in the real world? And then on the other side, you have this representation of identity as seen through the machines and as seen through the algorithms.

Speaker2:
And this part of the conversation that I latched on to, of course, being my like nerdy computer science self was when we were talking about the algorithms and the training data and this image net data set that's used as a benchmark. And it's fascinating because I think about representation in that data set as almost like a bad thing like. It's it's definitely a contentious topic because I'm currently working on computer vision for my internship and I understand that computer vision data sets should be representative and have all people from all different backgrounds who look all different kinds of ways, because we want to make sure that these models and algorithms are robust to all different kinds of people. But when we're annotating these data sets and attempting to categorize these people, it's almost like representation isn't great if the annotations are malicious. Like in the examples that Alex gave where they had like the closet queens annotation or like the slut annotation, like in those circumstances, representation is kind of being like defamed in a way. Like representation is not good. If the things that people are extrapolating from this representation or annotating from this representation is not beneficial to the community that's being represented. Does that kind of make sense?

Speaker1:
Absolutely. And it points to the massive complexities and systems that are at play here, both technical and then also social. And one of the parts that also stood out to me of this interview is that even these annotation categories are not a contextual. They're not coming from they're coming from a history. They're coming from a social history that is, I guess, millennia in the making that has encoded over time different understandings of sex and sexuality. And obviously, there's a huge diversity across the world and across time and space about what sex looks like, what it means, what it signifies. But right now, the categories that these annotations and benchmarks of identity that are being embedded into some of these technological systems are being based within a societal status quo, which is largely heteronormative. And I think it's really important for all of us folks who are studying this as folks who are designing these technological systems, to remember that we are living in a history which also means that we have agency in determining what the future is of some of these categories. But the work can't just happen in the technological space, although that's an important space for us to work. It also needs to happen on a broader societal space.

Speaker2:
And it is interesting like thinking about ways to solve this and going into, I guess like the tail end of the conversation that we had with Alex because I, I agree with a lot of the ideas that he had. Like I love the idea of coming up with more robust benchmark data sets and better annotation guidelines that are less Western centric, that are less heteronormative, a little bit more inclusive, that definitely don't include like a slut category, for example. It would be a great start, but he's totally right that these data sets are so entrenched and ingrained in the foundation of the machine learning research community. And even the practical community that lives within industry is using these data sets as benchmarks to test their own models as well. It is it's been used for so long that it's so hard to convince people to change these data sets or to come up with a new one, or to retroactively go back and modify and alter the research that was conducted with these data sets. And so I I'm sitting in a space of like hopefulness for the future that that we have good people working on this sort of problem and coming up with new annotation categories that are more inclusive and new benchmark data sets that are more robust and have been tested for like ethical harms. But I'm also skeptical that this is a change that will happen any time soon, because it's going to take a long time for people to adopt these new data sets that are more robust, and then an even longer time for that data set to be trusted as a benchmark and to be used for long enough to be compared to other data sets and to be able to use to compare models in the way that benchmarks are used.

Speaker1:
And one thing I'm thinking a lot about is the role of the consumer in this case, especially the role of the pornography consumer. Because if you look at the numbers of people who consume pornography on a daily basis, and I'm also thinking especially folks who are coming of age and their teenage years and some of the representation that folks are seeing, it really makes a difference. And so one question that I'm still sitting with is how do we ethically consume pornography or is there a way to ethically consume pornography as it's currently represented in? Our technical system. So I'm just sitting with that, but I know we don't have that much more time.

Speaker2:
So just and of course, as we promised, we do have information on how to obtain this previously aforementioned book that we are raving about in this episode. So if you would like to get yourself a copy of Alex's book, we would like to direct you to our show notes page, where you can also find more information on today's show in general at Radical Air Org and on the show notes, you'll find a few links to either purchase the book through MIT Press to purchase the book through Alex's personal website. Or, if you would like to purchase the book through a local indie bookstore near you, there is a nifty little tool that we have linked on our site to help you find the place closest to you to do that.

Speaker1:
Again. Welcome back and welcome to Season three. We're so excited for the lineup of episodes that we have coming to you monthly at this point, unless there's any special bonus episode surprises, which we will see. But as always, if you enjoyed this episode, we invite you to subscribe, rate and review the show on iTunes or your favorite pod catcher. And of course, as I just said, catch our regularly scheduled episodes. Join our conversation on Twitter at radical iPod. And as always, stay radical.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including automated transcription, upload many different filetypes, powerful integrations and APIs, advanced search, and easily transcribe your Zoom meetings. Try Sonix for free today.