Tim Panton: This is the Distributed Future podcast where we talk to experts in their field to try and understand what the future might look like, particularly the future of the intersection between tech and society. And I'm Tim Panton.
Vimla Appadoo: And I'm Vimla Appadoo.
Tim: And I'd like our guest for this episode to introduce themselves now, please.
Karen Reilly: Hi, I'm Karen Reilly and I've worked as a fundraiser and product manager for various open source projects, particularly security and censorship circumvention.
Tim: So, open source in the kind of really nitty gritty important space here, then. Not trivial stuff.
Karen: No. And not only is the technology that underlies censorship circumvention and security very complex, there's a lot of cryptography to it. But the end users that we're talking about are people who are journalists and activists and everyday people who are facing censorship. And the regimes behind that are well-resourced and they do not shy away from violence, so the stakes are very high. With that, there comes a lot of responsibility to not go in as a group of primarily US, UK, and EU citizens going in and saying, "We are the pale saviours of your movement!" That's one of the ways where I've come to discussions about inclusion and diversity and equity in open source, because the first project that I really worked on was one of those things where if you come at it without inclusion, without an awareness of the power dynamics that lead to the situation where your end users are in-- which includes a lot of colonialism. Now if you're talking about Iran, you're talking about the repercussions of what the US and UK governments did in 1953. And you have a duty to talk to people about what they need before you make mistakes that could potentially mean physical harm. And so not only do I come at this from a feminist perspective, from the perspective that technology should resemble society at large, you know? Society is not just made up of cisgender, heterosexual White men. It's made up of more people than that, so open source communities shouldn't be dominated by people who are pale and male. But it's also from a very practical standpoint of knowing what your technology is going to do, and consulting with the people that you aim to serve. And I use the word 'serve' very deliberately. You should ask people what they need. And the best way to do that is not to be developing technology from afar and bestowing it on the end users, but to talk to the people from these communities, which also includes a fair number of software developers themselves. One of the people who speak most strongly about this is Nighat Dad from the Digital Rights Foundation, and she focuses on helping people facing harassment particularly online in Pakistan. She'll say, "Look, we don't need a White dude from Silicon Valley to make an app for us. What we need to develop, we can make it ourselves." So in a lot of ways, it's easy to fall prey to this idea in technology and in open source that we're making technology because people can't make things for themselves, when it's more a case of time and resources. The people who have the time and the resources to make technology particularly are not getting paid for it. It's not because they can, it's because they don't have the resources and support necessary to do it.
Vimla: I think as someone who is not... I'm not as technical-minded, I work in experience and design. And it really resonates with me in understanding what communities need and the cultural awareness that sits around that. The bit that I probably would like a bit more context on is the reality of doing that in your work. What does it look like to have those conversations in your field, and who in your team is having them?
Karen: In the past, I've worked with developers who need to be convinced of the value of user experience, of design, or even the use of a graphical user interface in the first place. I did talk to one programmer who said, "Well, if people want security, they should just learn how to use the command line." [laughter] You know, I've taught journalists how to encrypt email. And yes, when I'm frustrated with Enigmail and Thunderbird, I will just use the command line myself because it's easier than getting the graphical user interface to do what I want it to do consistently. But when you're talking about people in other fields, they're just as intelligent and driven as a cryptographer. It's just that they're not in that field. And I've had to use examples like, "Okay, there's a person who's a surgeon in a conflict zone. And not only are they patching up civilians who should not have been placed in the middle of this violence, but they want to document who's bringing this violence and how. What they don't have time for is to switch from Windows or Mac to Linux, or from iOS to a specialty kind of Android that is ideologically open source pure, and use the command line to encrypt a message to a journalist or something like that. That's the way those conversations have gone.
Tim: I totally feel. Many years ago, I had a public [rawl] with Phil Zimmerman about the idea that you didn't need a GUI to encrypt email. I was like, "You definitely do. And we've got megapixel screens and touch and everything, why do we have to read hex digits to each other?" The conversation didn't go particularly well.
Karen: Yeah. It's basically a form of programmer supremacy, this idea that programming and creating software is the pinnacle of human achievement. And look, I'm a nerd myself. My first computer was a Commodore 64 and as a child, I was typing in code from 3-2-1 Contact magazine to play little games and things. So I can really tell a lot of people in these conversations who weren't even born when I started to use a computer, "Look, you think you've seen a lack of graphical user interfaces. Really, get off my lawn." But we made progress. I like having touchscreens, I like having voice-to-text. I'm hoping that we go towards a future of fully automated luxury gay space communism. And that does not look like encrypting email via the command line.
Vimla: Okay. Tell us more about programmer supremacy and what that means.
Karen: It's this idea that, like I said, that the STEM fields are the height of what you can do intellectually and that people who don't programme computers as a hobby or for living are just doing it because they can't hack it in technology. This is particularly irksome when it comes to user research, you know, the psychology of using technology. I mean, you've had guests in the past who were talking about how machine learning becomes a tool that I think Dr. Abeba Birhane said works for the status quo. And these multidisciplinary examinations of how tech is formed and used are absolutely vital for society as a whole, but also to address some of the problems that a lot of open source programmers are very passionate about solving. They'll say, "I make this because it's an alternative to Google. I make this because it's an alternative to walled gardens, to closed source, to black boxes, to surveillance capitalism," without acknowledging that the humanities need to be included when you're creating alternatives to these systems. One example that ties in with so many themes in open source communities is this idea that you can just say, "Be excellent to each other," and that solves all your social problems. One of the reasons why I haven't used, for example, social networks that are open source from the past is because they didn't have a blocking feature. And that is the absolute bare bones of a feature that makes it safe for people to use a network. In fact, I spun up-- and I won't name it-- but I spun up a social network and the very first post that I saw was somebody who is an apologist for gender-based violence in technology. And I nopped out and I closed that down. There was no way for me to block that person and so that was the end of my use of that social network. It's really one of the open source social networks without blocking and filtering, it's one of the best examples of this fallacy that there's nothing that the humanities have that is relevant to technology.
Vimla: That's really interesting. It's that if it's safe for me, it's safe for everyone. But you don't represent everyone and actually in programming in particular, it's the issue of the power dynamic sits with white privilege. And that's the problem, right?
Karen: Absolutely.
Tim: I think that kind of exceptionalism does have a really ugly consequence of not being prepared to listen to other people. Because they know they're the pinnacle, nobody else's views matters. And they don't even ask for it, you see that a lot.
Vimla: I don't even think it's the views that matter, it's even more ingrained in "This is the way you do it," and if you can't use it, the blame is on you. Rather than on that cultural and contextual understanding of how and when. I remember working on a project and I came in too late, and the solution was an app to report domestic abuse without understanding the restrictions that were happening in those households of accessing phones and technology in order to have the app in the first place. So it failed because the context wasn't understood. And that's because the idea and the implementation was driven by a roomful of men with decision-making power who saw an easy tech solution to a very complicated cultural problem.
Tim: Looking forwards, what are the tools that we should be applying to try and make this better?
Karen: Well, it's really this issue of programmer supremacy is tied up in white supremacy and sexism and transphobia. One of the ways forward is going to require a lot of introspection and realising that tech is not always the solution. If tech is the solution, then you need to talk to people who are going to be using the technology. And your teams need to be a lot more diverse. From my experience, I got into tech because I'm the daughter of immigrants, including engineers. We had the money to buy a computer back in the '80s and they were really expensive. I had the leisure time to use a computer, because I didn't have to get a job to support my family very early. And because we have generational wealth, that isn't something that a lot of people have. And then once people get a job, they may be have caretaking responsibilities for older generations that weren't able to build up wealth. They have caretaking responsibilities for children, whatever money they bring in on a tech worker salary doesn't go very far because they're supporting multiple generations. So just having an awareness that time is a currency that is unequally distributed would be one thing to start with, particularly in open source. Who gets to contribute? Who gets to build up their their Git profile showing that they've made contributions to various projects, and to have a more open attitude towards who works on open source technology? And I think that one of the things that starts is... I'm not saying that developers with privilege should start self-flagellating, but they should really start with, "Okay, how inclusive do I want my project to be? Who am I making this for that I'm putting on my product management hat? Who are the users? What do they want? What do they need?" Now, if I'm just making something to solve a problem for myself and I fall into this category because my code is janky and I'm not going to submit anybody else to it, that's fine. Put it up in a repository somewhere and just label it, 'This is just something for my own purposes, I don't recommend that you use it. But if you want to, the default is just fork it and make it resemble something that works for you. That's it. There will be no pull requests, there will be no improvement on this. I'm content with how broken it is." [Tim chuckles] If you're making it for developers, of course. If you're making a development environment or something like that, of course, it's for other developers. And then decide again, "Is this just for me and my friends? Or do I maybe want to localise it? Do I want contributions to this?" And then you're on another level where you have to start talking to people and maybe think about being inclusive. And that's where you're starting with the code of conduct, with a contributor covenant or something like that. A great talk that I refer to a lot of the time is Donnie Berkholz, and he's given this talk at various conferences. It's called Assholes Are Killing Your Project. And he shows, I think it's the Debian project. He shows this chart which you have one jerk in the project and then you have absolutely brilliant people in the project. And then you have people that are sort of in the middle, they're kind of casual. They don't need the thing but they do want to contribute to it because it's cool. And that one jerk drives away a bunch of people-- you see it on the chart-- and then the project deals with that person. And then the contributor numbers never recover. This is really... You don't have to care about diversity, equity and inclusion. You really don't have to care about any social problems whatsoever to understand that if you welcome assholes into your project, you're not going to have as many contributors. It's great. And he also shows some research that shows for every negative interaction in a community, it takes several more positive interactions to make up for that. So from a very practical perspective, have a code of conduct, have a plan for incidence response, and just no matter how well that person can programme, don't use the "but he does good work" as a defence for toxic behaviours. Because it doesn't matter how good they are at programming if they're going to be toxic to people and drive away not only the people making your technology with you, but the people using it.
And then there's another level, and this is where you're talking about people who are making technology for social change like, you know, Patricia Aas was on talking about the dominance of commercial browsers and the importance of making open source browsers. You're talking about a social problem that derives from the way that we get and use technology, and you want a broader user base. So if you're making something for everyone, then you have to consult a lot more people. And that's going to include people who are not technologists themselves. Then you have these different communities interacting. And so if you've got the programmer supremacists among you who are telling people, "Well, you don't deserve privacy, you don't deserve this because you haven't learned to programme," that's a massive problem. This is where I've come into conflict with a lot of developers who are saying, "Well, if you're not using open source then you're just lazy and you don't care about freedom." Ignoring the many issues with user interfaces, with accessibility, or even security, you know? Those safety features that are lacking in a lot of projects. It's not that people are lazy, it's not that they couldn't learn to programme or use the command line, they just have other stuff to do. There's a difference between can't and can't be bothered, and you're gonna have to make technology for the can't-be-bothered crowd. Because frankly, I'm old and I'm less patient with broken stuff. I don't want to tinker with things. I've done my time messing about with Wi-Fi drivers by hand on some Linux distro that I'm putting onto an ancient laptop. I've done that. I'm tired. You know, I also have hobbies. I can, but I don't want to. And that is valid. And then there are people who from time to time, because it's a thing that changes or it's permanent, people have disabilities. So if you're not building an accessibility to an alternative, then people physically, mentally cannot use the thing. And that's something that's a discussion that's also not happening. But I went to FOSDEM-- an open source gathering in Brussels-- before the pandemic, and I happened to have a Windows machine with me because I needed speech-to-text at some point. And somebody actually scanned the network, saw a Windows machine, and tweeted about it. So it was really like, "A witch! A witch! We found a witch. There's somebody who is using the evil Bill Gates operating system!" And so I actually engaged with them and said, "Look, I invite you to try to do speech-to-text on a Linux machine like I have, and you'll find that you can't get your work done on a Linux machine." And this is not to discount there are brilliant projects like Talon which allows you to code with your voice, and not only is it accessibility technology, it looks really cool. I mean, this is the Sci-Fi future that we're all hopefully trying to work towards where you can programme a computer with your voice. That's amazing! But it still does rely on speech engines that at least for open source technologies are not as refined as the commercial software that I hated the user experience of installing, of paying for it, it's really expensive, it only runs on Windows. I detested the entire process of using this but it works. It does what I need it to do. And that wasn't enough of an argument for the people who were objecting to the evil Windows device on their open source conference network.
Tim: So just to clarify that, even once you'd engaged with that conversation, they still weren't convinced and they didn't apologise and say, "I see your point of view, I just hadn't thought of it that way." They still dug in and said you should be running open source even if it doesn't work for you.
Karen: Yeah. Furthermore, they noted that I was tweeting using an iPhone and then they blocked me. [laughs] Which is funny because, you know, I do tech support for people. So just surrounding me, I have an iPad, I've got a couple of Android phones, I've got an iPhone, I've got a Windows machine, I've got a Linux box. I have all the things and I was actually using a Sailfish OS phone until it died. I was an early adopter of that. I actually ran out and bought one in Helsinki when I was doing a talk. So it's not that I can't use the technology, it's not that I'm unwilling to use open source technology, but I'd tried every software that purported to do the thing and I use the one that did the thing. And yeah, when discussions of accessibility and open source come up, it can get downright hostile. So there is a fair amount of ableism in the technology communities in overall; open source, closed source, corporate, NGO, whatever. But to see it in person and that hostile was really something that soured me on the whole experience, and I actually left and went out to see a museum that had punch cards for lace weaving, you know, really early computers. I basically said, "No, no, I'm out. I cannot deal with this today, I'm gonna go look at the early history of computing." And it was a lovely experience. [crosstalk]
Vimla: I was gonna say that's the problem. Like, the more people who are in the community are pushed out because of that difference or lack of understanding the inclusivity element, it continues to perpetuate the problem of it getting smaller and more inward-facing and less open to conversation. So it's just a real meta point of that's it! That's exactly why these things are important and it's important to have these conversations and to understand, because the alternative is you just leave. And that is what happens, people just leave.
Tim: I think something that people don't... People don't say when they're leaving, because they're already too annoyed and they've already spent their energy. So they're gone. And so unless you do, as you were saying about Debian, and actually look at the statistics and do the analysis, it's not immediately obvious to the people who are still cranking the code that this is happening until suddenly they find themselves in an environment they didn't want to be in because everybody interesting has left. That's a really sad thing to happen but it does and people don't notice it until it's too late. Because they don't have to notice it.
Karen: Yeah. And the response to both technical issues and social issues is "works on my machine." You know, the technology does what this person and the other people that they've let into their tree house want it to do. And they don't care. I, myself, I tinker with things. I take broken things and sometimes make them more broken and I'm entertained along the way, but that's not... There's a level of annoyance that we cannot demand of people, you know? On the accessibility thing, Reginé's Gilbert has a great book about accessibility and technology and there's a chapter called If It's Annoying, It's Probably Not Accessible. It conveys so many things in one title about how important accessibility is because you can lose look the other way if it's not accessible and it's annoying. And so, how much annoyance are you expecting somebody to put up with to use your technology to participate in your community? And this is something if you're not annoyed by the power dynamics in a project, you're not going to see it. And you're not going to see that the technology that you're creating is not fit for purpose for a lot of different people. And you're going to be content with that. And then your project is either going to chug along with this certain subset, this small minority of technology users and you'll be content with that. Or it's going to die. And in any case, it's not going to make the social impact that somebody maybe wanted it to make. And I find that sad, you know? Working on inclusion in open source is not because people want to storm in and destroy communities, it's basically saying, "Look, these are your stated goals. And you're not going to reach your stated goals unless you become more inclusive. And these conversations are going to be met with a fair amount of fragility. It's going to require some introspection, it's going to involve accepting that maybe there are some some unconscious biases."
And that's an uncomfortable subject but if you can work through these, then what emerges is software that is easier to use that is fun to use, maybe even, and communities that are more pleasant to be in. If you make your community safe for people who aren't white, who aren't cis-gendered men... If you open up and if you have those difficult conversations, then what emerges on the other side is, you know, having discussions instead of arguments while gathering the best ideas instead of just the ideas that are promoted by the loudest people in the room. I, myself have managed for-profits teams and I was the only person who wasn't a cisgender man on the team. And the way we ran meetings was neurodiversity inclusive. It was based off of a fair number of feminist principles. And the way I ran meetings, I didn't say any of this. But there was an anonymous review of managers and the comments and the ratings were, "Look, in our meetings I feel heard and I believe that I'm taken seriously." The thing is, we're not talking about going into these communities and demanding that everybody act exactly like you work at Apple or Google or Microsoft. We're not talking about abandoning your own culture and your own views, it's just making a little bit more space. Especially if you don't explicitly talk about this as inclusion work, a lot of the times people say, "Yeah, actually, this is pretty nice." And then maybe you say, "Haha, you've been in an intersectional, feminist meeting. We've worked on decolonizing the project and [laughs] all the computers didn't blow up. Haha, you've been tricked." No, it works out for everybody. It doesn't have to be such a fraught, unpleasant experience. But the status quo is unpleasant, annoying, and sometimes even dangerous for people who weren't welcomed before.
Tim: One of the things that we had very early on in the podcast was a talk from GitLab about how they manage a distributed workforce. And I keep coming back to that one because I think it is really interesting about how meetings were quite inconvenient for a lot of people and quite difficult, and they didn't necessarily, you know, loudest-voice-wins. A lot of things are wrong with meetings as a decision process. And then the kind of story out of that was that actually that their process is much more document-based. I mean, it's partly because they kind of version control people and they like documents. But also because it allowed people to contribute in their own time, in their own pace, and literally in their own time zone, and managed their own time in a way that they wanted to. Which I thought was fascinating. And I hear similar things from you that you're saying that if you construct an environment that works for everybody and you have a broad definition of everybody, then actually it turns out that even the people who were relatively happy before get happier, because it genuinely is a better environment. It's that kind of... Am I in the right direction there or am I getting that wrong?
Karen: No, absolutely. And this is particularly relevant not just for different time zones, which goes back to the, "Look, if you're making technology for a global audience, you're going to have to ask people using that. You're going to have to make space," and that involves asynchronous work. But it's also a neurodiversity issue. Really broadly, there are the type of people that talk to think. And there are people who need quiet and a lack of interruptions to think. There are also people where exchanging information isn't the goal of some discussions. For some people, the words have most of the meaning, they're going to say what they mean. And maybe in a tone of voice that's interpreted by some people as flat or hostile even. And there are some people where a conversation and the words themselves are just the carrier for vibes. And there's nothing wrong with these different communication styles, but they do clash in meetings. So they clash in meetings where there's no agenda which should be written and distributed beforehand, they clash in meetings that go on too long, they go on in meetings that are repeated, because you haven't written things down. And where I've seen this is I've worked sometimes with product people-- and I'm thinking of one person in particular-- a product manager who didn't use design docs. They didn't know what they were. This wasn't an I've-examined-this-and-I-don't-want-it, it was really hadn't heard of them, didn't study them, and had to be taught about them. And the reason why, even though, you know, a lot of the times I've been like the ambassador from the people who just want to be left alone to code. Where I will take an email, a chat, a short discussion, which one of my favourite direct reports would just come up to me and say, "This is crap." And then I would go look at it. Sometimes it would be readily apparent that this thing was broken and I knew why. And then sometimes I would have to go back and say, "Um, you're gonna have to elaborate on that. We don't have to have a face-to-face conversation but I am going to need more detail." Then I would go off and fix the thing, but a lot of times by talking to people. So I'm not averse to conversations. But where those conversations need to end up is in some sort of written artefact, and that's inclusive because you're assuming not everybody can come to the meeting. Not everybody can withstand the communication style in the meeting, but nonetheless there will be meeting notes which are not just going to be taken by the people who aren't dudes in the room. That's another thing to be aware of. Like, the glue work, the admin, that can't just be done by the same people who tend to be of the same gender and sometimes the same race over and over again. But you're going to take that meeting, you're going to take meeting notes, you're going to turn those into design docs or tickets or some sort of written artefact so that you don't repeat the same topics over again; that you can learn, that you can say, "Okay, this is the assumption I'm making behind this. This is what we're going to try, and this is how it ended up working." And then you build a consensus. You know, docs are also this idea that we're going to build this consensus together. We're going to agree on something. We're going to try it and we're going to test it. And so it's a matter of inclusion. Don't make people talk very animatedly, don't judge people by the amount of eye contact they give you, assume that at the end of the day whether you're neurotypical or neurospicy, we're gonna write things down. That's good for inclusion and that's also just good for learning from your mistakes as you're creating technology.
Vimla: It's also just good practice. [laughs] It's good for everything, that communication and asynchronous working. The amount of stuff that gets lost in translation just because it hasn't been written down doesn't get better over time, it only gets worse.
Karen: Yeah, and I have my own very strong personal opinions about history of how I've had to adapt my communication styles. But the great thing about that is that people don't have to say, "Hi, my name is... I have autism or I am autistic, and I don't like to make eye contact, and I have a sensory processing thing and somebody keeps eating into their microphone, which is making my fight or flight response go off the rails," and everything is like you don't have to overshare information, you don't have to ask for accommodations. You can mute and then you can contribute to the written conversation, and it's fine. So that's one of those things that, you know, inclusion doesn't have to be oversharing either, it's also just saying, "Look, we're gonna do it in a way that fits the communication style of a lot of people, and as you say, it's just best practices." I've had to argue with tech founders about the utility of the written word before, and that goes back to the annoying part. It's like, "Really? Really, it's been tens of thousands of years that people are writing things down and then reading them afterwards and you're gonna say that documenting a meeting it's going to be out of date as soon as we type it down? Come on now! I'm out of here." You don't even have to say, "Look, I'm not neurotypical as the rest of the company, I'm not enjoying the endless meetings, it's just from a best practices thing, we're in tech. We write things down."
Vimla: Yeah.
Tim: Can you think of an example-- I mean, I'm not going to ask you to name an example-- but can you think of an open source community where this is working? Give us some hope that there are places where these things are working well.
Karen: Well, there's a couple of things. One of the biggest open source projects in the world is OpenStack, or Linux itself where things are improving. So it's not necessarily the surface layer of the internet, but the machines that are running on the infrastructure level. That's a lot of open source and that's a really good example of seeing a specific need and seeing that it's useful for lots of different types of people. I mean, OpenStack started with Rackspace and NASA, you know? Because in the days where computers were larger and heavier and you couldn't just move data from one computer to the next really easily, you know, you're studying rocket science and you need to do a lot of calculations. And for each project, you would have to obtain physical hardware. And then once you're done with the project, these boxes were just sitting around. So the whole idea of cloud computing is like, "We have complex topics to tackle, and we want to be able to move this hardware around for different projects." And so that was the genesis of Cloud. And it's grown to community where individual contributors and corporations are contributing code. That's basically in terms of scale that's really successful. And then there are individual projects that have gone through some horrendous thing socially and then seen the value of codes of conduct. And basically in the so-called Internet freedom, privacy-enhancing technology and anti-censorship technology space, there's been a lot of progress where it went from "Be excellent to each other," to some oddly specific codes of conduct that might even require content warnings to talk about in the prohibited conduct section, to a lot more inclusive communities. It used to be that the conferences around this were very small, you'd have the same group of people flying out to a different country and then you'd have maybe a handful of people from that country who were there. And I would look around at them and think, "A good third of us live in Washington DC, why did we get on a 14-hour flight just to see the same faces?" And then Sandra Ordonez was working at one of these organisations and then took this small exclusive treehouse model of conferences and turned it into the Internet Freedom Festival which embraced this idea of, "Look, you have people using these technologies. They have domain-specific expertise, they have experiences that are valuable for these developers to hear. So why don't we get people together?" And not only did they have a code of conduct, but they actually explicitly disinvited some people who were the subject of multiple credible accusations of gender-based violence. They basically said, "No, not here. We are not doing this." And that's grown into a team community which if you want to build a community, if you have open source technologies particularly to protect rights and freedoms, you can actually hire them to help you with a lot of things. They put out a community mental health report that has a lot of valuable insights if you're leading a project. Especially 2015 and 2016 were really difficult times to be in open source, and particularly in this part of it. But the community has emerged and it's doing a lot better. Now, I won't say it's perfect. There are some places in Berlin in the open source community where I actually would warn people. You know, you want to see who's around you. There's some faces that you need to recognise before you determine whether you're safe in this space. So there's still a lot of work to be done. But comparing the early years to the awareness that we have now, it's getting a lot better. But basically, if you see-- I would say if you're wanting to contribute to a project, see if they have a Contributor Covenant. See if the leaders of the project refuse to be on panels at conferences that are all white and all male. Look for those signals. And now in the middle of a pandemic that's not over, also look for the public health pledge, as well, for people who are saying, "Look, I'm not going to a conference and I'm not going to be on a panel unless it's safe for immunocompromised people." So just seeing that these discussions are happening is really encouraging. It's a good time to be in open source compared to just a few years ago.
Tim: I think one of the things you said-- I wanted to highlight it because I think it's a valuable lesson, which is about how you would warn people. And I think what isn't always... If you're on the inside of a community that's kind of going bad but it doesn't really affect you yet, you can hear that. You hear those warnings. And my experience is that typically people will only give you that warning once. They will tell you there's a problem here and then they're gone. And if you don't take note of that warning, it's too late. You really have to listen very carefully to those warnings when they are given and take them seriously. They're not often said very strongly because the person has typically already given up on the community, but they feel it's fair to give a warning as they leave. And if you don't listen to that, yeah, you've missed a really good opportunity to fix your community. That's my kind of peripheral experience and I think that was kind of implicit in what you were saying there.
Karen: Yeah, and those warnings, they may seem like they're very short given once, but the price that people have paid for giving those warnings from sharing that information has been very high. Somebody doesn't hear... And the people that disparage these efforts call them drama or they're rumours, and oftentimes they'll be talking frankly like police departments that fail to protect people against gendered violence. They'll be all about technologies to combat surveillance until they've heard that in their community, there's somebody who has been the subject of multiple credible accusations. And these people who oppose any type of surveillance, you know, they might even reject the use of a smartphone, they've still got a flip phone because they don't want the police to be able to surveil them... Suddenly, they're the strongest advocates for police that you have heard in a long time. "Why didn't this person go to the police? Why didn't they press charges? Why aren't they trying to put this person in prison." And when you get that sort of warning, it's against this backdrop of communities that will say that they're pro-whistleblower, but they'll push out whistleblowers. They'll say they're against state violence but they'll ask people to bring state violence into their experiences. They say that they are for human rights, but it stops at an accusation of harassment or gendered violence. And so when you're getting that warning, that's after a lot of thought. It's also not a model that we should embrace because if you're not a part of that whisperer network, if somebody doesn't know you and doesn't know that they can trust you not to go to the person that they've been warned about and say, "Do you see what so and so is saying about you?" If you are new to the community, then you're vulnerable in that. So I think going forward, that's why Codes of Conduct aren't just about responding to an incident, though they're very useful for that. They're also a signal. And that's why it has to go beyond 'be excellent to each other, don't harass people.' It has to be very specific about ‘these are the behaviours that are absolutely not acceptable.’ It says, "Look, we're aware of your tricks, we're saying no to them, and we will do something about them." And that, the hope is it signals to victims that they will receive support. That's why it's important- You can't just slap a Code of Conduct on a conference, you have to back it up with an incident response plan, and be willing to kick somebody out of that conference. But the ultimate goal to me of a code of conduct is if somebody wants to go to a conference and do harm. If you signal strongly enough, they're not going to come. You're saying, "That's not going to be tolerated here." A lot of abusive people rely on myths about gendered violence, they rely on bias against their targets, and they rely on bystanders who haven't thought through what would happen if I saw this in front of me. And that's the thing about the whisper network as well is that if somebody is coming to you saying that this person is just a run-of-the-mill jerk or they're abusive, a lot of the times the people who will swear up and down that this person has not done anything, they've seen that. They saw it with their own eyes, they heard it with their own ears, and they're just going to deny it." So that's another signal. You have a code of conduct, you have an incident response plan, it's very specific. That's another signal. "Look, we're not going to have bystanders dominating this event. You're not going to get away with it. If you try something, we're not just going to say we didn't see it." To me, that's the bare minimum that an open source community should embrace. Because the alternative requires multiple content warnings to discuss. And it means you're going to have a monoculture and that means the tech event isn't going to be as good as it could be.
Vimla: I think it's also recognising that safety means different things to different people, and the only way you understand what safety means is to talk to people about that. And it's what you said at the very beginning is involving communities in those conversations early on to understand those needs, and having those needs from the community rather than determined by the people building the programme, hosting the event, putting the technology out into the world.
Karen: Yeah. The other part of that is trust is earned. You know, I'm a pasty European, cis-looking woman. Whatever difficulties I face in life are not because of my race. And frankly, white women are implicit in a lot of horribly racist things. So I can't go to somebody and demand their trust. That's absolutely... That's not the way it works. And it's also inappropriate for me to centre myself in discussions of what it's like to be a person of colour, to be black in tech, or to be trans in tech. Not my lane. So the other thing is to seek out communities where they don't just say, "Oh, we welcome people of all, you know, who are disabled," and do the whole diversity and inclusion checklist. It's about time for the leadership to be more diverse. Inclusion without power is going to go awry. Another person who talks about this is Dr. Kecia Thomas who coined the term 'pet to threat.' She's an industrial and organisational psychologist who talks about the experiences of people of colour, and specifically black women in organisations where you've been included, you've been hired, you've got the skills and the background and education. And people will say, "Yeah! Good on us, we hired somebody from this background." And then the more excellence you bring to your work, the more you're seen as a threat. And it's a pattern that repeats over and over and over again. And so, me as a pasty woman, I can't go to somebody and say, "Oh yeah, I've read this thing! And I'm aware of it and we're totally not going to do that." To me, the better thing is for me to pass the mic, get off the stage, and not be so focused on being in a leadership position. It's like, you know, give up this idea that you always have to be in charge. Give up the idea that you're the cleverest person in the room. It's fine not to be in power. It really is. And it's like, if you're worried about other people having power over you, what does that say about what you do with power when the people on your team and your community don't look like you? So, that's a tell! That bears some thinking about. So it's basically, you know, we can turn over the whole system, we can do things a lot differently and it's gonna be okay. I would say it's the main thing. [crosstalk] The people who are being discriminated against...
Vimla: Sorry, you've really hit the nail on the head there and I just really want to hone in on that point of, if you're scared about your power being taken away, what are you doing with your power? Because it shows that you're doing something that's wrong if you're that scared of it being given to someone else. And I've said that time and time again to people like, "What is it that you're scared of in this power dynamic shifting?" Because if you were doing it with a good heart and good intentions and to be inclusive, you wouldn't care.
Karen: Yeah, and taking the example of being the most boring kind of third culture kid imaginable particularly in Germany, a lot of discussions about putting in a code of conduct, about dealing with abusive people specifically in the German hacker scene, a lot of the response was, "Oh, well these American feminists are coming in! And they're trying to destroy our German culture." And I'll get that response in a conversation that's happening in English and then I switch to German. [Tim laughs] And then I'll say, oh, that's funny. Because my mother never taught me about that, my mother who this conversation is happening in Germany, because I have a German parent and I've been raised partially in Germany. That never came up in conversation about German culture. I wasn't taught that gender-based violence is a key component of German engineering, but I'd be interested to hear where that come comes from with some citations. And for some reason, they never have a good answer to that. Not that the answer would be good, but they really, [laughs] you know? It's like the call is coming from inside the house, what are you gonna do now? And so it's this idea that if you had a switch that turned you into this dominant group immediately, that's what it would look like because I basically have that. It's depressing and amusing at the same time, where it's like, "All these outsiders are coming to destroy!" I'm like, "No, no, I'm half you. And I'm telling you that it's gonna be fine. It'll be okay." So the people who are in a dominant position, if they would just calm down a little bit and ask the people who are being oppressed, abused, and made to feel unwelcome is just sit there with their anger, which you've earned by either being an active part of the problem or ignoring it, just take that in and acknowledge it. Because if we reach the point where we can get past the fragility, all of these, "Oh, what are you saying? That I, myself am evil, that I'm a racist," and if we can get to those in-depth discussions about social issues with the open source community and basically say, "Look, let's do some 101 Sociology, let's talk about systemic bias and individual bias. Let's talk about what microaggression actually is." If we can work through those and get to communication methods that aren't violent, if we could put some checks and balances on ourselves to say, "Okay, what are my unconscious biases? How do they factor into the way we're creating communities or building communities, allowing them to be built?" You know, we build at the end. It's a way of doing things that is just practically morally better at the end. It's not this. We're not coming for your computers, we're not saying that you need to pledge allegiance to Apple, you know? We're just saying, "Look, things could be better." We could really make alternatives to the commercial surveillance capitalism. We could make technologies that work better, cost less. Everything could be so much better! It's just that these social issues are absolutely irrelevant. We can't take a pure tech approach to everything. There is no such thing as a pure tech approach. [laughs]
Vimla: We're really glad we were able to give you this platform because it's been an incredible conversation and I just wanted to clap for you the whole way through. So thank you for that, and I hope we get to speak again soon.
Tim: We do a final question, which is what would you hope-- looking maybe five years out into the future-- what would you hope that it would look like?
Karen: So in five years' time in open source, I have great hopes for open hardware. Because we've gotten a lot of tools now that are more widespread like 3D printing where it's already easier to make your own physical objects, and it would be amazing to have something like a 3D printing repository for a lot more things. And also, to adopt the communication methods that work best no matter where they come from. So, to have an open source project that is free, Libre, open source, and in communication with the end users wherever they are, whatever their goals are. So that when somebody asserts that there's no reason to use the commercial version of something, that that's actually true from a usability standpoint, from an accessibility standpoint. And on the accessibility front, disabled people are early adopters of technology. And so making technology more inclusive, you're basically letting the cyborgs take part. And that's something I'm really excited about is the future of accessibility technology, because it's not only a must for a lot of people, but a lot of accessibility tech is what the Sci-Fi, non-dystopian future looks like. So I'm really excited about all these things about the social sciences, physical objects, 3D printing and fabrication and accessibility coming together, and just making things that are a pleasure to use that really promote freedom and can also evolve as needs evolve quickly. That's where I think we can really get to with open source technology.
Tim: Oh, I'm looking forward to living in that future. It sounds good, actually. Great. I want to thank you so much for doing this, we really do appreciate it.