Tim Panton: [00:02] So, this is the Distributed Future podcast and I'm Tim Panton.

Vimla Appadoo: [00:09] And I'm Vimla Appadoo.

Tim Panton: [00:10] And the podcast is an attempt to see a little way into the future by talking to people who are working in areas that we might not know enough about and who might be able to help us understand what the future might look like. Jillian, perhaps you could introduce yourself.

Jillian York: [00:25] Sure. Hi, my name is Jillian York, and for the past nine years I've been the Director for International Freedom of Expression at the Electronic Frontier Foundation. I'm working these days mostly on issues around platform, accountability, and transparency. And I'm also--I've got a book coming out in March called 'Silicon Values.'

Tim Panton: [00:43] And do those two things overlap? Is the book about your work or tangential, or how does that fit together?

Jillian York: [00:51] Yeah, it's a huge overlap actually. So the book is kind of more of a documented history than a polemic, although it certainly does make some strong arguments. But I'm looking at the past, let's say, 11 or 12 years of the ways that major social media platforms have dealt with their policies and with content moderation. And it looks really specifically at a few stories and themes. So around the Arab uprisings almost 10 years ago and what happened there, the ways that companies have shifted their approach to certain issues over time like hate speech and violence and terrorism, and there's also a strong focus on sex and sexuality.

Tim Panton: [01:30] Do you see that as a progression? Like, do you see that change carrying on into the future? Or do you think it's unpredictable?

Jillian York: [01:42] I think it's pretty unpredictable, but, you know, I think that the most interesting things that I'm seeing right now-- and I would say that we're kind of at an inflection point at the moment around a lot of these issues-- a lot of them are really peaking. So, for example, the ways that platforms and governments are handling counter-terrorism has reached kind of an apex at the moment. And what I'm seeing in the future is kind of a split off between how these different platforms deal with these issues. We see them all as kind of a monolith, but over the past year during the pandemic, what I've seen looking closely is a real split off between Facebook and Twitter and how they deal with certain things. I'm seeing Facebook kind of grow so big that it's almost like the Soviet Union before the collapse, where nobody knows what any other department is doing.

Tim Panton: [02:26] You mean internally it's so big, not that it's got so many users, but then like it's internally doesn't--it's collapsing. I mean, interesting.

Jillian York: [02:34] Yeah. I am not sure if it's a collapse or a split, but yeah. Sorry, go ahead.

Vimla Appadoo: [02:40] I was actually going to say I don't even think it's about the users, it's the things and all the pie. It's like it's owning Instagram and WhatsApp and Facebook together, and the reliance that people have on it without necessarily putting all of that together is what scares me the most. And I say that as a user of all of those platforms because I'm fully aware of how complicit I am into this massive system of surveillance, but do nothing about it.

Jillian York: [03:07] Yeah. I understand. I mean, I'm also complicit in it and I find it really difficult because I think that often the arguments to boycott these platforms don't make sense to me when I think about the users elsewhere in the world who have come to rely on them for even basic things for a number of really valid reasons. I mean, you've got businesses in developing countries that rely on Facebook as their website because it's too expensive or not accessible to build a website.

Tim Panton: [03:32] Yeah. Then there's the whole thing about Facebook cross subsidizing data in those countries as well. So you don't necessarily--like it might be massively cheaper to do it through Facebook than any other way.

Jillian York: [03:47] Exactly. And Facebook knows that and they've promoted it that way. They've given free access to people in certain countries. They had this program, I think is still ongoing, where they call it Free Basics and now I think it's got another name. Where they gave people free access to Facebook even when their data plans had run out. So this was operating in Kenya, India, Tunisia... And some governments have kind of wisened up to it and the problems with it and shut it down, but other places it's just too valuable for people to shut down that way.

Tim Panton: [04:18] Right. That's kind of scary though, isn't it?

Jillian York: [04:22] Yeah. Oh, absolutely.

Tim Panton: [04:25] So I guess maybe we should check--like track a little bit back and talk a tiny bit about the EFF because not everybody will necessarily know what it is or what it does.

Jillian York: [04:36] Sure. Yeah. So EFF is only eight years older than, or eight years younger rather, than I am, which is kind of amazing. We were founded in-- now I'm going to share my age-- but we were founded in 1990, and a lot of the first things that we did were legal cases around copyright and fair use and free speech online. And we've grown from being like a handful of lawyers to a 90-something-person organisation that does a variety of different things. So we still have that strong legal element to our work in the U.S. but we also have folks working on things internationally from the Digital Services Act in Europe to like Turkey's latest bill to limit speech on social media platforms. But we also have this whole other side of what we do, which is building technologies that make people safer and more secure and educating people about safety and security online. And in the past couple of years, we've even built out to have, for example, we have this program called Electronic Frontier Alliance that works with small organisations and sometimes student organisations to build an alliance across the U.S. that can fight on legislative issues. So, we're really a very complex and growing organisation. And you know, I've been there for nine years, so I've seen a lot of these changes happen over time. I think it's a really incredible model and there aren't too many other organisations like us out there.

Tim Panton: [05:56] Yeah. We have like--there's a thing we kind of keep tripping over on this podcast, of finding really vitally important work being done by like guilds almost. Like, you know, not-for-profit associations or these sorts of organisations that just do spectacularly interesting work that people-- nobody else would do it, you know, and it needs to be done.

Jillian York: [06:23] Yeah, it's true. And I think in the specific work that I work on, I mean, kind of these both platform issues, and I also do some work on government surveillance and censorship globally, there's a lot of academia working on it most certainly and especially in the past five to eight years. But I've been doing this since-- I've been in this space, at least since 2008, I think, 2007 maybe. And in the UK there's groups like Privacy International and Open Rights Group. In the U.S. we've got, of course, you know, these huge groups like Human Rights Watch and ACLU, but then also this kind of growing group of digital rights organisations that are often working in coalition or collaboration with each other. It's been really incredible to see that build up over the years. I've gotten to watch most of it happen and it's really a community now. We don't always agree on everything, but there's a lot of strong agreement against some really key issues.

Tim Panton: [07:19] Sounds like there's room for another book there as well, actually.

Vimla Appadoo: [07:22] Yeah.

Jillian York: [07:24] I think somebody is doing it as their dissertation. So I've got to read the field first, but yeah, I think there's definitely another book there. [laughs]

Vimla Appadoo: [07:31] What do you think would happen if the community didn't exist or if the people didn't step in to kind of do this work?

Jillian York: [07:38] Oh, wow. That's such a great question. I think that we would have a very different internet if groups hadn't stepped in we're talking 30 years ago now? 30 years ago, yeah. I think that we would have a much more monopolised internet than we do now. I think that we would have lost on a number of fronts. I mean, you know, copyright is still a very contentious issue and it's still-- I don't think we have all the answers or all the problems solved-- but if people hadn't stepped in, I mean, I think it would be even more like megalithic corporations-- megalithic, sorry, monolithic corporations owning the rights to everything. It's already bad enough as it is now. And then I think when it comes to things like privacy, which is not my specialty, but I was there. I was at EFF for the Snowden revelations and things like that. And I think without, you know, if EFF hadn't been there, we wouldn't have gotten some of those first tips on what the NSA was doing because one of the earlier whistleblowers walked into our office and gave us stuff. So I think really, we can't underestimate the importance of nonprofits and academia in this fight.

Tim Panton: [08:49] I was thinking about what you were saying about the point of inflect that we're somewhat to the point inflection. And again, that's a kind of recurring theme at the moment, which is a little disconcerting actually. But I was thinking specifically this week, and we turn like--this will date the podcast terribly-- but this week, there's just been a storm of things happening that make you wonder what's going to happen to the internet. I mean, particularly these court cases against Facebook. Quite what's going to happen there, I don't know. And then there's the thing with Apple and the App Store kind of really documenting what each individual app does with your data. And I think of those changes this week, and they're massive.

Jillian York: [09:43] Yeah, and you've got the PornHub story and you've got--I mean, there's just so much happening right now. And you know, there's part of me that feels, and I know this will sound a little conspiratorial, but I don't think it is, given some of what we know, there's some part of me that feels like the pandemic has created some cover for governments to get away with a lot, you know? We've seen this with like some of the executions in the U.S. right now, and some of the extraditions that happened earlier this year. I think that without in-person organising, it's become really scary. And I think that that's true with the tech companies as well because they have a lot more resources than we do. And so when they're working on these things they're able to connect a lot better, work a lot faster, move fast, and break things, so to speak, while all of the nonprofits are kind of struggling to keep up. And the same is true, of course, for stuff going on in the U.S. with police and the ability to organise around that. And so I think that the pandemic has demonstrated that we're not really there with technology, and yet at the same time, the technology companies keep pushing forward. And I find that really scary.

Tim Panton: [10:50] So you mentioned PornHub and I kind of wanted to--there's this old adage that, you know, where porn is today technically, everybody else will be in five years time. Do you think that's true?

Jillian York: [11:02] Yes. Kind of. So part of my book looks at this and specifically looks at how the combination of smartphones, high-speed internet and, I forgot what the third component is without it in front of me. I finished writing in June and then I put a lot of it out of my mind, I guess.

[Tim laughs]

Jillian York: But this kind of thing that happened around 2005, 2006, enabled so much. And I mean, it obviously didn't just enable porn, but it did enable like these hubs, these porn hubs, which don't really connect closely to the rest of the tech industry to really thrive and often to thrive on unpaid labor, sexual abuse, child sexual abuse imagery. And so I think what's happening right now, and I admittedly am on vacation so I haven't been following the story too closely, but I read a lot of what sex workers have to say about this. And it's a lot more complex than it's being presented as. Yes, of course these companies have been exploiting people and particularly exploiting children in a lot of ways. But at the same time, the way that governments are using this as an excuse to crack down is going to have a huge impact on sex workers' rights. I think that that's a complex issue to talk about too, because a lot of the work that people do is not lawful, and yet at the same time a lot of sex work is completely legal and a lot of times sex workers who are engaged in either lawful or unlawful practices are still being targeted for their regular existence on social media or on using payment platforms. So this is one of the areas where I think we need so much more complexity of thought instead of this very black and white thinking that happens around sex and sex work.

Vimla Appadoo: [12:42] Yeah. I think that also rings true for some of the stuff around policing and just generally the ethical standards that we put around anything that's kind of what human life implications, is we try to make it as black and white as we can, without thinking about the nuance or the empathy with the person who is living that life.

Jillian York: [13:05] Yeah. I mean, that's what I found so fascinating this year with the movement to defund police, because up until this year, the way that I saw the conversation from the media, I mean, I knew organisers, of course, but the way I saw the conversation, especially as a person living outside of the U.S. now, was around how to make policing better. And now it's come to this fruition of, you know what? Do we really need cops? Do we really need cops doing things like traffic, you know, routine traffic stops? Is this really what we want from society? And I've been so happy to see it progress to that. You know, I mean, I'm not sure what the answers are when it comes to things like violence, but when it comes to these basic things, I'm completely convinced. And it only took a short period of time to convince me.

Tim Panton: [13:46] It's funny, sort of wherever you grow up, you kind of assume that that's how the police work everywhere, and that's so not true. I remember the first time I went to France kind of as a late teenager, realising that they had like multiple different police forces with different roles. And you're like, how does that work? You know? And so this sort of concept of the police is very much something you grow up with, and getting your head around the fact that other people or other countries may do it differently or see it differently is really hard, actually. Kind of, you know, it's a big effort, I think. And a lot of people who don't necessarily hang out with the police a great deal kind of find that quite tricky.

Vimla Appadoo: [14:35] Yeah. I was going to say not only that police forces around the world are different, but that the lived experiences of people and their relationship with the police is so different. Yeah, it's just really, really interesting to see how different people view the police.

Jillian York: [14:58] Yeah, no, I mean, I completely agree. I grew up in like kind of a homogenous, mostly white town in somewhat rural part of the U.S. And although I didn't have great experiences with the police, like I didn't, I obviously wasn't able to witness any racially based police discrimination. And we had police who would go run downtown on horse. I'm not kidding. I mean, it was obviously kind of a throwback, but still, it was something where the police were your friend. They were there to help you. And then moving to Germany, you know, it's complicated here for sure, but the police have so much more training and so much more sensitivity training than most urban police do in the U.S. And so, even though we have plenty of problems with cops here, you--I mean, when the Black Lives Matter protests were happening this summer, and we had huge turnout in Berlin, I was looking, or I was doing a lot of research and trying to find out how many incidents there had been of police killing people wrongfully here and particularly people of colour. And I believe that the number they came up with was like 20 over the past five to 10 years, which is still too many. It's always one is too many, but when you compare that to the U.S., even on scale, I mean, it's just still such a different phenomenon.

Tim Panton: [16:13] And they are, I mean, in the large part, those are armed police in Berlin or in Germany. So it's not like, I mean, the British argument's always been, well, you know, we don't arm the police and therefore that's why they don't kill people as often, but that's a huge oversimplification, but that's a factor. But the German police are routinely armed anyways, so it's like, it's not that, it is the training.

Jillian York: [16:38] It is, yeah, and they don't bother people here about minor things. I mean, I've seen cops walking around in parks that are thick with marijuana smoke and the police don't do anything about that. And you know, from my perspective, that's how it should be. I think, you know, legalize anyway, but even in an illegal context, I think it's really important that the cops, if they exist, if they are to exist, that their focus is on preventing harm, not on going after people for small violations. When I look at the U.S. and just see the number of people of colour in prison for the most minor infractions, it's heartbreaking and it's wrong. And I think that that's why we, you know, why this movement has gained such strength this past year, is just the amount of people who've written incredibly powerful, persuasive things, and have convinced a lot of us differently.

Tim Panton: [17:27] How much of that do you think was read on social media? I mean, is social media helping as well as hindering?

Jillian York: [17:37] Yeah. I still think it is. I mean, I don't want to--I'm not the expert when it comes to social media use in the U.S. I think I'm really just an observer there. And there are a lot of smarter people than me, but at the same time, just as an observer, I think that the movement for black lives has gained so much power through social media and has connected so many people across, not just across boundaries in the U.S. but across borders. And so, yes, there's negativity to it, obviously, as well. There's harassment, there's misinformation, there's potential infiltration of movements, but I wouldn't know nearly the amount that I do if I relied solely on the traditional media for this. And the same goes for the Me Too movement, the same goes for democracy movements all over the world, and social justice movements. And so I think that we, and my book covers this as well, but I think we have to see it as a double-edged sword or whatever the positive version of that is.

Vimla Appadoo: [18:30] What do you think the relationship is between social media lead in the news coverage, then mainstream media picking up, or the vice versa? Who do you think holds that sway at the moment?

Jillian York: [18:42] Yeah. I mean, I think the mainstream media is complicated and complex. You know, I mean, the news over the past day and a half has been around the Califate podcast with the New York Times and just how much of a failure that was. And I do think that journalists because of the--I mean, just the way that journalism has changed in the past decade, I think that journalists are often pushed to work really quickly and that's why we're seeing so many of these mistakes happen. I don't want to blame journalists. You know, I think they're doing the best they can most of the time. I mean, there's some that, you know, New York Post can go, whatever. But I think that the thing with social media is that yes, on the one hand, it's free labor that people are using it for telling stories and sharing information, and I hate that all of that goes unpaid. And yet at the same time, there's a lot of things that wouldn't be printed even still because they require more fact checking or they require more sources or whatever. And so I think, you know, like a lot of the work that's been done around misinformation and around manipulation of like videos, things like that, I mean, even Bellingcat, a lot of that started on social media by people doing it for free.

Tim Panton: [19:53] Yeah. Bellingcat's an interesting kind of, not journalism, not really social media. I don't know what category to put that in.

Jillian York: [20:05] Yeah. It's complex. I haven't followed them closely, but in the beginning, when a lot of their work was around what was happening in Syria, and there was so much misinformation and so much media bowl BS around reporting there. I mean, I'm thinking like 2012, 2003, I think it was incredibly vital to see people actually trying to document things from the ground or by working with people on the ground and by detecting like where videos come from and all of that, you know. And I think their work has gotten a lot more complex and complicated to use two of my favourite words, but that model, I think, is really important.

Tim Panton: [20:43] Do you think it's a future model for the rest of journalism, or does it really only work in kind of war zones?

Jillian York: [20:51] You know, I'm not sure. I mean, I think one of the things that's, to bring it back to like my focuses for a minute, I think one of the things that's been so difficult to report on when it comes to social media platforms is that it's so hard to get access to the deepest bits inside these companies. You don't get access to engineering teams or Mark Zuckerberg himself most of the time, you're getting, you know, the same policy and PR staff whose job it is to talk to the media. And it makes me wonder, you know, as we're seeing more and more of these leaks, how much of them are going to require verification, whether these skillsets can be applied to that kind of work. And, you know, I mean, it makes me think a little bit of spycraft and you know, what you have to do there in that kind of work to detect things.

Tim Panton: [21:36] How different social media companies are structurally from like big oil in terms of kind of, or any other big organisation with secrets they don't necessarily want to tell, do you think they're structurally different?

Jillian York: [21:51] I don't know enough about big oil to really say, but what I can say about that is that I do think that there's teams that have secrets that no one else knows about. And I think it's worse when it comes to, the bigger the company gets, the worse it gets. I think Google and Facebook have reached this like beyond point where, I mean, they've got--okay, let me give one really clear example. When it comes to content moderation, Facebook outsources a great deal of it to third-party companies in other parts of the world. And so they've got their public facing rules that users are supposed to abide by, and those are pretty complex as it is. Then on top of that, you've got an internal guide book for how to apply those rules. And then on top of that, they've got these training manuals that they put out every few weeks to train the content moderator staff who have pretty high turnover already, as far as we're aware. And so you've got three different sets of rules that are constantly shifting and changing, and there's no way that everyone in the company can know all of that information. On top of that, we've seen mistakes like literal, just factual errors in the guidebooks that are created every couple of weeks, or the training manuals. And so, I think like that just demonstrates how problematic it is when these companies get really big. I would also imagine that, you know, when you think about all the different offices around the world, we've seen some really concerning issues of bias coming from some of the regional offices. India is a great example of this, where the case that happened recently, and I'm blanking on the woman's name, but one of the regional managers at Facebook who had some sort of Hindu nationalists biases. And so, yeah, I mean, I do think that it's gotten to the point where these companies can't keep track of their own staff and what they're doing at times, and that's very troubling.

Vimla Appadoo: [23:38] Yeah. And this question is going to be a bit bolder, where do you think the line is between kind of content moderation, surveillance, and freedom of speech? What does that look like?

Jillian York: [23:54] Yeah, I mean, I think--so when the Snowden revelations came out, there were some great arguments, they weren't new, I mean, to be fair, I think they were coming from, you know, opposition movements throughout history, but the thing that always sticks in my mind is how we can't have progress when everyone's being surveilled because a lot of really important organising around a variety of issues throughout history has had to happen in private because things were illegal. So even just to give a very basic example of this, LGBTQ or queer organising had happened behind closed doors until the past 20 years, for the most part. And if people had been more heavily surveilled, we may not have been able to see that progress. And so that is the clear argument for me against surveillance. But freedom of speech also goes hand in hand with that, without the ability to speak out against those injustices, then we also can't, you know, we can't move forward. And so I think it's really important to link the two. I also think that there's a deeper component to this that we don't often talk about because we're moving so quickly. And that's the role that shame is in a lot of things. And shame is often enforced externally, but it's also inside of us. And so, you know, I mean, I think that part of the reason that we want privacy is because of shame, but another part of it is because of external risk. And so there's a lot of great complex thinkers on this. I think I'm only scratching some of the surface of it, but I do think that we have to think about these things in coordination with each other.

Tim Panton: [25:22] Is shame still a thing? Like I'm kind of wondering sometimes.

Jillian York: [25:30] Yeah. It's a good question. I mean, maybe shame isn't even the right word anymore. You're right. I mean, it does feel like we're reaching a point where shame no longer exists, at least in certain corners of the world, or certain corners of the internet. And that definitely presents interesting issues because right now, like in the U.S. at least, the problem is largely not freedom of speech anymore, although it is for certain groups. I think that's really important to point out that there are groups that are still completely marginalized, and I think right now, two of the biggest ones are sex workers and Muslims. But I mean, it's true that like the movement for Black Lives has not been heavily censored, but rather the issues are that the police are still the ones that hold the power. And so, that doesn't connect directly to shame, sorry, I kind of missed the boat on that one, but I think, you know, there's not shame there, but there's still this external threat. And so, yeah, it depends on which movement or which group we're talking about.

Vimla Appadoo: [26:28] But I think shame exists in the sense of the repercussions of not agreeing with the status quo, whichever side of the fence you sit on. There's like self-censorship that might happen if you disagree or just hold different opinions. And I think that's the interesting bit of it doesn't feel like there is a safe platform for that conversation to happen anymore.

Jillian York: [26:53] I agree, and I think it's tricky because I have a lot of--so when it comes to things like identity politics and cancel culture, I have a lot of empathy for why those things exist and I agree with a lot of the stuff coming from that side. And yet I think that the reason that it's become such black and white or binary thinking is because of the amount of external potential for violence, potential for harm. And so a lot of times people on that side of it are just kind of fighting for basic dignity and respect. So it makes sense that people are going to come out swinging and say, okay, you know, we have to use this language and force that on everyone else. I don't love it, but I get the conditions that created it. Whereas the other side, this kind of this latest-- like the persuasion community and those kinds of thinkers-- I find them to be disingenuous. And that sort of like, "Oh, we have to listen to all sides. We've got to listen to everybody." Really? Do we? Because I don't think that listening to Nazis has ever gotten us anywhere. [Vimla laughs] We're not going to sit around and sing Kumbaya with them. And so, yes, while I wish that there was a little bit more deepening in complexity of thought on the side that I'm most sensibly on, I think we just can't reach that point until the external threats are gone.

Vimla Appadoo: [28:06] Yeah.

Tim Panton: [28:09] We had an interesting conversation with Jacob Lefton about how to defuse conflict in social media and his thing was actually-- And the thing that came out of that for me was that actually nobody enjoys it.

Jillian York: [laughing]

Tim Panton: If you get a very conflicted social media, nobody comes out of it having had an enjoyable experience, and that was, I mean it's obviously true, but until you'd heard it, I think, it was a bit of a revelation to me. I thought it was really interesting.

Jillian York: Yeah, I know. It's a great point. [laughing]. I haven't thought about it before.

Tim Panton: So, a little more around the book, where do you? Where do you think we might kind of-- What might we learn from it apart from history maybe?

Jillian York: Well, I hope that it gives people a greater understanding into the ways that companies are thinking and operating in the world, and whether or not we really want corporations to be the ones that define what's acceptable for us to do and say. I've always had a lot of trouble with the idea that we're allowing corporations or even pushing them sometimes to censor speech. Again, it's one of the things that I understand why people don't want hate speech on platforms, and yet I also see that as another very reactive short term kind of thinking that doesn't consider the impact on people in other parts of the world. When I see civil rights groups in the US calling for hate speech to be censored by Facebook, I totally get it and I understand the short term relief that that offers people. But I don't know that there's always a consideration from their side on the long-term impact that might have when dictatorships start following the same sets of rules and pushing Facebook to take down the people that they consider to be opposition or dissidents. So, that's one thing. The other thing too, is that I hope that it gives people a greater consideration of the fact that everyone on every side of the political spectrum believes in censorship in some way or another, and that censorship itself is not inherently a-- What's the word? We don't have to have a value judgment behind the term. To 'censor' is to remove the information and it doesn't have to be a state that does it sometimes it's a company that does it, and that trade-offs come behind all of these choices.

Tim Panton: Yeah, I gave a talk many years ago actually weirdly at Bletchley Park, about the importance of secrets and how all of the sort of things like patent law and financial trading law and whatever, is knee-deep in the assumption that professionals can keep secrets. And that like none of this stuff actually works if your lawyer can be compelled to reveal what you said to them. It's like the legal system doesn't work anymore. I'm not totally a total fan of complete openness, I don't think-- we haven't built a society work which expects that. And I think part of what we're hearing is that we don't really know how to deal with some of the flood of information that's coming out as a result of more open communication.

Jillian York: Yes. [giggles] I think that's true. I think that-- I'm a very big believer in transparency, especially when it comes from governments and these companies. But at the same time, obviously, there is a need for secrets in the world. And I often think that governments are protecting the wrong kinds of secrets, and so are these companies, but again I think that there's just a such a strong argument for complex thinking and broader thinking around what these things mean for us.

Tim Panton: Do you--? In theory, you might think that Facebook being a kind of a broader medium than Twitter's like Haiku format, that it'd be more nuanced, but it doesn't seem to be the case. You may have any thoughts about that?

Jillian York: Yes. I mean, Facebook, I'm not a big user of it anymore, although I do keep a profile so that I can check in on the features and how things are working. I think that it's become so complex that they don't even know how to manage it anymore. I mean, when I saw the new Facebook format, the new UI and I played around with it, I kept hitting brick walls where like it just stopped working and it was really buggy. It reminds me of just kind of a-- What's the term? Like an animal-built by committee where you've got everybody fighting for their own features on top of things and things just keep getting built on top of other things to the point where it no longer makes sense. Obviously, that happens with governments with laws with a lot of different things in our world, and so I think there is a strong argument to be made there for breaking things down back to basics. Auditing the entire service, auditing all of the rules and maybe starting from a fresh point rather than trying to build more things on top of each other. We've seen what happens when you try to build layers on top of buildings, they often crumble. [laughs]

Tim Panton: So, weirdly, breaking up Facebook would be doing them a favor, then?

Jillian York: I think so, yes. I don't think that it solves a lot of the problems that I talked about, I don't think that breaking up Facebook is what helps protect democracy from Facebook, for example. And yet, I do think that breaking up these platforms is doing them a favor, and it's doing users a favor. We've seen the thing with Twitter Fleets coming in a couple of weeks ago, and looking at how Twitter, Facebook, and Instagram, even though, Twitter is a different company, that they're all copying each other on these features, and so I do think that breaking things up and separating it apart, and having more competitive and diverse thinking between these different platforms is ultimately a good thing for society.

Tim Panton: I've had my head in code for the last three months, and I have no idea what Fleets actually do.

Jillian York: They’re the same thing as Instagram stories.

Vimla Appadoo: I don't think anyone has an idea of what they do, because they don't fit with how you use Twitter and I think that's very much, but a great point is there's no USP across the platforms anymore because it's all just becoming homogenous?

Jillian York: Yes, it's true. I think that's what drives me from actually wanting to use some of these platforms is just the homogeneity of it. They're also collaborating and copying each other when it comes to the rules sometimes too. I mean, I don't think it's any kind of coincidence that we've seen this huge anti-sex work and anti-sex push from these platforms over the past few years. I have no doubt that their policy people are looking to each other for guidance.

Tim Panton: I guess it's risky to be the one out in front or the one behind, so there's a sort of natural grouping of you don't want to be the one that everybody picks on I suppose, but they're big enough to be able to defend themselves anyway, so maybe not a very good argument.

Jillian York: as I keep saying, it's so complicated. [laughing]

Tim Panton: But I think that we should be welcoming that. I mean, welcoming kind of more nuanced thought is what we should be trying to encourage, and I'm not sure how we do that. Maybe kind of long forum like this is good for that, but—

Vimla Appadoo: Yes. There's definitely something in transparent complexity. There's nothing wrong with things being complex and complicated if it's transparent. I think the problem I have with Facebook is much to what you said, it's very hidden and you never get to the root cause, and it feels clunky because no one really knows what that root is anymore and so you're just left with this really opaque system that's for the developers to hold and manage and understand. That's what I think it gets dangerous.

Jillian York: Yes. No, I really agree. I think that the leadership, I think that's what I'm trying to say here, is that the leadership of these companies are so far removed from real life in some cases. I mean, it's been interesting to watch Twitter and the changes that Jack Dorsey seems to have made in the past year to himself, as well as to the platform. I think some of the hirings they've done is really smart that they brought a lot more diverse thinking into those big rooms and boardrooms or whatever, and then I compare that to Facebook where Mark Zuckerberg is like Boy King, who's never had another job in his life, is pushing us toward destruction in some ways, and it doesn't seem like he has anyone around him telling him not to do that. I think that's why I keep comparing Facebook to the Soviet Union before its collapse because it does feel like there's just this level of protectionism happening at the very top that is not healthy.

Tim Panton: Do you think he doesn't hear about the problems, or he doesn't believe them, or what?

Jillian York: I think he doesn't want to hear them, you know? And I don't think it's just him. He's surrounded by people at the very top of the company who also don't go to the meetings, they don't listen. I've seen Sheryl Sandberg speak live on several occasions, like in person, and she comes off like a politician reading a speech from a card. It's troubling, and I don't see that same thing when Jack Dorsey gets up there to speak or when his deputies get up there to speak. I'm not praising Twitter, there's a lot of problems with what they do as well, but I'm just really troubled by the kind of political nature of Facebook leadership and the way that they act like a state.

Vimla Appadoo: Yes, that's why I find it terrifying, and when people celebrate that as well it troubles me. [laughing] It's just really put me on edge, and—

Tim Panton: celebrate it how? What do you mean?

Vimla Appadoo: so, when Zuckerberg kind of made, well drops, when it hit the news that he could run for president and it was celebrated as a new form of politics, or when big tech companies get a say to influence policy or governance. It worries me that we're being driven by the wrong empathy. So, we're empathizing with the business rather than the people.

Jillian York: Yes, absolutely. I couldn't agree more. And just back to a point that that you mentioned and I forgot to comment on was just that models like this, like the conversation we're having right now, are helpful towards building a greater complexity of thought. I think that's so important, and I've been fascinated to watch this new podcast boom happened this year when we're all stuck at home and we don't have anything else to do anyway, but it's been great. It's so rare that I've ever gotten to have these like hour and a half long or hour-long conversations in the past. Even when I do live panels, there's often this very rehearsed-- like, you've got the pre calls, the prep calls are longer than the panel itself and then everyone gets to ask one or two questions. I've always just found that to be a waste of time in a lot of ways. I mean, I do them because they think it's important to get thoughts out there, but I love this longer format, and I used to hate listening to them and now I've become much bigger of a listener to these things.

Vimla Appadoo: Yes, we've had more time to listen as well, not only to do them but to listen to them. I think that's just been a blessing in disguise that this year is, the amount of time that you've been given back.

Jillian York: It’s true. [laughing]

Vimla Appadoo: Yes. [laughing] And the other question I wanted to ask was around, and I've only just come across it so I don't know a lot about it but is ClearPass the kind of new social media platform that's--


Jillian York: Yes, you know I got to say I don't know a lot about either.

Vimla Appadoo: Yes, all I've seen is that there's been an article written about it about moderation, and how it's trying to pave the way for a new type of social media, but it's actually called and privy to the kind of white tech brew setup that has it always existed in social media in Silicon Valley. I think there's something really sad in that we're not really doing anything differently even though we're claiming to.

JillianYork: Yes, I think that's true. I mean, this year it has definitely been a blessing in a lot of ways, and yet this-- I think the most frustrating thing to me has been the way that governments have treated-- Well, basically the way that capitalism has forced us to continue doing things the same way that we've been doing them... working at the same pace, and watching how difficult that's been-- not just for my friends with kids that I think they're struggling the most-- but even for my friends who are just comfortable and living in houses that are big enough and don't have children, or pets. It's still been just this-- It's unbelievable to me that we're expected to go about our work and our lives in the same way as if we were able to have social relief and time outside. Yes, I think that there's a lot of reflecting that we need to do on that. I don't know if it-- [laughs] I know it's happening right now, but I think right now a lot of people are just dealing with first-hand struggles and don't even have the capacity to think beyond that. So, I'm happy that I have the luxury to do so.

Tim Panton: Yes, I'm certainly true that there are people for whom they're kind of their working style is very much person-based and find it really difficult to do the sorts of work that they do over media like this. It doesn't suit there. I have a friend who really, I mean, he's a coder, but he just likes to code with other people in the room. And he finds this a real struggle. He's found this year really difficult for that reason. This is not me at all, but I get that there are people who have totally different kind of ways of seeing their workplace.

Jillian York: Yes, it's true. We've even had some of these kinds of little divisions inside my workplace between people who really miss being able to drop into other people's offices and people who are getting a lot more done because they work better this way. I've been working remotely far away, nine time zones away and from home for six years now, and I love it. [giggles] I haven't had a lot of difficulty doing work during the pandemic. In fact, I would say I'm doing some of my best work because I'm not traveling, but at the same time for me, the inability to go out dancing or to get the kind of social relief that I'm used to has been a huge challenge, and I've gone through waves of depression during this period where I just you know-- It's not that I'm having trouble with my job, I'm having trouble with everything else. [laughing] The absence of everything else.

Tim Panton: Right. Yes, it's kind of weirdly pointless to be in a great city when you're not allowed out into it.

Jillian York: [laughing] Yes, exactly. We had quite some relief this summer in Berlin to where we had 20 to 50 cases a day so people were going out and all the-- Everything outdoors was open and the weather was great. So, I feel really lucky because I know that that didn't happen everywhere and that some people never really recovered, or some cities never recovered from the first wave. But it's now so much more stark how much I'm missing that because it's been like three months now where we haven't been able to do any of that.

Vimla Appadoo: Yes, I've had a similar realization of the need to play. Like when you're working a lot and you don't get that playtime. I really, really miss it.

Jillian York: Yes, me too. Like right before the shops closed the day I bought a bunch of craft supplies and I admit that I haven't really taken much out of the box yet, but I need to kind of force myself to do that and it feels weird to force yourself to play, but my partner's great at it. They write music and have been, not for a living but like for fun, and have been doing that every day and making some really incredible stuff and improving on their craft. And I am regretting that I never really built up hobbies before [laughing] the pandemic because I'm just sitting here going like, I don't know how to draw. [laughing]

Vimla Appadoo: Yes, one of my reservations is that work became a hobby for me, and so when that's taken away I was like, "Oh, how else do I spend my time?" [laughing]

Jillian York: Yes, exactly. [laughing] My hobbies were dancing and yoga, and travel, and I can't do those things now. [laughing]

Tim Panton: Yes. I mean, yoga is interesting, but in that, you can do it remotely, but it’s sort of getting the vibe right is super difficult.

Jillian York: Yes, I miss having the other people in the room. [laughing]

Tim Panton: Coming back to your thing we have about the new social networks. I wonder whether there any Parlay and Clubhouse, or both fallen right down into the trap of, "Oh, we don't need moderation." And, "Oh, boy, you do!" [laughing] And they have fallen down that hole. Do we know of any kind of incoming social media where it does work, or they've got a different moderation thing? I'm thinking maybe if something like-- And if you looked at Matrix or any of the other kind of distributed decentralized networks. Discord seems to be okay.

Jillian York: Yes, I've had conversations with a few of the emerging ones that haven't really been released yet and I am seeing-- People are reaching out to me, and reaching out to other people who do the kind of work that I do for advice on how to do this right, and so that's been kind of cool. I'm not going to name any of them yet because they're not fully operational, but yes. I mean, I think it was funny because when Parler first came out they had these rules against sexual expression and nudity, and yet they were building themselves the most free speech platform, and I was like, “This is such a misunderstanding of obscenity law and a misunderstanding of what a free expression is.” But they quickly got rid of that, and I think the zero content moderation idea doesn't really work. I used to argue for it to be honest. I used to say companies shouldn't be moderating speech at all, and on some level, I still feel like companies, emphasis on the word companies, shouldn't be moderating speech, and yet at the same time, I have kind of come around to think, Okay, this is necessary, and we have to just find ways to do it with consent and get it right.

Vimla Appadoo: Well, I really struggle with this, and I completely agree with you it shouldn't be the companies, philosophically I think for me one of the things I struggle with is, the who gets to decide what is right and wrong because it's so opinion and perspective based, but how do you--? How does that happen?

Jillian York: Yes, there's not a good answer, I really struggle with that as well. Yes, I think that we're never going to have a full agreement in societies and some of the things right now that are being pushed really hard are things that I agree with and yet I'm not sure that it's a helpful long term strategy to ban people from talking about certain controversial subjects, even if those people are completely wrong. Obviously, I think that when there's the potential for real-world harm and violence, then we have to do something, but I think that preventing people from having even really problematic conversations often only pushes them even harder into their set of beliefs. That's why I think a lot of the reactive short term pushes for censorship are not going to be the solutions in the long term.

Vimla Appadoo: Yes, because in the same way that that kind of silencing sparked the kind of left-liberal social movements, it will do the same for the right when you push it offline or behind closed doors that will kind of go into itself to then create a different type of movement. I guess that's a genuine fear of blindness, the less you see it in public spaces or in public domains, as horrible as it is, it can if it's not there it's creating in the real world there's no one to challenge it.

Jillian York: It’s true. I've been in Germany, obviously, you can't use certain symbols and certain words here. There was very clear reasoning for that, in the aftermath of World War II that made a lot of sense, but in the long term what it's done is it's pushed Nazis to use different symbology, sometimes it's more hidden, to mask themselves in other movements. There was a period in Berlin in the early 2000s, or late 90s, where a lot of the Nazis were joining the pro-Palestine movement because then they could see some of the things they want to say under the cover of a positive social justice movement. That really scares me and I think that when I look at it I see a lot of US folks praising the German NetzDG law, the hate speech on social media platforms law, and is going like, "Yay, Twitter doesn't have Nazis in Germany?" And my answer to that is, "Oh! Yeah, it does. You just don't see them as easily."

Vimla Appadoo: Exactly.

Tim Panton: That's fascinating because I was going to say-- I was going to draw the distinction between it happening behind closed doors, making it more difficult for it to be a recruiting technique and therefore, kind of, you don't have this sort of YouTube slippery slope problem. Which I do-- That we haven't talked about YouTube as a social network, because sort of it isn't, but I do think it's almost the most interesting from the point of view of censorship, I think it's probably the most interesting target, because it's the algorithm is so aggressively pushing you to increase extremes. For that, it gets kind of weird that we're doing nothing about it.

Jillian York: It’s kind of weird. Yes, YouTube is a fascinating example. I wrote last about it in my book than the other platforms, but one of my favorite stories is that in 2008, Eric Schmidt who was then CEO, was arguing in favor of allowing YouTube to host Al Qaeda videos. [laughs] Oh! How much has changed over the years? I think a lot of that was Syria. But at first, YouTube was like, "Okay, we're going to let the platform show graphic violence because it's so important for people and civilians to see what's happening in this country and for people to document human rights abuses." And then ISIS beheaded journalists on YouTube, and then they were like, "Okay, no. Now, we actually have to censor all of this." And now they're censoring more documentation that's coming out of Syria in ways that's really troublesome. There's this group called Syrian Archive that they do a lot of different things, but part of the work that they do is mining YouTube for videos that can show what's going on and that can potentially be used in tribunals in the future. And YouTube has kind of almost entirely shut those types of videos down because they've swung so hard in the opposite direction that they're not considering those use cases. And I think that goes back to the question of who decides, and right now the answer to that of who decides is a bunch of Americans who don't really have a lot of understanding of how the platforms are being used elsewhere?

Tim Panton: Well, or indeed degrees in philosophy or jurisprudence or anything else.

Jillian York: [laughing] Absolutely, yes. Yes, and there's increasingly, Facebook in particular is hiring former law enforcement, former government, and I think that that's just really scary.

Tim Panton: Yes. Well, I mean, they stole our-- no, didn't steal-- they gave a job to Ex-Deputy Prime Minister for reasons which escaped me completely, but maybe they thought it was a good idea.

Vimla Appadoo: Yes, I think that way around scares me more when Facebook hires ex-politicians as well, but then-- Actually, right now I think it scares me even more. [laughing]

Jillian York: Yes, I'm not sure which is the scarier one at this point to be honest. [laughing]

Tim Panton: I kind of interested to hear about what you see that. Where you see the future because we kind of are coming to the end of the natural end of this conversation, but I do want to kind of get in a sense of where do you see this going? You said it's an inflection point. Can I push you to kind of [Jillian giggles] come down on one side or the other and say where you think it might end up?

Jillian York: Well, the final chapter of my book is called The Future is Ours to Write. [laughing] So, I actually don't have an answer for you because I think that we're still writing this. I think we're like-- That's what I think is so interesting about the moment that we're in. Because soon we're going to be able to go back to the office, and soon we're going to have in-person organizing again which does allow for a lot more secrets, so to speak. Or at least private conversations to happen without the threat of surveillance. And I think that once we're back in that kind of action, the future's-- It's unpredictable. Both in terms of the bigger picture of world governance and all that, but also the smaller or increasingly bigger picture of what happens with these companies. Right now a lot of these leaks that are coming out of Facebook, I don't know if that's going to be possible when people are back in the office and under more surveillance. Yet on my side of things, I know that we're going to be able to do better organizing when we're able to meet in person again because it's just so hard to have some of those conversations on Zoom. So yes, I think that partly because of the pandemic, but partly just because of the way that life and the world work the future is still unwritten. But let me say this to give you a little bit of a smarter answer, I think that European regulation is going to play a huge role. I think the Digital Services Act is going to be, at least in the current-- at least as far as I know so far-- is going to play a positive role in what the future of these platforms look like especially when it comes to transparency and accountability. I'm worried about what's going on in the US with the Section 230 Reform. Personally, I don't think that there is smart reform possible on this, I think that it's a really important law and that while I don't want companies to just be able to do whatever they want rampantly, I also think that making them liable has the potential to create more harm than good. So, those are a couple of the things that are really on my mind. And I guess the third one to kind of round that out is; I think that the people in these companies have a lot more power than they think they do. I don't mean the people at the top, I mean the people making policy moves and working in the regional offices, etc. and I think that they really are going to have to start taking a stronger stand in their jobs.

Tim Panton: Yeah. I'm totally with that, and I think it also comes down to the kind of people making technical decisions on a day-to-day basis. Because with that, that increasingly bleeds into real life basically.

Vimla Appadoo: Yes. I say all the time, we have a responsibility as people delivering technology or whatever level that is, to make sure it serves the whole society. Not just a single part of it.

Jillian York: Yeah, absolutely. Yeah.

Tim Panton: Cool. I think that's a great place to end. Very much for spending your time on a Friday afternoon with us and really appreciate that. I'm going to-- What we'll do is ask; if there's any links or anything you think people might want to read around the discussion, obviously to your book, but also to anything else that we've talked about where you think a link would maybe clarify or help people dig in deeper into a subject, send it over and we'll drop them into the show notes. Yes, I'm going to unclick the record button.