Tim: I'm Tim Panton, and this is the Distributed Future Podcast. The goal in this podcast is to talk to people who are doing interesting things and hopefully from them, understand what the future might look like because they're the experts in their own issues and they kind of know what's going to happen, what's coming down the pipe. I encourage you to subscribe because otherwise you'll miss an episode and not find out about the future, which we all need to know about realistically. I think that for this particular episode, you might want to kind of go back and listen to Episode 45, which was from Word On The Curb, Hayel Wartemberg talking about how to get positive messages into social media, which I think might be might be kind of relevant to this conversation. The other one that is slightly kind of more tangential but I think that people would be interested to listen to is number 14 with Melissa Pierce, who was talking about how to detoxify a brand. In that case, she was talking about how to get older women acquainted with cannabis without it feeling like it was kind of druggies behind the bike shed in in high school. That was a fascinating conversation and I think is also possibly relevant to what we're going to do. But anyway, speaking of we're going to do one, more thing which is to say that Vim isn't here-- maternity leave-- which is great, except that it leaves you in my capable hands. But the good news is that we have a guest here this week who is going to tell us about what the future of social media might be. So Dave, if you'd care to introduce yourself, we'll get going.
Dave: Yeah, sure. My name is David Troy and I am a technologist and entrepreneur based in Baltimore, Maryland in the United States. And for the last several years, I guess since around 2007 or so, I've been studying social media data online to try to understand more about how people use social media, how communities form on social media, and really how social media influences society and vice versa. So understanding the relationship between our online worlds and our in-real-life worlds, and how that affects our lives and our existence as humans is very much of interest to me. I've been, for the last five years or so, really paying a lot of attention to disinformation and how it functions, and how social media has been a very potent force for the distribution of disinformation, and altering our social makeup as a society.
Tim: I think the first time I came across this as a concept was a piece of art that you'd done, it must have been 2008 or something, the Twittervision thing which was capturing Twitter data and then presenting it in a kind of social graph. Well, not social graph but in a graphical format that told you something about society. Then you did kind of a subsequent thing whose name I've forgotten, but there was actually a graph and you can see how a city is maybe split into two different topic areas or something. You were the first person I came across who was doing this big data analysis on social media way back when. And you've stayed doing it ever since, or have you gone off and done something else as well?
Dave: Well, I've been doing a lot of different things but, you know, what I was really interested in with the first project that you mentioned, the Twittervision, was really understanding the geography of kind of like where people were. Like, if people are self-reporting their locations, where are they? I actually developed a little scheme for people to report their locations inside of tweets using the convention of putting the letter L and then a colon, and then their location. Then we could geocode that and map that onto a map so you could actually just get a sense of like where people were around the world using this, and why, and what their voice was and all of that. But then I quickly realised that well, that's interesting and was pretty compelling. What I was really interested in was this idea of networks, the idea of what sort of cliques form within a community. You mentioned that in a given city there might be different kind of networks that emerge out of that. You might have people that are one race or another, or they might be really interested in one topic area. So you got people that are interested in sports versus people that are interested in media, or cooking, or food, or whatever. All of that stuff kind of comes out in that distillation of that data. I found that really really fascinating and I actually thought about it in terms of something that could be used for good. Like if for example, local governments had a really deep understanding of that kind of information, they could design their cities and their services to be better and to make cities and places stronger. What I didn't really realise in retrospect and looking at all of that was that you could use that same kind of data to do a lot of really nefarious social engineering if you were so inclined and that that was in fact, you know, Robert Mercer and Steve Bannon and Cambridge Analytica were brewing up at that same moment. That's kind of how I got led into this disinformation thing; I was really trying to counter their efforts using the same kind of data that I had already been exploring.
Tim: This is a terrible thing to attempt to do on a podcast but what was fascinating to me on that was the visual blobs of sports fans versus music fans. So you kind of get a blob of Country and Western and you get a blob of the local football team, and you'd see very small overlaps between them but actually, it kind of isolated big blobs with very few interest interconnections. What I think I hear you saying is that that was a place where you could insert division if you were so minded. You can like fracture a society by finding those gaps and whacking a chisel between them. Is that a fair analysis?
Dave: Yeah, I would say so. And basically that's the same kind of process that works in what people refer to as psychological operations. What you do in a psychological operation process, the first thing you do is target audience analysis. Is to basically figure out who are you talking to and what do you want them to do? And what are going to be the behavioural triggers that will get them to do the thing that you want it to do? And I notice none of this has anything to do with facts or ethics or anything like that, it's all about how do you get the behaviour that you want out of that target population. And that could be exploiting their fears, it could be creating fears within that particular group. For example, if I've identified that the progressive politicians and their supporters are in a different blob than the sports fans, I could potentially start telling the sports fans rumours about the progressives and get the sports fans to start to have the animus towards the progressives. Then at that point, you start to actually alter the social structure itself and to sort of reify these groups into their own special walled areas. The more that process advances, the more reactive those groups can become. So you can start to message them with really anything you want that will get the behaviour that you want out of that target population. It's an extremely nefarious and I would say cynical way to manipulate people. It doesn't have anything to do with appealing to their intellect or to their sense of logic, this is all about just getting one group to respond in a way that you want them to respond.
Tim: I guess the presumption there is that a group will carry on behaving as a group, like outside the cheering for one football team, there is a bunch of other things that that group has in common in terms of the way that they respond to messaging. Is that well founded scientifically?
Dave: Yeah, I would say that these kinds of networks tend to persist over time even within the context of a city. And that has to do with the fact that social capital, which is really what binds these groups together, kind of persists. It doesn't really go away within the confines of a group. Now, social capital between groups is a little bit more tenuous and if you can find ways to sever that you can kind of alter how cross-knitted these different groups might be. But within a given group it tends to persist over time, especially if you do things that harden their internal social capital. Just as an example in the United States, we've seen a lot of kind of crazy stuff going on with COVID disinformation and most recently there's been all this debate around critical race theory. I would say that, for the most part, people don't even know what critical race theory even is or says, but they know they don't like it. And that is a common feature of people in this group. Is that they feel like that it's something that's being shoved down their throats and it's something that's coming from these liberals and East Coast elites and all of that kind of thing. It does tend to persist over time and these groups are pretty identifiable and can be targeted to behave in very specific ways.
Tim: So the way that you're targeting them is very specific to the group. I mean, your disinfo is aimed at a particular group. There's some kind of cross-fertilisation, like you see some some of the same kind of this info in other countries, but it's quite localised. What works in one country doesn't necessarily work in another even within the same kind of target group. So we still have a geography here.
Dave: Yeah. I would say it's more cultural than necessarily geographic because clearly, certain things you're gonna play with certain cultural targets than others. Just take a really broad example, like look at misinformation and disinformation around 5G and COVID. That's been kind of a global phenomenon and it plays pretty well in the US, it plays well in the UK, it plays well in Germany. There's a lot of reasons for that and in fact there's some overlapping different groups. For example, there's a left wing kind of hippie granola kind of crowd that thinks that anything having to do with science or whatever is bad. So they don't want to deal with vaccines, and they don't want to deal with 5G and it's gonna give everybody cancer and all of that. But there's equally a right wing faction that is resistant to vaccines because they think that Bill Gates is putting chips in it and that 5G is some kind of evil government plot to track everybody and all of this. So there's actually kind of an overlap where you can get a single cohesive group out of two different parts of the network that agree on these things. And then if you can build social capital within that group, that becomes its own force and its own target that can be manipulated in its own right. It's a very pragmatic, very cynical kind of approach where it really doesn't matter what the message specifically is, it's all a question of whether it gains resonance within a particular section of the network. And if you can activate two sections at once with a single set of messaging, all the better. That's just very convenient and practical and efficient.
Tim: What does it cost to run a campaign like this?
Dave: Well, I think that that's a difficult thing to talk about in any clear terms because what it really boils down to is a kind of networked insurgency. It's a kind of networked warfare where there are people that are buying what amounts to Facebook ads and running them online and they're achieving some of these goals by way of just sort of this traditional ad purchase mechanisms. But that's, I would say, a pretty small portion of the spectrum that's being deployed here. So you have a lot of people who are running their own independent information operations, let's say independent "journalists" who are doing work that pulls in the direction that the information operation wants to pull in, but they have their own like YouTube channels that they're monetizing, or maybe they're on Rumble and they're receiving cryptocurrency donations from supporters, or they have a big audience on Gab or Parler or something like this. And then there's a whole variety of what I would consider to be corporate and state-backed entities that are pulling in the same direction as well. It's really difficult to kind of just say that there's like one vector here. It's multiple vectors all driving in the same direction.
Tim: What I hear that as being like, you can lay down and you can crystallise these things, and then everybody else will do the work for you. Is that?
Dave: Yeah.
Tim: It's relatively cheap to start the ball rolling if you're smart about which ball it is and the direction you pick. If you get it right, then everybody else will just roll it forward for you.
Dave: Exactly, I think that was really well put. If you look at who's behind a lot of this, you've got people like-- at least in on the US side there's-- we'll talk about Europe in a second but on the US side, you've got Michael Flynn who is a four-star general. A three-star general, sorry, and Steve Bannon who is ex military. And what these guys understand is networked insurgency. Flynn was deployed in the [Balkans] and in Iraq, Afghanistan, and knows that if you want to get a thing done, the thing you want to do is to get the word out to the mullahs. And the mullahs are going to start talking about stuff to their followers and those followers are in turn going to go do things undirected by anybody at the top and with a great deal of plausible deniability, because each layer of that network acts as a cutout, right? So you can't just say, "Well, I was ordered by so and so to do this." No, it was just like, "Well, it became clear to me that this was a good idea so I went and did it." And sure enough that's how stuff gets done in this kind of networked warfare. And in Europe, Steve Bannon and Mike Flynn have allies and people like Vladislav Surkov, who is one of Putin's chief propagandists and the strategist, as well as Alexander Dugan who is this advisor to Putin who has been helping him crystallise a strategy for geopolitics. He actually wrote a book called The Foundations of Geopolitics that talks about a global realignment away from NATO and the EU. They want to put the US and UK into a strategic bilateral alignment and then give Russia and China, I would say, a split reign over over Asia and even Europe. It's a very big kind of network strategy.
Tim: To what extent is the ability to do this new? I mean, what extent is the existence of things like YouTube and Twitter and Facebook with their specific monetization of attention strategy? To what extent do they make this more possible than it used to be?
Dave: Yeah, I think the biggest differentiator between what's happening now and what was possible maybe 50 years ago, and that's looking at traditional media, is that you've got scale now. The internet reaches everybody. It's not something that was just reaching early adopters. In the early days of the Internet starting, let's say, around 1990, which is sort of the very earliest pulse that you could get on the internet for most people, it was kind of utopian, because it was reaching nerds. People at universities, people that had a deep interest in the internet, cybernetics, all that kind of stuff... Those people were populating those networks. And so that was people like you and me, right? And in those early heady days, it felt very utopian because there was this idea that you could just sort of take these values that were breeding at that time and scale them up to the world scale, that the world would be better and more like that. Well, that isn't how it happened. [laughs] Instead, the world eventually showed up on the internet and started to infect how everything worked on the internet. So these these current systems that we have were kind of built with that earlier utopian vision in mind and or with a certain amount of naivety, right? Nobody really sat down and thought through in detail what would happen sociologically if the entire planet was exposed to these systems. And so what we're dealing with now is really the aftermath of the societal consequences of this exposure. You never hear about startups focusing on ethics right out of the gate. [Tim laughs] Like, they don't have like a chief ethicist or a philosopher in residence or something that's helping them go through these issues. They certainly don't know anything about sociology most of the time. So it's not surprising that these consequences are occuring.
Tim: I'm not sure I agree, I think you're being overgenerous. [Dave laughs] It was always clear to me that there was a kind of an unpleasant side to the internet. I remember, almost- I'm trying to think when this will have been. But very early in the 90s, being on a group called [Alt Flame Roommate], which was absolutely glorious. It was hysterical to watch, but it was basically college students bitching about their roommates and giving them death threats and being amazingly petty about their food. And all the things that roommates do, but with an audience. And the audience loved it. Even from that, you could see that it would be really easy to be super unpleasant in this medium and get people to watch.
Dave: Yeah. I definitely agree with your experience there and certainly recall a lot of that kind of stuff early on. I guess what I'm getting at is that the ethos that kind of emerged out of that moment was that, you know, the Internet was this new special place and it needed to be protected, and it was mostly going to tend towards goodness. And that while there was this crazy undercurrent that would sometimes be there, that that was people's right to have that kind of crazy undercurrent and it was okay. Because at that time, it really hadn't scaled up to be something that was going to have massive societal consequences. But now that you do scale that up, we start to see all of these warts that were there all the way from the beginning. Yeah, I agree with that take as well.
Tim: Okay. Yeah. I'm curious to see whether you see structural differences between, let's pick Facebook, Twitter and YouTube as kind of the big examples that everybody knows. Do you do you see differences in the way that they operate such that that causes difference in how they're used for this kind of campaigning?
Dave: Yeah. I think they have natural sort of modes that people engage them with and I think that each of them has kind of different societal results. It's very difficult to kind of compare them one to one because they do just operate differently and we're not even able to get data on a lot of things, so it's kind of speculative to some extent. But I think it's reasonable to make the assumption and make the statement that design informs how things are used, which in turn informs the effects that they have. And so we've heard, you know, certainly early on about the YouTube recommendation algorithm, and how if you start watching one video and then keep watching the videos that it recommends, you end up going down these rabbit holes of extremism. I think that in general, they've become aware that that's a problem and I think that to some extent, they want to avoid getting a black eye over the most obvious examples of that. So, you know, when people report things, I think they do take a look at that and try to figure out, "Well, how can we keep this from ending up in a completely dark place?" And to some extent, why do they have these crazy dark videos on the platform to begin with? But I think that's different on YouTube than it is on Facebook than it is on Twitter and those platforms are all changing how they work on a daily basis, with a lot of it being kind of a black box. So you don't necessarily know what effects they're having, you just know that they're having some effects.
Tim: I was going to ask you, like, if you're spending a lot of time looking at this stuff, how do you become immune to-- how have you stayed immune to being influenced by it? Does it not kind of radicalize you as well?
Dave: I think that that's a concern. A lot of it boils down to really thinking in terms of networks of influence rather than looking at individual atomized pieces of content as standalone things to be sort of judged or taken in or whatever. A lot of it boils down to like, "Okay. Well, who is this? Why are they saying this? Who else are they connected to? What is their intention? Where do they want to drive people to?" Just as an example, I came across some really wild COVID and 5G-related disinformation in 2020 around May 2020a nd I was like, "Good lord, who is doing this? It's completely batshit insane." [laughs] And it was coming out of a group called the International Tribunal for Natural Justice and at that moment, the particular thing I was looking at seem to be particularly targeted to the UK and I was like, "This is a crazy group." Once I started studying who that group was, I realised it tied directly into the QAnon network and the people behind that, and it all sort of started to make sense. But you know, if you encounter these things kind of atomized and out in the wild, they kind of sound like they make sense sometimes and they seem to have people that sound like they have credentials, and you just don't know. A lot of it is just network analysis and understanding what's connected to what and what their motives are, and trying to just put things as quickly as possible into buckets of related influence so that you're not taking things in on their own.
Tim: So you're kind of applying... Are you applying maths to the problem, or is this a kind of philosophical process? How do you do that packetization?
Dave: We've done a lot of network analysis on different things over the years. So especially when you find something totally new and you don't really know what it is, it can be helpful to do network analysis because you can see the way that stuff around it clusters and what kind of themes emerge out of that, and then usually that'll be something that you're familiar with and so you can kind of say, "Okay, this is these people doing this thing." But after enough of it, that kind of analysis, a lot of it becomes kind of just a sixth sense that you develop and you can kind of get a sense really quickly in your own brain of how to do that kind of analysis just in your head. The other thing that we have the advantage of is there's a network of researchers and scholars and reporters and other people that are sort of, I would say in a very loose affiliation to kind of trade notes on this stuff. So if there's something that I don't quite recognise off the top of my head, I will send it over to whatever relevant expert that I know that can just very quickly sort out and go, "Oh, yeah. These are the crazy cult people in North Dakota and I will be able to put that into the right bucket that way. I literally have like a handful of cult experts on my Signal chat that I can reach out to at any given time and say, "Hey, this looks like a crazy new cult. Is this actually something new or is this just a rewarming of something else that's been around since 1975? You tell me. You guys know." That's very helpful.
Tim: That seems to say that there have always been cults since, it's just that they're kind of more weaponized by the technology which you're describing is naive, and I'm thinking that's more generous. But anyway. So you think it's just a weaponization of an existing behaviour set?
Dave: Yeah, I think that that's a fair assessment. And I think if you look at the longer term history of secret societies, cult religions, throughout let's just say the last 2000 years just to pick an easy number, this kind of thing is not new. I think what makes it new now is that it's much much easier to stand up a tribe than it's ever been. It's easier now to message at them and reinforce the social capital within that tribe to induce the cultish behaviour. I would say maybe three-four years ago, I started getting the idea that we really need to understand cults as part of this whole situation, because it's really the dynamics of cults that get people to dispense with truth and reason, and also to start to really hate their out groups and to see them as being physical threats. So while there was so much hand wringing around "why do people believe things that aren't true? Why did they dispense with science? Why do they vote against their interests? Why do they seem so ready to commit political violence?" You know, the answers are all there in the studies around cults. If you read people like Robert Jay Lifton, who wrote the book Chinese thought reform and- Sorry, Chinese totalism and the psychology of thought reform. He's got in there eight criteria for cultish behaviour that apply to most of what we're seeing right now. Then there's a guy named Steve Hassan, who's a good friend of mine who's a leading expert on cults. He used to be in the Moonies cult himself and got out in the late 70s and then has written extensively on how cults function, the kinds of things that they do to people to use undue influence on them, to get them to behave in certain ways. And then most recently, he's just done a PhD on really understanding all of that through a clearly quantitative way. He and I are in close touch. I don't think you can understand the disinformation problem and the social media design issues without also understanding radicalization and cult behaviours, because it's really just the same stuff. And so if you think about it from that perspective, disinformation is just a instrument that can bring about cultish behaviours. And it does so by destroying social capital and ties into family and friends, and creating toxic parasocial capital into new networks that allow people to feel a sense of belonging around a shared cause.
Tim: So, where do we go from here? If we know this and we start to understand the problems faced, what would a better social network look like? How would it avoid these problems?
Dave: Yeah. This is where we're kind of in a choose-your-own-adventure moment in society where we kind of have to make a decision about what direction we're going to move in with this. I would say that there's two basic schools of thought that are starting to really become prominent, although we haven't yet had this conversation in the kind of detail that we will need to and I suspect that this is what's going to emerge over the next year or so; which is that we can either continue to have these fairly centralised social networks that are run by autocratic leaders like Mark Zuckerberg, or we can start to think about decentralising them. Now, decentralisation is a very loaded topic and I'll get into that in a second. The other piece that we need to consider is regulation and how regulation might apply to either centralised or decentralised solutions. Now, one disadvantage to something like Facebook is that, you know, it is very much a reflection of Mark Zuckerberg's personal values and whims. He owns I think about 55% of the voting shares in the company and what he says is law of the land for something like 3 billion people that are using Facebook. That has some inherent weaknesses. Some people are, as I mentioned, were talking about decentralisation. The only problem with the decentralisation narrative is that nobody has really necessarily given a sufficient amount of thought as to how decentralisation might promote this kind of cultish behaviour that we talked about. I'll try to lay out just a little bit of a sample of how that could be a problem, which is if you give people the ability to set up their own little social network and to have it be maybe sort of in some way, broadly interoperable with other social networks, so they can kind of choose how their little sub network chooses to talk to other networks and what kind of things they do and then within those sub networks, they have the ability to moderate behaviour based on whatever rules exist within that little sub network, there really isn't much within that structure to prevent them from becoming extremely cultish themselves. So if a little subgroup decides that it wants to become cultish and to exclude everybody that doesn't think like they do and to practice daily chants at 3:00pm and use coded language and do all the kind of awful things that humans tend to do, I'm not sure that there's any real mechanism by which that can be stopped. And suppose that that little cultish behaviour starts to turn political and they decide that they want to take out their political enemies and organised to conduct terror attacks or other kinds of actions that are negative for society, let's put that mildly, I don't know how you really stop that particularly if that's all hidden behind encryption. I think it's unnecessary, but people talk about using Blockchain as a mechanism for that. And you know, the idea being that if you could sort of use this distributed ledger technology, that it would be totally unreal regulatable. And so if you have, let's say, an authoritarian state that wanted to squash dissident behaviour, that kind of structure would enable that to be resilient against that kind of intervention, in theory. Now, I think that that's pretty naive but that's kind of what they advocates of the decentralisation agenda will tell you.
Tim: It'd be really interesting to look and see whether-- like this has already played out to some extent I imagine in Minecraft-- I know it sounds like the wrong place to be looking but but Minecraft has had this kind of small servers run by individuals with their own rules thing for a while now. And I wonder whether like, have they been like clannish Minecraft servers that were full of hate? Or have they all been sweetness and light? Does the eight-bit thing make it intrinsically charming?
Dave: I think maybe Minecraft is one possible place you could look but I think you can also look at things like Discord, which tend to attract a similar kind of audience and have all sorts of behaviours going on there. When you also even look at things like 4chan where anything goes and there's a lot of weird behaviour that kind of emerges out of that, a lot of what we kind of know today is like the alt right emerged out of meme culture that came from 4chan and they channel on that.
Tim: Just briefly cover 4chan-- maybe we track back to it in a minute-- but just about where the Bronies had a minor victory in the 4chan world, they kind of managed to hold off the forces of darkness for a couple of years by being unrelentingly nice. Is that a scalable solution?
Dave: I think you have to look at that in the context of like social capital and what does that actually doing? Like, what does that do in terms of the network dynamics? Are they building social capital amongst themselves that enables them to kind of withstand this attack from the outer network? How does that function? I mean, this would have required somebody to go do like a full, you know, sociological ethnography on that community and I don't know that anybody did that. But it's possible that that research is out there, you never know. But I think that's an interesting inquiry and the short answer is, I don't know that we know. But I also think that these sort of things that tend to work or can work at small scales, sort of in the corners of the internet, may or may not be applicable when we get to the everybody stage where we've got everybody participating and also you've got all kinds of incentives built into the system for people to manipulate it and to gain political advantage or to gain cultural advantage. We could go on a whole nother show about this but one of the things that's happening right now, Steve Bannon and his allies are playing at this cultural level, which I like into like a network layer two-type thing, and the politicians like Democrats and people pushing policy are operating at something like layer four or five, you know? [laughs] And so the stuff the machinations that go on at layer two end up sort of foiling the plans of the people at layer four and five, and they don't know what's going on because they're like, "Oh, I don't know, we put out a good plan and people said it was a good idea but they're not voting for it. Why is that?" And it's because of these tectonic shifts that are going on at layer two. I think that's another analogy we have to keep in mind.
Tim: I think there's also a delay thing there that like by the time you've legislated for for something and you've imposed regulation on something, it's kind of often moved on. It's no longer the problem it used to be because people have moved on to another platform or, you know, they've started sending gifts instead of typing or whatever it was. The thing is moved. How do you tackle that with legislation?
Dave: I mean, I don't know that we know, I think we're gonna have to kind of figure that out. But all I can say is that I think that we need to come at this if we are going to try to regulate stuff. We need to come at it with, you know, what are the effects on social capital? And does it make people radicalised or tend to make them less radicalised? That's a loaded set of questions and there's going to be subjective answers to that, but I think if we don't grapple with it at the societal and social level rather than just looking at it as like is the information true or false and that kind of thing, then you get into the thought police kind of stuff. And I think that that's more dangerous and less tenable in terms of a place to regulate but if you can say that this platform tends to have radicalising tendencies, not about content but about social structure, that seems to me like something that's more tenable as a place to apply pressure.
Tim: Do you think that's measurable, though, social structure?
Dave: I think it broadly is. Basically, what you would need to do is to look at people's social networks and look at how sort of inward facing they are, and also what parasocial connections they're connected to and whether they tend to be connected to a broader range of things, or a more narrow range of things. And frankly, that is really really difficult to just come up with normative statements about because, you know, like if somebody is super into trading cards and board games, and, that's about it... Is that bad? Should they have the right to just only be interested in those things? And arguably, they should, but I do think that it isn't really a question of what's happening to anyone individual. It's more about what is the effect that a platform tends to have on a large swath of society? So for example, Twitter was criticised recently because their platform tended to promote right wing content more frequently than than other kinds of content. And Twitter was very quick to acknowledge that and say, "Hey, we don't even really know why this is, but it seems to be true and we're sort of as surprised as anybody. But here it is." And so I think if you can kind of look broadly at whether or not the platforms tend to bring culture closer together or to divide it into more atomized units, and I think you can measure that, that might be a reasonable measure to at least inspect and graph and try to measure to have it go in the right direction versus the wrong direction.
Tim: The Twitter thing was fascinating because the only place where that wasn't the case, disproportionate amplification of right wing themes wasn't the case, was in Germany where there is already legislation about some things that you can't say on Twitter. So it's kind of interesting.
Dave: And there's also cultural norms around that, too, that maybe don't exist in other places.
Tim: I'm not sure that's- I'm not sure how effective those are these days. I think we're particularly around the kind of the vaccine and the 5G things, I think that there aren't any cultural norms about what you can't say anymore.
Dave: Right.
Tim: So, to what extent do you think like there are- I mean, I'm kind of listening to what you're saying. I was kind of wondering about whether... Is religion an antidote to this? Like something that is trying to bring people together and in a shared purpose and whatever. Is that an antidote, or is it a dangerous sign? Can't get my head around which it is.
Dave: Yeah, it's kind of being used in both ways right now. I think one of the things that you have to grapple with when you're working in the space is what's the difference between a religious cult. [Tim laughs] Some people say there isn't much of one and that religion is just a cult plus time. I'm a little bit less of that mindset. I think that honest religions are faith traditions that give people a place to express spirituality and community. But the most important factor that makes a religion a religion and not a cult, is that you can leave anytime you want, and you're not being pressured into participating or to engage in some social milieu or some shared project that you don't necessarily want to engage in. Because once you cross that border, you're really in the run of undue influence, which is what the hallmark of a cult is. So I think that religions that function in that way of being faith traditions that give people community and that were their positive forces in their lives can be useful in this because they tend to exhibit this property of having cross cutting social social connections that create balance in people's lives and make them less cultish. But, you know, for every religion in every local parish or church that operates in that way where people do have freewill and agency, there are organisations that are not operating as ethically, and some of which have been weaponized. I mean, Putin is making every effort to weaponize the Orthodox Church to advance his agenda. You know, there's a whole variety and tradition of evangelical and Protestant faiths in the United States where people are really herded into participating in certain kinds of political behaviour. I think that a lot of those are fairly unethical, and some of them are just outright totally unethical. There's a good documentary on HBO right now about a religious cult in Tennessee that was being run by a woman named Gwen Shamblin. It was all about weight loss, and Jesus, and the elect and all this crazy stuff. [laughs] Those kinds of groups are very, very vulnerable to being weaponized for political purposes because, you know, they can get people to all behave the same way because they're all sort of socially tied to each other. There's a lot of pressure that can come from the leadership and members of the church, etc, to behave in a certain way. Lastly, there's a thing in the United States going on called Gloo, G L O O, which is an offshoot of Cambridge Analytica that is targeting vulnerable people who are experiencing or have experienced things like alcoholism, drug addiction, divorce, financial woes... and recruiting them into church communities, which sort of sounds innocuous enough on the surface, but they're using data analysis to do this, some of which the data that they've gotten has come from de-anonymized records that they've managed to piece back together and to connect to specific people. The reason that they're doing this is because once people join these church communities, they're more likely to vote Republican. They see this as a way to grow their numbers by targeting vulnerable people. And this is really an offshoot of a group that existed before called Campus Crusade for Christ, where they were trying to do this kind of thing on campuses. So they've kind of merged that approach with this targeting the vulnerable approach, and then steering vulnerable people in specific geographies into the watersheds of specific churches that are aimed at radicalising them. I wish it was not that cynical and insane, but that's exactly what's going on. There's a documentary that people can watch called People You May Know that outlines this. It's based on the research of a guy named Brent Allpress. Fascinating stuff that's going on with that.
Tim: We'll make sure that we put links to these things in the show notes if we can, and then people can follow up this stuff. But something you said there which is kind of interesting about the desirable property of cross communication between social groups. How would you build a social network where that was emphasised and encouraged? What attributes would that network have? What kind of monetization? What structure would it have that might tend to do that?
Dave: The first thing you might do if you were thinking about how to go about doing that is to think about what kind of examples of social structures exist that have that property already, and what we could use for inspiration. So just to pull a couple of ideas off the top of my head; if you think about a small town, a small town has a bunch of random people that are sort of born into it. They grew up there, they go to the same high school, they eat at the same coffee shops, they go to church. There might be three or four different churches, there might be a synagogue or a mosque, or whatever. But in general, people are sort of needing to participate in a shared civic reality and to do so in a relatively civil way. If you remember, and I don't know if this is the same statement in the UK as it has been in the US, but there used to be kind of an old song that said over the dinner table, you don't talk about politics or religion. There's a reason for that, because those those topics can be divisive and are sort of uncouth in a world where you are trying to maintain cross cutting social connections. It's just not polite to bring up stuff that people can't necessarily change or which is contentious for them. And in the name of [00:45:22], in the name of civility, we tend to kind of put those things aside in favour of cheerful exchange with each other. So, I think that kind of thing happens in small towns. The other place that it tended to happen in the 20th century a lot for people was in the military, you know? You had people that were serving from very different backgrounds, rich-poor, East Coast-West Coast, rural and urban, and they came to trust each other. And so within the community of like, veterans, you would often find people that are like, "Look, I don't know anything about, people in Iowa. I can just tell you that this guy that I served with from Iowa was a good guy and I trust him, and I don't think that all people from Iowa are bad or something like that." We have lacked the mechanisms for that kind of mixing. So I could see, you know, in the context of online connections, it might be reasonable to start to encourage things that very deliberately engineer connections between people from different networks. And you know what networks they're coming from because you can actually measure that empirically using data. Not everybody's going to be interested in doing that and that's kind of an unfortunate aspect of humanity, not everybody wants to be exposed to people that aren't like them. In fact, there's this principle of homophily, where people want to be exposed to people exactly like them. But I also think that I don't think it takes a ridiculous amount of investment in building those kinds of social ties to have things tip in a better direction. I think the one of the things right now is that we've just kind of tipped in the opposite direction and we may just be sitting on that fulcrum. And so pushing back in the opposite direction towards more civility, more comedy, more cross cutting social ties, may well push us in the direction that we need to go, and I just don't think it's that hard to start to build things that encourage that kind of behaviour.
Tim: Do you think that that's something you could engineer into an algorithm? I'm thinking about TikTok, for example. They have very specific tricks that they use to build loyalty to TikTok and get you enthusiastic about it. Do you think that the sorts of things that you're talking about are things that you could do in an algorithmic way?
Dave: Yeah, I think that it can be done in an algorithmic way. I think the challenge would be that they don't necessarily maximise profit. [laughs] You know, the things that maximise profit attend to be to show people the exact stuff that they want to see that makes them self validate and to feel a sense of belonging. The things that expand their horizons probably cost the platforms money. So there might be some kind of a- You know, if you were to design a regulation, you might need to say something like, "Okay, you need to spend 3% of your profits on investing in building cross-cutting social ties. And if we don't like the results for 3%, we're gonna make it 5%. [laughs] And we're gonna keep increasing the percentage until conditions improve." [laughs] That might work.
Tim: And do you think there are legislators out there who are kind of open to doing that?
Dave: I think that they are starting to be conversations around that. I know Sheldon Whitehouse seems to get a lot of this somewhat. But I think this is the conversation we have to start, and we need to educate people on how this works. People like me and my peers have been studying this long enough to kind of have a really innate sense of what's actually going on here, but most people just kind of arriving at it naively don't, and they're going to be like five years behind the study of this. So one goal that we could have for right now is to start to find ways to educate lawmakers quickly around some of these issues so that they are not pursuing the most naive ideas first, they can actually hit the ground running and do some stuff that might actually make sense.
Tim: Let's hope that this podcast is a very small contribution towards that. We'll put some links in the show notes so that people can kind of follow up on some of the things that you've been talking about and some of the shows that you suggested people might want to watch. And yeah, I've encouraged all our listeners to go out and be be polite and talk to somebody that you didn't necessarily always talk to. It sounds like it's going to help us solve the problem. So Dave, thank you so much for doing this. I really appreciate it. It's been great. If you got anything else kind of you want to promote or mention, now's a good moment. Otherwise, I'm good.
Dave: Yeah. I'm doing a weekly newsletter, kind of cutting through some of the noise and getting at some of the most important stories and trends that are going on right now. It's sort of beyond the political and into the parapolitical if you will. So if folks want to check that out, you can subscribe to my newsletter at davetroy.medium.com.
Tim: Brilliant. I'll put that in the show notes as well. Yeah, really good to talk to you again. It's been too long. We should...
Dave: Absolutely, I look forward to seeing you in person soon.
Tim: Indeed. Indeed. We should do that. Anyway, great. Thanks so much for this, and I'll press the record button and we'll be done. Brilliant. Thanks so much. Bye.
Dave: All right. Sounds good, Tim. Thank you. Bye now.