Conflict resolution
Tim: [00:00:00] I'm Tim Panton
Vim: [00:00:01] And I'm Vimla Appadoo
Tim: [00:00:02] and this is
the distributed future podcast where we interview people who are doing interesting things and hopefully ask them a little bit about what the society will look like in the future, based on their current expertise and practice.
I have no idea what we've got this week, this episode. So, Vim, tell me what's on the schedule.
Vim: [00:00:27] Yeah. So today we're talking democracy, in all its glory and understanding how to move from hostile conversations to positive conversations, particularly online, on across social media. what I find really interesting in this podcast with Jacob is.
How to facilitate those conversations and actually what the future of polarized opinions might look like. Say something very, very evident in today's society.
Tim: [00:00:58] And is there any good news in that cause like it seems to be this disinformation and polarized opinion is really dominant in a lot of social channels.
Vim: [00:01:11] Yeah, I think, so the, the kind of premise of the conversation is. Yes. It's difficult and it's tough and it's a big ask of people to be the voice of reason in those, polarizing conversations. But the positive aspect is actually the more that we do it, the less polarizing things can be. So there's this kind of hurdle, or we have to get over as a society to put ourselves into that uncomfortable position of actually stepping away from our opinions and opening ourselves up to that criticism or external thoughts.
Tim: [00:01:45] Right. And, but knowing when that is appropriate and when people are just wrong. you know, I mean, there's, I, I heard this, it's not social media, but I heard this thing the other day, on the radio. It was a phone in and somebody was talking about, About advice, medical advice, and the presenter said, I'm going to stop you because this is wrong.
you know, and, and, and I just thought that was really interesting that, you know, it wasn't, it wasn't rude, but it wasn't polite. It was just like, this is wrong. And we're not, we can't, I mean, there's sometimes do that for like, you know, legal reasons, but, but it was interesting to kind of actually see that, you know.
That isn't stuff we're going to say on this channel.
Vim: [00:02:33] Yeah. I think that's, I think you get less and less of that online, and that's the issue.
and,
we're more likely to be defensive and arguments to my mind than we are in person. so when someone face to face is. I don't know you, I'm going to use an extreme example.
If someone's face to face to me is racist, I will be very tactful in how I approach that. Where was, if that happens online, I might be more quick in my response or less thoughtful of how I write the message and that's what we're trying to overcome at the moment is how do you still bring that sense of humanism into your online world?
Tim: [00:03:16] Yeah. Particularly when the algorithms tend to, to, what's the word? encourage dissent an offense. Like uni, you get extra. I exit. It's a matter of like embarrassment to me that my most widely retweeted tweet was a piece of snark. All right. Yeah. You know, it wasn't wrong or liable a certain thing, but it was a bit snarky about a public personality.
And that is the tweet that got retweeted most. And I'm kind of like, of all the things that I've said, it's probably the one of the ones I'm least proud of. And it went in the furthest. Yeah, no, and that's kind of weird, but the algorithms just drive that.
Vim: [00:03:56] But the thing is you, you've got that and you've got an ability to step away and understand that.
Whereas what we're finding at the moment is the our
unconscious.
Well, the way we wired makes us appreciate that. So for your average user, getting that feedback from something snarky tells your brain to do the same thing again, to get the same feedback and get that same endorphin rush. And then we end up in the cycle of just that becoming who you are online.
Tim: [00:04:23] Right? Well, I've had that. I mean, I've, I think I've said this to you before, but I've had somebody come up to me after I've given a presentation who said, and he said that was surprisingly reasonable of you. And I'm like, what? Where did this personality. Where did you get the idea that I'm unreasonable and the answer is online?
You know? That's kind of, that was also a bit of a wake up call. It's like, Whoa, where did that happen?
Vim: [00:04:50] Yeah. How did that become me?
Tim: [00:04:51] Yeah. Right, right. Yeah.
Vim: [00:04:53] Yeah. That's really interesting.
Tim: [00:04:57] So does Jacob have like a prescription for us or, or is it more just a, like a, a walk round of things we could do or how does that work
Vim: [00:05:08] Well so the company that, that Jacob works for and he doesn't know much better job of explaining this, the maze, to build these on my facilitators.
To have those conversations and give them the tools that they need to be able to do this. what's really interesting is I measure the impact if they have that kind of positive messaging so that they can prove that. Actually by changing this discourse, she create less polarizing opinions and views.
Tim: [00:05:34] Oh, that's cool. So it's like actual science as well as social science or, or, yeah, no, that's, that's neat. I'm going to, this is going to date the podcast, but I'll do it anyway. I saw the, this morning, Facebook had. They'd sent all of their content moderators home. So normally their content moderation is done.
And I'm not sure that like the exact facts here, but basically the rumor is that it sent all the content moderators home because normally they like work in these big open plan offices. And, and, and as a result, they had to like make full algorithmic, Moderation. Whilst that was transition was happening and it basically banned a whole ton of, perfectly factual medical posts.
Vim: [00:06:20] Wow.
Tim: [00:06:21] Yes. There's like a lot of people this morning, early this morning complaining that they're, you know, they link to the, whatever it was, you know, the surgeon general site or whatever it was. It had been taken down by Facebook.
That's been so that's crazy
because it seem like everyone's talking about this and therefore like it, it just assumed it must be, disinformation, which is kind of, that tells you an also awful love, really depressing things.
Actually,
Vim: [00:06:55] there's a lot to unpack. The in this podcast, we talk a lot about Cambridge Analytica and Facebook, and. Responsibility of organizations and all of that kind of stuff.
Tim: [00:07:09] It's funny though, but it isn't. I had assumed, and I think I'm like, I was being optimistic about this. It turns out, but I had assumed that like if you knew people, you wouldn't behave like that.
But we got that it's online versus offline that does it. It's not like I'm in a local, local group here. mostly people are reasonably civil, but they're much ruder than they would be in person if they met in the shop.
Vim: [00:07:34] Yeah.
Tim: [00:07:36] That's weird.
Vim: [00:07:40] Hidden behind a screen. So you don't need to, you know, your, nothing's going to happen to you.
Tim: [00:07:45] Yeah, yeah.
Vim: [00:07:48] Politeness and your civility comes from those face to face interactions and the time that the time and space we find ourselves in now and again, they in the podcast, but in a time of self isolation, having to work remotely, it's going to be interesting to see how, civilty is, is affected by this.
Do we need, are we going to lose that part of. Oh, social norms.
Tim: [00:08:14] wow, or does video does real time video, like take the edge off that risk? I kind of hope that like seeing how shocked somebody looks when you say something will help. Yeah.
Vim: [00:08:28] Yeah. But I know I've already started to see that kind of, it's easy to talk over people in a video call with them.
than it is in a meeting.
Tim: [00:08:43] I did some work years ago with the guy who had a really nice project, which is essentially in enforced politeness. And basically what it does is if you talk over somebody, it records both parties and plays them back so that like if you talk over somebody, then you get to finish one of these that you want to say.
But what they're saying gets recorded and then whilst they're listening to what you said, you listened to what they said. Yeah. And so I like the conversation. The, the, the like, it's not lost. You kind of have to wait your, well, you don't have to wait your turn, but it's sort of your turn will come and you know that the other person will, will eventually hear it.
It's kind of interesting. I took a while to get used to it and I actually, in the end I decided I didn't like it.
Vim: [00:09:36] But not working as well. Real time. That being quite effective.
Tim: [00:09:42] It works in real time, actually takes a little getting used to, but you know, you. You learn that you can actually finish the thought, even if somebody's going to try and interrupt you, cause you like hear them starting to interrupt.
But what you learn is you can just finish the thought and then shut up and then you'll hear what it was they wanted to say and they will hear what you finished saying. And then by the time that's over, you both heard each other's point of view, which is kind of, it's like I say, it's a socially odd thing, but it's interesting.
Vim: [00:10:17] Yeah. Yeah. Really interesting.
Tim: [00:10:21] Anyway, that was a slight digression there. So let, right. So let's, let's, let's listen to Jacob and, and see, see whether we can be super polite or automatically polite.
Vim: [00:10:32] Yeah, I've been, that's the aim.
Tim: [00:10:35] Alright. Right.
Vim: [00:10:36] Bye.
Tim: [00:10:37] Bye.
So unfortunately, due to a technical mess up on my part, the audio in the following parts of the podcast, the interview with Jacob isn't as good, as one would wish it to be. But, the interview itself is well worth listening to. So, actually I apologize for the quality, but please do stick with it because the conversation is well worth your time.
Jacob Lefton Interview
Vimla: [00:11:03] Hi, this Vimla Appadoo and you're listening to the distributed futures podcast. I've got Jacob Lefton here with me from How To Build Up, a peace tech movement to understand how to shift, shift peace in the world. Jacob you'd like to introduce yourself.
Jacob: [00:11:18] Yeah, I'm happy to. So Build Up , is an organization that was founded in, I would say in 2014 early 2015, around the emergence of the idea of a tech, practice, which is the idea of strategically using technology in peace, peacebuilding processes. And through the last half decade, our thinking on this has grown and expanded in our projectshave grown and expanded, to the point where we're, what we're talking about now is we transform conflict in the digital age, whether it's through technology tools or through arts tools or other participatory methods.
But basically what we work to do is combine peacebuilding best practices, participatory methodologies, and digital technology. Or arts or other innovation processes, to identify and address emerging challenges to peace. And so we find these emerging challenges are, I think you read the news and you see them pretty much everywhere in the world these days.
You've got, you know, disinformation and you've got, various injustices and you've got, questions about like what happens with AI, and then you have traditional peace problems and challenges where you have armed groups, and, political instability and such. And, so what we're able to do is we're able to use innovation practices to basically engage in new ways with this broad landscape of peace-building, with the aim also of engaging more people in participatory manner.
So that local level, they can own the process they are engaging in.
That's,
Vimla: [00:12:58] it sounds incredible. And I've got to say, my naivety in the space, I hadn't had a peace tech as a kind of movement before, before doing some research. And then do you find that that's common out there in the big bad world? Like to people understand what peace tech is and how it can be applied.
Jacob: [00:13:20] I would say that there's kind of a broad understanding that maybe technology can help and make things better. While there's also an understanding that technology has been used and is used to make things worse. And so there's this idea of Mmm, kind of technology as a tool, and there are many people in maybe Silicon Valley and in other innovation, entrepreneurial spaces who want to be able to use it.
Technology, to, to create positive impact in the world. and one of the things that we do in this space as we take this very specific piece of building lens, which is a lens that you apply when you start to do work in conflict context, which is in the kind of classical, I would say, governmental or UN sense.
This is after an armed conflict. What do you do to make sure that the society that's recovering doesn't fall back into an armed conflict, and that you can deal with things like the justice issues that have come up and the reconciliation issues that have come up and the sort of personal, trauma and healing that needs to happen as well.
So when you apply that lens to. Technology for good. You start to see to lots of challenges with the basic approach that an organization would take without conflict the sensitivity understanding. and so we bring this conflict sensitivity and combining it with the innovation approaches. and move forward with that
Vimla: [00:15:02] is really, really interesting.
So conflicts happens, at kind of numerous levels in society. That's the kind of, and then there's the war conflict then this kind of, if we think of that as kind of the most extreme end of the spectrum, but then you can kind of scale back to your everyday conflict of what you were saying, talking about on social media or, proliferation in the news, or even just your microaggressions walking down the street.
How do you think that. All of that's taken into consideration in the work that you're doing at the moment?
Jacob: [00:15:37] Yeah, I would say I would say so . probably when we get down to the interpersonal personal conflict or like I haven't had no confidence and then I have conflict on one on one with people around me.
This is maybe a little bit less our focus and kind of the, Larger community to society level conflicts. so we have areas of work that are in, mediation for example, on the top level government actors, and also across people from the communities into mediation processes that tend to be exclusive.
and we have work that goes into processes that support one-on-one conflict resolution, but use the technology and use the innovation process to make sure that that support is able to be deep and replicated and also, functions functions well. Yeah. But without going into the details of, of projects.
Vimla: [00:16:46] Yeah. I think, and that's one of the key points I think is often missing in.our kind of impacts on the world, I guess, is that really lifting people into the conversation so that it's done with them, not to them. And that was something that I saw a across when I was working in the public sector. I saw time and time again, is this kind of, well, this is just what the law says, therefore is what you have to abide by, rather than what is your circumstance and what's happened and what's your voice and your story.
That's. Needs to be taken into consideration as part of this. so I love that you're, you're kind of using the communities that exist to, to people bought part, bought into the kind of the peace tech itself. how would you go about kind of finding those facilitators and getting them involved?
Jacob: [00:17:38] So for a project like the Commons project, which maybe I'll talk a little bit about it since we haven't, Well, she used to get, I don't know, the Commons project is an intervention on social media, which is looking to address polarized conversations and polarization, political polarization on social media, and we started this project in, I would say we started thinking about it shortly after the 2016 election in the United States.
And in 2017 has been an approach, or, how can you use automation and interpersonal communication? So automation bots, and trained facilitators as part to, engage with other innovation and social media. until that point we've done, we looked at the body of research and a lot of work. Mapping polar region defining or they can, but there are very few are driven practitioners really engaging with it.
And those who did engage with it, either within a very limited conference or they're always putting out, when you put out a public call for people who want to have better political conversations, you get a very self selecting group. So we wanted to kind of break through that wall and rich people who are maybe the edge of the polar, which you probably don't quite realize are there.
Yes, they'll, maybe they realize it and they don't want to get out of there, man. so for us, it became, an internal project, do at Build Up with local partners. And so, for example, we run, upward fellowship programs. One is active in Syria, one is. Just wrapped up, in Mayama , maybe starting another one that we have programs around Yemen a couple of other places.
And, what that means is we accompany local standard living tech schools and public innovation tools in their process. and sure, this process, because I put an organization a little bit more than half as American. It was kind of, we were the conflict affected people in that case. Yeah. so we ended up bing that.
We had to, as we did that association search, as we move close the who wanted to get, across a political spectrum group with them, because it's very important to us that this public is multi partial rather than. Non-partial. So it'd be just a bunch of leftists coming in and saying, Hey, you want to have a better political conversation really about what is the, what is it?
What is the society needing? What are the people not a multimedia accident object. And so we essentially we activated our networks and say, Hey, I'm interested in this, who wants to learn this as a facilitator who wants to have actually quite difficult conversations on all week. and we ended up with a number of people who would join us.
And unfortunately, I think that people let go of it because of. A variety of reasons, ended up being more liberal, sort of centrist liberal, rather than really including the conservative voices. in part because I'm on the textual, that pushback, I get other work, et cetera. So there were, there were a couple of challenges and development process.
yeah. The aim and the intention is to set others not about winning a political argument that it's really about having, have.
Vimla: [00:21:32] Yeah. I like, that's so interesting because it's so, I think, you know, the world is in political turmoil at the moment and there are a lot of mirror scenarios happening across the U S and the UK.
I don't think it's quite often that people compare the Trump elections to Brexit and understanding the, the socioeconomic difficulties or devise that kind of sparked both. Although I don't, I don't know loads about either, just from what my echo chamber tells me. But it's, it's interesting to see the difference in conversations across different social media platforms.
So the conversation on Facebook is very different to the conversation on Twitter, which might be very different to the conversation on Instagram. How do you kind of pick and choose which social media platforms your having these conversations
Jacob: [00:22:25] on. Yeah, that's a good question. And then unfortunately, I think the answer is probably a little bit anticlimactic.
Really. It's resources. Twitter is by far one of the easiest platforms to access information from, and we're very familiar with Facebook. and we are less familiar with Instagram and. Part of it comes from my own personal ability. I've built a lot of the code in it. I know how to navigate the Twitter API and the Facebook API, and I just didn't have the time to learn Instagram.
Maybe I don't know what it is on our radar per picture. Hmm. Definitely like to do we engage in it.
Vimla: [00:23:12] Yeah. And I guess then like, so having a conversation is, is one thing. But then how would you measure the impact of that? Or even like the impacts of an argument, I guess? Like, is it something that's happening?
Jacob: [00:23:27] This is a really good question and it kind of gets at the heart of how you measure change. How do you measure like, okay. there's a huge conversation about how you measured peace in the peacebuilding field, was the UN sustainable development goals. That have metrics that measure a peace, but I would say on my point of view, those are measuring essentially negative peace.
So the absence of violence, but not the positive peace. Mmm. How, how well people are treating each other and getting justice, et cetera. Yeah. Mmm. So how would we measure this kind of compensation, especially . Is challenging because when many, many of them are people who engage with, we have one conversation with them.
so we have a, theory of change, which basically states if we can elicit a change in behavior individual, social media users, to encourage them more connection across the political spectrum, it's better for that. Our students use. And, and, and identification of shared values. It's building respect.
Then we contribute to a healthy political system and society. And so what we're looking for in the brief pieces of contact that we have people is do the people express satisfaction about the conversation. I think this is one of the. One of the key things that we are, that we landed on for doing this twice.
In a little pilot and then a scaled up version, for a total about nine months of active conversation hosting. we basically come to the point where we consider the project. I would, I call it, it's an intervention, but it's kind of a coaching intervention where the idea is that we're not going to change people's behavior immediately and in a big, what is impact this big explosion way we're looking, you know, we don't want necessarily huge turmoil, but we want to build this slow and steady
process where people are reccognize the types of conversations that they've been in that make them feel good and start to understand the tools that they can use and that are to continue to have conversations like this. and so it's essentially exposing people to things that make them feel good and exposing them to, and giving them access to the tools that they can use to.
Yeah. get experience, again,
Vimla: [00:26:15] that's an amazing way of looking at, because I'm speaking from personal experience, you know, responding to something I disagree with on Twitter and the kind of back and forth of what is then a really negative experience leaves me feeling crap. And then part of me is I've all, I need to have these conversations if I want anything to change, but I'm never left with a positive.
feeling . Whereas if I could have that conversation, yeah. but leave with a deeper understanding of that perspective. Oh yeah. The change in my perspective, or just an a conversation not an argument, I'd probably feel more positive about it and continue to do that. Which is, I guess the nub of what you're trying to do is, is shifting that kind of.
Triggered or aggressive approach to polarizing opinions so that it becomes a conversation, not an argument.
Jacob: [00:27:10] Yeah, exactly. Mmm. And I would say that, you know, it's a challenging space to work in. Mmm. There is, there is like an activist approach having conversations which basically says. If you're not with me, then you're against me.
And there's a tendency to sort of close up in the conversation process. And when one person closes up or pushes hard against another person, they close up. And then we tend to get groups building, you know, rank joint, like tightening ranks, and you tend to get, sort of an increase in. The in group cohesion rather than a angry increase in inclusion across skirts.
And in this process, this is where you've got to get the polarization. People use different language than each other. They use the same language but mean different things or different sets of facts, whether what those are in another conversation, but we want to conversation on the commons. It's not about are they right and I'm wrong, or am I right and they're wrong.
It's more about how can I see them as a member of a community that I live in. Yeah. And how can I make space for them to step over it and feel like they can express
Vimla: [00:28:44] themselves
Jacob: [00:28:46] yeah. In a way that. Helps them feel like I'm hearing human beaing.
Vimla: [00:28:56] Yeah, that's really interesting because I've often been on the side of, call it out . Like when you, when you see something you disagree with, with that insites hate, Hey, or could be dangerous, like called out report, do something about it. Don't engage you, which isn't always the right thing to do, but you say it's, you know, there's a deeper understanding there of understanding that perspective.
But where do you, what, what's that, that kind of that nine when, when should you stop engaging and start reporting or. Or do been doing the kind of the other side of
Jacob: [00:29:34] it? Yeah. This is, I think a line that every individual needs to draw based on where they feel comfortable. it's very much about one's personal identity and the work that one has done to sort of understand where my boundaryb is.
Where am I? God here is kind of my character, my side. And so it's completely reasonable that, someone decides this issue touches way too close to home. I'm very much not interested in engaging with this. so if I, I'm thinking about our process and the conversations that are happening around, for example, abortion, some of the very divisive issue in.
Mmm. in the U S where like abortion and the border and, Mmm. And then of course, Trump as, as a character, as a president, this idea of make America great again. And I recall at one point in the intervention process, we had some, conversations on Facebook around abortion issues. And some of the. Women who were facilitators were less comfortable with the conversation than some of the men.
I'm a little bit more distant on this. I can come in and have a conversation. Whereas some of the women like it was more personal point. They just said, you know, there are things like that I'm not going to engage with, and that's totally fine. and I think it's about knowing oneself and also in the process we did a lot of back channel organization uses those.
Here's, here's the hard conversation. Let's get through. People talk to me about, okay, how do we, how do we engage with this person who seems to be very, very, Expressive about their particular opinion. Always they are here and maybe this is a conversation that you don't want to deal with, and then you think that there's something here.
So let's try to craft some language as a group that helps you as a facilitator, keep personal identity out of it.
Vimla: [00:31:55] Have you, has there ever been a part of this where you might anonymize these or like set up fake profiles to have these conversations through? Or is, is that kind of against what, what its ambition is?
Jacob: [00:32:07] Yeah, that's a good question. And it's something that we've worked through different iterations of it. So when we initially started, we wanted something tweeting people from an account and then we wanted facilitators to come into the conversation.
I mean basically asked people about this. And what we've found is, and experimented as well, is one, on an ethical level, it's really important to us to be very clear that we're people and we are real people, and this does create a level of potential exposure for facilitators. So there is a risk in that.
But one of the things that we've found, for example, is on Twitter, if you have a new Twitter accounts and a lower number of followers. You don't get made with as much because you don't look like an important person. and one of, there's another piece of research around, using Twitter bots, in an antiracist capacity, which basically showed if Twitter bot has more followers who are more likely to respond positively to the intent of the message.
Vimla: [00:33:12] Interesting.
Jacob: [00:33:13] So there's an influencer sort of. Problem in there as well. And so people create new accounts. Then you have this situation where these are, they look, they end up looking like Watts. We certainly being called, you know, chose like Russian trolls and we've been accused of working for, George Soros and, what was the social media community that got a lot of notoriety in 16 or, Yeah, we got up. Who would pertain to the Analytica yet? So, it's just about when these types of things come up and my phone is that maybe administrative bubble the projects they, they have been and they say, no, we're not doing that. We're very much this little organization and you can read about us here, to try to keep it in a legitimate.
Human space as much as possible.
Vimla: [00:34:04] Yeah. It needs, it needs to be authentic, otherwise you lose that. It kind of adds to that dilution. and is being, is remaining apolitical, important. So it obviously came that the commons project started with the 2016 election. and I imagine that a big part of this is not to tell people that their opinions are wrong, but to understand opinions.
So. Well, how do you remain non bias or bi-partisan that, I guess.
Jacob: [00:34:36] Yeah, this is a very good question. is it's hard, right? When you come into these conversations and we have ideas of what we think is right and what's wrong. and on one hand. It's really important to not cut those off from our thoughts. When you talk about this as multipartial process where we're not, we're specifically not saying it's on partial.
if I'm going to engage in a conversation on political, topics, it's kind of impossible for me to divorce who I am from what I'm bringing into this. It's, it's. The person who I'm talking to is going to never just buy the lens. The questions I ask. and so what we really are moving forward with the idea is there is space for many, different political , but the ways that we have a conversation about these political views need to be healthier and so we model helpful approach.
We provide space for a person to essentially. State their opinion. We ask them questions or validate that are not using. and then we try to lead them into the conversation. Less about the specific details of the topic. And more about the awareness of the health of the political conversation that they're in and if they're, if they feel it's unhealthy, and would they like access to the tools or, I mean, engaging with this poor health of the politicall conversation.
Vimla: [00:36:22] Yeah, that's, yeah, that makes, that makes sense. And it's, I guess it's something that I've been trying to, do myself this like be less triggered, be that's, that's polarize it. Like, like you said at the very beginning, it's not, it's not, you disagree with me, therefore you're wrong. It's, we have different views, but let's understand one another.
I don't think Twitter is probably one of the harder platforms to do that because you have with, with Facebook, for example, you'd have some sort of connection with the person that you're speaking to, either through friends of friends or whatever it might be. Twitter is an open field. You have no connection with the people that you're speaking to often.
unless you're like me, in which case my echo chamber fully supports everything I do. and you know, that's dangerous as well. So. Part of me often will. so just today there's been, protest in London against the deportation of 50 Jamaican men. The crimes that they committed years ago that don't even stand in court anymore.
And I PO, I tweet it out. Like, how is this still happening? What's everyone in the throw out? You know, the standards, injustice tweets. And I wanted someone to kind of explain to me why it's happened or how it's happened or to give a different perspective. I really, really wanted to engage in a conversation because I just couldn't understand it, but no one that's in my, in my sphere had the other perspective.
And so the conversation stopped and it ended up being like, Oh, okay, this is just what my, how I've curated that safe space for me. And it's dangerous because it means I don't learn the other perspectives. And I, I ended up being more hurt when the outcomes don't match my values or my belief systems.
Jacob: [00:38:08] That's where we're are seeing that a lot. You know, this election is a show up. It got it, and then all of a sudden it's different than,
Vimla: [00:38:17] yeah. So speaking about elections, what, what do you think. The impact you'll have or do you think you'll have an impact on the upcoming election?
Jacob: [00:38:30] This is a hard question to answer right now.
We're looking for funding to work through it and it's a bit difficult because, it seems like groups working for social good are reticent to engage in this type of project, even though they're very excited about it. But there's, there's hesitancy. Meanwhile. Marketing organizations, political organizations, government organizations, but governments, etc.
Are heavily engaged in this, like in an antagonistic way. And so, what I would say is, you know, we're looking for are people who want to support, it's, it's, it's a, it's similar strategy with a different aim.
Vimla: [00:39:15] I think. There's, you've really hit the nail on the head. It's, you know, the, the extreme example was Cambridge Analytica used the hard data and tools to change people's opinions through what they saw, their beliefs and their values, and that was fine.
Whereas switching it on its head and having the conversation around it, and not even trying to change people's opinions, but trying to open a debate. Isn't seen in the same light because it's not as black and white and it's, and that's really difficult to accept because it's, one had very concrete outcomes and measurable impact, which was what their business model was made on versus this kind of, let's just see what happens if we have a different conversation and it might be really good and let's learn from it.
Jacob: [00:40:07] Yeah. I mean I would say that I'm, I'm convinced that if we did this at scale, it would have a positive impact. We have, a couple of, it's hard to tell it's on Twitter cause it's such a large ecosystem. But from what we were looking at, and we have a couple of quantitative hint that. The conversations were had a positive effect of bringing people across certain groups little closer together.
And then on Facebook we have, and on Twitter we have a huge hosts of qualitative data. Basically from the conversations that we've had with people of reaching the end of the conversation, the person saying, you know, thank you. This is interesting. Furthermore. We have the data from our facilitators, which basically shows that the greatest change came from people who were facilitating.
So the greatest depolarization. Not saying that they changed their political views at all, but they felt depolarized they felt they can have a conversations with someone who disagreed with them without completely closing up and shutting up cause they have tools to know how do I ask the questions so that I understand this person's position?
How do I earn that? I care about them as a person, even if we disagree. And then how do we begin to start to examine them? The balance of that. and with the facilitators practice this a thousand times. They in a much deeper understanding of where they sit and so I think one of our goals was essentially to get more people, we facilitated these types of conversations.
How can you work on your conversations?
Vimla: [00:41:58] Yeah. And it's really interesting that you say that because one of the things I have to constantly remind myself of is, someone having different values and beliefs. To me. Doesn't change my values and beliefs, and I think we live in this world where someone is, like you said at the beginning, like someone with those different, like polarized views to you.
We suddenly see it as a really personal attack on who we are when actually it doesn't, it doesn't change who you are. It doesn't change what you believe in. It's just
Jacob: [00:42:27] coexisted. Yeah, it can be an uneasy coexistence. Right? And so I want to be sensitive to that. There are places and spaces, and like , legitimately, these spaces are not designed for depolarizing conversations.
These spaces are designed for, protecting and uplifting an ingroup. but. We think that that can be, those two spaces can exist where you can have in groups that support each other, but you can also learn to make sure that as groups support each other. There's third corpus of shared values that help and to do every morning in society.
you know, regardless of how.
We disagree with our neighbors, they still share the street.
Vimla: [00:43:23] Yeah. And that's an amazing way you're thinking of that. and just the kind of final question to finish on it, what do you think the future that's like in this space?
Jacob: [00:43:37] Yeah. I mean. I'm hopeful on one hand is because people are starting to discover methods or engaging in these harder conversations. Mmm. I am an optimist, so I'm optimistic point of view, but, like we're learning new tools. Peace building is a relatively, new. The world's peace tech and all that corner of that, and social media is this, you know, grand new, very disruptive space that has turned to a type of, Essentially, you're having our, our political conversations there. And so these are spaces that are. Owned by private companies and private companies essentially have the ability to regulate self regulate them. To a large degree, which means that they ultimately make decisions on how the platforms work.
And so they ultimately make the decisions on how these conversations are shaped and who they respond to, when the conversations need to be different. And so it makes it a little bit difficult to work in the space. You never quite know are things gonna change on us. And. The organizations that own these spaces don't necessarily want to share any insights with us.
So it's hard sometimes to verify if their fixes actually fix things without being actively engaged in the platforms. so on the one hand, we're learning a lot. We have some very good methodologies that come down. Experience and practice. and on the other hands, it's a tough, it's a very tough ecosystem, a very tough place to be working in the communities, large companies and people.
And it's the changing nature of the conversations. for example, I read an article the other day that says Twitter is going to be maybe implementing a tool that allows you to change your comments. Interesting on your, your tweets, so public or just friends or specific groups of people or just the people tagged in it.
and if that happens, for example, then it might dramatically change what we're doing and we have to reexamine our outreach strategy. But the core facilitation tools, the core conversations that we're having are still, Strong methodology things that we've learned that we can take into other processes as well.
So it's a little, you know, plusses, as a minuses. The future is, is, mean, yeah. Even from an optimistic point of view, it looks a bit grim sometimes, but we, are learning and we're continuing with going forward.
Vimla: [00:46:28] Yeah. And I think one of the key things I'm going to take away from this conversation is we all have a voice to use and it's up to us how we want to use it.
And that's a really positive message for me. And you know, we can, we can do it in a way that makes us feel safe and leaves us feeling positive. So why wouldn't we do
Jacob: [00:46:48] that?
Vimla: [00:46:51] Thank you.
Jacob: [00:46:53] sorry. I think that's a good takeaway.
Vimla: [00:46:55] Thank you. thank you so much for your time today and apologies for all of the tech faf at the
Jacob: [00:47:00] beginning.
Vimla: [00:47:04] Yeah. so if you want to send over any links or things that you want to be, wants to have included in the podcast, when this goes live, please send them over and it will be going live on Spotify, Apple, podcasts and, on our website as well. probably in about a months time we've got kind of a backlog that we're getting through to get out.
So. Would be a little bedtime, but I'll ping you when it goes live as well.
Jacob: [00:47:29] Sure. Sounds great. I will send some links and, okay. if you have any followup questions or anything that you feel like, Oh, this was interesting, that actually, and I want more information, please feel free to reach out.
Vimla: [00:47:44] thank you so
Jacob: [00:47:45] much.
Yeah,
Vimla: [00:47:46] thanks. I hope you have a great day.
Jacob: [00:47:48] You too. Thank you. Thanks. Bye.