Tim: [00:00.02.29] Hi. I'm Tim Panton.
Vim: [00:00:04.16] And I'm Vimla Appadoo.
Tim: [00:00:06.12] And this is the Distributed Future Podcast where we talk to people and try and learn about what the future might look like usually looking at what the present is doing now in sort of niche areas, and hopefully that tells us what the future is going to do. And we're particularly interested in the intersection of security, technology, and society and how those things mesh together. This episode is backed by some kind of identity stuff we saw actually on Twitter but about virtual humans. And I was thinking about how difficult it would be to tell the difference between a virtual human and a real human when you see some of the things that they've been doing. Vim, did you look at the pictures that it generated and the animations it generated?
Vim: [00:00:56.17] Well, I'm assuming that on that website it's the images in the background. But that was my assumption.
Tim: [00:01.05.11] Yeah, there were a couple of videos in there. And they had very real-seeming pseudo people who didn't exist which is quite kind of weird actually.
Vim: [00:01:16.17] Yeah.
Tim: [00:01:16.17] When you look really carefully, you thought perhaps you might be able to tell cuz they all had absolutely identical perfect teeth.
Vim: [00:01:24.13] [laughs] Interesting. Which suits an American market though, doesn't it? Really.
Tim: [00:01:30.02] Right. Right. Right. Right. So--
Vim: [00:01:32.03] I--
Tim: [00:01:32.08] --maybe they were just Americans.
Vim: [00:01:34.10] Yeah, I think it's about the intent of those images. Like if it's intended to be as you scroll through a feed, you're not really gonna pay that much attention to it anyway. It's a judgement they're real or not. There was a lot of this during Obama's presidency of people doing kind of hyper realistic videos of him doing fake speeches. And it does terrify me to think we're getting to a point of unknown reality, and it's beyond fake news. It's like, "Is the person that I see in this video real? Do they actually exist?"
Tim: [00:02:18.00] Right. But there's two things there. One of which is that, is this a real person or is it a pure fabrication? But the other one is, if this is a real person, did they really say that? Are those really their lips moving? Okay?
Vim: [00:02:34.00] Yeah. And then--Sorry.
Tim: [00:02:37.22]No, go on.
Vim: [00:02:38.19]Then it's even further of, "Well, then if it's not them, whose opinion or script am I hearing? Who has scripted this? Who's put it together, and what was their intent with this video?"
Tim: [00:02:54.12] Yeah. I'm assuming that if you think they might not have your best interests at heart, then there's this fun thing where not--If you look at what targeted advertising does, you could go even further and have the people in the adverts being synthesized so that you find them attractive.
Vim: [00:03:16.29] Mmm. Yeah.
Tim: [00:03:18.16] I would get a totally different person representing a political cause or washing powder or whatever, and they'd be like maybe targeted based on dating profiles.
Vim: [00:03:31.13] Yeah, I think I watched The Social Dilemma over the weekend last weekend. There was nothing in there that surprised me particularly especially after seeing hyper normalization and knowing what I do about targeted ads and profiling. But it did hone in on the predictability aspects of it which I don't know if you've seen Devs the TV show that was on BBC.
Tim: [00:04:02.06] Oh, yeah. Yeah, yeah, yeah. Yeah, yeah.
Vim: [00:04:04.14] I know building technology that is accurate to 99.9% of our predictability but not just of thought, with movement as well. And I think that isn't that far off our reality. We will get to a point of--And I brought this up in a previous podcast but of questioning our free will and free choice. Because if technology is able to predict every move or every purchase or every thought, where do we actually have any control over what we do anymore?
Tim: [00:04:43.06] Yeah. I'm slightly more optimistic about this than you because a lot of the targeted advertising is just terrible.
Vim: [00:04:52.01] Mhm.
Tim: [00:04:52.10] And if you look at the way that targeted advertising has been sold, it's been vastly oversold compared with its actual capabilities like or that's my perception that, "Hey, maybe I've just been fooled."
Vim: [00:05:07.21] Yeah. Yeah, yeah. Or maybe it's so false for you it's actually right. [laughs] I get it's kind of gone full circle, and it's definitely not what you wanna see. But it actually is what you wanna see.
Tim: [00:05:20.06] Yeah, we were talking about that predictability at a really funny conversation with somebody the other day about like how you--With these digital humans, how can you differentiate yourself from a digital human? Like if you want to be what's the future version of authentic. And they were saying, "Well, you don't have to do something that a machine would never do, like something illogical or irrational or something that the advertisers sponsoring it wouldn't welcome."
Vim: [00:05:56.04] Yeah.
Tim: [00:05:56.09] And therefore you can prove that you're real.
Vim: [00:05:58.28] So I've got two thoughts on this, if the humanity is based on emotion and on illogical responses to emotion. And my mind was always gonna separate synthetic and real. However, I do think that that degree of illogical reasoning can also be hard coded. So not really simply, yeah, nine out of 10 times--No, 100% of the time it's a human's instinct to pull away from something hot. But there's also a 0.01% chance of keeping it there because you're overriding your instincts to keep touching it. And that's quite a simple thing to put into a synthetic human of nine out of 10 times you pull away, one out of 10 times you keep on there. And that's the kind of thing that then you extrapolate, and it kind of builds in that unpredictability into an AI form.
Tim: [00:07:04.05] I think what you're saying there is that you can simulate emotion with random actions.
Tim: [00:07:16.23] If that's true, it's really depressing. Like I--
Vim: [00:07:20.23] I wouldn't know that's even possible, but I know in my head is that can't be that far removed from where we're at.
Tim: [00:07:28.15] Yeah. I don't know. I think this stuff is really complicated, and it's coming at us really fast. And I [unintelligible 00:07:37.00] there must be hopefully there are people thinking about it more deeply than we are. But--
Vim: [00:07:41.22] Yeah. Yeah.
Tim: [00:07:41.28] --like I hadn't read them.
Vim: [00:07:44.01] No. Well, it's when I was working in machine learning for a little bit, not as a coder but just on the trying to bring human-centered design into it. And that's how it was explained to me is when you're building an algorithm particularly to sell people stuff, you add in an element of serendipity as one of the things that you consider. Someone might be looking to buy a bike, but every five images or every five links you might show a scooter because they might just in case think, "Actually, this is what I want." So it is something that's been built into the way that we access ads and services, and it is a part of it. So if you put that into the way we're thinking about humans and building AI in human like behavior, that serendipitous moment of you offer something completely random just isn't that far removed.
Tim: [00:08:46.23] Yeah, yeah. So that then leads back to this thing about how do you prove that you're you and not--
Vim: [00:08:53.16] Yeah.
Tim: [00:08:53.26] --a simulation of somebody a bit like you? And how do I tell that I'm genuinely talking to a real human rather than--And does it matter, I suppose? But assuming it does matter to me that I'm talking to a real human rather than an algorithm, how do I know that they're real? And this is the kind of thing that is really coming at the social networks at the moment trying to deduce like trying to prove real identity somehow or--And yeah. I don't know if you've followed it, and actually you might have been in government at the time when the government ID service thing kicked off. Were you there for that?
Vim: [00:09:33.29] No, I was on the peripheries of it.
Tim: [00:09:37.06] Right. Cuz there was this whole effort which appears to be collapsing completely now where the government was trying to outsource identity to like trusted third parties like the post office which I kind of vaguely--
Vim: [00:09:52.01] Yeah.
Tim: [00:09:52.11] --think is okay maybe. But other things like the credit reference agencies, like I think Experian was one.
Vim: [00:09:59.16] Yeah.
Tim: [00:09:59.16] Experian is the company that's had I think the single largest data breach in human history. They were a trusted identity provider for the British government.
Vim: [00:10:09.15] Yeah.
Tim: [00:10:10.03] And--
Vim: [00:10:11.06] Yeah.
Tim: [00:10.11.21] And Pornhub is the other one or rather the company that owns Pornhub is the other big identity provider that got into this. And bless them. I don't generally agree with them, but the Inland Revenue the UK tax department declined to take part in this government-backed identity scheme. And they said, "We'll run our own identity. We want to manage the identity of our--" What do they call it? Do they call us clients now? I can't remember.
Vim: [00:10:36.20] Mhm. Customers.
Tim: [00:10:39.25] Yes. Well, except that it's the wrong way around. They take money off us. No, actually that's right. Yeah. But anyway--
Vim: [unintelligible 00:10:45.21] customers assumes choice in how you spend your [unintelligible 00:10:49.28]
Vim: [00:10:51.03] Personal choice about using government tax.
Tim: [00:10:53.16] Yeah, although tax is more negotiable than you think from the outside.
Vim: [00:10:57.12] Yeah. [laughs] Yeah, it's definitely--No, you can't opt out.
Tim: [00:11:02.11] No, not completely. Although some people do. [laughs]
Vim: [00:11:06.17] Yeah.
Tim: [00:11:08.10] Yeah, so that whole thing kicked off. And it's been kicking off this week again.
Vim: [00:11:13.13] Mhm.
Tim: [00:11:13.13] And then they're gonna tie it back to access to websites. There will be websites you're not able to access unless you can prove you're over 18 with a government-sponsored identity provider, so that's all getting quite complicated quite fast.
Vim: [00:11:33.02] Yeah. That's interesting because quite simply I was on Facebook before I was the legal age to be on Facebook.
Tim: [00:11:42.05] Right.
Vim: [00:11:42.05] And had that existed there, then it probably wouldn't have stopped me from using it. But I'm sure there would have been a way around it as well.
Tim: [00:11:56.11] Well, yeah. So then you're kind of getting the whole fake ID thing.
Vim: [00:12:00.22] Yeah.
Tim: [00:12:00.24] Only digital.
Vim: [00:12:03.01] [unintelligible 00:12:04.06] fake IDs.
[Vim and Tim laugh]
Tim: [00:12:06.08] Well, yeah, what do we do with those? [laughs]
Vim: [00:12:08.23] No one drinks anymore, do you know? [laughs]
Tim: [00:12:11.17] Well, not in lockdown or rather--Actually, do you have to prove--Yeah, yeah, yeah. So I think that you have to prove that you're over 18 to receive a delivery of alcohol.
Vim: [00:12:24.01] Yeah, you do.
Tim: [00:12:26.05] Yeah, and I don't know how you check actually. Like if you--
Vim: [00:12:28.14] You're meant to show your ID when they give it to you.
Tim: [00:12:32.02] Yeah. But, well, ID counts. Like 9:00 at night and you're the Sainsbury's delivery driver.
Vim: [00:12:43.07] Yeah.
Tim: [00:12:43.16] Are you really gonna be able to check whether this is a legitimate driving license?
Vim: [00:12:48.03] Yeah. I once lost my driving license, and I have a Mauritian ID card. So I was using that cuz I didn't wanna take my passport out, and it got rejected in a lot of pubs and bars because it wasn't a recognized form of ID. And I kind of was like, "Yep, it's got my date of birth on it and my--Now why wouldn't you take this as real?" But I get that falls into a whole other conversation about identity and culture.
Tim: [00:13.21.02] Well, it sort of isn't because what that tells you is that it's government-sponsored IDs that actually have the highest trust value.
Vim: [00:13:34.02] Yeah.
Tim: [00:13:34.26] And so it's an odd decision of the British government to try and get out of that business.
Vim: [00:13:39.03] Yeah. Well, is it? Or is it actually a way of trying to force people to be verifiable through government-selected means?
Tim: [00:13:51.03] Or in a broader context.
Vim: [00:13:53.07] Yeah, in order to access over 18 websites online, you have to have registered with the UK government through any of these forms so that we cannot track you but we have an understanding of who you are and where you are.
Tim: [00:14:11.26] Yeah. And that won't get abused.
Vim: [00:14:13.28] It's really interesting for me because I'm working on a project at the moment around community-led security. And the way we're trying to gain community insights is through storytelling and a future's thinking of utopian dystopian visions of different levels of control and identity and all of this kind of stuff from kind of your GP has been paid to give your medical records to the police. What do you do? How do you feel about it? What do you think the outcomes are gonna be to the polar opposite of your GP has asked for your consent to give your medical records to the police? What do you do? Do you give your consent? Because we, as a society, aren't that informed about how this is all happening. And when you break it down into really simple products and services and exchanges, it shows the level of complacency we have around it. Like actually do I care about whether my GP gives those medical records? Or do I only care because you're asking me whether I want to or not?
Tim: [00:15:30.02] Right. That was the great cookie experiment, wasn't it? But the idea was that putting up a cookie banner would remind you that people are asking for data you didn't necessarily expect to give them and won't give you an opportunity to say no. And that was I think largely subverted by the technology people who just made it such an unpleasant experience that everyone clicks 'Yes'.
Vim: [00:15:55.19] Yeah.
Tim: [00:15:56.29] There's a lesson there somewhere.
Vim: [00:16:02.23] Oh, yeah. That's not how you informed consent. I know it's the law. It's exactly the same as the Apple T's and C's that is always used as the joke if you just scroll to the bottom anyway, instant ticket regardless of what it's gonna be. And I can remember a really funny South Park episode that riffed off that. So the South Park characters were buying an Apple product, took the terms and conditions and actually meant that they were giving their consent to take part in The Human Centipede.
Tim: [00:16:42.03] Right.
Vim: [00:16:43.04] And this is my kind of thing like, "Why don't I actually know what I'm saying yes to at all?" Because I [unintelligible 00:16:49.23]
Tim: [00:16:49.21] Right.
Vim: [00:16:50.19] So--
Tim: [00:16:52.10] Yeah, I know. One of the games companies actually did put in their T's and C's something quite alike you've given your soul to the devil as one of the T's and C's. And everyone agreed to it. But, yeah, I think they only ran it for a month or so. Everyone agreed to it until somebody spotted it, and they gave him a prize. I've forgotten what it was. But, yeah, gave him a prize for actually having noticed that they put something ridiculous in their terms and conditions. So, yeah. [unintelligible 00:17:24.01]
Vim: [00:17:24.01] Yeah. I think it's quite good. Well, not good. But like funny cookie policies of it's just for the cookie recipes and stuff like that which is funny but also shows how few people would actually click on it to read it.
Tim: [00:17:47.02] Now we are rebuilding a website at the moment. And one of our things is that actually we don't want to track anything on it or rather we don't want to use external trackers. And we wanna do absolute minimum tracking. We wanna check the load on it and make sure it's stable, and that becomes a little bit--
Vim: [00:18:14.00] Yeah.
Tim: [00:18:14.25] But we're not interested in individuals. But it turns out that actually if you go to like a website builder and you say this, they look totally horrified at the idea that you might not want to have like the full suite of Google Analytics and all of this.
Vim: [00:18:24.20] Yeah.
Tim: [00:18:25.00] And I'm like, "Well, we don't do that. We don't need the data, and we don't want to collect it."
Vim: [00:18:30.02] Yeah.
Tim: [00:18:30.08] And they're like, "Oh, well, it'll cost you more." And so we're actually paying extra to have a site with fewer functionality. I get why cuz it's not what they normally do so they can't just cookie cut another one.
Vim: [unintelligible 00:18:42.10] to them.
Tim: [00:18:44.00] Yeah, yeah.
Vim: [00:18:45.00] That's interesting. So, with culture shift we are exactly the same. We're not interested in any identifiable information or data. And we just don't wanna track people using the site because it's people rappot and bullying and harassment or experiences that they're trying to get support for. So we're not interested in who they are or what's happened to them. But we are interested in how difficult it is to fill out a form to get support because that is how we need to try and remove those barriers.
Tim: [00:19:22.25] Right.
Vim: [00:19:23.10] And trying to explain that to people is really difficult because as soon as you say--And actually, you don't have to tell people of that sort of backing. But in order to kind of explain we don't care who you are. But we know how you use the site is quite interesting. And what I found personally is the more you try to--Like I was saying before the more you try to ask and inform, the less people are to comply. Whereas, if you stay silent and you don't talk about it, that complacency rises.
Tim: [00:20:05.11] Yeah.
Vim: [00:20:05.15] There's much more of a inertia of, "What? I don't care. But I'm much more interested in using the service." As soon as a hint of choice it's then like, "Well, no. No, I might do it."
Tim: [00:20:19.22] Yeah, yeah. Yeah, I know--
Vim: [00:20:20.19] And I do believe--Sorry.
Tim: [00:20:22.19] I don't know how you fix that.
Vim: [00:20:11] No, cuz you're trying to do the right thing.
Tim: [00:20:28.27] Yeah, I think the only thing you can do is to make it implicit that that is what you're doing rather than asking a question. Make it kind of almost the point of the site that you're really upfront about, this is what this site does. And anything that isn't like core mission to the site, you just don't do.
Vim: [00:20:52.26] Yeah.
Tim: [00:20:53.01] But then like I said like for sort of stability and quality purposes like how do you do that? We're doing a similar thing for somebody else, and what we're actually ending up is running a quite large beta program so that in the beta program people understand that they will be tracked.
Vim: [00:21:16.19] Yeah.
Tim: [00:21:16.23] And so we'll find all of the issues in the beta program. And then when we roll out to real life, we'll hopefully know enough that we can turn off by all of that, all of the questionnaires and all the nagging prompts and stuff. And then--
Vim: [00:21:33.16] Yeah. Yeah.
Tim: [00:21:36.16] But I guess we'll miss something, and we'll never get told until somebody complains kind of loudly enough to get through.
Vim: [00:21:43.09] Yeah. Yeah.
Tim: [00:21:24.11] Which is tricky.
Tim: [00:22:04.15] Yeah, that's what incognito mode is for.
Vim: [00:22:08.04] Yeah, yeah. True. Although some of the T's and C's on social media websites override incognito, so they can still track you even if you're in incognito.
Tim: [00:22:21.04] Yeah, there's a big effort in the standards bodies to make that less and less successful.
Vim: [00:22:27.07] Yeah. I--
Tim: [00:22:28.17] But it's still doing nothing.
Vim: [00:22:30.28] Yeah, I also wonder if organizations should be forced to ask multiple times to continually prompt people to reassess cookie policies and consent so you don't want some Facebook. And that should be every month you get permitted to reassess what you still consent.
Tim: [00:22:52.03] Yeah, Facebook is actually perfectly usable in incongnito mode.
Vim: [00:22:56.26] Yeah.
Tim: [00:22:57.13] I do that the whole time. Google weirdly isn't. Google is unusable in incognito mode. It does keep asking you to log in which is like well, it drives you towards DuckDuckGo and the others who don't take that view. But--
Vim: [00:23:10.03] Yeah. I feel [unintelligible 00:23.11.25] meant to be evil though.
Tim: [00:23:14.07] Meant to be not evil.
Vim: [00:23:15.27] Yeah.
Tim: [00:23:16.00] But I think we're way past that point.
Vim: [00:23:18.19] Yeah. [laughs]
Tim: [00:23:20.07] But, again, what they know about you is just amazing. Although weirdly kind of this is the other thing that struck me this week is how much this sort of knowledge actually ages. So I had a thing with the bank which the little token is running. The battery's running out, so I want to replace it. And so I rang them up to say this. And they said, "Well, you'll have to answer your security questions." And I said, "Well, that's going to be interesting cuz I don't even remember when I set them up."
Vim: [00:23:51.12] Yeah.
Tim: [00:23:51.18] And she said, "Oh, well, I hope it's a football team because that's the only thing that people never change."
Tim: [00:23:58.20] And of course it wasn't. All of those, what is your favorite ex? Like they change over time. Like you go off people or singers or whatever ice creams or whatever. You change. And so you actually have to cast your mind back to like, "Who was my favorite singer 15 years ago?"
Vim: [00:24:27.03] Yeah.
Tim: [00:24:27.22] I don't know. And so it's ridiculous to kind of cast those kinds of questions in stone. The whole thing just fails for me, that sort of test of identity of like what your answer to a question 15 years ago was. I like to think we evolve a bit.
Vim: [00:24:47.11] Yeah. It's quite sad if we don't. I do think there's something really interesting about who we are, our online identities versus our offline identities and how companies or even the government to a certain extent are trying to merge the two and also how conscious we are of how different those identities are.
Tim: [00:25:11.08] But do you think that lockdown has blurred those lines?
Vim: [00:25:15.25] Yeah. Yeah, it has. We've had to become our online identities in some sense.
Tim: [00:25:22.05] Andwhether that in some cases that means projecting more of ourselves into online than we necessarily did in the past, and I think that's true for a lot of people. Or in some cases it's like kind of changing our online identities to be more representative of who we kind of really are.
Vim: [00:25:41.12] Mmm. I had quite a harsh reality check a couple of weeks ago because I noticed myself, polarizing is too strong a word, but definitely becoming less able to listen to other opinions particularly about race. And I had this horrible moment of like, "Oh my gosh, it's happening." Like that merging of like the echo chamber. I'm becoming my echo chamber in ways I've never wanted to or realized before. I follow a lot of strong people of color on social media who talk a lot about injustice and racial injustice and all this kind of stuff, but I definitely don't ever wanna get to a point where I can't have a constructive conversation about race. And I felt myself getting to that point, and it terrified me. I was kind of like, "I've become that person that just can't debate well anymore online or offline." Because those identities or that influence has happened without me even realizing, and it was terrifying. Absolutely terrifying cuz I consider myself a well-informed person about this.
Tim: [00:27:00.02] And I think what that's about is, one of the potential causes of that is the lack of serendipity. Like the thing about kind of real life is that you do bump into random people who you aren't looking for who aren't in a curated environment, and maybe you do like have a quick chat at the bus stop or whatever and learn something. Maybe you don't. A lot of time you really don't. But some of the time you do, and that's where we get our new inputs from. And I think the problem with digital media is that it's so actively curated either by ourselves or by the services that we don't see a whole swathe of views and people. And also people self curate the views that they put out there and not always in a good way. I think it's tricky this, I don't know how you kind of induce more kind of warm serendipity without like I don't know. It's hard. It's really hard stuff.
Vim: [00:28:06.17] Yeah. Yeah, especially when it's something that is important to you. Like it's--One second, the dogs gonna bark at myself in time at this point. I forgot what I was saying. Building and serendipity moments.
Tim: [00:28:24.29] Yeah. And how do we do that? How do we get the digital version of that?
Vim: [00:28:30.25] Yeah. Yeah. But also without someone in the room kind of going, "Just to play devil's advocate." which is a phrase I hate. So I think that there is something with, yeah, trying to do it in ways that is constructive and true and, like you said, authentic. And it reminds me of the conversation we had, the podcast we did around trying to embed that into social media as a result of the 2016 election and how you need to be trained almost to instigate those types of conversations or those serendipitous moments. Because it's--
Tim: [00:29:13.05] Do you think that's effectively a new kind of manners that we'd have to learn? But there used to be a social convention about like how you met people and they probably still are just not aware of what they are.
Vim: [00:29:28.08] Yeah. Yeah. Yeah, I think you're right. And I think that's what the focus on emotional intelligence and in cultural intelligence and being human, I think what that's all ending itself towards is how we build that back in because for a long time the focus has been purely on academic achievement and IQ and hard skills. And I think we're seeing the reverse happen now where it's getting back to self-reflection, who am I? Who do I wanna be? How do I live in a multicultural society? How do we work together as humanity? And I think the pandemic and the social movements that have happened throughout it have played a big part in that and particularly in the UK where there was a really strong sense this time last year of everyone being in it together. I think we've lost it now, leaned ourselves to a much deeper understanding, empathetic understanding.
Tim: [00:30:37.12] I think a lot of that is true for adults, but I think children have had a completely different experience of the pandemic. And I think the lessons that they've learned are gonna be very hard to unlearn. I think their attitude towards technologies and particularly kind of real-time technologies are gonna be tied to their experience of being forced to sit in front of interactive learning--
Vim: [00:31:08.20] Mhm.
Tim: [00:31:09.06] --in a way that it's been pretty unsatisfactory actually--
Vim: [00:31:14.29] Yeah.
Tim: [00:31:15.05] --for all of them. And I had an interesting experience where a friend of mine whose seven-year-old had a birthday during lockdown who couldn't have a birthday party--
Vim: [00:31:27.21] Yeah.
Tim: [00:31:27.21] --and hasn't been hanging out with his school friends obviously for the last how many weeks it is at this point and won't do that over like Zoom because Zoom is a school thing.
Vim: [00:31:40.19] Yeah.
Tim: [00:31:40.22] So we built this little racetrack and drove around. And what we're--
Vim: [00:31:44.10] I saw your tweet about that.
Tim: [00:31:46.18] Which was lovely. They had like five of them driving around these little robots. But what was really interesting and what I kind of took as a big win was that they also would put them all into like an audio conference at the same time. And they just like chatted and did the stuff that seven-year-olds do, and I wasn't listening to this which is even funnier actually cuz you could just tell what they were doing, what they were saying from how they were driving the cars. But what was really interesting was that one of the parents contacted me afterwards and said they are now starting to do social conversations over, well, not actually Zoom but one of those technologies in a way that they weren't before cuz they could see that it's fun and it's not only for school.
Vim: [00:32:33.09] Yes.
Tim: [00:32:33.25] And I think there's a bunch of things that there's the way that technology is being used for remote schooling. There's a bunch of prices we're gonna pay for that down the line. I don't know what they are, but it worries me.
Vim: [00:32:49.19] Yeah. Yeah, I definitely agree. And I don't think it's the school's fault at all. I think the way schools have responded has been outstanding because it's completely unprecedented. But I think if you did have more technologists in schools, that would have been a really different solution to videos and video calling and, well, like exactly like you've proven there. You built the racecourse thing. And I think even instead of like pair coding, imagine doing paired maths or paired English or all of those different things that you could in theory do that not that I've seen that hasn't happened.
Tim: [00:33:38.14] Yeah, I think the whole doing things doing homework together that kind of stuff, break up into groups and do this.
Vim: [00:33:46.22] Yeah.
Tim: [00:33:46.25] I guess some people have done that, some teachers have done that. They're already, as you say, doing an amazing amount of work and like extra work than they have over and above their normal kind of load.
Vim: [00:34:03.00] Yeah.
Tim: [00:34:04.06] So asking even more of them is unreasonable. But--
Vim: [00:34:36.14] Yeah. Well, I'm not saying the teacher should have done it. I'm saying there's an opportunity to rethink group jobs in education, and bringing different skills into that mix is only gonna benefit that outcome. Imagine a service designer, and they're all like a technologist or a systems thinker or any of these kind of roles that are rife in technology in that field the kind that gets me excited to think about how many different options I could have been to rethink learning.
Tim: [00:34:45.02] Right. Right.
Vim: [00:34:46.25] And now I think--
Tim: [00:34:46.25] I wonder what their attitude towards identities can--Kind of circling back to the beginning, I wonder what impact that's gonna have on their attitude towards identity.
Vim: [00:34:58.28] Mmm. Yeah, I don't think kids would care. But we tried to watch The Social Dilemma with a 13-year-old and a 10-year-old, and they just didn't care at all. Like, granted, it's not a program for kids. Because they use the internet a lot, we were trying to show them just what is happening when they're doing it especially from a young age and just didn't. Yeah, didn't care at all. Just--
Tim: [00:35:32.18] Yeah.
Vim: [00:35:32.21] --wasn't helping.
Tim: [00:35:33.06] Yeah. Interesting.
Vim: [00:35:35.10] Mmm.
Tim: [00:35:36.07] Although, they do evolve their own ways of like dealing with this in terms of generating like separate accounts for different purposes in their lives like--
Vim: [00:35:49.23] Yeah.
Tim: [00:35:49.25] --accounts their parents don't know about or accounts that their closest friends know about but the kind of general peer group don't and that kind of stuff. So they've gotten that definitely, the sort of phenomenon of having multiple Instagram accounts for different purposes. They do get compartmentalization, but I don't think it's necessarily in the way that you're thinking about.
Vim: [00:36:16.08] No. No, but that is probably one way to move. it goes back to my point of if you've got multiple online identities but one real identity, how do you then verify who you really are? And that's not verifying in the physical sense, I mean from a emotional, psychological sense. Like who are we? [unintelligible 00:36:42.12]
Tim: [00:36:43.23] Wait, aren't we the sum of all of those? Isn't that what we're--
Vim: [00:36:46.04] Yeah.
Tim: [00:36:46.17] I kind of hope so anyway.
Vim: [00:36:51.08] Mmm. Are we though or do we put on a facade for those different versions of us? Like today I'm gonna be that funny account where I send all my jokes versus the serious one where I post all my quotes or whatever it is.
Tim: [00:37:15.10] But do you think that those aren't both part of you?
Vim: [00:37:19.17] Hmm. Yeah.
Tim: [00:37:20.15] Aspects of you. And like in real life people who are actual close friends would know both sides of you from that point of view from--
Vim: [00:37:30.06] Maybe. Maybe. Because I don't understand it as much I think about it too deeply, and actually it's just really simple like you said that it's just all of our personality just funneled into these different outlets. And that's not a bad thing.
Tim: [00:37:47.06] If you wanna find the people who've been thinking about this for years, there's actors because they're doing that big time and always have. So maybe that will be the kind of the place to go and read some kind of deep-thinking actors' autobiography, and you'll probably find the answer to those questions.
Vim: [00:38:10.21] Yeah. Yeah. So, so interesting.
Tim: [00:38:14.02] So we haven't solved any of this.
Vim: [00:38:16.13] Well, I don't think we can. [laughs]
Tim: [00:38:18.10] Maybe we'll get it all nailed down Saturday morning.
Vim: [00:38:21.19] Yeah. I don't know. There's so many levels of the individual than the societal than the kind of global elements of what the impacts are. Like it--
Tim: [00:38:40.00] Yeah, it's gonna come up. And like this whole thing was gonna come up in a really big way with vaccine passports.
Vim: [00:38:47.19] Yeah.
Tim: [00:38:48.08] Like that's gonna be the ultimate trans national identity. Apart from actual passports and potentially driving licenses, this is going to be the identity document--
Vim: [00:39:01.07] Yeah.
Tim: [00:39:01.15] --that matters in the next few years. And who gets their hands on that technology is gonna be really, really interesting to watch.
Vim: [00:39:09.07] Mmm. Yeah, and it goes back to what I was saying about you don't get a choice of opting in or opting out if it means you can't leave the country or go to the shops. And then what?
Tim: [00:39:26.10] Right. It's going to get to the point of going--you won't be able to go to a festival without one.
Vim: [00:39:32.19] Yeah, yeah, yeah.
Tim: [00:39:35.03] Yeah, yeah. It's gonna be interesting. I'm watching that with deep interest. We'll see how it plays out. I suspect the blockchain people will win, but that'll be interesting.
Vim: [00:39:44.01] Yeah. [unintelligible 00:39:46.05]
Tim: [00:39:46.18] Anyway--
Vim: [00:39:48.00] This podcast is always good for me because I'm not used to being the pessimist in the conversation.
Vim: [00:39:53.01] And I owe them on this, so it's great to get to embrace that part of my personality.
[Vim and Tim laugh]
Tim: [00:40:00.26] Well, I'm not used to being the optimist. And I do try a little bit on this. Anyway, cool. Yeah, I think that's a good place to stop with a small dash of optimism perhaps.
Vim: [00:40:12.02] Yeah.
Tim: [00:40:12.19] What do you think?
Vim: [00:40:13.12] I like it. I think that's a really good conversation.
Tim: [00:40:15.27] Cool. All right, I should press the button.