Untitled 1

[00:00:00]Tim: I'm Tim Panton
Vim: and I'm Vimla Appadoo
Tim: and this is the distributed future podcast. This episode is about how our smart objects are starting to leak Secrets because we share them. So I think the simplest example in the one that I first really came across was if you borrow a hire car.
Then quite often that radio already has everybody else's phone numbers in it. Like so the previous person has sync'd their Android phone to it and it's got all their phone numbers in it and the numbers they called and you know
Vim: Yeah
Tim: and like that's annoying and if you're careful, it doesn't happen, but the conversation we have a Stella is about how that then leaks into.
You know what happens if you're in the middle of a messy divorce, but you're still sharing a car, then like, you know, knowing that you rung your [00:01:00] solicitor or you've rung your mother-in-law or whatever. It actually may be part of that like negotiation and you might not really want the car to tell somebody that
Vim: that's so interesting.
Tim: So it's kind of this assumption that you know you that your owned devices can kind of now carry secrets to other people who are allowed to use them. But so somehow they don't they don't understand kind of whose secrets are whose or something II its kind of intricate. It's an interesting space and it's very much still evolving, but it but I think
Vim: yeah.
Tim: No, it's not obvious. How how we work around it apart from
Vim: No. And I guess it's like it's that extra leap on from internet history and my browser history
Tim: so yeah, it's shared objects around the house.
Vim: Yeah, so if you are speaking to your Alexa or even sharing a laptop to a certain extent, how do you keep and I know there's kind [00:02:00] of the Incognito browsers and things like that.
But how do you keep whatever you're doing a secret from other people that you share those devices with
Tim: right right and mean it used to be that you kind of logged in but on tablets don't really login you don't there's sort of no concept of switching users particularly. So it doesn't really apply in the same way.
Vim: No, not at all. But even I have that your Alexa device you're not there's no sort of recognition around and that either. Like I no kind of person variation verification on it.
Tim: Yeah, I mean and you know simple stuff like you get this situation where you have like your preferences get muddled up with other people in the family's preferences.
So
Vim: yeah,
Tim: you know like that. There's a smart TV starts showing you episodes of something that somebody else in the family watches or doesn't show them it encourage you to watch them like and that may be fun. But it you know, it may not be what you had [00:03:00] in mind.
Vim: Yeah,
Tim: he might not know they're guilty exactly know but they watch whatever it is.
Vim: Yeah. Well the same with kind of shared Spotify accounts as well. No friends of mine who were parents who let their children have access to their Spotify their their playlists like their daily recommended play this then become warped by kind of whatever their children are listening to
Tim: right exactly exactly.
So so this this sort of the idea that. An object it doesn't have kind of different faces if they show different people or they don't seem to and maybe they need to maybe they need to understand who they're talking to.
Vim: But that is where is that the questioning of right how. How important it is to us as users to have that personalization as well.
So in my mind the compromise is you either kind of let the technology have access to you as a whole person so that you get a [00:04:00] tailored service. But that means anyone that else that uses that, that device so that technology can also see what you're doing or you opt out of that and have a lesser tailored service because your data is not being collected and stored or shared
Tim: right but this is more about the device having the concept that multiple people don't necessarily who are allowed to use it but they don't necessarily share everything between themselves .So it is sort of somehow dcoupling physical access to the thing and permission to us , like trying ti decouple that from the date that it pulled in
Vim: Yeah
Tim: But it actually that has [00:05:00] a whole bunch of serious implications as well. You know, the the the ex is a stalker and that kind of stuff just you know, who may still have an account on a shared tablet that you've left there or whatever and it's just like all of these things. We're still I think learning how to deal with them and Stella's points was one of those points was about how if you don't have a diverse team, then you don't understand half the threats that might be involved.
Vim: That's interesting. Threats in what sense?
Tim: Well, so in the sense that if you don't have anybody in your team who's experienced having a messy divorce, then you don't necessarily know. Like that, you want to keep those things private or you know, if you don't have any women on the team you may there be a another set of things that women would choose to keep private that the boys on the team don't have a clue about.
Vim: Yeah,
Tim: [00:06:00] but wouldn't know to keep keep privates. It's like that kind of you know in there and there are cultural differences. There are things that you know are shared within a family in some cultures and are not in others. And since kind of quite quite complicated rules and the less diverse your team is the less of those rules are going to automatically pick up on I think we should
Vim: or even understand or even try to consider at the very beginning.
Tim: Right? Right. Exactly. Exactly. So that was it that was actually quite kind of quite salitory and you know, the. The car rental car one is actually become quite quite a thing. You know, that's become a it's a it's a standard hacker technique now has to kind of rent a car near a corporate headquarters because if you can you can pick up a lot of interesting phone numbers with names on it really quickly.
Vim: Yeah, that's scary. That's really really scary.
Tim: Well, so [00:07:00] so they now have like a policy like they wash the car. Between renntals they're supposed to like wipe the data on the head unit for the same reason, but it's okay.
Vim: Yeah,
Tim: it's the you're still assuming that they're going to and maybe they will and we really won't.
Vim: Oh, yeah,
Tim: but super
Vim: you have no idea
Tim: super hard to remember to do it when you hand the keys over there, but the end of your drive.
Vim: Yeah. I wouldn't it wouldn't even cross my mind to
Tim: right I mean. You do know that you've uploaded them because it prompts you saying, you know, do you want to upload your contacts?
But in my experience you pretty much can't use it unless until you do Hmm. This is pretty bizarre.
Vim: And what interests is it if the for the car rental company to do that?
Tim: Oh, it isn't it's just a standard feature of the car.
Vim: Right, right.
Tim: So the thesis is that almost all cars are bought by individuals keep them for five or six years or two or three years or whatever.
And so it's convenient to have all of their phone numbers [00:08:00] loaded into the car because but but if you're only going to rent it for a day that. Equation, the convenience thing doesn't balance the same way at all. But
Tim: yeah, the car manufacturers don't make cars specifically for rental so they don't have different behaviors
Vim: blow my mind a little bit.
Tim: Right right. It's I mean, I know when you start once you start looking for these things, they just like like they crop up all over the place and the one that got me the other day. Is that my I got a credit card that notifies me on sends me a notification every time I spend money on it. But if I spend money on that, but I've left my phone on the table at home.
Then whoever sitting at the table can see what I'm spending money on.
Vim: Yeah, that's yeah, that's really interesting.
Tim: So you don't actually know pretty sorry.
Vim: Yeah. I thought the same when. You can get so you are use monzo, but you can now have a [00:09:00] joint Monzo account with anyone that you choose to that it obviously means if you're using money off that joint account by accident or whatever the other person who has access to that count can see what you're spending.
So I accidentally used it in the wrong instance and the person who share the account with was that oh, I think you use that for the wrong thing or did you mean to buy this on that card and it was that moment of like, oh, yeah. it was that moment where that little slip-up means that youve seen something that I've bought.
Tim: Right, right, exactly and and it may or may not matter but like it's not a thing that we've been like naturally thinking about no.
Vim: No, you're not a at all.
Tim: And Stella is very funny about this. She said I you know, it's easy you just ask your hacker friends and it's a were not everybody has hacker friends, you know.
Oh she said
Vim: yeah. Yeah exactly and. Not only that but it's so much lesser extent is the same way. I think we [00:10:00] spoke about this on the last interaction we did but the same way people feel like they know who you are through your online Persona. So even like people that I do know that my mom be like, oh I saw you went here the other day because you post it on Facebook and it is that thing about oh my God.
I had no idea that I have no idea really who has seen that to know where I was.
Tim: Right, right. I'm and I die now don't post in real time like a pictures a day or two later.
Vim: Yeah
Tim: and talk about places a day or two later. I don't
Vim: ya
Tim: I'm almost never post when I'm actually at a place so, you know, although you'll know where I've been you don't necessarily know where I am right now
Stella: okay, so I am london-based and I work in security awareness doing the training for a medium-sized business and we are looking very much at how we can make things much [00:11:00] more. Engaging and relevant for people and one of the one of the things we work with government agencies with and that I do also coming coincedentally on the volunteer side of the private interest is I work a lot with groups that represent domestic abuse survivors or targeted groups to help them make informed choices with regards to iot.
And then we also on the other side of that work or try to work and would like to work a lot more with people who design products and who are involved in that whole process at the other end at the creation of the creative side to understand how they think and to help them too. To design very much more for with those communities in mind,
Tim: right?
So this kind of first really raised its head its visibility for me when I got into a hire car a while back and the hire car [00:12:00] offered to sync its contacts with me and it was full of other people's contacts and like just left like all of their phone numbers in this hi car's radio and and after a while I discovered why which is basically you couldn't use this as a sound system in any way without syncing your phone with it like the moment you touch Bluetooth.
It wouldn't let you continue unless it had all your contacts and it's like well, I'm not doing that and then I started thinking about well like probably it's not total disaster. If I do do it, although I didn't but like for some people that would be, you know, life-or-death situation and and. How about service design got done is like a mystery to me.
Actually, nobody obviously nobody thought about it.
Stella: Yes. Yeah, I think so. I read somebody somebody put a comment on about tech the other day about everything really saying that we've designed for convenience and not for community. And I think we're very much about sharing and not about [00:13:00] privacy design everything now.
And so yeah, I think I think it wouldn't have been designed like that out of malice or anything. I know that a lot of hire car companies now do have people who they can remember which one it was in the day, but I read a couple of companies now automatically spot after asked to post service valet service.
They will they were wipe. Well, they're supposed to wipe the history of the phone in the system. I know that when I had my Ford last year that it because I use sync I made that choice do you think and also because I was setting it up and I knew there were issues with their that have privacy that I deliberately did that to see because I had to get into it to to establish the privacy issues around it, which they were quite a few but then after that every time I switch the engine on it would tell me that I had to notify my passengers
that their privacy might be violated effectively. And and I think that's a real issue when you're designing things. I mean, if you've got to put a notice up like that [00:14:00] then perhaps you should be thinking about why you're doing that
Tim: well and and to some extent what the benefit is like, you know, I suppose you're saying it's convenience basically didn't.
Stella: This makes money, isn't it? I mean the the when I spoke to Ford about this off the Record and I had a number of people in their engineering team over in Silicon Valley speaking to me, you know off the Record as such but I can reveal that what they said was a we see as an issue for. For when you resell the car, which is a completely I mean it's a completely white able straight thing to do isn't it to say to see that as your issue that or when I sell it.
I don't want anybody to be able to see what I've been doing. But the but they. Want to talk about the fact that it is actually an issue. And then the CEO of Ford went on NPR last year and said well were using all that data. All the data was scraping everything which includes your GPS. Even if you're not using the in car GPS includes [00:15:00] every single voice recording and they're not the only brand of cars to be doing this that every single conversation every single sound that is made in a car is being recorded and a lot of these new models.
And that's aside from everything else. You might be doing, you know, your location your speed your breaking everything and and so that he admitted that they're going to monetize that so here from here, two three, four years ahead. I think, you know, you're going to get into a car and also hey do you want to go to.
Applebee's again with Patricia like you did last week at 6:00 and then you want to go to the Marriott. You know, it's that perfect.
Tim: Right? Right. And and if that's not the person not the person who you got into the car with last time, that's not necessarily a good thing.
Stella: Right? And that's the example I have to use to make people listen because they see that as a threat.
Oh my goodness. My personal life all my goodness my eating habits, even if they're not having an affair. They might not want their their co-workers. They eat at Applebee's on the regular. It's. That kind of thing [00:16:00] and and I think we shouldn't need it to be relatable to us. We should be just be thinking actually, you know should these things be be designed in in this way should should these to this amount of data really?
Is it really necessary for my conversations to? To be recorded and I can see how that would be an issue for say Insurance. You know, I would love to know I haven't done any study on it, but I'm sure that already there are there are situations where people are having car crashes and they're they're probably analyzing that data and decided whether or not they're going to make an insurance payout whether or not they reveal that to the owners of the car the premium holders or not.
Tim: Right.
Stella: I mean I would be interested
Tim: there was there was somebody who bought a couple of written off Tesla's and found that the video was of the crash actually was still in the Tesla.
Stella: Yes,
Tim: but had been written off. So so even if they are well, so they think the other thing I tend to do in this space [00:17:00] depend a little bit on who I'm talking to but but trying to kind of get the message of privacy over is is.
Um, if you're talking to a lawyer like they're basically like they're having their conversations recorded. They suddenly realized that this isn't a good idea because like their professional ethics say that you know, Well, I mean the conversation they're having May invalidate a patent application.
Stella: Yeah,
Tim: and if that's recording and it's in it could count as Prime somebody else having prior knowledge of it then like that's gone. If you're having a discussion about you know, your share price prior to an IPO that's privileged information and you're in breach of a whole load of of you know, Of Department of Commerce regulations like you just the moment you start recording things that our people are assuming a secret for perfectly legitimate reasons.
Like they don't even have to be personal secrets. They could be corporate Secrets like [00:18:00] you just leaking this stuff out and undermining whole lot of assumptions. And I think that's what I like all of this comes down to people make assumptions about. Privacy and secrecy that are now no longer valid.
Stella: Yes, and I don't think that they're able to make informed choices.
I think even when I mean, I got a new phone the other week, and I had to reset my Apple watch for instance. And the old all the health things were automatically opt-in. So I had to go through an opt-out because I was saying to somebody this morning. I really I don't care. I like to have the out of the health app on my watch but I don't want to be told every five minutes that I need to breathe or I need a heart rate is too high or I need to stand up.
You know, I don't need that judgment in my life. I have my mother but also that should be opt in. So I'm relatively tech-savvy and I still have to go through and an opt-out of these things and so it's rather like, you know, if I go to the doctor, I [00:19:00] trust that doctor to an extent. I mean, I think working in this in this sector makes you trust everybody a little bit less, but I'm not too nihilistic.
So I go there and I trust them not to be selling that information about my health issues or my concerns or my questions. So insurance companies or to my boss or anybody else on the outside interests, and and I think that trust is broadly speaking kept but then if you go and you're using a platform or a service or indeed you have to drive a car.
You you have a right to expect that the same sort of thing the same sort of respect for your data information for anything you're giving it. And so and I think it's very difficult for people and you can be incredibly tech-savvy you can be the leetest hacker in the world and you could still. For victims to this it really because I think the assumptions we make about trust and I think trust is a very important and they shouldn't be betrayed.
I get quite angry about that kind of thing.
Tim: I [00:20:00] think my all-time favorite you mentioned the watch I watch and my all-time favorite with the Apple watch has somebody gave a gave us an example on the podcast couple of weeks ago was that this woman found that her toddler had discovered that they could unlock the MacBook just by moving mummies arm when she was asleep.
So watch was in range of it. Yes, just like you know and that's like that's not malicious. But it's like it's an unintended... It's a set of assumptions that have been made by like actually pretty obviously single young people
Stella: really works. I love the fact. I mean I've actually said I've advised people that's a good idea to have that setup.
If you're using a Mac and you've got a watch then as far as you know screen lock time. It means that if people are working on Mac in their office, and we've received from an awareness point of view. And you say they leave the proximity of their device it is [00:21:00] locked. So I advise people to set that out because it's great.
You know, you have less in theory. You have less devices that are just left completely unlocked when people happen to walk away step away from their desk or visit the bathroom, whatever else and forget to lock it. So it takes the responsibility away from you. Yes, so. Yeah, children are a threat models themselves just get dogs.
It's much easier. But I mean it's interesting cuz I was looking I was talking to somebody from UCL this morning and they're doing some really really interesting work and I can I can send you the link so you can put them in your show notes if you did too, so, I don't know how you present. But that doing they're actually really looking for I'm hoping to do it, you know to help them out a little bit and they're really looking to work with people who are developing platform Services, whatever else and we're looking at maybe thinking what kind of could we offer people are structured to lure an exercise of could we talk to people about.
How to how [00:22:00] to kind of put a put some breaks in place when where things are being designed so that people can kind of help themselves to use those to make the good decisions the sound decisions because we're talking about more than just ethics it we're talking about and we're talking about highly intelligent capable people.
They were designing these things but none of us are perfect and I was can think about everything and so I think we would love to talk to people if they have really good. Practice what they think is really good practice, or if they think that they would like more help in this area and we can maybe talk to them about how to design something for them or work with them or what processes would be useful because I think people are genuinely interested.
In doing good things. I don't think people genuinely set out on a Monday morning to make the nefarious decisions. I do think that is difficult. However to get bored level by in you know, if you think about the CEO of Ford, he's happy that he's got his his stuff [00:23:00] monetized.
Tim: Yeah, but not probably that doesn't apply to him personally.
Pretty sure he has an opt-out on that.
Stella: Yeah, I'm sure he's like damn sure. He said you never know because we've seen with Facebook that you have to actually shut the entire platform down to delete all your CEOs post. Don't you and then he made prior prior to actually realizing that privacy was important but I do think that there's a larger will to develop things the proper way.
I think the problem is that what you mean what I would say to people if they're trying to think about how to develop things. For you know for people positive groups is to think about how your height if you're developing a platform or a product or service. Do you think about how your hacker friends would use that and deviate it from its original use think about that?
Because I and I don't think about how you yourself would deviate this or you know move away from the original purpose because I think that's what the people do in Tech they're there because they and they're good at it because. That mind in that works [00:24:00] towards taking this apart and breaking things down and try to work out how this could be better how this could improve what's going to break this what's going to fix it.
And I think we sometimes develop some things and then we forget that people are going to not use this as it should be infected to be used and that shouldn't be too difficult because most of the people I know who are that way inclined to spend the entire day taking things apart.
Tim: I think I think I think that's like I think the problem is that we don't all have hacker friends and I think we'd like we you know, it's sort of should be mandatory
Stella: if you work in tech there if you're in a Dev if you're in you know, working and developing an app or.
In that you're in that kind of space then in theory you would but I understand that people would I mean I worked at place where several of the team developing things had absolutely zero and interest in [00:25:00] security or privacy or anything or previously as my boss says I should say because I said in the American way still so I do apologize, but he's incredibly I was in America.
So I've got an American again. It just took me an hour in Target to get back to my bad habits, but
Tim: all of your bad habits.
Stella: Yeah all of them back on it again. Yeah, I think I think that's true. But I think again that's why we would we would be really interested in talking to people and I know certainly the team at UCLA doing an amazing job and they have done incredible insightful work and very thoughtful work and they're aware that there are no limitations on who.
On people when they're designing things and thinking about things and you can't think of everything. We're not you.
Tim: I mean you definitely can't I had a surprise the other day which was kind of interesting. So I realized that something I wanted to watch was on. On Netflix. I don't have a Netflix account.
But my daughter was here [00:26:00] a while back and she used our TV to watch Netflix and so her account was still logged in so I messaged her and said do you mind if we use your Netflix account? She's a no go ahead but don't judge my choices. She said
Stella: oh no, yes
Tim: like and and this made me realize that something as innocent as a TV. is actually like, you know, I mean as it as it happened, I know what she watches on TV because we chat about it or at least I think I do but you know, she was like he's gonna be happy for me to use that account and it turns out that actually next week kind of future for exactly that sharing within a family and it's like, you know, it's a known model and they've designed for it's just kind of really nice but.
But it was that whole sort of like arc kind of thinking like TVs aren't something you think about in the context of privacy.
Stella: No, I mean even with that for when I set up my Apple TV, it will [00:27:00] automatically set you can sync it with the new you can make a new account, but you can sync it with your phone which obviously is worth, you know works for me with the children, but I you know, you it automatically syncs everything up like even your photos.
So when you're and I don't want that all my photos being linked up being visible on my TV. So anybody who might be using my TV because my photos are private and you know, I have screenshots I take for work or because I'm thinking about buying stuff for the children for instance. And then I don't want you know them going on a girl look that's what moms going to buys us for Christmas and I'm just cos things are private.
So yeah, these things are set up like that and I have that situation last week my parents were at my house. I don't watch any TV really and they were restricted to using my Apple TV and using all the the apps within that so Netflix and iTunes Apple TV, sorry the iPlayer and stuff and of course there are all my choices on there because I haven't locked it down to the extent of expecting an adult to be [00:28:00] walking around on there.
Using using my login codes and things so yeah, I think a lot of things are set up as they everything in the home and everything in our groups is completely open and as if none of us have anything we'd ever want to hide and actually as if hiding was a bad thing and it isn't because. You could have the most open relationship with anybody in the world and still want to buy them a gift or or wonder, you know, maybe vent about them or something.
But equally when I talked to I mean this is a couple to three years ago. I remember I wrote something on one of the. Security blog sites and I had quite a not about privacy within your group within your family in your intimate Circle and I had a lot of security people saying oh, but my girlfriend and I or whatever else we share each other's passwords.
We don't have passwords on our devices or whatever else and we share we read each other's devices. We just go in and I think we need to [00:29:00] perhaps as a society have a discussion about that. About how much should be private and what's good about why it's good to be a little bit private because I think if we come from that side of things from that angle we might make some progress because at the moment it's all about oversharing and oh, yeah, what did you get that weekend?
I saw you were at Coachella or something and you don't even you know, just because you were so somebody's Facebook. So people the people are giving away a lot. And we've maybe we maybe a suspicious and we've been trained a little bit by the government as well to be suspicious of people who are using, you know, encrypted chat apps like WhatsApp, you know, I know I know abuse survivors are very who are reticent to use things like whatsapp
because they don't normally have them on their phone and the people in there in a circle will be suspicious of those if they have them on their phone because there is so much talk of terrorists using a what'sapp and signal that too, even though for a URI perhaps, you know, what's up? It's [00:30:00] fairly, you know, it's very mainstream, you know.
It wouldn't be something that you would consider to be unusual for somebody or weird to have somebody on that for somebody to have on their phone. It might be maybe you would think more like I don't know the more maybe less than and ones in the UK or the u.s. Like kick or something or line might be more unusual.
But you know, these are still not nefarious. They don't use for bad things anything can be used for bad things white Vans can be used for quite a lot of bad things like regularly used in terrorist attacks and yet we don't ban those and we don't judge people for driving them. And so there's this thing around Tech that we've got this this push for privacy vs this push for sharing because also we can monetize it and we can make people less, you know, drop their berries a little.
And make them feel like sharing and within an intimate circle is a good thing or what's to be feared from my car on my hire are you know, really, you know, I trust the company to work this I'm not even thinking about who might use this after me,
Tim: right? You're exactly I mean, you know, [00:31:00] that was exactly my thing is that I hadn't really thought about this until it ask me a question to which my reply was.
No, but then it wouldn't let me do anything. I mean obviously I could drive but I actually couldn't couldn't listen to my choice of music.
Stella: Yeah, I think as well because I hired several cars when I first moved back to the UK and on a few of them. I did have you know that, you know, you went into Apple play or whatever and you had you know, five or six different people's phones as it I was astounded and I think already I think already if he were saying to people right change the device name on your phone make it not your name,
so that you know people already if they were on the fuck if they're on the train looking for a hot spot for a free Wi-Fi network. They can't find you. They don't know you you're on that train.
Tim: Right? Right.
Stella: And then also they doesn't relate straight back to you when you're if you do happen to have that issue.
You know, my phone isn't my name. It's a so already for those kind of things. [00:32:00] But yeah, I think if people see that and they see no Debbie Mike and Steve's iPhone already looking they might think oh, Look at all those thing or look at those people's details, but I don't know whether it stops you from syncing your phone app and from deleting it up taking the care to delete it afterwards.
Tim: Yeah. Yeah. I mean, I think I think that that I'm particularly interested in is the sort of non-obvious. I mean know stuff about that you're sharing accidentally. I mean my favorite one at the moment is probably notifications like these things that come up even though the phone is locked so if your phone is sitting on the kitchen table and you are on the laptop and you buy something but it comes off your credit card but your creditcard [00:33:00] is syncing to your phone. Then the person who is in the kitchen knows that you've just bought them a holiday on your phone in front of them.
It is this kinda weird over sharing but it's sort of little routs that you wouldn't necessarily obviously have thought about that now has stated takes.
Stella: Yeah. This is why I'm no fun at parties because this happens, you know, I see people and they start to ask you what you do and then you kind of either a completely honest or you make some kind of story up because you don't want everybody saying all scary.
And depending on who you're with but yeah, and then you end up having people spend so I just sit with people only have their notifications every single every single thing. They have as a preview on their phone, even on lock screen like you're saying and I end up saying why have you please please stop this?
You know what? I don't care to say, yes, but what if you you like you said you wanted to surprise your husband with a surprise 40th? To you know, a 60th birthday trip to you know, Bora Bora and you know their I'm leaving a hint for my significant other to take everything [00:34:00] that you know, and people don't think about that and but this is designed like that, you know your banking app will and it's important that your banking app communicates with you about fraud and you can lock you can turn off all of these.
But at the same time it's kind of useful to have that is because then if there is fraud on your account and you don't, you know, you go several days without really noticing it your bank might come to you and say yeah, but you turned off notifications because I know somebody who had an issue with their car very recently in a small warning light came on.
And so she took it to the garage straight away because she's and she could and because she's like that, you know, she's very cautious about these things other people might not. I certainly ignore most of the warning lights on my car. And when she got there they said well, she said it was it under warranty.
This is what have to check and she thought that meant they were going and she said I know it's new, you know, it's brand-new had a six months. They said no, it depends on how quickly you brought it in
Tim: really? Wow.
Stella: Yeah, so I think sometimes we're tying people in. I mean, I don't know. I haven't read the terms and conditions [00:35:00] of my banking app too much but I am really serious about this kinda thing, but I wonder if.
If there are things like that that are coming or they're already there. But if you're not already if you're not reacting to warnings then then, you know, maybe you'll be penalized in somewhere. So I think we we've got people just leaving all the notifications on because I want to miss something. So I'll leave that one for a Nat West or whatever because.
Because it's useful. I want to know somebody's spending on my account but does that necessarily has been a message preview and do all your you know, your your direct messages and things and as your Twitter notification your Facebook messaging need to come up and then of course it reveals, you know who you're talking to.
Sometimes it messages from you know, what platforms you're using that kind of thing. It's it's all in there, isn't it? And it's. And we just you know, which I would say the vast majority of my non-security/privacy friends have phones which regularly display everything. All [00:36:00] right, so just

Tim: right.
Yeah, and and the the like my thing with that is it's all very well. If the phone is the only path into that service, right? You know your phone showing your WhatsApp. Previewing your WhatsApp messages is kind of like it's a risk, but it's not totally unexpected. Unexpected is where it kind of joint account the fact that you've got a share an account that's shared across multiple devices some of which you may have completely forgotten about like I have a tablet which I the charging connector broke and I realized a little while ago that I can actually charge it wirelessly.
So I have now charged it up and it's like the message screen on it at the home screen. I was she was full of messages like everything I did for the last three months. It's backed up on that and like I could easily have forgotten that and giving it to you know, a kid or somebody and then it's really still logged in
Stella: and it will tell you as well that my my new phone will tell me what [00:37:00] what's been recently searched on my ipad
right just yesterday and I'm thinking you know, I mean it's all linked to my account. So that's fair enough. And if I if I was you know, I mean II occasion let my children use it and I would you know, eventually I will give that to them but it would still be part of but they're young and things like that.
So it's a bit different. But if I was letting somebody else use that which I might. It's like, you know how you just use my iPad do that search and then I'm able to see what they've been searching for which I wouldn't necessarily want, you know be have the right to I mean another using my device.
So if they were doing anything illegal or whatever else I'd want to be informed about. You know, you're just thinking it's just a small very small kind of these are small things but they matter in when they Stack Up.
Tim: Yeah, I mean exactly so so my my example with that one is that if your main account main email account is Gmail and your logged in on that [00:38:00] tablet, then pretty much all of your password recovery is open to abuse by that single thing.
So like if I had given this this tablet to somebody else not thinking about that. Then they could basically do generate new passwords for all my accounts and reusing that turned out that's not true for a good reason but like in general. That would be that would be possible and like you don't think about that single kind of thing.
You logged in three months ago on to a particular device being actually kind of really crucial to you know, your banking security or whatever. It's sort of bleeds across devices is the thing that I think is really. Odd an unexpected. I think it's not anthropomorphic. You don't like if something about them being physical devices.
I think that you like you hold this thing in your hand and you expect somehow mental model is that that contains [00:39:00] unique that contains that data and of course it doesn't have to.
Stella: No, and I'm you completely rights think it's it's one of those things about about using devices are you know and accounts across devices and things like that and then who you're sharing with and and how much you might want to share with them, even though you you know, you might be linked to them in some way. Its a bit like sharing your calendar
Tim: I think what we are lacking is this sense of graduation, what I liked about what netflix had done a was this acknolwdgement that this was behaviour that people would want, they would 'lend' their account to family members on a shared device but allowing them to be compartamentalized so that
my preferences don't necessarily bleed into my daughter's preferences and vice versa. So I don't end up watching Kpop and opera and she doesn't end up watching sci-fi, you know or whatever.
Stella: Yeah, definitely. [00:40:00] I would give money to spotify to be able to recoginze that if I'm listending to Disney or the Spiderman soundtrack that this deviates from your standard listening .
Well, I don't think I'll ever be able to do that. That gives away far too much about myself, but I do I didn't mean like today I was going on I was on Twitter. I always am and they put kind of warnings content warnings on Game of Thrones spoilers, but you could they didn't do any of that when when there was every time there's a terrorist attack.
It takes them days hours even months. It's to block content which might be very very disturbing and you're still allowed to do very bad things [00:41:00] and make very discriminatory awful harmful comments. And I think that's another thing as well that we're capable. We know we're talking about this about how 'difficult' in inverted commas it is to.
Good privacy choices and help people to do that, but then we're completely capable of doing it when it really matters. You know, Game of Thrones is a massive Revenue maker. They don't want to offend anybody who's making a really popular program or things so they block that. They are able to say well, you know, and I don't want to upset people who may or may not want to see spoilers and I'm thinking wow, you know.
I wish the biggest problem in my life was whether or not I saw a spoiler for a program. I like, you know, I could probably get over myself with that. But you know in the same way that when I once uploaded a clip from a nightclub once it had no 20 seconds or less of a recording of an artist in the background and Facebook detected that.
Tim: Oh yeah,
Stella: yeah so I could not load it I did actually manage to upload it because Facebook I remember at the time. I mean I was slightly under the influence of alcohol [00:42:00] and I remember the time it wouldn't upload and I was getting very cross with my phone for not uploading this clip of me dancing as I wanted obviously to share it with everybody.
And when I when the next morning I could see, you know about six seven or eight messages from Facebook saying, you know, do you have the rights of this please check and then then and then the final one saying, you know, we can detect that we has this music in the background if you don't have the right to this music.
You know, you should do should delete the post or we will delete it or something that maybe we will delete it for you. I don't know I can't remember but it's that level of they can go to that level of protection and of thought when it comes to censorship, I mean I'm censorship are difficult words using the Olympics before I talk about that but there is a level of care that is taken to protect certain ideas certain principles certain things or certain institutions.
That is not given. To protecting other things.
Tim: Yeah, and I thought my favorite example of that is is living in Germany. It's a real eye-opener [00:43:00] like, you know, like it's perfectly K. They you know companies are perfectly capable of bearing with German law on. Or Nazi symbols, they can cope with that.
They can Implement that because they have to it's German law and that's right. You know, it's perfectly possible to do these things. It's just a question of having to be pushed hard enough into doing them and thinking their important for one reason or another and I think that's what you know kind of make my aim with this conversation is for just to like push this stuff a little bit further up the list of things that you would think about when you were designing something.
You know, just just try and think about how as you say how things might get misused by, you know, the jealous boyfriend or the you know, that the person who's just wandered into your home and doesn't mean you any good because like, you know where the burglar or whatever they're like bunch of people who.
And I think that's the thing that we [00:44:00] sort of forgeting we are assuming that physical access and I'm guilty as anybody but we usually assume that physical access to a device is the same as permission and that's just not. Enough.
Stella: Yes, I think so. I think yeah the permission consent coercive consent and I think I mean I was I was when I speak to the people at UCL, you know, it's very much and I've got a huge amount of respect for the work they're doing and the ncsc as well.
With really trying to you know, trying to show the harm the think about the harms that can be done and think about human behavior very we can't really it's very difficult to change Behavior as such but I think we can certainly change what we what Behavior we facilitate. And what we encourage and also I think we need to change what harms we tolerate because there's you know, I flag up and I'm not the only one whenever I flag up things that are harmful or could cause [00:45:00] problems you're often.
You have to really have to yell. I mean you really have to become a problem or you like a like a wasp buzzing around for them to finally swat at you and acknowledge you
Tim: right
Stella: and you so yeah.
Tim: Okay, cool. I think that's a really good place to leave it. anyway brilliant. Listen, thank you very much for that do appreciate it.