Quantified self and privacy

[00:00:00] Vim: Hi. This is Vimla and you're listening to distributed Futures and I'm here with Tim.
Tim: Hi, I'm Tim and this week we're talking about Quantified Self.
Vim: Yeah, so. Interestingly, I think now there's almost an expectation that people will. Take more of an interest in measuring their day-to-day life. So I've been running some workshops recently about what like digital means to people outside of their working life and probably nine times out of ten people will comment on Fitbit or trackers that measure their steps and that kind of thing and it's interesting to see when when I kind of question what that means to them.
The only answer you really get is like "it's good to measure my health. I like to know what I'm doing every day". So that's kind of it. There's no [00:01:00] understanding of where that data sits or what happens to it afterwards or or any kind of comparison either.
Tim: Yeah. I mean, I think it has a it has a real huge potential and I think we're just on the edge of it and he was talking just now actually a couple of Medics about.
A lot of the medicins being prescribed based on bulk statistics. So, you know this thing about you know should over 60 to be taking statins and things like that and at night and the statistics on that a good but only en-mass and so it's like if you knew more about if you were actually naturally healthy because you're doing enough exercise then maybe you don't need to take this particular drug or whatever.
But because they don't have those statistics on you individually. They only have them as bulk on the population then they prescribe for the whole population so that there's a [00:02:00] potential for actually if you did know and you could prove that you did enough exercise in a week or whatever then that might affect.
You know, how. You manage your aging for example, or or like a whole bunch of other potential medical conditions could be helped by knowing more about your personal circumstances. We always lie about you know, how much exercise we do and how much we drink and like all of those things actually having some real numbers for kind of interesting because I think we don't know that.
Vim: We don't know that and I think we're not encouraged to use it for that either and that is interesting that you're speaking to Medics about it because I wonder like is that kind of data recognized as evidence or cut it like how much sway does it have if you were to present a doctor with well, here's all the exercise.
I've done over the last six [00:03:00] months. Here's how my heart rate looks but these are the symptoms. I'm still having I wonder how. What the kind of interplay with that would be
Tim: it's amazingly topical because like Apple have just released this thing on the iPhone watch that does electrocardiograph. So it actually like it graphs your heart rate.
And one of the features is it will present a produce PDF that you can give to your clinician. Wow, that's an explicit feature of like here is a standard format document that you can give to a clinician to show them your your you know, your heart rate and whatever. So that's take that interface that how do you present this information to clinicians?
You start Apple in particular starting to think about I think it's fascinating.
Vim: It's really fascinating. I'd be interested to know who owns that data but isn't is your heart rate? [00:04:00] Then I owned by Apple. Are they collecting they create some that across the board.
Tim: So Apple. I mean I can't speak for them obviously but that Dave got a reputation for being much more interested in in it being your data.
It's your data is stored in your phone. I mean it's collected by the watch but it's transmitted to your phone and it's stored in your phone in a way that Apple supposedly don't have access to
Vim: yeah.
Tim: And then they run the the algorithms they run on it. If I understand it correctly they run on the phone in the basically using the graphics processor from the phone to do analysis and they've got a new like AI engine.
I'm sounding like an apple Fanboy. But but I think it's really interesting that they're taking a totally different tack on this from from some of the other organizations from the other more cloud based companies who are centralizing this data in a much more aggressive way than they [00:05:00] are. So I think that that contrast is actually.
It kind of comes to the fore this week with that.
Vim: Yeah. Yeah. Absolutely. Yeah, I just find I want to find a whole thing really interesting. Particularly like when we start thinking about tracking what we're eating and alongside the exercise that we're doing and how we then what that actually means for the decisions were making like the real.
Kind of conscious knock-on effect that it has if at all
Tim: yeah
Vim: me personally probably has the Outburst. I don't think I know that I will have run whatever in a day and therefore I'll be like, oh, well, that just means I can eat more rather. They look kind of be put more positive thing about all I've just done some exercises and leaving it at that.
Tim: I think the most positive outcome from some of those things is you thinking I will actually I haven't really done quite enough [00:06:00] today. So I'll walk to the next bus stop along rather than take the first bus that comes along that's sort of I think that that's the level of. For me, that's the level of useful feedback one could get rather than you know, oh, I've got to get a hundred thousand steps a year or something like those big time that were the numbers are but those big targets I think are unattainable or like you are they going to meet them with your not but I think I think some of the smaller targets a little nudge.
I think those that that said that space for you know, I could see that working with me. Although I can't stand having anything on my wrist so I could do I can't type when I've got something on my wrist. So, you know, I wish that my watch off.
Vim: Yeah, but you still wear a watch no.
Tim: No, I'm one watching years [00:07:00] because I own one, but I never wear it because I'm afraid I'll lose it.
Vim: Yeah.
Tim: Let's not take it off. Leave it like in the co-working space or whatever and then never find it again. So it's like I got through several watches that way and that's what we're like give up on that.
Vim: so I got a Garmin Fitness tracker. So all of this is really interesting for me.
I particularly because they're such a different differentiator between devices and how they measure steps and heart rate and all of this stuff. So I actually went for a run with my partner and we did exactly the same route and I think there was a point three mile difference in how far they both said we had run.
Tim: So do you have to calibrate this for like your leg lengthen your average stride and that kind of thing or what?
Vim: Well, this is based on GPS. So it was the distance run rather than right amount of Step. So I would have assumed the step [00:08:00] step count would have been different. But yeah, just based on the GPS.
It was different lengths. That we'd run
Tim: that's crazy. Well, I suppose that they might have assumed it slightly different path. I mean were you running in a straight line around the circle or what?
Vim: straight line
Tim: That's crazy. There's no reason why they should be different. Yes.
Vim: Yeah, which is which makes you then think about and what the difference is is are between all of the devices and how accurate they are in telling you.
Information about yourself, you know, you could I think a few people have done different tests on this of wearing multiple devices on one wrist throughout the day and seeing what the differences are and there are there are big differences.
Tim: It's like so we have this joking in our house about like if you look at a weather site and you don't like the [00:09:00] results, like, you know, it says it's going to rain today and that's not what you want to hear.
You. Just go and look at another weather site into a find one that like agrees with what you want.
Vim: I must look at that.
Tim: I mean, you know, there's a limit there ends up being like a two-hour three-hour. Well, okay the majority think he's going to rain. Well, I'll have to accept it sort of thing.
But we know I suppose what that says is that we're not like totally rational about data that affects what we want to do.
Vim: Yeah. Yeah absolutely and I think. One of the big things that follow in the entry that followed it comes up is the kind of collections later and then what happens after a long amount of time and when the organization that holds that data decides to change their policy on it.
Tim: Right,
Vim: but what mmm how your the left as a consumer to do like to make those [00:10:00] decisions? So the example in the conversation is 23andMe and the announcement that they made to partner up with a pharmaceutical company to use the data. They've collected to make some kind of large-scale pharmaceutical decisions.

And it begs the question that if you had known when you first signed up with 23andMe that they that was what was going to happen kind of five years days. Whatever the time distance was would you have signed up? So when you're the the kind of decisions decisions we make in real time about what data we're giving away or how we're using it.
It's kind of up to the discretion of the company years down the line.
Tim: Yeah, I mean yes accepted like it was always pretty obvious that was going to happen between with 23andMe I think. You know, what else did you think they were going to do with that data then I suppose I mean there are worse [00:11:00] places.
They could have given it to they could have given it to the insurance companies. Yeah, but but yeah, and I'm going to take I take the point. I think it is. It is a problem that you. You don't know where that data is going to go and that's that's again. The thing that Apple and a few other people are trying to do is trying to say, well it never leaves your phone unless you want it to and and the transfer medium is really obvious.
It's a PDF that you are giving to your physician. And so it's really clear-cut what's going on here whether they will continue to succeed in doing that I think is. You know against Market pressure because they basically being undercut by people who who Who's devices are subsidized by the fact that they're selling data to other people so it is like I know that's a competitive issue that especially him to deal with but it's a tricky space this I think and [00:12:00] and you know, what future use is going to be made of it as kind of.
Really? We don't know what use it could be mean particularly things like 23andMe. You don't know what those genetic like, they they might discover later that genetic markers for particular diseases not my like the fact that you gave 23andMe data 10 years ago might affect your insurance tomorrow because somebody discovers something in the lab discovered something in the lab this year.
Vim: Yeah,
Tim: and you like it's not even risks you already know about the point. I'm trying to make
Vim: yeah. Do you think it's something that people should consider or just not even bother with
Tim: I? It would put me I mean I it would personally put me off doing something like well collecting that data in the first place what I can this you're really [00:13:00] clear where the data is going to go and what it means then I'm like, you know, I would tend to not give that not collect it and will allow it to be collected collected.
I mean, that's what the. U.s. Military are not doing this is like basically on duty or on base soldiers are not allowed to use any Cloud connected Fitness device.
Vim: Yeah,
Tim: because it gives way too much information about where that Battalion is or whatever, you know, yes, just like two huge secrecy leak.
It's kind of interesting. Actually. I'm reading currently reading a book about the second world war and Enigma and then a lot of that is about I mean a lot of it's about code-breaking but a lot of it's about operational security and you know people just being lazy or letting data out that shouldn't have been let out and that, you know reveals things that.
Kind of can cause a [00:14:00] whole Convoy to get sunk or not. And that kind of things is a huge outcomes of what feel of the time to be may be relatively minor security leaks mean there's this thing there's this thing about how the whole.
They basically manage to successfully break this break the code on a regular basis because there was one weather ship that always use the same phrase to start the message. It's like, you know, and and so the they knew exactly when they learned that weathership, they knew what they had to get the message to decode us and so that just made the problem.
Hundreds of thousands of times simpler and that basically gave them two years of breaking this code and see something that seems like a really trivial breach of protocol in that case can actually result in massive damage. And I think we're in that that space here is that you know, data you're [00:15:00] giving away now that you think doesn't matter may actually.
Affect your future pension rights or your ability to get insurance or a mortgage or whatever and that, you know without you realizing it turned out than 10 years down the line. It may stop you, you know doing something you really want to do.
Vim: Yeah. Yeah. I also wonder if you know if it's better in some ways that if like insurance companies or.
Brokers or whatever use the kind of pure data as it were a rather than putting the onus on the user to kind of fill out loads of forms and you know, all of that kind of stuff. I wonder if that is what the future should be.
Tim: Well, I think only if it's absolutely clear to what use that data is going to be put and that it can't be changed
afterwards like it should be [00:16:00] used for the thing. It's going to you know, assessing your risk or whatever. It is. You've handed over for but not for future use, you know, I think that's the for future undefined usage. I think that's the problem in people don't know people can make reasonable assessment about whether they want to tell, you know, their doctor something or their insurance company something.
In that context but what they don't know is what it will later mean and because this is an evolving world. You don't know that what what the downsides are and I think unless people are very clear about that they get basically that they will destroy it after they've used it for the thing that you've agreed to have.
It used for. Like after that if they keep it and reuse it for other things. Like that's a problem because you don't know what they're going to be.
Vim: Yeah.
Tim: I'm a bit of a kind of fun. I don't know what the word is, but I'm bit of it better [00:17:00] extremists on this stuff. But that's kind of what I do so these days
Vim: do you think that then the general population knows enough about data and a cheese?
Tim: Oh good grief now. No, because we don't know that I mean we'd that's thing. We don't know how it's going to get used. Yeah, I don't know how bad it could be. Like even the experts in the field. Don't know how how how bad that could could get. I mean some of us have thought about it a bit but it's all sorts of like weird like weird space is the guy.
I think I mentioned this at some other point in this podcast was about the guy who. There's a murder conviction that was based around Fitbit data. All right, right. Could I I can't remember if he proved he was or wasn't there based on where his Fitbit said he was.
Vim: That's interesting.
Tim: Right? Right.
So I mean it's like nothing to do with this Fitness. It just cheer located in more proof that he was [00:18:00] like, I don't remember the details, but I remember thinking good grief like that's you know, that's that's something that you wouldn't have associated with a fitness tracker.
Vim: Yeah.
Tim: Yeah, I should really know the day the details of that and like being wet way too vague.
Vim: Maybe something happened with a fitness tracker somewhere. Yeah. Yeah.
Tim: It's a challenge for Google basically, isn't it? Like can I Google that and come up with an answer? So, yeah, I mean, I think it's a huge area and I think one we should like when one needs to keep an eye on but I think also the tools for for keeping that privacy or and informing people what it is that giving up on there either so it's not just a.
An education problem. I think there's the there's the tool set is barely there.
Vim: Yeah,
Tim: maybe you should find somebody to talk to who's who deals in Tools in that space.
[00:19:00] Vim: Yeah,
Tim: go digging. See if I can find somebody.
Vim: No, that sounds great. I think that's a really good point to move on to the conversation with Ian.
Because he covers a lot of this in his talk.
Tim: Well, I'm looking forward to it
Ian: okay, so I'm Ian Forester. I work for BBC R&D and very interested in in the future and distributed future, which is democratic.
Vim: What do you mean by its Democratic
Ian: I'd like to see a future where it's much more democratic so people can be to choose rather than right. Now, it seems like things are just happening and there's not much kind of people active in the future. So I'm a big believer of that that [00:20:00] quote of the future.
Yeah, we kind of cover exactly what the quote is. But it's all about you. You you make the future. It's not like it's just comes and yeah going to hit to you. So yeah.
Vim: Yeah, that's really interesting. What makes you think it's not Democratic now,
Ian: well, I think to have to have. Yeah, real Democratic kind of opinion of this stuff and vote to some of that you kinda need to be informed.
Vim: Yeah,
Ian: and there's a lot of tecnology right now, especially some of the big corporations are kind of hiding stuff. So it's very hard to find out exactly. Yeah. Okay, we could like a real, you know, kind of real. Feeling or real idea about what's actually happening. [00:21:00] So you can't because that decision is kinda responsible decision because you don't know.
Vim: Yeah,
Ian: I think that this is clear that some suitable for example, what happened with Google and the locational data stuff now kind of logically makes sense, but it was kind of never. A lot of people thought that that locations the when you press the button would turn off locations.
Vim: Yeah,
Ian: they never they never guess that it would kind of load up and do a quick paying about but it's interesting lead out gets towards you know, yeah, so I think that's an idea of color where what I'm kind of thinking about.
Vim: Yeah. And do you think if more people knew about that stuff it would change their habits of using their technology.
Ian: Um, [00:22:00] no, I think I think it's I think it's worse things where I would like people to ideally I would like it so that people were aware and then they would they would change their behavior.
But I was I kind of thought that when Edward Snowden came out and said look all your data going via you is this way and been snooped upon and reveal the stuff out of thought. Wow, people would have you jumped at stuff, but they still there's a whole bunch of other factors, which do apply. So for example, peer pressure social pressure are massive.
So even even now people say oh. Why are you not on WhatsApp? So because I value my security I value my privacy. I do not trust what they've gonna to do. I've read those terms of conditions. Yeah, you [00:23:00] know, I don't I don't want to be involved in that but you know, I understand most people like well actually, yeah, it's good way of talking to my my friends and I think it's more about.
Trying to make a decision, but I wanted to make a decision which is an informed decision before a rather than just kind of follow along. Blindly.
Vim: Yeah,
Ian: it won't it won't change things but I think enough people. So for some what I want, my dad were asking about WhatsApp and I told them exactly what what it what it does and yet they were all why don't know if I want that but.
Vim: Yeah,
Ian: they may have more informed decision. Not just just a story thing and that's how
Vim: yeah. Yeah. Absolutely. And do you think it's on the onus is on the company or the individual
Ian: both?
Vim: Yeah.
Ian: Yeah. This is the thing is though. If [00:24:00] you if you focus on on kind of regularly in the company and all this stuff then you just get into.
I told you territory the people need to be informed and I need to have that appetite of of being informed the wanted to be informed
Vim: Yeah,
Ian: It's Tricky. This is why the previous answer was very much kind of like it. I think it could but I don't I don't think it will because a lot of people still people are very time poor, you know, and people some people.
are money poor, you know and they it's just too difficult. It's too stupid to save you if it's between speaking to your mates of free and yeah paying ridiculously high cost for your your phone bill then obviously, you know, but yeah, [00:25:00] it's a tricky one, you know?
Vim: Yeah. So I remember one of the first talks I saw you was in barcamp years ago and you're wearing one of these cameras around your neck that took a picture every kind of five seconds.
Yeah, sorry?
Ian: A long time ago
Vim: was a really long time ago. So what has shifted from kind of in your mindset of like collecting that kind of data for your personal self and then they'll kind of wider ramifications of doing that on a bigger scale.
Ian: So yeah, I think this comes right back into the Quantified self stuff.
So I think the thing I found interesting about that was, you know, a lot of you have never seen that camera before. A lot of people for especially the UK are not kind of won't kind of come up to you and say hey, what's that [00:26:00] thing maybe other cultures would say something, but the people did ask me what kind of asking why I was doing it.
Vim: Yeah,
Ian: I just like basically, yeah, it's no use it's all public. It's not like I can. Yeah, we asked me not to fill you and that's fine. But I can just delete you.
Vim: Yeah,
Ian: but there's an element of trust and I think because it's my data. Yeah, it's still that I've collected it's different thing that the thing that I think where you're heading is that you've got now you've got a third party.
So for example, there's another camera which does a similar thing. And if you if you use it, it automatically syncs those pictures to to the cloud or someone else's machine.
Vim: Yeah,
Ian: and so [00:27:00] that camera didn't and I chose that camera because it doesn't do that. Yeah, because I don't want another person involved in that.
Yeah, and I think that's really important. It's like I get to to to view or to delete. Without anyone else being involved.
Vim: Yeah, that's really important particularly. As you mentioned like the trust other people have in you taking those pictures or that they've put in you as a as a stranger.
Ian: Yeah, exactly. Yeah, it's when there's a social trust under the. You know and if it's a thing, that's just recording and you have no.you can't see who the person is you there's no way to have a that kind of discussion.
Then that that's a very different thing from me walking around with a camera.
Vim: Yeah. Yeah. Absolutely. What do you think? So we're kind of touching on [00:28:00] Quantified Self for people to have anyone listening that doesn't know what the Quantified Self is. Would you mind giving a quick description?
Ian: Okay.
So the Quantified Self is its people or people who who basically track themselves track their track their own data for their own benefit for to know more about themselves. So a very obvious example is Step Counting. You know, how many steps I take every day I take you know, six thousand steps a day government says I should be doing 10,000 steps a day the machine records it and that means that I can then look at it and go.
Okay. I should do another block on a run around the block. For example. Yeah, that's not a very simple. Yeah off the Quantified Self,
Vim: probably the most common.
[00:29:00] Ian: Yeah. It's probably was common next to probably. The food tracking also started to come quite big. Yeah, but leave your post steps on both visible because there's there's kind of gadgets that will do it for you pretty much but but I think the interesting part about stuff like the like the Fitbit is you have to bear in mind that you've also once again got someone else in the middle of that.
So you're what you're what Fitbit do is and this is kind of discussion that we have quite a lot in the Quantified Self Community is there's your Fitbit decides what a step is.
Vim: Yeah.
Ian: So because I mean the data that you've when you actually look at the data, it's messy I've got another tool which which actually shows me the actual steps.
It's very [00:30:00] messy. So what they do is they go they run an algorithm over it and say right we think that is a step we don't know for sure. But we think it's a step. Yeah. Well if you actually there's actually been a quantifiers who then have put on multiple gadgets and applications and they've been walked around the block a few times and it different steps.
Vim: That's interesting.
Ian: Yeah, so how can you get different steps if you if you if you're aware is that same time but positioning of the device, you know, what algorithm is being used which is the secret source of like the likes of Fitbit from the you can't have a influence on your life.
Vim: Yeah.
Ian: Yeah, that's quite scary.
Vim: Yeah, that is.
So what was she what do you think needs to happen for that to become less [00:31:00] scary.
Ian: So I think the thing that probably is person most important is that it's a true sense of the Quantified Self its knowledge through data and have to have the actual data. Yeah, what Fitbit do is they provide you the the information they provide you?
This the analysis of data and they would provide you the this is how he steps this is a step. But what you actually really need is the data the data you should be able to kind of give it to Fitbit if you trust Fitbit. Yeah, and if it will go. Oh you did you did fifty thousand steps yesterday. Wow, you're amazing.
You do it to someone else someone else you trust or another service you trust and they said actually no you just you just did your 12,000 steps good going but you know, you could definitely do [00:32:00] more.
Vim: Yeah,
Ian: it's it's about having the data. Yeah, once you got the data and you can kind of shop around that's that's what it gets really interesting.
Vim: So at the moment the market is split you kind of buy the device and then expect the service but what you're saying is you have the data that that you you are your data and you give that to other services.
Ian: Yeah. Yeah, absolutely. So this is this is why I find GDPR are really interesting because you if you can actually have the data and also time also.
I was one of the founders of the data portability group a long time ago
Vim: amazing
Ian: actually. Yeah, but actually having the data even though you may not understand it. You didn't want to have to read it. But to be able to then go. Oh, I trust you. I'm going to give you the subset of data and other see the results.
Vim: Yeah,
Ian: and [00:33:00] then they can they can also use this kind of ecosystem where they're competing against each other. So. Just because you bought a fit bear you're kind of locked into Fitbit. But what if actually there's a new start up on the you that's doing Fitness tracking and have amazing graphs. Like you've never seen before.
You should be able to choose to go over to there. And yeah, I kind of hope that Fitbit will allow you to use your data and even if they did the data give the other company of data. They wouldn't give you the actual data that give you the give you the summary of their.
Vim: Yeah, but you want the raw data.
Ian: Yeah that raw data
Vim: and do you think that's democratized process and what I mean is do you think that kind of access and knowledge is accessible to everyone.
Ian: Um, hmm. [00:34:00] I don't think so. It's a hard one because a lot of people will say. Oh, yeah, it's out there. It's out there but it's not it's not easily kind of Grokable. It's not very easy to like to like understand and redo. You kind of need to have quite a technical mindset to understand what it's actually doing but it's also there's all this other limitations, you know, the obvious one is the time poor price for you know, those kind of things are are massive factors.
Also depending on which country you're in you may not have access to it or if you can't speak. You like for example English then being able to access the most of the internet is quite difficult. You have to do lots of [00:35:00] translation and the transitions is not very clear. So it's those kind of things that always a problem.
It's a shame. It's a shame. I mean, this is why I mean you remember the Wikipedia project is great. But if you look if you don't look at the other languages,
Vim: Yeah
Ian: wikipedia then some of them are very very good very few pages, which is a real shame.
Vim: Yeah.
Ian: Yeah,
Vim: and I think the true is this same is true of a lot of the DNA services.
So I my family recently did one like a member of my family along with
Ian: which one.
Vim: It was 23 and me which will go into it.
Just carry on and what was really really interesting what I found as part of one of the [00:36:00] unconscious racism's or prejudices in Tech is that my brother got kind of 99.9 percent South Asian. Which is a huge area of land like that is not specific where it's his partner who is white was able to break down into like 20% polish 30% particular part of Scotland like the detail in her analysis was unbelievable and so ours and to me that just highlighted like how important.
That kind of difference in where you come from is to how we access Tech right? It's just not accessible to us to have that level of detail because that kind of Technology isn't accessible to the majority of the South Asian population.
Ian: Yeah. I think there's also have to bear in mind that you know, [00:37:00] 23andMe is based in in America or was based America.
Vim: Yeah.
Ian: these things all have a bearing but it's clear that there's differences and one person's experience of a service will drastically change based on. What's so many different factors?
Vim: Yeah,
Ian: all those things really have an effect on your experience of will you keep on going?
I'm very surprised the 23. Well, someone's waiting for me after they actually read the terms and conditions.
Vim: I think swell it was before the news broke recently about the pharmaceutical company partnership.
Yeah, I'm not sure. I'm not sure. I don't know if the t's and C's every red.
Ian: Yeah, I think I mean, I mean, I know I get it right. [00:38:00] Yes. I want people to do read the terms of conditions, right? But I also know someone who is it's dyslexic. It's very hard to read but I've read enough of them that I can point out where all that sounds a bit dodgy.
That doesn't sound right. What does it look like everything else? I've seen. I'm not a lawyer. I'm just I just kind of read enough that I kind of understand that where things are going. But also that will be clear the t's and C's are not good enough. This is what GDPR is hoping to kick into line.
It's like it needs to be. Describable in a way that's not like buried in like a load of T's and C's. Yeah, it's very clear what its doing, you know, it was not clear was not very readable then. Yeah, that's not a defense.
Vim: No, [00:39:00] and that's not helping anyone
Ian: No
Vim: in kind of an inline with accessibility as what about actually going back to 23 and me.
What was your take on the pharmaceutical partnership?
Ian: Well, so what I looked at 23andMe a long while ago and I was considering I've actually got a blog post saying I'm considering there and I looked at the terms conditions and I saw the terms and conditions saying. Well, okay, every terms conditions generally have we have the right to change the circulation of at a time, but it seemed to be remember correctly.
If we change the terms of conditions, then you have no way of opting out
Vim: right
Ian: which means that if they suddenly get. Bought by Google or Amazon or Facebook or whoever [00:40:00] then they you your data will be part of whatever they decide to do with it and that for me struck me as like well, I don't know if I like this.
Vim: Yeah,
Ian: is it is it worth the results? All of a sudden up for what you said. It sounds like it's not probably work result to find out, you know, very little.
Vim: Yeah
Ian: term for his everything about me.
Vim: Yeah,
Ian: so that's probably that's probably the big one probably also the think about the percentages.
I've only got done a little while ago about. False claims and the percentages are kind of kind of way out of whack. So you like playing. Oh, so like just be like, oh you're 33 percent chance that you will die of [00:41:00] diabetes, right? But that 33 percent what is the scale? Is it like what a hundred percent is hundred %.
I'm gonna die tomorrow or is it going to be is zero like? Yeah, it's like and even if it was it's just it's like a they basically got done for kinda claiming things and not be very clear.
Vim: Yeah,
Ian: I've been there results that the reports are less detailed as they used to be because it can't not legally allowed to claim the stuff now
Vim: I see that makes sense.
Vim: And I think they also do a thing where you can opt out of. Accessing that information as well. So if they pick up that you've got cancer for example, but you've opted out of finding that information out you kind of had that and don't know
Ian: hmm. Yeah, I did that that strikes me also like [00:42:00] to one else knows but I don't know.
Vim: Yeah,
Ian: you know in my Quantified Self Hound. Yeah feels it's like the Fitbit thing wearing Fitbit know all the stuff about my run but not sharing that information will me and yeah, I could understand from you. You know what I know, but they're just feels that it's an uneven exchange happening.
Vim: Yeah.
Yeah and all of and then the next thing, you know, Facebook's advertising you cancer pills or something. And you like why is this happening?
Ian: Yeah. Yeah. You've got a.
Vim: Yeah, I think it's really interesting like that. Quantified Self is focused very much on the I and the data for you, but I think yeah, I think what really strikes me though is Based, based on like that like democracy that we want in the future at the moment very few people and I'm being very very gerneralist here.
But [00:43:00] very few people understand what all of that data is or what's being collected of them at any given point. What do you think the future of that looks like?
Ian: Um, hmm. So that's what answer the question about. Well the kind of worried about it being about you. Yes that there's a there's an issue with the Quantified Self.
So there's the world the nice thing for the quite myself is that is that people basically record talks about their own data? And they make try to make claims. So for example, yeah, some people will do so would like I eat this every day at this time and I like this and you kind of use a kind of some a bit of science to go and look this what happens when I don't eat this but there's a lot of factors in yeah, they have a data to the doing tests on [00:44:00] themselves.
It is like patient zero. Yeah important things that you unlike other places that in this is what worked for me.
Vim: Yeah.
Ian: This may not work for you. You know, we're very different in the recreation of diversity is very important. Yeah, so that's why it's about you and you focus on yourself.
Vim: Yeah,
Ian: its knowledge through experiment on yourself.
How to people how people learn or can understand this or get into this I don't know. I mean, it's really figure out if there's actually there's been some talks when I've gone to some of the Quantified self meetups will be upside Quantified Self conferences.
Vim: Yeah,
Ian: that's been it's been interesting because for example, everyone thinks the Quantified Self
it's quite a new thing. It's not [00:45:00] we just quits just called self-tracking or attracted a lot of people have collected lots of stuff or they've made notes in a notebook every time they vary over time. But the Quantified Self does is it allows you to have a framework where you can try experimentation on yourself?
Why do I start at the angle? What if I do something slightly different? And then track that and then go back and try something else, you know, so I think there have been Lots people tracking stuff for a long time rulers are there was a woman who knew she was in Germany in 1980s. Not my 1980 in the in the 1800 1800.
Sorry. Yeah, and she she was writing something. And they had it is in a museum somewhere. So she kind of like careful [00:46:00] track of who who will pass the Door Kinda thing.
Vim: Yeah.
Ian: It was it was called to this kind of tracking external data, but she didn't have the mechanism to analyze that or two to do something different with that.
So I think you can you can you can get into it quite easily without even buing any tech at all. Yes. Oh, I remember that guy the first I went to the Quantified Self conference where he was tracking the conversations that he had. Yeah people and we have this notebook and he would mark down if he spoke to someone from one five minutes.
He might down kind of how many people were there what sex they were. What the conversation was about like really quick kind of like this tiny ever and it was color and so he's got loads and loads and loads of notebooks, [00:47:00] you know, and he was going back through and he was is pouring that into a spreadsheet which then he could use to analyze stuff, you know went to certain places then this converstion took place that he has for a long period of time.
Vim: Yeah,
Ian: so I think there are ways to start and understand the value of data and the and the understanding the kind of scientific method method and. Because analysis really?
Vim: Yeah, I think if you note on the head when you said people are time poor and money poor because when you when you may even then I was like, oh, that's really cool.
That's really exciting. I would definitely want to do that. And then I thought actually I'm not going to I'm not going to spend the time going through my notebooks and then typing that into the spreadsheet and then learning how to analyze it, but I would rather just buy an app I add a note to that then just gives me a the [00:48:00] end of it.
You know.
Ian: Yeah, and I think that's and that's what what the likes of Fitbit are there. Do. You know if those are loads of these little companies that will will do stuff for you this even loads of those apps that will do this for you, you know, its a matter of kinda choice and what you're willing to give up what you're not.
I think they think the thing for me is that people can understand that. It's always a transaction. and that transaction could be in favor for you or your kind of work against you is really important.
Vim: Yeah, absolutely. I think that's a really good point to close the conversation. I think they're your wisest words to kind of leave people with to think about.
Ian: Okay
Vim: little bit harrowing.
Yeah, is there anything else you'd like to add?
Ian: Yeah, I guess some minutes [00:49:00] there's a lot but there's a lot more I could say I feel like only any slightly cut into it, but I think yeah, I think it's kinda is I can just be about being quite conscious about the transactions that are happening around you and your data.
Vim: Yeah,
Ian: it's probably the thing that's probably most important for that conscious decision. You start to like think actually am I being you know, what am I getting out of this?
Vim: Yeah,
Ian: if you this is it's all I think I always have that I use a lot of Google services because what I get out of here in return for what Google gets out of it, I don't know what Google gets out of it, but I get quite a lot out of it.
Yeah, but it's a conscious decision. It's not like a I just walked into it.
Vim: Yeah. Yeah, absolutely. And I think you know we've [00:50:00] seen Trends happen where people have become more conscious and other areas like as a general population were getting more conscious about the food that we're eating and we're getting more conscious about plastic and you know, all of these different things I think data has its time where more and more people will get.
Become more conscious of what they're giving.
Ian: Yeah, I think I wouldn't give some it's about the right time and what you'll see is that at some point they'll be apps that will be all we will suit the people who who are are time poor. You know, what's the thing about put my mouth? What's the not only what calories all that kind of stuff.
But also where is it from? Does it fit with my rule sets of my life? You know why I don't want to eat anything that's like factory farms. For example, those are opportunities for for startups and for businesses and for for other companies because I think [00:51:00] that that's going to become a big part of it is that people are so time poor that but it want to make these decisions.
He will do the right thing.
Vim: Yeah.
Ian: They want to recycle? I'd say so yo even that helps with that but it's done in a in a kind of ethical manner. Yeah, other than just we're now going to do with you take all this data and we're going to mine it for as much as we possibly can and that's are you're going to get a yes.
No out of it is the thing.
Vim: Yeah. Absolutely. That's great. Well, thank you so much for your time.