distributedFuture-4-ethics.wav
Tim: [00:00:00] So I'm Tim Panton and this is a the distribution Futures podcast
Vim: and I'm Vimla Appadoo and we're here today at the chat about ethics.
Tim: So I'm really interested in ethics in technology in the sense that I as as technology gets more and more ingrained in our society. Like how do we how do we respond to that in an ethical way?
What? Like engineers in particular and product designers and they get ethics into their product without losing their jobs.
Vim: Um, So you cut off quite a bit then.
Tim: Oh, it doesn't work here.
Vim: I thought all I had was ethical and losing their jobs.
Tim: Yes, right. So I'll move somewhere whether we've got better signal to shame because it's nice out here but practical right? Right, right, right, [00:01:00] right. Um, let's see. Yes, right.
How about that right now sounding like I'm real.
Vim: Yeah.
Tim: Okay. So so um and I'll edit it this. Um, so I you know where I was ethics in technology because as ethics as technology gets more and more into the fabric of our society then the ethics of technological products becomes more and more important and the question is, how do you.
As a product designer or an engineer on a product. How do you inject your ethics into that without losing your job?
Vim: Yep completely and I think there are a few things attached to ethics that are really important to consider. So kind of. Inclusivity. Um, and when you try and make a product ethical which usually set by your standards of what ethics are not a universal understanding [00:02:00] what ethics are so how can you try and get that Universal understanding into Tech
Tim: Is they're a universal understanding?
I mean, isn't that like being too optimistic that you can you can find that I mean, I suppose all I'm saying really is that I'd settle for uh for my own ethics getting into something at least like somebody's ethics rather than none at all.
Vim: Oh, yeah. Absolutely. I complete. Yeah, I completely agree.
I think it's very utopian to think there is a universal understanding of Ethics, but. As as more and more technology gets built by lots and lots of different people. I think there will need to be a standard at some point in the same way. There is data
now.
Tim: Yeah, there [00:03:00] is a standard. I'm not sure. I mean what were you thinking about in terms of a standard for like the quality of data?
Vim: Uh, well, I mean GDP was what springs to mind. Less standards, but more compliance
Tim: right? I mean I weirdly I mean I'm no expert on this gdpr actually has a very small amount to say about this in the sense that it it says, I mean it says that you you can't () I'm going to get this not quite right but something like () it says that you can't be have a decision that impacts your legal status.
Done purely algorithmically, it has to have a human component made in the decision-making process.
Vim: Oh, wow.
Tim: So this I'm I'm paraphrasing it and that's not the legal term. But essentially there has to be some way that you can appeal against for example, um some judgment that's come out of an algorithm.
You have to have [00:04:00] some way of appealing against it, um to a human.
Vim: Yeah,
Tim: uh, you can't have a purely automated decision-making process that has legal ramifications on your your life and well-being, which I think it's I mean that's start that's a really good start and not sure it's
Vim: that's an issue that human is more ethical than the machine in a very dystopian world like.
Yeah, yeah,
Tim: I know you mean. Um, yeah, so I mean I think there and it's very easy to get dystopian worlds just by not really. Um,
Vim: I think exactly
Tim: like putting your head down and getting dug in. I mean, I don't know if. This is an experience you've had but it's really easy to get dug into the maths of something or the technical complexity of a problem [00:05:00] and not really stand far enough back from it to think well actually mean to the users.
Vim: Yeah. Yeah, that's kind of my job to stop that from happening.
Tim: And how do you do that?
Vim: Um speak to the users as early as often and as often and as frequently as possible, so. Understanding their behaviours and their needs and making sure that that's a part of the technology that's being built.
Tim: And how do you avoid like picking the wrong users?
Right? Because the users you've built it for but implicitly you're excluding somebody potentially.
Vim: Yeah. Um, I don't know if I have an answer for that. I think it's more about trying to understand Behavior [00:06:00] rather than the person if that makes sense. I have are they interacting with it and that type of thing rather than who they are.
Tim: Okay, so the kind of detailed interactions rather than. And presumably those represent a kind of group of people rather than
Vim: yeah. Yeah. Yes. I rather than than your 20 year old white male. It's you're a person who interacts with it like this in the same way that a 30 year old woman interacts with it and we can it's the same human interaction.
Tim: Right? I mean, I think that's very hard to do. Um without consciously making an effort to go and visit other people's environments. Um, it's kind of quite interesting, uh to see how [00:07:00] occasionally you get I as a technology person get brought up short by realizing that like, there's a whole thing. I didn't understand about something.
No because I thought I thought I'd kind of knew the maths of it or knew the Computing of it, but then then you kind of stumble across. The reality of it. Um, and and that that doesn't match what you were doing in your code and that I think bridging that either takes luck to be exposed to the the different environment or it takes a lot of kind of work of consciously including other viewpoints and that I mean,
Vim: do you find that when you're working on your own or when you're working in a team?
Tim: Um, I think it's worse in a team unless well because there's a very driven more to group think. One of the nice things about working on your own is you'll forced to go and because you don't [00:08:00] have a team around you you're forced to go and talk to not random people but more people it's like this thing about.
Um, if you go somewhere as a tourist, you actually see much more if you go on your own when you go with somebody,
Vim: yeah.
Tim: Um because you're forced to to reach out a bit more. Um, yeah, I mean, there are other sites that I mean in a in a in a diverse mix team you get the other benefit. Like other people's few points and and there are huge benefits for working in a team in terms of kind of productivity and and you know just general kind of almost mental health, but it's a much
It's a nicer environment. But but um, I think in terms of like getting diverse inputs being forced to reach out to people kind of does spread the net a bit wider in a way
Vim: that's really interesting because my. And my experience has been the opposite. So when I work on [00:09:00] my own, I don't sense check or I don't get that challenge.
Whereas when I work in the team. It's constant. It's like well why have you done that? Why is it that way? Where is the evidence to prove? It needs to be this way? That's when it all comes in.
Tim: Right I think. Think that's partly to do with how long you do that for? There's like you.
Vim: Yeah,
Tim: if you know, if you put your head down and work for six weeks on a project on your own then you can do that with like a new in theory.
You could do that without interacting with anybody who's six weeks and you've still like, you know, you still do it, but after a few months you actually kind of you need to to talk to other people and whatever and and that that does um does inform. Yeah, I mean I had to the exact opposite thing when I was doing this thing on a couple of years ago when we were writing the text for filing a patent and I actually couldn't sense [00:10:00] check it like my normal response was to run it past a couple of people to see what they thought and I literally couldn't because like it would invalidate the patent application,
Vim: right? Yeah
Tim: absolutely hated not being able to ping my you know. My trusted cohort and say well like what do you think of this does this make sense?
And I had to write the cold document without being able to do it. I absolutely hated doing that.
Vim: Yeah, I guess the big question is though it does that sense checking actually make things more ethical or is it better to just do it as what you think is the right thing to do like you said at the very beginning like some ethics is better than none.
Tim: Yeah, I mean, I think I think the trick is to see if you can Empower people to feel like they can do that that they can push back against something that you know, they're being asked to do that they feel is is not ethical [00:11:00] and and I think we've I think we may not really have the structures in place in Tech to allow us to do that and if you look at you
the other professions they all have professional bodies which are the side channel for raising this kind of issue. So, you know Engineers have like the professional associations that they can go to and say, you know, this is this is this ethical behavior and you can you know, if you're a doctor you get struck off for unethical behavior and which effectively means you can't practice but there's no equivalent for software
Vim: know that's true. And that's kind of what I was trying to get like a standard or a compliance. It's that like all auditing of software to see if it meets a certain I don't and I don't know if a little bit would ever work but it feels like something's missing.
Tim: I agree [00:12:00] that this is something missing but I think audit and compliance. Well, they serve a purpose. I think they tend to be they tend to fix things retrospectively like
Vim: yeah,
Tim: you know, it's usually too late. Where's I think you know, you can get a sense of uh, that a project has a like. Besides you don't want to be involved in quite early on often and you know, um, I mean one of the nice things about getting about working for yourself as you can like get out of a project that you don't want to be part of um, whereas if you are a big organization and that's your career that can be quite quite career limiting to do.
Vim: Yeah. Yeah absolutely. Do then is does that mean there's a. Uh bigger culture shift that needs to happen around it so that it [00:13:00] doesn't um threaten your career, but it promotes it
Tim: I think so. I mean, I think we need you know, we're starting to see around the edges sort of semi professional organizations like this, you know visit there's one in the marketing world
I forgot what it's called, but there's starting to form and then and then you got things like the Guild of makers but the space where we really need it in things like AI I'm not aware of anything there. I mean what we really need is like a, you know, the worshipful company of AI practitioners or something who kind of you know, um, shun each other if they don't behave ethically or if their software doesn't behave ethically, um, whatever that means.
Vim: Yeah, that's very true. And I mean, it's the example that everyone uses of the Microsoft Twitter [00:14:00] bot that learn off of conversations on Twitter that ended up being racist sexist homophobic with in like half an hour of being live. It's that kind of. Makes you think about the things that we're building.
Tim: Yeah that, I mean really they should have known better like that was you know, that that was just like that's by no means the first time. I mean like that that sort of and it was deliberate like the people who it was deliberately attacked to get that result. Uh, so it's not like I think that's less disconcerting than some of the kind of subtle algorithm make things which like a, you know, a ten percent you get a 10% worst grade if you're a woman or if you're from, you know, northern France or something.
[00:15:00] Those are scarier.
Vim: Yeah. I saw a really interesting, um case the other day where a Chinese woman her last name was just two letters. But all forms had a minimum of five letters for your last name, so she couldn't fill out any online application forms for jobs or like anyting because her last name wasn't long enough.
Tim: Right right.
I saw another one of those today and it so it's obviously a common thing. There was somebody saying that uh, these there was one of these kind of verification questions was um, you know, what was your father's middle name? The answer was Paul and his types in Paul and it says you put in a the thing then challenges him.
So this is too short. It should be five letters or more. It's like well now it's the wrong answer right? It's like
Vim: yeah, that's really interesting.
Tim: But it so it's not it.
[00:16:00] Vim: Is that ethics ethics?
Tim: Um, so the ethics comes in in in when you were designing that and if somebody wasn't given the time to challenge that and say but like some people have 2 letter surnames we should add support for that.
Um or didn't get the time to do a review of what surnames were. So think a lot of it is actually giving people the time and the space to. To self verify what they're doing because I mean most programmers actually want to make a good job of it. They're not like I mean, some of them are actively evil but most of them are like want to do a decent job, but like time pressures mean that they tend where they don't it tends to be that they're under time pressure not too and so I think the ethics' are actually around how you manage people and how you give them the space and the and the support to let them feel like they can make more ethical decisions.
[00:17:00] Yeah and encourage them to do. So.
Vim: Yeah, absolutely.
Tim: I don't think I know how you do that apart from every there's like all the other professions have a lot of kind of um, well Pilots are interesting because they have uh, like they have this blameless Reporting System where you can report near-misses without losing your job.
Vim: That's interesting.
That's really interesting
Tim: and you're actually obliged to because it's not a like there's no weather. I mean they obviously there is a downside but there's a minimal downside. You're absolutely obliged to report it so that like the ethics of being a pilot is that you do report near misses and you won't be criticized for that.
Vim: Yeah.
Tim: And when definitely not in that situation in software,
[00:18:00] Vim: yeah,
Tim: so I think that we're not really solving this or do you think we're working our way? So I guess the question is do you think it's getting worse or better? I
Vim: think there's more of an awareness. I think it's definitely become something that's on the agenda now.
And as a result of that is getting better, but I think some we need to speed the way it's happening because technology as always is growing faster than we're able to control it. So I think something needs to be done before something very catastrophic happens.
Tim: Yeah, I mean the history of this is that actually nothing is done until after a couple of catastrophic things have happened
Vim: exactly.
I mean it was it would make sense for once to learn from our mistakes.
Tim: Yeah. Yeah,
[00:19:00] Vim: and now I'm being very dystopian.
Tim: So well, I mean suppose it depends on whether you know what thermostat they were mistakes like yeah, that's not always you know, obvious, I mean
Vim: ethics is always subjective, right so you can like we said there's no Universal understanding.
Tim: Yeah, I mean there's a this is an agreement about outcomes. I think like, you know, you can usually point to an outcome and say well that's unfair and get broad agreement about what is fair and unfair,
Vim: yeah.
Tim: Um, but that subtly different I suppose. Yeah, I think that is so I think this is something we I'd actually really liked and hopefully we can find um somebody in the AI world who will like answer some of these sorts of questions because I don't feel I know enough [00:20:00] except to know that there's like this is difficult.
Um,
Vim: yeah
Tim: and needs fixing sooner rather than later. Um,
Vim: yeah, absolutely.
Tim: Cool. Well are we will um, I think we'll come back to this. I think it's an important topic and we'll come back to it. And um, yeah, we've got a um, Got an interview now which touches on this area? It's a slightly broader broader space.
But um, well, uh, I hopefully that'll that'll kind of make people them think a little bit about the ethical about the consequences of their actions at least anyway, um, and now to learn from them potentially, so um with that I think we will move on to the interview.
Vim: Great. Thank you.
Tim: Okay.
Vim: Okay, bye-bye.
Noemie: Okay. Hi. My name is Noemie Lopian. Um, I grew up as a [00:21:00] child in Germany and came over to England at the age of 13. I'm the daughter of two Holocaust Survivors. Although I didn't grow up consciously with that knowledge. Um, I studied medicine here in England and a married. I've got four kids and grandchildren blessed with grandchildren too old and that's me in a nutshell.
Tim: I understand that you're interested in bringing some of that history. Um, To current audiences and so that they understand that the risks and um, you know the horrors that could be could be happen again if we're not careful and diligent, so maybe you could talk a little bit about the work you're doing it in that space.
Noemie: Yes, thank you. My dad left us really with a legacy. He shortly after the war wrote a book [00:22:00] called "The Long Night" , which I translated from German into English together with another gentleman who didn't speak any German, but gave me the discipline. And the space to uh, keep to the hours of translation and it took three years only turned to my past really after my youngest daughter was born and that was at the age of 36 37 and it was a slow progress but one that was very Illuminating and that I do feel as survivors are getting now older and fewer.
How do we preserve the memory of the Holocaust and as you say to Tim, how can we prevent another happening again at the moment? We see a rise in overt anti-Semitism. I don't believe that anti-Semitism was ever dead. But I feel is very much out in the open within certain fractions in our political party.
Not only in [00:23:00] the UK but in Austria we have the far right? In Germany the same the afd the "alternative for Germany" is a new party that has a 20 percent representation and the bundestag and is the opposition in the bundestag which is in basically in the House of Commons. So and in France also, um many Jews have left France last year another 3,000 because again, the actual anti-semitic is not just the anti semitic language but actually attacks,
murders and most recently an 80 year old lady who actually survived the Holocaust in Vichy France and was murdered by her neighbor for his convictions. And that was I think an Islamist extremist. So we have the far-right the far left and the new anti-Semitism of Islamism extremism, and my project is [00:24:00] really.
To educate people and particularly looking at the young people in working sectors of life and let them form their own views, but at least gives a balance views of our life and our people and in particularly recounting the horrors which happen merely because we were born to a religion called the Jewish religion my, um future projects are that.
I translated the book but also brought out called now. "The long night" by Ernst Israel Bornstein, but also to mirror that in particular looking at the younger Generations who use online and awful lot for their research and is their natural Library. I brought out website in January last year called HolocaustMatters.org And that mirrors the book, but it's already divided up into themes.
So should be easily used for. Um teachers and students, [00:25:00] but I think that's probably not enough. What I do want to do in the Futures actually create a package because I know that teachers are extremely busy. The curriculums are tight. And so something that could just lift off. Um, the other thing I'm doing for next Holocaust Memorial Day in January '19 is something called whiteboard animation.
So that's a two-dimensional representation of "The Long Night" in bite-size as videos. And again, they can be accessed online and they'll be free to use um, and I'm also looking at a project, um for the future years, um, which is buy a VR. Um, you might know even more about it than me and with a modern technology, but where people can use it at home in a two-dimensional but with headphones in a three-dimensional way so that we can actually turn the story into film animation and that would be probably a lot richer [00:26:00] than the white line video and gives us a more in depth dimension of the store of my dad's experiences.
He was only um, 18 when war broke out in 39 and he was in seven labor and concentration camps for over four and half years subsequently go on. Sorry.
Tim: I was just gonna say I'm interested in the idea of trying to bring this to different New Media. I think it's particularly. There is a couple of kind of really difficult challenges and fascinated to hear what you you maybe say about those one of which is how do you get people interested?
I mean sorts of things that you've you talked about are formats, but how do you hook people in the first place? And then I think the other question is, how do you keep those formats moving forward? You know, the iPhone is a different environment than I hear the VR stuff. I think that's where.
[00:27:00] Very exciting to be to be looking at that but how do you keep how do you keep moving forward into into new territories and keeping it, you know adapting that content because I think that's a huge challenge.
Noemie: It is a huge challenge you are right and ask myself the question all the time. And why would people want to hear about subject?
That's let's face it. It's not a pleasant subject and if it's nothing to do with them, why should they be interested in want to learn from it in the first place? And I set myself that challenge and um, I try now because of that to go really into schools even as young now I've got two psychologists on board with me as primary schools, which I think could really happen.
So that children can learn in a very safe pedagogic environment a little bit about it, maybe just through commemoration and through lighting account or something very simple, but very meaningful and touching because. I think history this sort of [00:28:00] History can't be taught as a history lesson as a cold Leslie has to be taught as a human story something with feeling because we need to use both our hearts and our brains and what the Nazis were so successful in doing with dehumanizing them by not looking at us as humans.
It was a double-edged sword and I want to say that we are very much human. We are like any other people good and bad. Um, it's not necessary to treat people like that just because we belong to a different faith and now this year in order to attract people. I I'm going to try and go into Universities at the moment.
I've got Manchester uni tester and University College London on board. And come with a speaker who has a very different background. He is the grandson of an Nazi and I was thinking maybe people might be interested in that sensationalism that you have somebody with a Nazi background and somebody who whose parents [00:29:00] were in the Holocaust and and and see how these two different backgrounds come together and work together and give a question answer availability to the students and hope that they'll be animated and.
Want to know more from that
Tim: sounds I could sounds like an interesting project. Have you thought about maybe making the the that available subsequently as in like a YouTube video or something?
Noemie: I think I haven't but I think it's a very good idea or I will now. Thank you.
Tim: Okay. I'm interested in the in the
extent to which the ephemeral-ness of of New Media, you know what we're doing here in the podcast and and YouTube and other formats like that how that makes it more difficult to make the challenge seem real I mean one of the things that I recently bought an apartment in [00:30:00] Berlin and one of the most effective things that I've seen is that is the little forgotten the German word for it.
Yes little inch and half square stones in the street. I mean, there are a couple outside my apartment and there's a block of four of a family who and they commemorate. The whole family who were murdered, um, simply because of their religion and that and that's outside my apartment block and that's it's an arresting sight and it brings it very much.
Um, viscerally. You're conscious that these people had lives they you know, they went to the school or whatever and it's not it removes some of the distance that I think one of the risks and I talking too much. But and one of the risk so feel some of the New Media is it brings too much distance too much remoteness from from the fact.
How do you deal with those challenges?
Noemie: I think by the remoteness is that I have to become even though that is [00:31:00] once removed the mouthpiece of my father and actually speak those. Stories and actually tell it as it happened. So to bring it to life I to bring in the human factor because via YouTube and media they can see interviews, but they actually I think also needs to hear actually in what happened exactly
what happened because somehow by brandishing those words about and especially the misused these days of the Holocaust and the Nazi and the comparisons, which I cannot bear. I don't think anything can be compared to that and shouldn't be but we become immune to it. And so we need to bring It Right Back to Basics.
But of course our survivors are very elderly now and less and less will be around in the next few years and eventually none. I think we have to use multimedia facilities. It can no longer be one Survivor speaking to one group of people and having it filmed we have to appeal to [00:32:00] you, um to all our senses as it were and provide many different avenues and channels so that something will appeal to people and I still think it is that human factor, so easy but to make it readily available impact sizes.
Um and use our visuals as well as our audio so that people can you know, either look at it, but also hear it when they're when they're doing other stuff and I just think we have to keep moving and and keep listening to what's being being aware of what's out there and plugging into that all the time evolving with time
Tim: and so do you work?
Essentially on your own or is there a kind of collective group of people who work together on these things or or is it more ad hoc. I'm actually also entry very interested in how New Media gets formed. I think sometimes it's very, um, it's kind of ad [00:33:00] hoc coalition's of people who temporarily come together to build a thing bit like Vim and me on this podcast.
We're doing this thing because we both want to but it doesn't like. There's no sort of financial structure or anything around it. Um, so I think those kind of casual, casual is maybe the almost accidental collisions are quite sometimes quite fruitful actually.
Noemie: I think the accidental collisions are fruitful and very and passion-driven.
So each project that I do I probably have different people that I approach because different people. Um have different skills and different needs and we were like that much better together than in a set group. I had a wonderful young woman Julie who did the website for me who'd read the book and really had a good understanding of it and uh, not only intellectually but also emotionally so and I don't use that word lightly I entrusted her with that because um, [00:34:00] I want to make really sure that the memory and the words are exactly preserved to the same nuance as my father wrote it, but it then yes different.
None of it is done for money. It's really done for the commemoration on for the education because I believe that extremist groups are so powerful in their education, which is a bit more extreme. The really let's call it propaganda, but it is a form of Education. That the only way that we can counteract that is giving the chance for people to be educated and hope that there's are more good people than bad and they can form their own news when they hear the true side of the story.
Um, so yes, it's I have meetings with many different people ranging from MPs to Primary School pupils and also going to even go into Liverpool into a care home for [00:35:00] adolescents because I've gathered some research has been done where their behavior actually improves and I think that they're inspired by
knowing that people can surmount a tragedy in when they look at what happened to people in the Holocaust not that God forbid that should be a norm and they can see that their situation isn't the same and maybe they can help to put themselves help themselves to put themselves on the right path together with help of others and that they have a chance in life and their life isn't quite as bad as they thought it was.
So, um, yeah,
Tim: no brings me to an interesting you were saying that you wanted to preserve the the tone and the Nuance of what your father had written and the words. How do you do that as language shifts? I mean, I mean the translation is is a is an interesting problem in itself there, you know, but I guess being bi-lingual you [00:36:00] would
be able to manage that transition, but I wonder how you do that for for a more modern audience due to change the words or do you just let those speak for themselves? I think that's really you know, tricky.
Noemie: Yes, I'm very I think you hit the nail on the head by asking me that question. It is tricky and I think we have to talk the talk and walk the walk so
as long as I have the meaning, uh for me, it's all about the meaning and that it's exactly the same meaning as in the German and that you preserve that beautiful language. But again for it to sound proper English, we can't translate literally you do have to translate it into good English and that was fine for the book.
But then when you speak to young people, I think you have to again convert it or translated into into that sort of talk because. The power of words is very important and you do have to talk the language of the people [00:37:00] in order to end the right in the right age group or age appropriate. Um to get through to the people, but again, I think that's helped nowadays by sort of maybe keeping it like you're doing a podcast size which is something young people are more used to even in my generation and and also look into visuals as well because they're very powerful and people are very used to uh, learning and seeing things on visuals.
Um, so yes again using all the mediums possible including words. I was very conscious of of using the right 'right' in inverted commas words, and that was the Battle of well, actually with a gentleman who translated it with me because I realized it's pretty early on that. His language was very removed from my dad's language.
And so really every sentence took a long time to make sure that I kept it to my dad's language.
Tim: I think that's that's [00:38:00] quite a quite a challenge. Obviously that business of translating is a a regular challenge for all of us and as we try and deal with different media, I wonder whether um, whether you're kind of coming across also this this business about us no longer trusting on people no longer trusting authoritative sources or not even not even necessarily knowing what Authority their source is.
Noemie: Yeah, especially like nowadays with fake news or fake media. Um, I'd like to think I think that's an important work that's in a way that the Holocaust educational trust are doing, which is letting people hear from survivors.
Their accounts and they in effect become Witnesses and ambassadors say we've heard that this happened to these people and therefore carry on passing the word down. Um [00:39:00] again in my dad's book just Christmas gone. I got contacted by the son of a gentleman called Samuel Gilbert who's on the top of page 74 of my dad and as if I needed to bring it alive because I know it was what happened to my dad.
I still found it incredible that. Um, there is the the son of a man who lived through the same things as my dad and I'm yearning to meet him. Um, because I almost feel that like family having these two men my dad and this man's dad having lived through those horrific is to mild word experiences. So, um, yeah,
Tim: so I think that that really I think what you're saying is is is witness is personal witness or as near as that as you can get and then we see that because an increasing challenge, um, but getting that that personalized witness is the way to [00:40:00] to um become as make it clear how genuine the facts are. I think that's you know, that's the challenge.