Infosec - the skills shortage.
Tim: [00:00:00] So I'm Tim Panton
Vim: [00:00:04] And I'm Vimla Appadoo
Tim: [00:00:06] and this is the distributed future podcast. It's a new year and a new, well actually not really new cause it's all the, we're doing the same thing cause we think it's maybe working. but but this one's about sort of, it's typical of the way that we do stuff. It's, it's the intersection of tech and society.
And I, I really like it when we get, when that happens, you get something that's like quite technical. But actually. The undercurrent is all about the societal impact. and this one was on computer security, but looking at particularly at why and the skill shortage, I mean, there aren't enough people working in computer security and there's a big demand for the, in that problem space, a lot of money as well.
But, but they don't appear to be enough people working in it. And when you look at it, like only, you know, it's simply only white men, like two huge, like 90% white men. And you know, there's a huge market of people who they're not employing. And so a lot of the conversation was around that. And like in a, how did that happen?
Why is it that other industries have managed to address this problem and fix it in the last 20 years? But somehow, you know, information security and to a larger extent, like. Networking and stuff hasn't. and it was fascinating. It was a really interesting conversation about how that works and, and how, how it can be fixed.
and then of course, we talked about computer security and, you know, as well, but, but a lot of it was around the society stuff and what you should and shouldn't be doing and, you know. Interesting.
Vim: [00:01:43] yeah. So what was the confusion around why. The computer security industry hasn't had more diversity.
Tim: [00:01:54] well, so because it's highly toxic, like all of the working practices or things that you've run a mile from, if you are sensible. and, and the conclusion was this is partly that it's new and therefore we don't have the regulatory structures and the professional bodies and the, the sorts of, sort of.
Expectations and norms, but creating a healthy working environment. I mean, that seemed to be a lot of it.
Vim: [00:02:22] Yeah. That's really interesting because that immediately makes me think, new industries, the default is for men to be hired. So you, my assumption would be if there's a new industries in any stort of work that's, that's rising.
It's more likely to be diverse because of the knowledge we have now with building diverse teams and being inclusive and understanding how to do all of that stuff rather than kind of, proliferating. It's happened before.
Tim: [00:02:53] Yeah. I mean, computer is really weird because I mean, historically it was a woman's job, like all of the early computer.
People were women. And then somehow it got like changed into being seen as a man's job. And, and, and that has stuck, in a way that like, I mean, the example I was used as medicine. Alright. Medicine has changed immensely in the last 20 years. I mean, 20 years ago you almost always had a male GP. And now more than 50% of GPs are women.
And I, and I, you know, that's 20 years and it's changed.
Vim: [00:03:30] Yeah.
Tim: [00:03:30] And the same is true for surgeons. And like, you know, it's not just at the GP level, it's kind of. Almost throughout the, the organizations and also thing like chemical engineering, the engineers, like, you know, substantial proportion of them are women now.
Whereas, somehow in, in the computing world, we've completely failed to, to learn that or to do whatever was necessary, which is, you know, it's a shame. and, and an embarrassment, frankly.
Vim: [00:03:58] Yeah, yeah. 100%.
Tim: [00:04:01] But I did think that the idea that, you know, structures and professional associations and just generally professionalizing what is a bit of a, we didn't actually say this, but like this slightly cowboy feel to it.
like professionalizing it actually would help tremendously. And, but we've got the classic thing that we do in, in computing, which is that we've got competing bodies. Trying to standardize and trying to be the, you know, the agency for this. and of course, none of them are winning. So what's your, Computer science,
Vim: [00:04:38] computer security perspective. What was the kind of the big hitting thing that's going to happen over the next in the future? What, what do you think?
Tim: [00:04:49] So we were very depressed, about that, you know, we tried to come up with some, some good news there, but like, there were a lot of bad news stories.
And, and I think again, it's because it's slightly, it's just, it's a new and changing field. And that like ones, it's difficult to come up with, with like habits and expectations because the ground, the goalposts keep moving. But you know, if you start teaching people about how to do things securely and then suddenly like this new attack comes out and this new virus comes out and, and the rules about how you construct passwords change.
Vim: [00:05:24] Hmm. Yeah.
Tim: [00:05:25] well, you know, you, you should or shouldn't use SMS as a, as a way of checking whether you've changed your password or not. And like, you know, a whole, it's a fast moving field, and so it's really hard to give clear, consistent advice, which is, I think is. Well, we both, we were both a bit gloomy.
We had sort of suggestions about how it could get fixed, but
not going to get fixed anytime soon, unfortunately.
Vim: [00:05:53] Yeah. I went to a talk recently, around security and the past, I've given the talk, he would do a live demonstration of being able to hack wifi and, do you like, Security assessment of your professional email address. So he set up a fake wifi network at the conference that looked like a Starbucks login or look like the conference login.
And he was showing how many people had logged into that and then flashed up all of their passwords and email address like passwords, because they had submitted it to this. This bright light to this login page, and like, it had the effect he wanted. It shocks everyone in the room to kind of realize that, how easy that is today.
Tim: [00:06:38] Yeah. Yeah. I mean, you know, passwords, this email and password address. The thing is, is it's a challenge. I mean, you know, I don't, I can't. Like a, particularly with an aging population, the idea that everyone is going to remember, they're like a hundred passwords or whatever, or that they're going to be organized enough to use a, an app and consistently use to say, map for password management.
It's difficult. We've all got multiple devices and, you know, yeah. So, yeah. Yeah. I'm, I'm, I'm sitting here, I should essentially just reorganizing my living room because I've had had like this weird thing this week in which, so we've, we've got a. Stand a little stand, at CES and Simon, my cofounder is over there, I'm on the stand and we wanted the demo, like what we're doing.
And it is about, computer security. And it's about, this particular thing is about how to like, secure a webcam to make it safe for use. and, and so we, we kind of fitted out this demo where like, there's a webcam in my living room. and the people on the stand in CES can view it securely if they give them access to it.
but the side effect is, I have a webcam in my living room that like 40 people in CES would be looking at. And I'm like, okay, I'm doing security here, but the net result is I can't use my living room.
Vim: [00:08:06] Yeah, yeah. That's great though.
Tim: [00:08:11] Yeah, no, I mean, it was a fun, and we've got some, got some good feedback and interesting, interesting leads and whatever.
And it's a hot topic. I mean, you look at some of the, disasters that are in the making there, but, actually that was a, that was probably, it's kind of not exactly positive, but what, one of the things that came out of the conversation with Saskia was like, there are things that you probably shouldn't be doing as a product designer.
Like, you know, you should look at whether this is something you can actually build in a secure way, and if you can't, don't build it.
Vim: [00:08:45] Yeah,
it's,
it's surprising. It still surprises me how reluctant people are to think about the ethics and privacy security questions that come up right. A big part of my job is just saying, yeah, but what if this happens?
What are we going to do? What? How are we going to design the out? And it's like, Oh, well, it doesn't really matter because if that happens, it happens and that's, that's out of our control. Yeah, yeah. I'm like, no, no. This is in our control. We can control how we want to do it with this and what we want to do with it, and that's part of our responsibility as an organization.
Tim: [00:09:21] Yeah. We've made a decision to add this feature and therefore it is our responsibility. Yeah, yeah. Yeah. It is difficult. I did a, did a thing, Mmm. A couple of years ago where I did a like a 40 minute consult with a bunch of startups who are doing. Like they were the very early stage of an accelerator program.
So they were like pitching their concepts to like a bunch of mentors. And I was one of them. And I was like taking the security angle of like, you know, how could this go wrong? And some of them were just like. No, really don't do it. This product is like a, you know, it's, it's a working security breach here.
Just stop now, rethink it. and some of them would, some of them and had thought about it, but most of them hadn't. Most of them came up with these horrendous, and we were in Berlin too. You'd think they'd like have a, you know, have a think about it first. But, but no, sadly not.
Vim: [00:10:15] Interesting.
Tim: [00:10:18] So has has things in, in your computer security, you're like back everything up and use a one time, you know, a password manager and all of that.
Vim: [00:10:29] Or, or I'm awful with my security, like just, I'm the laziest person there is when it comes to it. So I use the Google, like, secure manager. Same. So it goes all of my passwords. that's funny.
Tim: [00:10:48] That's not a bad choice there. I mean, the only thing you can do to improve that is, is you can get one of these little key fob things to lock that down as well.
Vim: [00:10:58] Right. yeah. So I do that. And then, what was really interesting the other day, it's not so much a security thing, but more the dynamic that we had was. A document that we needed from Google driver been deleted and we can figure out how to get it back. that was just, it blew my mind that how much trust we had in the cloud to keep all of our documents safe, but stuff could still go wrong.
Tim: [00:11:24] then like. Tracking access to tracking who has access to that document, is also interesting. Like, you know, one of the common themes at the moment is people who just shoved their database into a cloud server, but not actually secured it in any sensible way. And so, you know, people have got your entire.
Customer list, which is a, you know, it's a, it's a valuable thing or whatever it is that, you know, that you've stuffed up into the cloud, your, your latest product designs or whatever. yeah. No, it's, it's, it's, so many ways to get it wrong.
Vim: [00:12:01] Oh, God. Yeah.
Tim: [00:12:03] So what else did we got on the coming up, after this one?
W a little teaser for future stuff.
Vim: [00:12:11] Yes. So I've got a interesting conversation about climate change coming up. So one of the attendees from the Huxley summit that I went to, which is air, kind of, which is a conference of about who we're trying to solve, the world's biggest problems, with the British science association.
Emailed a few people from that, who are going to be on the podcast, may need to open about climate change and campaigning and understanding how to build engagement across the world. and then following that, a few talks on democracy, politics and. What's going to happen in the 2020 election.
Tim: [00:12:47] Oh, wow.
Okay, cool. I'm looking forward to all of this. it sounds great. Great. So, yeah. Excellent. Well, let's, let's, let people listen to the Saskia and, and hold all of those kind of exciting plans also in, in their mind and like, you know, subscribe. Otherwise you'll miss it.
Saskia: [00:13:07] I'm so, I'm Saskia. And I'm the, one of the directors and founders of digital interruption, which is, a small InfoSec consultancy, in Manchester, but based out of media city, actually in Salford. we've been going for about three years now. and we do, a combination of different parts of infosec.
So primarily pen testing, which is similar for any organization that works in information security. we also do elements of compliance that we do training, and we also create security tools. Yeah, we were one of the first companies to try and to a hundred percent DevSecOps in the security environment, but unfortunately we were a bit too early to the table and it didn't really take off.
So we went back to pen testing. so DevSecOps is the concept of embedding security at every stage of the development process, rather than just testing at the end, which is the current model of security. and although it's. it's something that's definitely in the pipeline and where we're moving to in terms of, of information security.
We're not really there yet, so we still do do that. But, unfortunately there's not enough customer base for us to do that 100%. I'm also involved in quite a lot of the community groups for information security. So I sit on the board of OWASP, Manchester, which is the open source web application security project.
I'm one of the founding board members of the InfoSec hoppers, which is, a group which is designed to, open up pathways into InfoSec for, underrepresented minority groups, which in reality for InfoSec is anyone who's not a white CIS man. there are only 11% women in information security in the UK, and we don't have any figures for people of color, people with disability, other marginalized groups, because we're simply not doing the research about it.
I also, am one of the admins for Manchester gray hat. Which is a monthly meetup, which looks at practical advice and upskilling doing workshops and kind of a bit more fun and hobbyists. But we have actually pivoted about four people into pen testing from that. and I also sit on the board of skillsec, which is, a group that, has four things, four events a year, where we get, new entrants to the market or people looking for work in the, in the industry together.
With, basically employers, but they don't always know exactly what they're looking for. So we run different events where we explain different elements of information security, and go through different workshops, and then use it as an opportunity for networking as well. one of the main issues that we found with an InfoSec is people just see it as one homogenous blob, but actually it's, it's quite diverse.
There's lots of different elements to it, lots of different skills, requisites for different jobs. And it takes time to unpick that if you're not a specialist in that area.
Tim: [00:15:57] So I'm gonna pick up on one thing you said, and, which was about the, the diversity mix. The 11%, I have to say, I was like. Every time I go to a, to a meetup on either information security or, or indeed, networking, I am absolutely shocked.
Like, you know, it astonishes me that we haven't fixed this. I'm an, I've been doing, like I took a break from it, but I did information security what, 20 years ago, a bit less maybe. And, Unlike then, the mix was pretty much what it was, what it is now. But if you look at something like Medecin, like it's completely fixed, like Madison is now pretty much 50 50 in fact, you're more likely to find a female GP than you are a male GP now.
And that's changed in the 20 years where somehow in tech and specifically in networking and security, we just haven't, like, we haven't moved, which I don't understand it and understand how that. Can have not happened if that sentence makes sense. Can you got any sense of like what happened there or what didn't happen?
Saskia: [00:17:05] yeah. It's a completely unregulated industry and everyone barely understands it. There's no clear career pathways to get into it. and it's historically very toxic. the bulk of people who are pen testers of your generation will be people who were, home hackers, bedroom hackers, you know, they're not, they're not working to a professional code of ethics.
and that's something that has to be embedded over time with industry. And we have new entrants to the market now, which is sort of starting to level the playing field somewhat, but we have problems with getting young women and girls into STEM subjects as it is. So what we do. managed to achieve in terms of comp PSI, we often then don't convert or pivot into security itself.
and, and the workforce, the environment to work in is really not that pleasant. when you're the only person who looks like you in, in a room, you stand out. and that can be difficult. the organizations themselves are very immature because as you know, this is an industry that's only been going for 20, 25 years.
so the work cultures are quite immature. they're still focusing a lot on Nerf guns and pool tables, rather than, you know, work life balance, and trusting their employees. when I've worked for large consultancies, the expectation for Sunday travel people, Being being required to be on site over a hundred percent util for people, it's absolutely crushing.
but then when you add into that there's a toxic environment in the workforce that you're working in a lot of the time. Plus the community itself is quite toxic and constantly squabbling about this diversity issue. it becomes very hard. Very difficult to be that person, to be the diversity hire within that environment.
And that's why we lose 52% of women in tech generally before they get to a senior or middle management position, in, infosec that you really just have to dig your heels in, grit your teeth, and get on with it. it's part of the reason that Jay and I set up our own company, to be honest. We kind of had enough of it.
and didn't want to work in that environment anymore. We wanted to do it differently. so we, we set out with digital interruption being the most diverse in folks that company in the world because it's 50% black and 50% women. We've now since grown. and we've, we've taken on two men and a woman. so we're trying to keep the balances as we grow that we have, ethnic diversity and gender diversity.
we've got two new year, eight non neurotypicals in the company as well. you know, the people are there and they do want to do the work, but it's about finding an environment where they can remain and they can thrive. And that's what I feel we're struggling with at the moment. my caveat for that is that there are a lot of security roles that are in house and then not having the same level of problems that we have generally across the industry.
Tim: [00:19:49] Yeah, no, I wanted to pick up on that as well. Actually, I think that the in house, like external consultancy thing, I mean, and I said we weren't going to talk about history and here we are doing it, but, but I mean, one of the reasons I go out and doing information security, you know, whatever it was 16 years ago or something, was basically that nobody was ever pleased to see you.
Like if you're doing a consulting gig. Like pen testing or network review or whatever, and you show up on a site and you, they know that you're going to be there for three days and then you're going to give them a list of unpleasant to do's. Unlike nobody is ever pleased to see you. And that after a few months, that really does wear on you of like showing up and being like that guy.
and, and so, you know, it doesn't, it's not full of joy as a, as an experience I found. I mean, I know people who love it and, and the technical challenges are just great. Like social stuff is quite awkward actually.
Saskia: [00:20:50] Well, let's say now. I mean, really, a lot of it is just, you know, a basic test of a web app and it's, it's a little bit box.
Ticky. yeah. The more, the more software we create, the more software we have to test, and we're not securing it. We're just poking at it until it breaks. and, and I find that the model that we have, I'll say kind of, Encourages a certain kind of personality. you know, I've, I've heard stories of, of some of my teams when I was running security team sat in front of dev's laughing at their code, which is just not a great environment when you're the consultants on site.
and, and you can see that the, that people are getting bored, you know, they, they actually, a lot of these, these. consultants just want to hack stuff. They just want to do cool stuff. And, and having to do, you know, a PCI DSS test, for compliance purposes can get quite boring and quite dull. so I think, I think there needs to be a balance of having interesting things to do.
There's interesting things to do actually, are what pushes security forward and makes us more secure. I genuinely don't believe that just breaking websites, you know, sort of. a number of ticking a box ticking exercise to, to break a web app is helping anybody. It's, it's like whack-a-mole, to, to coin a phrase that Ali, one of our, our consultants uses.
but I think it is a very, very stressful job. I think for bloom team particularly, they seem to be holding the weight of the world on their shoulders in terms of security, and they take the brunt if there's a breach. and I think for red team, they often get quite angry and disillusioned because they feel that things.
Oh, it's secure enough when they should be more secure. but having sat with consultants who tried to do a test so they haven't found anything wrong on it, that's also quite frustrating as well. So it's, it's an interesting balance. It's, it's an odd, it's an odd career path and it's, it's an odd industry and I think it just needs to kind of work out exactly what it is that it needs to be doing.
And I don't think it's there yet.
Tim: [00:22:44] I think you touched on something which, which I firmly believe, which is the idea that it's not sufficiently integrated into, into the rest of the business. mean, even the blue team, I mean, well, maybe we should like, do a little bit of definition stuff there, but, but, but even the blue team is quite often feel like an addition there.
They're like in there, a couple of them in a corner and they're not really integrated into the rest of the business. They're not seen as a profit center. They're not seen as, as anything other than a, like, you know, somebody to kick when it goes wrong. And often the budget. So although the, this seemed to be a lot of budget for like stuff and less budget for people, is any of that fair?
Saskia: [00:23:30] I think it depends on the organization. A lot of organizations are put in a situation where they have so few people that really all they can do is spend their budgets on tooling so that they can monitor, and that, and then try and respond. people do love a red team. It's very sexy at the moment.
But, Questionable how useful it is.
Tim: [00:23:48] so we do need to define these things. So my understanding was that, that the red team and blue team came from military war gaming that like, you know, if you red team with the industry, right, right, right. You red team is your kind of simulated attack team and your blue team is your simulated defenders and, and they are, they're trying to kind of protect the, What's the word? The homestead or whatever. Essentially
Saskia: [00:24:17] they're trying to, they're trying to say it's an offense and attacking defense of your assets. So really it can be any asset. I mean, even physical assets, effectively, we can attack and defend putting a lock on the door effectively is security. there are a number of jobs within, within the, sort of this loose definition of red and blue .
So loosely registered talk and play was defined. security analysts, CSOs. Compliance, people will tend to sit in blue, pen testers, and then security researchers will tend to sit in red, is a, is a rule. We have pen testers. they do a penetration test on applications, hardware networks. And, and that's, inadvertent commerce, a simulated attack.
It's not. It's really, really highly constrained your time constraints, your budget constraint. and if you go outside of the scope that's been agreed with the client, you're basically just hacking their stuff and breaking the law. So you have to be very clear about what it is that you're doing and what you're not doing.
a red team attack is a simulated attack where you're supposed to be bombarding it in a lot of different ways and, and trying to simulate an actual hack. So you might use 'em fishing, you might use physical infiltration. you might try and sort of job shells. You, you know, there's, there's all sorts of different things that you can do as a part of a red team.
So it's similar to a pen test, but with a few more bells and whistles. and again, it's still time constrained. and if you don't have the proper defenses in place in your organization, all it really does is just, you know, attack your organization and say you have weak spots. It should be used more as a tested to test it.
Your blue team is operating in the way that they feel that they, you know, that they should be, and that it's keeping the organization really secure. so it tends to be only useful if you're quite a mature organization. you know, banks, for example, will have red teams because they have multiple assets that they want to secure.
I wouldn't really recommend that a red team attack, which would happen on sort of like a normal e-commerce company. It's probably just a little bit overkill. and like anything for, for information security, it's about understanding your risks. So you want to protect based on the risk that you're prepared to accept and the value of your assets or the impact of the loss of your assets or the leak of your assets is what will drive the budget and how much you want to secure things.
Tim: [00:26:37] Yeah, I mean, I, I saw, there's a fascinating thing going on at the moment, which is that there's a major, I'm going to date the podcast by saying this, but, but there's a, there's a major, currency exchange company who are currently being, ransomwared. And the, the question going round at the moment is how long can they survive as a business?
Just working on pen and paper. Like, you know, the, the trade off is they have basically no working computers. But we've still got cash in the bank. They've still got like the physical notes and the physical process. Can they carry on being a business?
Saskia: [00:27:13] Any different one there, isn't it? Because normally what happens with ransomware is you just get encrypted.
So if you've got proper backups, that shouldn't be a problem. And I said, encrypted your backups and. But with this, when they've actually taken the information and they've downloaded it, they they're threatening to release it on the internet if it's not being paid for, which is a different way . Right. So, so there, there are many that even even when we're looking at malware, there are many different kinds of malware.
And then when you specify malware's being bounced, and where there's different ways of dealing with ransomware, where I'm so wannacry for example. you know, it hit loads and loads of systems, but it was actually patchable. It was, it was, you know, it could have been patched if people that put the patches in place and the NHS was just.
So, so, so stretched that they weren't able to do it in time and it had an impact on the NHS. And I remember the headline saying, you know, an HS, which used to pen and paper, and I just thought anyone who's ever worked for the NHS will be laughing their asses off of this. Because being reduced to is, is a contradiction in terms.
It's constantly pen and paper. but really the way that they dealt with that was just to shut all the systems down. And then they, they went to backups and that's what we would normally. Suggests these, that you have backups exactly for that reason. but then as things get more sophisticated and things change, that's where we need to have the researchers in place, making sure they're are a little bit of a head ahead of the game.
and insurance companies as well, you know, insurance companies will insure you against, losses for certain attacks. the main issue really is, is, you know, the risk to your reputation, whether people will use you again in the future. it feels to me that if you're, if you're attacked, that people kind of blame you for it.
It's like the ex, the extreme of victim blaming.
Tim: [00:28:54] Yeah. I mean, gosh, there's a ton to unpack there. so the, the insurance thing is interesting because, like they might insure you the first time, but that might put up your premiums significantly for the second time. And then the other thing, which I think is super interesting in a, like a whole conversation in itself is this thing about attribution that I'm starting to see insurance companies saying, well, like if this is a, a state sponsored actor, and therefore effectively terrorism or act of war, then we don't cover it.
And so, so you could give them the attack. If you can attribute, if you're the insurance company and you can get good attribution that this was a state sponsored actor, you know, then, then you don't have to pay up. Right, right, exactly. And so, so you start, you know, attribution starts to become. Of, of who committed the attack, of which there's like never any certainty.
if it's done well, like that starts to become actually quite a big financial and legal issue. So it's like, even that is turning into effectively politics, which is,
Saskia: [00:30:09] it goes, it goes against everything we advise as well. internally, we call it the Jeff Goldblum effect. I did a talk recently for task bash Manchester, which was based on star Wars.
and mean been talking about, threat modeling and the Jeff Goldblum effect is it doesn't matter who's attacking you, it matters how they've attacked you. and, this was the threat model if the death star. and then we sort of switched the Jack move. In fact in it, it's, him in the, in the flight and independence day.
And I'm saying, you know, Jeff Goldblum could have like a top the death star, but it's the wrong movie. I think that this would confuse things enormously. so from a security perspective. How about to do this, you need to make sure that you are secure enough that you're not being attacked regardless of who's attacking you, whether it's a script kitty in their bedroom or a nation state, the impact is still the same.
so I think really that's just, I think that's to do with the balance of insurance companies probably getting hit really badly from the. just massive increase of, of crime coming now from cybercrime there has been a massive reduction in violent crime. we don't have bank robbers anymore, apart from the one that happened recently by all these old bank robbers came out of retirement and stole some stuff for fun.
Tim: [00:31:23] Got by the number plate recognition.
Saskia: [00:31:25] Exactly. Exactly. So, you know, if you think we watch films of old and it's people with shotguns. you know, going into banks and swarming banks and robbing banks and getting into vaults. We don't do that anymore. It's, it's all cyber based. so to me, I think this is great.
I think it's much better that we don't have to deal with traumatized people or people being shot or, you know, the violence is horrible and the PTSD is violent. it's horrible. But what it does mean is that we've now shifted that blame into another area. and of course, insurance companies are going to try and protect themselves.
They're going to protect their own assets. so if there's a way that they can not pay out that they're going to find a way to do that. and that's why we find that there are so many regulatory, we requirement's now on cyber. So I mentioned PCI DSS before, but you can't take credit card payments unless you're PCI DSS compliant.
And that's where a lot of these compliance he drive has come in. we have HIPAA compliance, which is around medical records. we have the dreaded GDPR, which everyone's sick to death of hearing about. At which is for anyone who's a European union citizen or resident. and these, these compliance actors are put in place to, force companies to be more secure and secure certain sets of data, because the impact of, of losing them, leaking them, or, you know, having them taken is, is too high.
and certainly for the PCI stuff. when you, when you steal credit card. Money. You are not stealing it from the individual. You're stealing it from the credit card companies say they have a vested interest to make you keep that secure.
Tim: [00:32:59] But I mean, I suppose the other side of that is, is the additional side of that is, is as you were saying initially, this is starting to become a somewhat regulated market.
Like, you know, behaviors . How the thing works and therefore what jobs look like. They're starting to become part of legal contracts are starting to become part of the law in some aspects. And so like it becoming formalized almost from the, you know, from the insurance company side or from the, that side of the, of that pressure is coming into to formalize the constructs.
Things like PCI. you know, those things are all coming from demand fed effectively. And, but what I'm interested in is how much does that affect the job, the people in the industry.
Saskia: [00:33:53] I am a, it has affected jobs to an extent. We have a lot more people working in compliance environments and certainly with the run up to GDPR and a lot of organizations started to tighten their security and they now have more budget for security.
and they will continue to think in a more secure way. I think wannacry coming out when it did probably also move towards that because it was so well publicized. It's, it's one of the, one of the few cyber attacks really sort of held in people's minds, whereas other cyber attacks less so. but I, I, I think we kind of need to not have it come from the compliancy of, of industry itself.
I think in terms of, sorting out the issues that we have within the. The job market is, we don't really understand what it is that we're doing. Industry doesn't understand what security's doing. Education doesn't understand what industry's doing. Industry can't work out why education's doing what education's doing.
I, and this makes me unpopular, but I've seen quite a lot of bootcamps recently that are suggesting you can become a pen tester in 12 weeks. it takes four years at least. You know, this is a really, really high risk job.
Tim: [00:35:03] But, but does that mean that there is, there's something else there should those people should be doing rather than pen testing.
I mean, pen testing is like, isn't an established model. It's an established job with like reasonably clear job description and whatever. So it like has those advantages. But, but it doesn't show that it doesn't scale. And it's also like picks a lot of lead in.
Saskia: [00:35:29] Like it's like saying that homeopathy is, you could get a certificate in it.
I'm going to get killed for saying that it's not, it has an effect. But if you think about how much software we have now compared to what we had 20 years ago, if you look back at when you were doing information security 20 years ago to what we're doing now, you can see how unscalable it is. And when you add 5g into that internet of things, yeah, we just have devices connected everywhere.
So those networks need to be secure as well. and if we don't find a way of scaling security properly, we're going to end up in a situation where nothing is secure. and I want to live in a world where I can use software. I want to live in a world where I don't have to worry about Cambridge Analytica stealing and weaponizing my data.
so we, we need to find ways of making things more scalable. And there's definitely a place for pen testing, but pen testing should be a checking it at the end to make sure that you felt something securely in the first place.

Tim: [00:36:29] I think just to very slightly nuance that, and it's not even really disagreeing, it's just like it is.
One of the effects of a pen test is that if the developers know that they will be one. It colors the conversations you have in in the sprint meeting. Like I've been in sprint meetings, in sprints. Wait, you're, you're in the standup and somebody says, I'm going to do this. And said that failed pen test.
Like, well, it's not a good idea. We're going to get, it's going to cause us problems down the line. And if you know there's not going to be a pen test, like you're throwing together a quick demo, then you'll take that risk. But if you think it's going to go into get pen tested, then you'll do it. A bit better.
And so I think just the existence of it and the fact that it going to get tested, like does introduce a sort of back pressure back through the development cycle and that day to
Saskia: [00:37:26] day number one, a lot of stuff isn't built with a pen test on the horizon. A lot of stuff is pen tested because a client requires it before buying.
So that wasn't something that was necessarily in the psyche when it's being felt right. But also it requires developers to understand how to develop securely. And a lot of them are taught to do that. Developers are incredibly good at what they do, and they're very proud of that code, and they should be, it's difficult.
I can't do it. but really what we need to be focusing on is how we train developers to develop in a secure fashion. and it's something that we've been sort of really plugging at DI for a long time. And we do secure code training and, secure application development. We have white papers on how to develop secure mobile applications, for example, because there's lots of problems with that.
cause we're in a really like great position that two of our testers are also. well one's an ex-dev and teh other is currently a dev. He does dev work and on testing. and, and I think there's an awful lot of blame gets put on developers for things being insecure. but unless we're imparting how to develop security, that's just completely unreasonable.
Tim: [00:38:35] Right, right. No, I totally agree with that. And I think think the more that message gets over, I mean, it's one of the successes in this spaces is OWASP like, and admittedly it's maybe a little dated in that nobody writes web apps these days, but. you know, the, the top 10 as well. Yeah. But, but the construct, I think like, you know, these are things, these are kind of really simple rules that if you follow your make, your, your web app significantly more secure, and they're, they're applicable and that they're things that a developer can actually like get their head around.
I think that's a huge success. you know, so I think we need kind of a lot more of that. But I guess that the question I suppose is about the mindset thing. There's a certain mindset of like, like thinking about how things could go wrong. That is, is very much, you know, what you get from particularly a red team and, and pen testers and, and that kind of hacker mentality versus the devs who are thinking about how to make it work.
And this is like the genuine use of kind of cognitive, difference of focus there.
Saskia: [00:39:51] So that's one of the reasons that we suggest things like threat modeling. So, I mean, literally my job is to think of the worst case scenario. I go into companies and go, this could happen. And they're like, ah. to the extent you asked me how I sleep at night, and I'd be like, why?
but I, so, I mean, I, I'm a data protection officer for a number of organizations and I've been dealing with really, sort of quite scary data for many years now. I specialize in special category data. specifically, medical data, but also tainted that relates to children and data that is around, reports of harassment, bullying, and, and the sexual assault all the way up to rape.
So I'm the DPA for a company called culture shift, which make a piece of software called the boot and support, which is a brilliant piece of software. and it helps, Students and staff members all over the country at universities report bullying and harassment. But of course, that data is, is very, very precious and it needs to be, you know, we are the custodians of that.
We need to take it very seriously and we need to protect that data. And so we use risk modeling constantly. apart of the risk modeling is me encouraging people to think like me. So worst case scenario, what happens if this data is leaked? So if you take Ashley Madison, for example, when the Ashley Madison data dump happened, four people committed suicide.
So for anyone who doesn't know what that was, Ashley Madison, which a dating app that, found affairs with people, it connected people to have extra amount of the FAS. They were priests on their, you know, there were prostitutes. There was all sorts of stuff. Everything that you could think of that could potentially.
Be reputationally damaging was on there. And the data dump was, was horrific for a lot of people to people in the States committed suicide. And number two, suicides in Canada that were related to it. We don't know how, how strongly related. so when, my company went to children's zoo interruption, was doing some research, on, an adult application made by a company called SIM VR.
and we were able to find login details, which had 20000 users. we wanted to get that fixed as soon as possible because there was a strong implication based on, on, the precedent of Ashley Madison that this was really serious. and this is what we need to encourage, not just developers, but organizations to be thinking about.
Yeah. What is the purpose of your application? What data are you taking? Do you need all of that data? If you don't, don't take it. Right, right. I mean, the principles you put in place, which are very, very easy to learn.
Tim: [00:42:19] Yeah. I mean that, that's the, that's the most amusing part of GDPR is it making people question whether this, this thing of grabbing all the day through.
I'm wondering about what to do with it later. whether that's actually cost-effective anymore, like, or whether it puts you at too big a, a down downstream risk of like, you know, it all going horribly wrong.
Saskia: [00:42:40] And that's why you put your different hats on. So, you know, you have your business hat, which is, you know, day to day to day to, I need all the data.
I might do something with the data. I don't know what I'll do with the data, but data is there because it literally is more valuable than oil. Now. It's the most valuable commodity in the world. but then, you know, you have your hacker hat, which is, Oh, what can I do with the data? and then you have your customer hat, which is what you're going to do with my data.
Yeah. What's the implication to me, and I see all these sort of wonderful things happening, particularly with five G, I was on a panel recently, where, somebody was explaining to me that they can reduce congestion in city centers. Because they have connected car parking spaces. So the car parking spaces are connected to an app on your phone and the app on the face.
How would you buy? There's a free car parking space and then that's presumably reserved for a period of time until you park in it. So you just drive straight to the car park and the car parking space rather than circling around town. And I'm like, huh, what could we do with this? Really disrupt that. And you could bring a city center to a standstill.
The emissions would be awful. They will, we'll be able to get anywhere, you know, and, and it's, it's, it is my job to think of things like that. And then it's, the job of the pen test is, it's the job of the researchers of the team to see if it's possible. but you know, being, being a bit nefarious, a bit mischievous, that's, that's an easy mindset to put in place.
Tim: [00:44:02] I always have to put a lot of kind of mental effort into putting that hat on, and I'm not particularly good at it. I wonder whether you're underestimating the difficulty of getting into that mindset for other people, like you find it nuclear enjoy it, but I'm not sure where they're like. Whether it's that easy for the people.
Saskia: [00:44:26] I, I don't, I enjoy it when it's, it's, it's a hypothetical mischievous act like that. I don't enjoy it when it's looking at the protection of the people whose data I'm dealing with when it comes to things like the assult or medical care, then, then it's not funny and it's really serious. I were with sum of our experiences and I've, I've had a particularly unusual life and I've lived in some very unusual places, and I've done some very unusual things for a career.
which is probably why it's easy for me to do this, that, and I'm trained to do it. and I think, I think for the hackers, yeah. I mean, this is something they've been doing for a long time. and just because you're a hacker doesn't mean you're nefarious. It might mean that you're just interested in seeing how you can make things do different things.
recently, Hacker house changed a load of, Hello devolved voting machines from the state to get them to play doom. I mean, there's nothing nefarious about that willingness about you trying to vote on them at the time, but you know, they're the ones that are out of circulation now. So, I mean, really for those, those different and segments within the hacking community, if you like, and not all of it is nefarious, but, yeah, I think, I think it's, it is an easy, it's an easy enough thing to make people think about what they're going to do and what the implications of what they're doing is.
I mean, it's just literally taking responsibility for the data that you're dealing with. And I, it's, to me, it's the equivalent of saying, Oh yeah, I didn't know that my BB gun could shoot a bird if I randomly pointed at a tree and pulled the trigger.
Tim: [00:45:50] Right, right. So what you're saying is that it's trainable and that this kind of, mindset that hackers are a breed appart is, is, to a large extent .
incorrect. And that it's, it's a trainable skill.
Saskia: [00:46:04] I think it's a trainable skill. Maybe not to the extent of thinking, being able to predict what hackers potentially will do, but if we can train a five year old to cross the road safely, we can train an adult developer, an a, an a company of adults to create software more securely and think about what they're doing with data.
Definitely.
Tim: [00:46:23] I think the subtext of something that you cared a little while ago it is, is the idea that maybe somethings aren't worth the risk of building. Like, you know, the, the, the money you're going to get out of it isn't worth the high risk of it all going hideously wrong.
Saskia: [00:46:42] I think in a lot of cases, yes.
But then I think there were a lot of things where it is worth the risk and, and a lot of those are, they're sort of more, Socially based projects. so I mean, it's again, one of the things that we do at digital interruption, we have a discounted rate for tech, for good and charities for small companies.
when we have a special rate for startups as well, because it's usually the smaller companies who are doing the innovative and interesting things that can't afford security that really fall foul of stuff like this. And the stuff that they're doing is often amazing. so I, I, I do think that risky is the main, the first thing that you should think about is what's the cost benefit of this.
and, and also not only thinking in terms of the monetary benefit.
Tim: [00:47:23] Right, right, right. Yeah.
Saskia: [00:47:27] Benefit to society is important and we find a lot that the audience will dictate the level of security as well. I would say apps that felt fair, monitoring certain things in developing countries. Often they won't put a security budget to, it could be, it doesn't matter.
It's not the back. and that's where sort of inclusion and, and you know, care for people's data. Again, how old surprise the West is just considered more important. You know, we all get very upset when we think war is coming to the West, but no one really cares when it's not in the West. And it's the same with the way that the apps are, developed and secured.
the audiences is really, really key for that. one of the things that has been, Really welcomed, in my opinion from the GDPR is that it requires a sort of special stuff for any data that's relating to children. Because you know, children are kind of idiots. Really. That's why they have parents to keep them alive.
So you know, ensuring that children understand what their data's being useful or making sure that they have somebody who's consenting for them, I think is really important because. Data and applications and technology is, is basically infiltrating every part of their lives. And it's almost impossible to stop that now.
Tim: [00:48:44] Well, yeah, and I think think particularly for for children, the lifespan of that data, like if you imagine a, a child having a DNA test now, like in 50 years time, what are you going to be able to do with that DNA? Well, maybe clone a whole child. You know, so, so that the span of technological advances based on the data that you extract now, it could be quite terrifying.
And I think, you know, people need to be quite, reticent about giving out long lived data because you are going to get used.
Saskia: [00:49:18] And all the photgraphs . Everywhere. Children just being shared everywhere. I would say I was in Plymouth with that, with my best friend. She has an eight year old daughter. and I put a photograph of us, we had a hike to the beach.
and I made Alia turn around.
Right, right.
Tim: [00:49:37] Yeah. No, I mean, exactly. Because like, what are we going to be able to do with those, those faces? We're going to be able to make, make live action videos from them in, you know, well, the can already, yeah. Yeah. Cool. Well, so we're starting to talk about what this podcast is kind of. Aiming at, which is like what the future's going to look like, but it sounds pretty dreadful.
there any kind of positive stuff that you can see like five years out that you think, you know, will be getting better.
Saskia: [00:50:10] I, I don't know. I think if we can, if we can get the balance right, the future could be amazing. The future could be terrifying. It looks like a dystopian future to a lot of people because of the reliance that we have so much on technology. And what happens if that technology goes down? you know, it's already a big fear in hospitals.
If, if everything is, is connected and that connectivity goes down, what happens? There are people still in America who are relying on iron lungs who die if there's a power cut. and, they have to have their own backup generators. And if they don't work, there's huge problems with that. And they stop making, the, the next leaves that, that actually kind of keep people inside these iron lungs.
So, you know, people are getting left behind by the technology that we have now. I think the more connected we are and the more open we are to being attacked and things being brought down by hackers, potentially the more risk we put ourselves at. But then the benefits are also amazing. I mean, in terms of technology, generally medical technology is increased massively.
being able to, I mean, you and I are having a podcast between two different countries now. There's hardly any lag time, you know, that, that's, that's amazing. That's something that wouldn't have been possible in previous years, but it's, it's what we give up at the same time. and one of the things that I do think about a lot is as the imbalance of it.
so technology is great and it's helping loads of people in the West. but actually in developing countries, is it helping that much? and so if you, if you look at, North Africa for example, the, mobile networks in North Africa have always been a lot better than they are in, in the West because we had grounded networks to begin with, which are much older.
When they were, they were, that they were expensive to replace. So actually a lot of the innovations are coming from, from, you know, different countries and different cultures. It's, it's to me, it's just impossible to say. I think, I mean, every generation harks back to their saying that it was the best one.
but whilst we're suffocating with Australia on fire and our seas are all now made of plastic, I don't know. If we can't start using technology to clean things up, it probably won't matter soon anyway.
Tim: [00:52:13] I think, I don't know. We, one of the things that's cropped up on this podcast a couple of times in which I think is an interesting trend is, is the ability to, to reuse, old equipment, by hacking the existing,firmware.
So like a postitive hacking for good story is, is that you can take an old, like 10 year old router, reflash it with open WRT and it does a bunch of things that the manufacturer never intended it to be able to do. And you can extend its life. You know, the manufacturer is probably out of business now, but you can extend its life and keep using it to do new things for another, you know, five, 10 years.
And so I think, sort of, positive aspect of, of the hacking. world does exist though. And I, I'm kind of hoping that we see more of it because it's like, it's good in, in a, in a ton of ways. you know, not, not least because of the resource issues that we're not, you know, you're not sending this router to the landfill.
Saskia: [00:53:18] Yeah. I mean it's, it's about the scalability of things as well. so one of the things that we've been doing recently is we've been taking, the, the, old mobile phones that we've bought on mass from e Bay, and then we've been taking the chips off them and reading the chip. and seeing all the information and data that's been left.
so I mean, there's, there's going to be a security implication for that. You know, we recommend that people don't use out of date hardware. Once it stops updating, the security updates won't take, so it becomes insecure. again, it's, it's this, it's this constant cost benefit analysis. It's this constant risk assessment and risk analysis.
I know you've even, even as professionals and InfoSec, we can't agree. We're constantly arguing about it. And I think that adds to the unpleasantness of the environment and the lack of trust that people have. And it's, there isn't, there isn't a black and white with this. and it's, it's subject to so much opinion.
Yeah, because we're not regulated in the way that, say, Madison, because we've touched on medicine, but that medicine or law, those codes of ethics, codes of conduct, codes of practice, you know, there's certain ways that you have to behave. And we don't have that in our industry. So people just randomly say what they wish.
and, and again, that, that sort of reduces the, the consistency of what we're saying. You know, we're not consistent in what we say and, and that means that it's difficult to work out what, what the right. Advice is really,
Tim: [00:54:43] yeah. One of the things, one of the other things that's kind of, we keep coming back to in the, in this podcast, which I'm absolutely fascinated by, is the, nongovernmental, typically charity organizations that, that organize a sector, like for its own benefit.
like kind of like, if you think about the medieval guilds, it's that sort of mindset, but, but we came across a, what was it though. British association for amateur rocketry, right, exactly. And they organize like how if you want to build a rocket in the UK, then you, you follow their code of conduct and if you do, then you can get it insured.
Right. So I know, I know. So it's like you, you look at that and you think the same thing in the, in the, in the craft space, there's like, you know, the, the metal work guilds and the leather makers guilds are made, they have these codes of conduct that professional ethics and backed by insurance and membership schemes and like all of that.
And somehow, maybe it's just, we're not old enough as a, as a, as an industry, but somehow InfoSec. As far as fragmented, there's no kind of, doesn't seem to do that. And it really needs to,
yeah,
Saskia: [00:55:57] they tried to put it in place in the UK at the moment, the government's put funding out and it's gone to the Institute for engineering and technology to put together a cyber security council, but it's going to take time.
I used to do this for a living a long time ago. I used to put together national occupational standards for different industries and it takes time to map an industry. And at the moment I think we need to understand what the industry is before we try and police it. And we don't, we don't understand what the industry is.
We, you know, we need to have an agreement, a contract between education and industry to say these are the skills that are required. And when you put these skills together, they create a competency and this is what a competent professional looks like. And we haven't done that. And sort of going back to what we were talking about at the very start of the podcast about the skills gap.
It's really complicated and really nuanced in this industry because maybe there isn't a skills gap from, from the point of view that there are enough people who want to do the work. the skills gap is no one knows what they're supposed to do and what skills they need to do it. So we have all these new entrants to the market being told by education, by bootcamps, by, by
By, you know, crest, all these different certification boards. You know, there's so much out there that this is what they need to do. So off they trot and spend a thousand pounds on a certification and then they can't get a job at the end of it because industry's moved on. or, or because, you know, they, that's now out of date or there's no specific definition even for what's required of a pen test or there's no specific definition.
It can change from company to company. And it definitely changes from country to country. and, and I think we do need to understand a little bit more about what, what are our objectives? Cause we aren't objective focused insecurity. Yeah, pretty much tackle the things. Hack the planet seems to be what we say.
A blue team. We just gave defendant all the things. But that's not an objective.
Tim: [00:57:59] No, no. I mean, you know, realistically the objective should be, keep the company in business and, you know, follow the company's objectives and, and the security should drop out of that. But somehow we've never connected up.
But that level, I don't think,
Saskia: [00:58:17] I think we're starting to get there. And I think that the more conversations that we have about things like the skills gap and the more conversations that we have about career pathways, the better. I, in my opinion, what we're doing at the moment isn't making a difference and won't make a difference.
and. Starkly, that's not just anecdotally, when, when digital interruption last year decided to do a one week work placement, we paid 500 pounds to have someone come in for a week and work shadow. It was on a pen test. and we got over 40 applicants. and a lot of them had already been through education.
They'd already, I think five of them had come from a boot camp. And I would expect that coming out of the boot camp, you would be put in front of an employer and at least have an interview and not need to go for it yet another work experience placement. So that made me quite sad. I know, I believe that they are a big company in question is actually reassessing their calls now, which is great.
you know, they're taking on board the outcomes and that they're trying to get it better. But, One of the things that we've been looking at is maybe, I don't know. I think we need to work together as a community. We need to get the right facets together, and we need to get the government to make an agreement with us about what the occupational standards are for this, this industry before we move on to the next stage, and we're not doing that.
and we really need to, we need to map the industry. We need to understand what the requirements are. You know, we keep talking about, 11% women in the industry, but then some people are saying it's, it's, 20%. And then the ICSE is saying that it's 24%, but that's only if you. I'm half to talk about more than 25% security in your job role, which is just moving the goalposts to make it look like we've achieved an increase in women in the industry when we actually haven't.
Tim: [00:59:58] Yeah. I mean, arguably every developer has 25% of security in their job role, and so do all the UX designers and like you could, you know, you could
Saskia: [01:00:06] anyone who works in a call center.
Tim: [01:00:07] Right, right, right, right, right. So you can, like, you. Yeah, you can smear that, that statistic any way you want. And, yeah, that's, that's a little bit, little bit gloomy, but maybe that's a good place to, to, to wrap up, so. Cool. Listen, thank you so much for your time. I hope. Yeah. You have some fun having the chat anyway 'cos I did.
Saskia: [01:00:26] Absolutely.
Thank you so much for having me. Yeah.
Tim: [01:00:29] Lovely. Brilliant. Thanks. So thanks very much.
Saskia: [01:00:32] Great. Bye.