Highlights Reel feat. Beverly Roche, Sunil Saale, Nigel Hedges and Chirag Joshi
While Gar O’Hara and your regular podcast hosts take a short break, we put together some of our favourite segments from past episodes that we think deserve another listen.
The episode features Beverly Roche who talks to Gar about how humans need to be at the centre of AI and cyber security, Sunil Saale on the balance between security and enabling employees, Nigel Hedges on the value of automation in cyber security and Chirag Joshi on how to get cyber awareness support and buy-in from upper management.
The Get Cyber Resilient Show Episode #62 Transcript
Matt: Welcome to the Get Cyber Resilient podcast. I'm Matt Sprague, producer, and editor of the show. Garrett O’Hara, and your regular podcast hosts are taking a well-deserved break for a couple of weeks. So we've selected some of our favourite rite segments from past episodes that we think deserve another listen. This week, we've put together a series of conversations with some amazing guests around the topics of cyber awareness and the human versus machine aspects of cyber security.
The episode features Beverley Roche, who talks to Gar about how humans need to be at the centre of AI. Sunil Saale on the balance between security and enabling employees and Nigel Hedges on the value of automation in cybersecurity. But first up is Chirag Joshi. Director of ISACA Sydney chapter talking about how to get cyber awareness support and buy-in from upper management. Hope [00:01:00] you enjoy over to the episodes.
Garrett O’Hara: [00:01:04] What are some of the, the ways security leaders can change the outcomes that you're seeing through? Well, good, really what you've described as allies. Um, but in, in the business or the organisation that they work in.
Chirag Joshi: [00:01:15] Yeah. Look, I mean security and IT I mean, thankfully we're now at a point where we don't have to say that word over and over that cybersecurity is a business problem, not a technology problem. I think most people understand that now. Uh, but it also now comes to, we need, we have people in, in business who play very similar rules to what we play in cybersecurity. You know, part of it is also assurance. So we have some natural allies when it comes to, you know, risk practitioners, audit professionals, compliance teams, who just are you know part of the assurance group.
Uh, but there are also allies in, in corporate communications, in marketing who, who, who live and breathe in this space they know how to target stakeholders. They know how to create catchy [00:02:00] messages. And I think we need to take, we need to take their help as much as possible, but we also need to kind off, you know, what I talk about is an approach of cheat sheets. And what I mean by a cheat sheet is, you know, can you kind of articulate a security message, you know, in just a simple five or six bullet points, which any team leader or a manager can read through before starting a meeting.
So, you know, I, I think that helps cascade the message better and security champions initiatives, which are quite popular, you know, relates to having security advocates within the businesses that, that, that, that absolutely holds true. But I feel sometimes these programs fail because we haven't truly articulated what they're trying to achieve. Yes, find your security champions, but what are you running on? What is, is the thing that you want to get out of them? And I think having these cheat sheets, having specific targeted outcomes, help in those areas. So there are groups within your organisations you can, you can take assistance from.
And I think, legal, HR, are also key players in this game [00:03:00] because they have a stake as well in this problem. So I think that's where we need to harness the power of allies and use them to cascade our messages, because a lot of messaging already happens in organisations, real security can just tag along on them. We don't have to create new channels all the time. I think if we create a message, which is very simple, like all of the rules we've talked about so far, is around simplicity and targeted messages. If you can afford, create that, then you can take advantage of existing channels.
So that's internal allies, but then also external allies, I mentioned ISACA which I'm a member of and you know director of, but there are other organisations like Australian Information Security Association, ISC squared, you know, others who. who have a lot of good content. I think those memberships help you grow and help you kind of solve some problems, which other organisations have tackled. Uh, and then there are government resources.
I think in Australia we've been fortunate to have government has put out a lot of good cyber awareness and cyber information materials be it, Stay Smart Online. Scamwatch I mentioned [00:04:00] ECFT commissioner, you know, we, there's so much free content out there, absolutely free and really good content out there. So I suggest we should took it, you should not feel isolated and try not to solve this problem in isolation as a security team. I mean, there are people willing to help and, and people in the organisations outside who, who we can absolutely leverage.
Garrett O’Hara: [00:04:19] Yep. No, great, great points. And you know, we've sort of had talked about this a little bit earlier, and it's an idea that I'm a big fan of the, I think you described it as just in time training and, you know, there's this idea of when you've got a, call it a coachable moments or a learning opportunity, that's really the best time that somebody can be presented with some sort of educational information and, you know, there's kind of a repetition there. Um, what are some of the kind of practical ways that an average business can do? That, I think you mentioned, you know, executive assistance earlier, but what, what are some other ways that kind of a business could approach that just in time education?
Chirag Joshi: [00:04:58] Well, look, I think it's it, it, there are [00:05:00] a few practical things, you know, like when you're trying to change your password, having a banner there in a simple way, talking about, oh, if you had a password manager he wouldn't have remembered this password, you know, click here to download one, you know, that's a very simple example, but there are others where, you know, when it comes to collaboration tools or, you know, when somebody is trying to, like a lot of organisations are now leveraging Office 365 or, or, or Google solutions for the collaboration document sharing.
I think having a clear kind of a, a banner there or training pop-up there, which, which tells people, Hey, did you know a lot of data breaches happen because people accidentally send emails to it, or accidentally share information, you know, use this data protection labelling or more this is confidential or double check send address. I think those are, are the, the areas which we can practically start doing now. And if you thought about what are key risks are and what are the outcomes you are trying to drive at, this becomes, this becomes quite useful exercise. Uh, one of the, one [00:06:00] of the things that, you know, I, I've recently worked on is getting people to start using data protection labels.
So, you know, in the organisation I work if physically, fuse those labels, not just the watermarks will appear, but also, you know, it be encrypted by default, we'll have access permissions by default and all of that is fairly efficientless, because all people need to do is click on, click on a label icon and, and just mark the thing as confidential. But, you know, it's not intuitive. It does not happen automatically. Uh, and, and that's where, you know, having those kind of just-in-time training, people don't need to learn about that in the induction training, right?
I mean, it's fine. We can talk about it maybe, but when they will actually need it is when they start using it. And just before an email, if a quick pop-up comes up, you know, don't have to annoy people but you know it is showing them the first time they're logging onto a system or, you know, maybe refreshing it every few months. I think that helps. So basically those, those are the things I'm driving at. It doesn't have to be complicated. It can start with something as simple as [00:07:00] the things I've talked about.
Garrett O’Hara: [00:07:02] Yep. Phenomenal. And all of the things we've talked about so far, and they're all gonna need the support of senior management of the ExCo, they're gonna need the support of the people who are gonna buy into and sign off on a program of works for change management or awareness training. Like the, the million dollar question is how do you, how do you get the buy-in and the continued buy-in from senior management?
Chirag Joshi: [00:07:25] Well, that's, that's the key one, right? I mean, without that nothing, nothing is gonna succeed. Uh, and, and this is where I, all of these rules, you know, in a way are linked and it all comes back to remember how we talked about, you know, what happens if the org, organisational objectives or awareness objectives are not achieved. And I talked about the fact that, you know, at some point the funding is gonna come in question and, and, and executive support is not just about funding. It's also about walking the talk and actually being visible and advocating security practices openly.
Uh, how do you [00:08:00] get that as you know, I think that starts with engagement at the right level, and it starts with clear tying in with our strategic organisational objectives and, and security needs to do a better job of that. We cannot talk in, in generalities about these type of issues. If our objective is to, you know, hypothetically be future-proof by leveraging best in breed of emerging technologies well, security needs to kind of show how they're gonna tie into this picture. Well, probably by, you know, changing our operating model or changing our device model.
And, you know I think that, that's the kind of conversation that needs to go into security is on strategy and principles. So once we cleared that linkage that helps get the initial buy-in, but then we also need to kind of progressively through our reporting, need to demonstrate how our efforts and investments are shaping up. And that's where, you know, the smart objectives and, and metrics will help, you know, what gets measured gets managed, what gets managed shows [00:09:00] value. And I think that's where you start showing value. Uh, so that needs to happen.
But then the execs also, and the senior leadership also need to be visible and they need to talk about security, not just in generalities, again, I think they need to be seen doing specific things and are could have talked in the book, you know, like have them indoors, like have them show us security practice, maybe how easy it is to download, you know, a, a secure app on their phone. Uh, if, when people see this in action, I think that drives the right outcome. Uh, and, and look, if is, in some industries, there is a concept of safety moments, right? Especially, you know, in the energy industry where people take physical safety very seriously.
Uh, through the years and because we had people in the field in many of these areas, I think that same idea coming in, where if we start a meeting, you know, and you talk about, you know, an online safety moment, for example, I think it's a powerful message we're sending. So I think it's, it's, it's a two way street, one, we need to demonstrate value as security professionals, but [00:10:00] then I think the other side is equally important. They need to understand the, the senior leadership needs to understand that how critical they are, their role is in this picture and they need to be seen visibly actually exhibiting these behaviours.
Matt: [00:10:14] Next up is Beverley Roche, who is currently the interim CISO at cyber risk, and also host the Cybersecurity Cafe podcast. She and Gar discuss how humans need to be at the core of AI and the implications of machines, making all the decisions.
Beverley Roche: [00:10:31] AI is just absolutely brilliant. And we know that AI is gonna solve some really big problems for us, but we need to remind those that are kind of developing all the AI that, that human needs to be in the centre of it, that we need those ethics. We almost need an ethics mantra or some sort of, some sort of sign up to [00:11:00] this because the, the ethics and the guardrails are the things that are kind of really, kind of bothering me a bit, thank you for that question.
Garrett O’Hara: [00:11:10] Yeah. Like there's the, on the AI side of things is some very, I think, interesting conversations starting to happen around how we deal with the, you know, the robots. And it's almost back to you know, Arthur C. Clarke, I think it was, wasn't it with the AI robot stuff and, you know, the rules of robots and, you know, the idea that they had to protect human life, but you get into the I can't remember it, is it called a tram, the trolley problem? You know, the one where you have to choose whether you, you know, pull the brake, pull the handle to make the trolley go left or right.
And, you know, you've got a choice between killing, killing multiple people or one person and you know, the, the sort of ethics. But, you know, parlaying that then into things like autonomous vehicle, vehicles where, how did they make the decision to maybe, you know, choose which human [00:12:00] life potentially it's an absolute minefield, feels like with so, so many pitfalls that could be there.
Beverley Roche: [00:12:06] Absolutely. And that's what I'm worried. I guess that's my biggest concern is I don't want robots to make those decisions. They shouldn't be part of the decision-making process. That should be an alert for human intervention just as we will, you know, we've got all these fantastic autonomic, automated tools that tell us what's going on. I wanna make, we want the humans to make the decisions.
The automation pace is great, but the decision-making about those big ethical issues the, that's the bit we want humans to be. Um, how alert, alert. I need a human here, a human needs to make that, that decision, hence this idea of, you know, kind of the ethics, [00:13:00] but, you know, growing up in Ireland, you've seen some pretty amazing human behaviours, right? I'm thinking you've probably got some interesting stories about what humans do and can do.
Garrett O’Hara: [00:13:15] Definitely. Um, and definitely what humans can do after 10 pints of Guinness is, is something to behold probably would say that, wonder if, it, it's funny though, we're talking about AI and, you know, before we started recording, we were kind of getting chatting. And one of the things you mentioned was some of the stuff that's happening around the, you know, the, sort of the, the tech, the big tech companies out of Silicon valley and some of the commentary that's been happening, around the news and propaganda.
Um, and I know that there's a documentary on Netflix at the moment called The Social Dilemma, and which talks to some of the AI stuff in the background, you know, that does the YouTube curation of recommendations and you know, what shows up in your Facebook feed. And yeah, I'd be, [00:14:00] I'd be very keen to get your thoughts on that. Cause you know, we, we sort of started talking about it, but you know, we didn't really get to get into it.
Beverley Roche: [00:14:05] Thank you. Well, look, the, the thing that's impressed me this week, that's really newsworthy is someone, you know, I, I see someone who is an intellectual job is also someone who has an awesome sense of humour and that would be Sasha Baron Cohen. And this week he's been very vocal on the Silicon six and propaganda, but not only just propaganda, you know, we know that the longer uh, we stay online with some of these technologies, the longer we can be lose a fresh perspective if you like.
Garrett O’Hara: [00:14:54] Mm-hmm [affirmative].
Beverley Roche: [00:14:54] And you know, I do it, I do it with Netflix. You know, I don't have to click another [00:15:00] button or two clicks or three clicks to watch the next episode. All right, it's rolling into the next episode. YouTube does the same thing. They're lining up everything, everything online so that we can stay online longer cause that if we stay online longer, we're likely to view their ads longer. You know, there's all sorts of things going on here and there's great Ted Talk as well.
And pull me back if I'm deviating too much. But, but it, it really is timely. And it really is timely for, for us to start considering the influence, not only Cambridge Analytica issue, but the, the manipulation and the fake news that's going on and understanding, and understanding how that influences the [00:16:00] decisions that we're making, es, especially around our social justice basically. So I think that's really important.
Garrett O’Hara: [00:16:10] I agree. And I, I don't think you can kind of unpick the, like, everything you've talked about there to me kind of relates directly to our industry as well. Like it's got a very material effect on I would say service security and resilience and, and particularly human resilience. You know, it's, it's one of the things we haven't really gotten to, but I know it's one of the things you're passionate about is the, the human side of cyber security. And, you know, there's this whole wealth of, I think conversation we could have around that. What, what do you think, when it comes specifically to cyber security, what do, what do you think we've maybe haven't gotten right so far?
Beverley Roche: [00:16:48] Oh, [laughs] that's such a good question. Um, we've taken a while to put the human in, in the, in the cyber [00:17:00] security mix if you like, you know I can certainly talk to you about what we are doing about it now, but it, it really took us. We really haven't put humans. We thought it was all about tech you know, that we get that technology pace right and we're problems solved, you know, and, you know, we know that, look, if we've got good working brakes on our car and the windscreen wipers work and the engine is well serviced that, you know, it's good to go.
But, but now the humans behind the wheel and we've got very little control over what's going on. And that aha moment happened really uh, about five years ago where we started thinking, okay, tech starting to really mature, [00:18:00] what's next? What do we have to do next to help us solve this issue? And you know, I really see the human side of cyber security as we've got a lot of people working on the human side of it now. And I think our best chance of surviving a cyber attack is, is human bias because we're really good in across as humans, we're really.
Garrett O’Hara: [00:18:31] Mm-hmm [affirmative].
Beverley Roche: [00:18:32] You know, when we understand what we need to do. And I think it's something that a little bit unique to Australians as well, because we live in a country with a lot of challenges. It's a beautiful place, but, you know, we have bush fires and the way that we deal with the challenges that we have, if we can apply some of those things to cyber [00:19:00] security, I think we, we've got some really good things to showcase.
Matt: [00:19:05] In this next segment Sunil Saale, head of cyber and information security at MinterEllison speaks about the changes COVID have brought to cybersecurity and how this has accelerated the need for more controls and solutions to manage external digital collaboration tools.
Garrett O’Hara: [00:19:21] It definitely does feel like COVID has really been a catalyst for things that we're training anyway, right?
Sunil Saale: [00:19:26] Yes.
Garrett O’Hara: [00:19:26] I mean, working from home was, it was on the uptick. Uh, most organizations were looking at flexible work arrangements. It was a big requirement for you know, hiring good talent and, and being attractive to like the top end of, of talent in any organization. They, they tended to want some version of flexible work arrangements. And it sounds like was, was MinterEllison kind of moving towards that anyway, like mo, mobile working, remote working was that, was that a trend for you guys?
Sunil Saale: [00:19:52] That, that, that was done. So we had a flexible work policy, but there was no push to work from home. It was more as a, as a, as a [00:20:00] policy, as an additional benefit for, for us to work from home. There was no push that in that you have to have, you can only work in office a couple of days, a week sort of thing. So it was an additional benefit of, you know, you can work from home anytime you want, but with COVID now, it's it's become a mandate. It's not a mandate. It's more saying that, you know, you want to come into office if you need to be in office.
And many of our staff and like many other organizations they're now working from home because their work-life balance is much more I think it's much more real now, rather than you know, earlier time. That then you have to jump on the train and sit on the train for an hour reach office and then come back home, then rush home to pick up kids and things like that. Now you, you can use all the time to spend more time in, with your kids and also spend less time on train with strangers, right? [laughs]
Garrett O’Hara: [00:20:49] Yep. Yeah, 100%. We the Australian Bureau of Statistics where they, you know, they do these surveys around COVID and they kind of ask Australian [00:21:00] citizens, you know, various questions around the COVID experience. And one of the questions is post COVID, you know, what are the things that you want to continue? Like, what do you, like the changes that have happened? What do you wanna see, see stay in place?
And one in four Australians, all kind of say like actually working remotely, you know, work from home is, it, it just makes my life better. And I've certainly seen plenty of coverage in many of the kind of larger newspapers in Australia talking about this thing where for many people, exactly what you said, you know, they're spending 2, 3, 4 hours a day sitting on trains, going places to often do what we're doing right now, which is sit on a Zoom session-
Sunil Saale: [00:21:36] Exactly, yes.
Garrett O’Hara: [00:21:37] ... to somebody who's remote anyway. Um, which is quite quite astonishing.
Sunil Saale: [00:21:40] Yeah. And I, I think the, the other aspect on that, I mean, this was heading to, so we we're heading towards you know, cloud first approach anyway, but so with, with COVID, it, VPN was, was sort of a bottleneck for us in some ways, because all of the data is still hosted on-prem. So there it highlighted that, that need, that sort of a [00:22:00] shortfall from our strategy perspective that if a worker, if someone's working remotely, you have to enable them. We have to enable them to get access to the data they need at the time they need it.
So we can't really have this VPN as a bottleneck. So it, it does strategies uh, sort of thrown out of the spanner in which rewriting that strategy on when do we, you know, how do you process data? Where do we process data and things like that, it's more about enabling our remote workers and giving them access to data they need at the time they need it as well. And it, the cloud first approach, that's, that's sort of super charged now.
Garrett O’Hara: [00:22:36] Yeah.
Sunil Saale: [00:22:36] So every, every solution that we look at, the first question is, is it hosted on you know, in the cloud? And what ID, and either denies this money and solutions supported by that platform? And can we give access to a remote worker without VPN and using conditional access and things like that.
Garrett O’Hara: [00:22:55] It's, it's completely changing I've been, I've been trying to think about how I am [00:23:00] quite a visual person and, you know, like I think pre COVID in a way, and it's, you know, nothing novel kind of saw perimeter. You know, even though everyone says the perimeters is, is kind of going away, blah, blah, blah. We've been saying that for years, but there was a kind of mental picture I had of some version of a castle and moat.
And there was a gate where things like VPNs would let, let stuff in, let stuff out and your company was in the walls. And then it feels like it's almost like a membrane where like your company and not your company, but there's a membrane that's sort of letting things through at various different places, at worker levels and data levels across the membrane, but doing it securely. Um, so yeah. It, it's such a different world. It feels like compared to, yeah, this time last year.
Sunil Saale: [00:23:39] Yep. And it's also, when you think about it, say we have courts using BlueJeans and in some cases they want to use WhatsApp. So-
Garrett O’Hara: [00:23:48] Yeah.
Sunil Saale: [00:23:48] ... it's, it's out of, you know, all the consumer apps, there are on collaboration apps. They are coming to the forefront now. So we had, we used to use Teams sorry, WebEx, and we [00:24:00] are using Teams internally now and BlueJean, Zoom and everything else that wasn't heavily used. And even our staff, it, it wasn't, if we had to meet a client, we go to their office.
Garrett O’Hara: [00:24:11] Yeah.
Sunil Saale: [00:24:11] So I, I really let them to our office. So that was a norm before COVID. Now we sit in meetings pretty much all down WebEx meetings has gone through the roof, right? So pretty much every day, everyone is sitting in meetings throughout the day and at the night of evenings or late night they're working. So even when the courts push WhatsApp or BlueJeans and other other collaboration apps, it becomes from a data perspective and a security perspective. Those are finishing, consumer apps we need to assess and understand that, you know, how do we, how do we keep a record of those calls? How do we keep a record of the, the instructions exchange on WhatsApp?
Garrett O’Hara: [00:24:47] Mm-hmm [affirmative].
Sunil Saale: [00:24:47] It all adds up to the arch, archiving process as well. So you're looking at different solutions saying, how do we look at, you know, archiving WhatsApp messages, if a client messages us, you know, sends us instruction on WhatsApp, how do [00:25:00] we keep a copy of that? And how do we keep a copy of BlueJeans of, you know, someone chats on Zoom? How do you keep a copy of that, where do we file that? Because we've gotta maintain a copy of pretty much every conversation that we do with the client.
Garrett O’Hara: [00:25:16] That's, that generates a massive amount of data, doesn't it? And, and when you think about it, and then, then, yeah, so you've raised a really interesting, yeah, really interesting point around that operational use of shadow IT because we talk about it from a security perspective, quite a lot, right? I mean, it's, it's not, not a great thing to be dropping potentially confidential files in, in a, like a Dropbox, for example. Um, but you've raise, raised a really interesting kind of point around the data governance side of things.
If it's intended encrypted for, you know, a, a consumer grade communications application, what is the mechanism for doing that other than policy is it that you have to say, Hey, look, you just can't use WhatsApp to interact with clients. And would that even work? Um, have you, sorry, have you guys come up [00:26:00] with the solution? You've just, have got my brain is firing here now. And just trying to think through that, what you've just done.
Sunil Saale: [00:26:05] No, no, that, that's spot on. I mean, that's, that, that, that's a problem that we are working through right now. So we have a data governance issue, not, not an issue. It's a data governance group that Davi, we set up in Dundee. And we are looking at all the different types of data we store, the da, data, we, and we shall be speaking from our clients, how do we get them? Is it just through emails? Do we get it from WhatsApp?
WhatsApp was one of the cases, and one of the courts wanted to use WhatsApp and how do we store it? And how long do we store it? And how do we, all the retention policies and those ones. So one of the core concepts that that, that, that we're thinking of is if we can't see data, we can't protect it. So from our user perspective, our laws are pretty good, so they understand this properly.
So they, they, as soon as someone, a client says, "Can we exchange instructions on WhatsApp?" They send a note to our risk team and say, "Is it okay to use?" That's when we get involved and say, "Yes, that's fine [00:27:00] to use, but how do we, you know, how do we keep a copy of it?" And what terms do we have with our clients? So it becomes an issue where, you know, we are entering into the consumer grade solutions and also sort of, you know, consumer apps where it could have their own privacy data.
So it becomes a bit difficult. It's, we can write a policy, but again, policy is only as good as how, how much can you enforce it? The consumer wraps it up is you can, look at shadow IT policies, as good as you know, until the point where we can see data and, you know, see the information. As soon as we go into the consumer privacy space, it becomes ready bloody, and it becomes quite dark series as well. It's now privacy data.
Garrett O’Hara: [00:27:41] Yeah, 100%. And Zoom is a like a, a pretty good example of that. I think we're, excuse me, at the, like, as COVID hit you know, we were users of Zoom and, you know, many organizations were, but I think it was largely, it wasn't really ready for what happened, which was all of a sudden, very large [00:28:00] organizations were using it for very confidential communications. Hence, and like my, my personal take, it was the, despite the let's, what's a nice way to put this. The interesting marketing language that was used doing some of their my, their encryption-
Sunil Saale: [00:28:13] Mm-hmm [affirmative].
Garrett O’Hara: [00:28:13] ... like the interpretation of end to end encryption, for example, it was, was air quotes, "Interesting." Um, but the, the, I would say the, the sort of panic over things like Zoom bombing, I mean, that was easily fixed. You know, it was really just, it wasn't set up the way it should've been.
Sunil Saale: [00:28:31] Yes.
Garrett O’Hara: [00:28:32] Um, but, and I think, I mean, personally Zoom has come a long way, but I think there's probably better options than just things like Deco Secure, which is a local organization. [crosstalk 00:28:40] Um, yeah, they're doing some really good stuff. Um, I've used their VC a couple of times and really liked it. Um, but I guess the point is when it comes to Zoom, that's controllable by the folks like your, your Teams, but something like WhatsApp there, you know, there's no, you know, [crosstalk 00:28:58] WhatsApp, right? There's no access [00:29:00] to that. It's literally a black hole.
Sunil Saale: [00:29:02] Yes. Yeah. And then that issue end up, you know, if we have any sort of uh communication with the client, or if the client talks to us on, on WhatsApp, as soon as the staff, you know, if they leave the firm, we have no record of it.
Garrett O’Hara: [00:29:18] Yup.
Sunil Saale: [00:29:20] So unless, you know, we act and ask the staff to file it somewhere to to keep a record of it. We just don't have a copy of it itself. It is, it is a tricky area.
Matt: [00:29:28] In this final segment, Gar speaks with Nigel Hedges, head of information security at CPA and adjunct professor of cyber security at Deakin University, on the value of automation and the machines versus the humans in cyber security.
Garrett O’Hara: [00:29:46] I suppose uh, a pointed question, like how real, how real are the benefits of like for automation? Is it, is it in your opinion, a real thing?
Nigel Hedges: [00:29:55] Yeah, I, I definitely think so. Um, I think for [00:30:00] not only small organizations where 24/7 coverage and stocks are a bit more harder to financially realized, but also the larger organizations that have, you know, multiple stocks or stocks and internal team, and then an external team that supports that's that stock and rotation in different geog, geographies and time zones and it's all very expensive. Um, but beyond the expense, it's just that you're still dealing with with the people and you know, mistakes are made, but I believe that automation is a really powerful thing.
I think that if a, a account compromise happens at 3:00 AM in the morning and automation picks that up and then deactivates the account and sends a notification to that person's manager or whatever the process is. Uh, [00:31:00] and that gets kicked off in the morning. I mean, that's happened in close to real-time as opposed to somebody particularly that in a 24/7 stock uh, which still hasn't value by the way. I'm not, I'm certainly not suggesting that stocks are not valuable, but I think you just get a lot better bang for buck out of automation. Um, it is also very hard to do you know, there's this platforms that are getting better at automation. Um, but just a little bit of work to do, to, to, to bring it on parity, I think.
Garrett O’Hara: [00:31:32] Yeah, I definitely get it. And one of the things I've seen is I suppose, a and well, it feels like a more rational or realistic approach to the outcomes of a soar, for example. And I think it was Ernst & Young had a article too, a little while ago now, but they talk about the, the reality, I suppose, of getting to a place where a story is meaningful to security operations, and it isn't quick, and it does take fairly well documented and [00:32:00] well understood response playbooks, you know, you can't automate everything because you're and just so people can see you're nodding on video. [laughs]
But the like the reality of the potential for a huge amount of false positives or things that are actually really detrimental to the business, if they're automated. Um, what are your thoughts on that in terms of like timeframes? And I know every company is different, every organization is different, but things that maybe you've seen work really well from an automation perspective or things that maybe, maybe you were thinking about?
Nigel Hedges: [00:32:29] Yeah, for sure. Um, so, yeah, we recognized that everyone get better at incident response. So, yeah, we revised in the last 12 months our response plans and took a fresh look at the, the core risks that we felt from a cyber perspective would be a major incidents and that kicked off a bunch of half a dozen or so playbooks development. Um, and then from there having the playbooks, [00:33:00] you can look at you know, what can we automate?
Garrett O’Hara: [00:33:02] Mm-hmm [affirmative].
Nigel Hedges: [00:33:03] Um, so there's been some good quick wins. Um, you know, when you think about phishing attacks and, and delivery of ransomware, malware via phishing attacks and so forth. Um, most of the time they're really good platforms up there for email security, they filter all that stuff and that's wonderful, but there's no silver bullet. So occasionally things get through. Um, and it's what you do when that happens. So there's 144 emails that come through on a particular day, it's, sent through to a whole bunch of people in your organization.
And you get a report from somebody two minutes later saying that email is is cooked is reported. Uh, usually you have a person that then says, "Okay, I need to, to remediate [00:34:00] this," they have to talk to the infrastructure person. Who is always in charge of the exchange and rip out those, those emails from mailboxes. And that takes some time. And in that timeframe, probably half a dozen of people have clicked on those emails. [laughs]
Garrett O’Hara: [00:34:12] Yeah.
Nigel Hedges: [00:34:13] So you know, looking at automation for those scenarios is something that we've looked at and implemented. So there is the ability for us to be able to flag that as soon as possible. And then because there is still risk associated with those things you could potentially delete files and emails that you sho, you shouldn't. So the process to be totally regimented and, and checked. Uh, and the same for dynamic firewall blocking rules.
You know, if some, if some threat intelligence feed that you have detects a, a massive spike in activity from a certain geography or a certain IP address that you wanna block from a reputational perspective that could be dangerous if you have, [00:35:00] um, if you like crack a Walnut with a sledgehammer, you know, if you block a country to solve a problem and then half of your members are from that country you could, you could prevent real business. So yeah, you have to, you have to weigh those, um with, with some sort of human element of intervention.
Garrett O’Hara: [00:35:18] And I'm guessing as you've kind of moved up the ladder into what is now a senior kind of leadership to your, your I suppose the appreciation of the end user impact probably gets bigger as you go along as well, you know, I feel like, due to early comments, you know, as you start you're looking for inabilities or you look at actions and it's black and white, it's like remediate security, security, but actually as you get sort of further into it, you realize the potential impact to productivity and availability of services, et cetera, is also part of, of, of good security.
Nigel Hedges: [00:35:50] Yeah, absolutely. Um, I went to a good session a couple of years ago now where somebody whose business was, was purely [00:36:00] on their website, their website was their business.
Garrett O’Hara: [00:36:02] Mm-hmm [affirmative].
Nigel Hedges: [00:36:03] And they were asked, you know why they didn't do more for security you know, to, to huge levels of security. And the person said that, you know, if we went all bang, we're spending money there's a, the business knows that if we, if we put these features into the product, we will have, we'll have customers. And if we put security at this empty degree then it's certainly gonna slow down their ability to go to market and make money. So there's just a little bit of reality around the business model that you have to be, appreciate. Um, and as I said, a little earlier, we're not in the business of of doing security, most organizations in some businesses, you know, satisfy the objective of their organization.
Garrett O’Hara: [00:36:54] Yeah. And I definitely get that. And I suppose thinking about the, you know automation and you [00:37:00] mentioned, you know, mentioned a 24/7 stocks among the things you have in global presence, et cetera, where, where do you see the, the, you know, the machines versus humans landing and maybe short term, and then as you think forward five or 10 years what, you know, whatever it might be, what you, what is it that machines and ML and so automation excel at, and I, I suppose then the human side, be, good to get a gauge and you know, where, where you think the needle is at at the moment.
Nigel Hedges: [00:37:29] Um, I think that we've come a really long way. I, I'm really impressed by machine learning artificial intelligence. I wish I knew the answers to that question to be honest, probably make a lot of money. [laughs] Uh, I, I don't know the answers is, is my short answer. Um I, I think it's bounced around very, very quickly. I think there'll always be a requirement to have human intervention.
Garrett O’Hara: [00:37:56] Yap.
Nigel Hedges: [00:37:56] Um, I, I just don't think it's possible. Um, [00:38:00] just, just in terms of doing due diligence and having the guide rails in place to make sure that machine learning isn't used for, for nefarious purposes as well. So I think that'll always be there as a safeguard but it is interesting to see, um the developments in, in technologies that are using Python and, and different types of um, capabilities to really streamline integration stream one solution known uh, to the other so that you get system's getting tighter and tighter.
Garrett O’Hara: [00:38:32] Mm-hmm [affirmative].
Nigel Hedges: [00:38:32] Um, and that's where I see probably the best benefit. And it really, the main means that is important when investing in the security ecosystem that you're partnering with technologies and vendors who have a very open kimono type of approach.
Garrett O’Hara: [00:38:48] Yep.
Nigel Hedges: [00:38:49] Uh, and not to get too locked into a sort of a native security type of arrangement.
Garrett O’Hara: [00:38:56] 100%. I, I have had many conversations [00:39:00] recently where it feels like the ability to integrate has become something that's bubbling up higher and higher in terms of priority when leaders and teams are kind of evaluating vendor platforms, security platforms, et cetera, it, it feels like it's, it's changed to your point integration, integrability, telemetry, the, the richness of data has become much more important in, in gen, in general.
Nigel Hedges: [00:39:27] Yeah, absolutely. Um, you know, I think that 10, 15 years ago, we were still talking about firewalls being the perimeter. Uh, and now I can't remember who coined it, but somebody said identity is the new perimeter. And that sort of stuck in my mind. Uh, it doesn't sort of matter where you're traversing. Um, it's usually somebody doing something. Um, so that really drove home a point that for me, a model where you have, you have networks, you have networks that people will [00:40:00] traverse across. You have end points where people do things, data and news. Um, you have gateways where people come through a particular technologies to do something that could be email. It could be the web, it could be assess platforms.
And then tying that all together. Um, you have some form of student monitoring that can see all those things. Um, as you do something on a computer or you traverse the network or you use a gateway and then in the middle of that as a person doing all those things. So if you tie all that together and it's strongly integrated I think it's gonna help with your main time to detect or respond to incidents. And it's gonna provide a richer information forensically when stuff goes wrong which means you can react a lot quicker and reduce the damage in terms of reputational damage, as well as just the cost of cleanup.
Garrett O’Hara: [00:40:56] Yeah. Which is, is substantial. And it almost circles right back to the, [00:41:00] those big logos, you know, hitting the news. And that's, I suppose that's what it's all about is, is all the, the incremental changes things we can do to, to pull systems up, to pull them together. And, you know, ideally have people like yourself less stressed about the day, the day to day, and the potential for, yeah. For something bad to go, to happen.
Nigel Hedges: [00:41:20] Yeah, every day, getting, great and greater. [laughs]
Garrett O’Hara: [00:41:22] Yeah. And [laughs] we, we, we, we all are.
Matt: [00:41:33] And that's our show for this week. Thank you for listening to the Get Cyber Resilient podcast. Gar we'll be back next week with a brand new episode. And if you're keen to hear any of the conversations featured in this episode, check out our back catalog, but for now stay safe and stay cyber resilient. [00:42:00]