The Get Cyber Resilient Show Episode #19
Gar’s guest this week is Dr Cate Jerram, principle researcher and lead academic on cyber security at the University of Adelaide Business School. Cate’s background in adult education led her to information systems research and the human and organisational aspects of cyber security. Cate also helps design and teach cyber security courses and programs as well as supervise PhD students. Cate and Gar talk broadly on the human aspects of cyber security and the delay in organisations understanding its importance, how cyber posturing is playing into VCs desire to put money into startups, and discuss Cate’s latest research.
#cybersecurity #cyberresilience #getcyberresilient
The Get Cyber Resilient Show Episode #19 Transcript
Garrett O'Hara: [00:00:00] Welcome to the Get Cyber Resilient podcast. I'm Garrett O'Hara and this week I'm excited to be joined by Dr. Cate Jerram, Head of the Adelaide Business School, who I connected with during a think tank on Cyber Resilience at the AISA conference a couple of years ago in Melbourne. Cate is the Principal Researcher and lead academic on all cybersecurity research and teaching in the Adelaide Business School out of the University of Adelaide. An Information Systems academic, Cate originally committed to a new discipline of human and organizational aspects of technology implementation primarily focused around cybersecurity back in 2008. Cate's been fighting that battle around the human and organizational aspects being at least as important as the technology aspects of cybersecurity and she's been working to show that it was a critical knowledge area for business professionals as much as it was for technology professionals.
And recently, as in her words, the world caught up, Cate has been busy designing and teaching new cybersecurity courses and programs as well as supervising PhD students. Another passion of Cate's interdisciplinarity and inter faculty research and teaching, it's now happening with cybersecurity the pioneer discipline area leading the way. Cate's working with colleagues from various schools across the whole university to ensure that students studying cybersecurity at Adelaide have the ability to study it in its full interdisciplinary nature, rather than in silos, splintered bits and pieces which we kind of see in our industry. So what a bio.
In this episode, we talk broadly about the human aspects of cybersecurity, the delay in business functions in organizations, air quotes, getting the importance of cyber. How cyber posturing is playing into VC's desire to put money into startups, and Cate's work at the university and her up and coming research topics. So we cover quite a lot. Cate's an incredibly knowledgeable person. I took so much from our conversation and I hope you do too. Please enjoy.
Hello everybody and welcome to the Get Cyber Resilient Podcast. I'm joined today by Dr. Cate Jerram. How's it going, Cate? Are you well today?
Dr Cate Jerram: [00:02:04] I'm doing great, thanks, Garrett. You?
Garrett O'Hara: [00:02:07] I am doing well too, yes. It's a long weekend here, so that always puts me in a good mood. I've got three days of doing nothing, although I suspect there may be a little bit of work along the way. But basically planning to, to relax this weekend. So I'm definitely uh, definitely looking forward to that.
Dr Cate Jerram: [00:02:23] Envy, envy.
Garrett O'Hara: [00:02:25] Yeah. So, Cate, can you just tell us a little bit about your, your background and sort of how you got to where you are today? I think you've got a, a pretty interesting bio and it would be great for the uh, the listeners to hear a little bit about how you got to where you are.
Dr Cate Jerram: [00:02:39] Okay. It depends how far back I go in history but academically, I started in education. Uh, because I'd worked in adult education... oh, a long time ago, the government sort of insisted that if you're going to work in adult education you should be qualified. So, like many people in adult education, I went back to do a degree. From then, I, I just loved uni so much that from adult education I went on to do a broader education honors and then, uh, launched beautifully into this wonderful education PhD and I got seduced to the dark side. Uh, my husband was a computer scientist and his professor, uh, asked him to ask me to start working with her because she wanted someone with my education and adult education understanding. It helped make information systems more, um, appealing and intelligible to the students who weren't int... Information Systems is a really challenging subject and IS had a huge fail rate and uh, so I turned up and said, basically, you know "I don't think you want me. I've never studied IS in my life. That's my husband's field." And she said, "Yes, you're a teacher, I need you."
And so, that, as a teacher I had to be on such a steep learning curve that in the end I dumped my education PhD to do an Information Systems PhD. And thus I found myself an Information Systems expert without ever having doing the Information Systems basics. It was really not the easy way to go about it. But from Information Systems, round about 2008, cyber security started being a really pressing need and so I realized very soon that one of the biggest problems, and this is so true of IS as well, so it was a natural realization for me, is that um, it's assumed it's the domain of the technicians and the technical people and the computer scientists. And so, um, but everyone- business people sort of complained and blamed everything on the IT guys because, you know, "It's programmed wrong and I can't use it, and oh, oh, oh."
But meanwhile the IT guys are going, "Well, what do they want? We asked them what they want. They told us this, we gave the that." And it basically... business people don't speak IT and IT people don't speak business. And that's really the domain of Information Systems, is learning both languages and interpreting.
And I could see the same thing happening in cyber security. And to me, cyber security is just too important to have the people who need it unable to understand the people who provide it and vice versa. So, that was how I got into cyber security. It was from information systems and the understanding that this has to be interdisciplinary. It can't just be dumped in IT and left there.
Garrett O'Hara: [00:05:26] Absolutely. And, and it's funny because that- those human aspects of cyber security and you know, thinking about that purely within the industry, it's quite often, you know, thinking about you know, like Bob's clicking on links and that's a bad thing. But obviously for you and then based on what you've just said that goes much deeper.
Dr Cate Jerram: [00:05:45] Oh much. Much, much, much deeper. Um, so essentially when I first started in 2008, and I was trying to- because I knew the Business School. And I was trying to say, "We need to be looking at cyber security from a business point of view." But it was all, "Go away, that belongs in IT. Go away, computing handles that." And then suddenly it was, "Cate, why aren't we teaching this? And because I'm IS, I've been inter faculty, interdisciplinary all of my academic career. And uh again, universities aren't structured for that, they're structured for siloism. In almost exactly the same way most businesses are, you know, if you're marketing you're not accounting. And if you're accounting, you're not IT. Um, and cyber security's suffered from that because cyber security is just shoved under the IT umbrella usually. Whereas, essentially, it belongs in um, executive, strategic worlds and reading down from there into risk management, and from there into operations. And that's totally overlooked because it's IT and IT have to come cap in hand to beg for a budget to do what they're expected to do.
So um, you know, so from 2008 the business people were telling me it was none of our business and essentially a lot of computing was saying the kind of thing, it was sort of like, "No, no cyber security is about the computing. When you put hardware and software measures and, you know, we'll make people do what we need them to do." And of course, year by year it was shown very clearly that you can't make people. Because you try and make the hardware and software so secure that people can't screw it up, the people just- they can be very creative and inventive about finding work arounds and being determined to make it work for them despite IT. So, yes, the human organizational factors are critical from both the business point of view and the computing, programming, IT point of view. And, fortunately, recently, the cyber security domain has started to realize that and it's starting to spill over into the IT domain.
Uh, where we're very slow is the business domain picking up on this. I mean there's still a terrible dearth of strategic leadership in cyber security. There's, um the finance sector's CPS 234, the B-E-A-R, BEAR. They're out there sort saying, "Okay, CEOs and Boards are responsible for cyber security breaches. Uh, but the CEOs and Boards are still saying, "yeah, yeah but it's IT's problem." [laughs]
Garrett O'Hara: [00:08:18] Mm-hmm [affirmative].
Dr Cate Jerram: [00:08:19] So, uh, uh, boosting it out of down there in IT below operations up to strategic [inaudible 00:08:27] risk management is still a huge ask. So, computing and IT are far more on board with human organizational factors than the business and organizational world yet.
Garrett O'Hara: [00:08:41] So why do you think there is that delay in the business picking this stuff up? Given that- I can see at this point there's clear links between the importance of that kind of human aspect of cyber security...cyber security in general, I would say, um, and to your point, the risk mitigation that, again, directly leads to kind of business outcomes. Like, it's a good thing to do and they're on the hook a little bit more, I would say, than they were before. You know, like with CPS 234 and all of those other things. What's the, what's the delay, like, why is there this lag?
Dr Cate Jerram: [00:09:12] I think there are multi-factors. Multiple reasons and delays. And a lot depends on what sector, and particularly what size of business. Tragically, I think COVID has actually helped a little. But, for the most part, when I'm reproaching really small and the lower sized- actually even bigger, medium-sized businesses, but too many of them go, "well, you know, I'm too small. No-one's going to be interested in me. I don't need cyber security." You know, that's a bank kind of problem which is so untrue. Uh, the, the, attacks on even the tiniest little home based businesses are accelerating and most SMBs or SMEs, that's for small and medium businesses, and medium enterprises, uh they are bankrupting a business within weeks to months of a breach. So, um, there's that ignorance that this applies to me. In the larger businesses, a lot of it's still that embedded- it's not a nice word, possibly not the right word, but arrogance that you know-
Garrett O'Hara: [00:10:17] Yes.
Dr Cate Jerram: [00:10:17] You know, I'm the boss, that's not my worry, it's a stupid IT thing, let the IT people worry about that. That seems to be very endemic. Uh, a slightly better but not much more helpful view is the, "Well, I realize it's a concern" and you know, "I- we've got a good IT department, but the thing is I don't know anything about it and I don't have time to learn." So there's all those sorts of attitudes where we're actually, I'm embarking on a research project to really investigate not what I think are those barriers, but what the barriers really are, what peop- what people themselves think are the real reasons they're not investing. A lot of misperception definitely, we know, is in the "It's too hard. It's too expensive." A lot of it is simply too technical, so if you don't have a CIO or a really good quality IT department, let alone CySEC. You know, you can get a copy of the ISO 27001, and 002 and 005 and um, you just start reading. It's like, "yeah, can't do it." You know, "We'll just hope nobody attacks us." You know, "I've got a firewall, that'll do."
So, a huge range. So partly, it's on the cyber security IT industry responsibility to be more communicable. Partly, it's because all of the big frameworks and standards, well with a couple of exceptions, but like in Australia the gold standard is ADF's Essential Eight. Well, Essential Eight is just sort of your average business person reads- it's eight things and they maybe know what one of them is, you know, so immediately too hard, too techy, can't do it. Or, "hmm, I'll throw it at IT, they probably can understand this." And then IT will look at it and also not being able to describe parameters, but also so many of those things. It's like, "how much would a piece of software cost me?" I'm sorry, you can get it- get software free, downloading from open source. Or you can pay $100 or $500 or $5,000 or $5 million, you know, how long is a piece of string?
And a lot of cyber security measures, that's all there so, you want me to buy a firewall and an anti-malware program and an anti-virus program and- uh, I- I'm sorry, that- that's my entire year's marketing budget. So, so many factors and a lot of them in that too hard because of money, too hard because of tech, all of those. Whether when it's not- doesn't apply to me.
Garrett O'Hara: [00:13:00] Yeah, and, and yeah, I definitely get that side of things. Do you think there's... my... is there more of an appreciation for probably, more sort of the human side of cyber security? You know, things like training and awareness and I know you've been involved in research around that. Do you think that's kind of picking up?
Dr Cate Jerram: [00:13:18] Yes and no. Um, certainly, as I said, COVID has helped. People are becoming more aware, um, and the whole remote working, there's suddenly where we're...before they were able to brush it off, you know, "we've got a good IT department" so it... "Agh... people in their homes... Agh." So they're more intelligently about remote working cyber security. Um, but... it, it's sort of like, you know quite often when you teach something you have to teach it in steps. First, you have to get some- you start sometimes teaching with some absolutes so that, once people have mastered those absolutes they may realize "Well it's not really absolute". There, the flex is here. And so, one of the first big absolutes that had to be got through to the industry was that humans are involved and that the message was Bruce Schneier's so quoted today statement "humans are the weakest link." Or um, PEBKAC, Problem Exists Between um, Computer and Chair. Um, Keyboard and Chair, it is.
So um... now that's not an absolute, so um, but you have to get there. Until you've got that you're not going to get the basics and move on. But once you've got that you have to understand that, well actually, rather than treating your humans as the weakest link, which to too many IT people means batten down the controls and make them good and mad so they'll work around it. Um, so you're actually better saying humans can be your weakest link but let's make them your strongest defense. So let's build a human firewall. Okay, so that's a much, much better, much safer approach. But you need to understand that humans are involved before you can get there. Uh, or that humans... humans, essentially override any technical or computing controls.
Um, so then... so first understanding humans are involved, culture, behavior and so forth, then deliberately building it. But because of that "humans are the weakest link" sort of message that was driven home for years just to get the industry to open up to human factors, as a result what we have is, "eons and eons, and tons and tons of awareness training programs. And I think it's most... some of them are good but not really, some of them are good for the whole week but they are remembered and then not, um but the thing is, as I said, if, if people are aware and have written it off, no awareness training program is going to help. The training isn't addressing the reasons why. So, yes awareness training is the right thing when people just aren't aware that it's an issue and it affects them. So then yes, cyber awareness training, that's a pretty good start. Um, but when people are aware and are still not doing it, awareness training is just going to make them more and more resistant.
There, there's a few different theories and pedagogy that essentially amounts to, uh, you shove something down people's throat too much and they will deliberately rebel and go against it. Um, so cyber awareness great for what it's there for, absolutely inadequate and counter productive when you need to go beyond that. And I don't see good training programs that are about really genuinely addressing the major issues and aiming for people buy in. But then on top of that, that's less about training sessions, it's more about deep seated education which starts with organizational culture and really needs to be, well, bottom-up but top-down. So trying to instill a cyber secure culture at ground level while everyone sees that the corporate, the C-suite [laughs] are doing what they want. Yeah, you know, you're never going to get that culture.
Garrett O'Hara: [00:17:17] Yeah, and you-
Dr Cate Jerram: [00:17:17] And too often training is just aimed at ground level. That the executives who are the biggest targets, the whales, are not taught.
Garrett O'Hara: [00:17:25] Just keep doing-
Dr Cate Jerram: [00:17:26] So we have anti-phishing education, but not anti-whaling education.
Garrett O'Hara: [00:17:29] Yeah, I definitely get it. And I think that is one of the things that, I think the lights have switched on and people have started to understand is that um, you know, information doesn't change people's behavior and you can know something but that doesn't really mean anything unless your behavior actually is affected and is different. And I definitely agree and take your point around the top-down culture. I had a very sort of funny thing happen in a meeting that I was in in Queensland a couple of years ago. And uh, I was talking to a security person and they were looking at essentially awareness training as a... as an option to help them, amongst other things, to fix some of the security issues, let's call them, um, in their organization. And this person described how uh, the owner of the business and their, their, sort of main person in that business had an iPad that didn't have even a... you know, a little key- um, a pin code on the screen, so completely open with their email flowing through that. If he- if that person lost it in a café or left it in a taxi somewhere, you know, that information is out there. So it's not even the... you know, don't click in the link, it's more just basic, you know, security of devices and those kind of things which, you know, sort of gets missed so often.
Um, so you've mentioned culture because I think that... that's probably the key, or the most important part of all of this, is that there, the idea of the importance of security and the behavior of security is part of that culture. Um, and we actually met- it's a few years ago now, at an AISA think tank where we were talking about cyber resilience. And um, it was a little bit of a challenging environment, if you remember it, because it was actually in the open floor. It wasn't in a room separate from the main conference, and there was lots of noise, it was, it was quite a... [laughs] quite an experience.
And we had three teams. It was a, uh, a team before a breach, a team during breach, and a team after a breach. And both you and I were on the team after one. And I remember you made, to me, probably the most insightful point of that day, which was around critical, like, the established culture becomes when there is a breach because you need that trust for people to say, "Oops, I might have done this wrong thing and I need to be okay to say that without feeling like, you know, there's going to be repercussions." Um, do you mind kind of talking a little bit more about that? Obviously, this is a fairly important area.
Dr Cate Jerram: [00:19:46] This... this puts me in, um the bad basket for some people. Uh, there are sort of two schools of thinking and you'll- some are even named for it, like I know there's an organization called Zero Trust. And certainly, like the ADF... well, the military anyway, never mind cyber security, they operate in a zero trust environment. Um, need to know, that sort of stuff. Uh, and I think this is one of the original problems of that, is because our cyber security is led in Australia particularly by the military, I think to a degree in the US and other countries as well, um, military approaches that use military methods and assumption of military culture, don't translate to non-military cultures. So, there are not many organizations, SME big global, perhaps the really big ones maybe, but not many of them function well in a zero trust environment.
Um, military it's not necessarily unhealthy because it is... it's a culture you go into knowing what the culture is and it's, it's historical and there are very clear reasons for that need to know, zero trust environment. But a lot of um organizations, this is one of the things we've noticed. So, um, very small startups, they are often very vulnerable because they are in a 100% trust environment. They, they [laughs] some of these startups, you know, even if every member of the startup is- has computing qualifications oozing out their ears will still say, "Oh, we'll get to cyber security when we're big enough, or when we get that far" or whatever, which is totally lethal because some of them are working on really high value projects. And of course the more valuable your data, the more of a target you are.
But, small startups of any kind, whether it's a local grocery store or, or um, a programming or gaming, or whatever, that initiative, that innovation, that entrepreneurship tends to be very, very high trust and they usually don't have policies or rules because you know, they talk about, they- everyone knows everyone, everyone trusts everyone. You know, it's almost like you can... you hand someone your wallet so they can take money out of it and duck down to the shop.
But once it starts getting bigger and you don't know everyone, and your organization doesn't have that clear culture, suddenly you have to go, "ooh, I need a policy". And um, it's that change over there that's difficult. So, if you come into one of those really open trusting cultures and say, "okay, here are these rules. You can do this, he can do that much but no more than that and that he, well he's not allowed to do any things like this." That just does not work in, not just startups, but any culture, family businesses, cultures that have even a modicum of trust. Certainly not when there's high trust.
But the thing is you can, if you've got a really strong education program and everyone in that trust environment understands, then um, you are almost safer. I mean, I'm not saying don't have any controls and don't have any policies, but don't have zero trust. Have a lot more, well, the Essential Eight, one of them is that really, they're really tightening down on permissions. No-one has any functionality over what the bare minimum they need to get the job done. Which too often translates between business and IT as IT think needless functionality they keep it. Business think they need more and um, you know, they can't meet their KPI's because everything takes so long through the computing system. And therefore, of course, they violate security. So um, it's a trade off. You have to really, really balance, but essentially just taking it and trying military culture and try to transplant it to an organization and hope that people will buy in and have great social interactivity and trust everybody in a zero trust environment, that... that is just... I won't say insane, let's just say unsound thinking.
Um, so if you're in a military-like environment, zero trust... well, I'm going to say tons of sense. I think in the military there are places where trust has to rule anyway.
Garrett O'Hara: [00:24:11] Yes.
Dr Cate Jerram: [00:24:12] Um, especially if you're trusting people with your lives. But certainly in other types of business zero trust is counterproductive. So that means you have to find this really fine balance trade off. Universities are a prime example of this. We, we have such a distributed environment. There's so many different core stakeholders and there's what the academics can get hold of, what the students can get hold of, different levels of students, what the admin people can get hold of, what research partners can get... you know, and yet we've got some massively valuable IP and other data to be protected. So, any CySEC from a university is usually pretty savvy about that because they have to have an incredible balance of trade off between where you trust and where people have freedom of access and mission and elevated rights, and where you don't trust, you just have a rigid "No. No-one can do this other than him and her." You know. So I'm not saying it's easy, I'm just saying it's necessary to think through rather than just impose a zero trust attitude.
Garrett O'Hara: [00:25:18] Yeah, totally agree. And circling back on something you said around the, and probably particularly startups, the, the pressure grow. I wonder, is that a huge issue there as well? Um, where you see this, you know, this huge, huge pressure to grow as a startup. Um, and to your points, it's kind of like "yeah, we'll get to security later because right now we need to get the next round of VC funding" and, you know there's always a "we'll get to it later". And then by the time you actually get to it the inertia's there and you're kind of untangling code or processes that are... it's almost too late, you know you have to approach with security by design.
Dr Cate Jerram: [00:25:58] Yeah, and that's actually... there are two answers to that and I'll back track go off kilter for a moment and say this is something the computing industry, programming industry is just waking up to. The fact that you don't tack security onto a program after you've programmed it. It's time to start programming security into every program. And they're not really there yet, but at least they're waking up to the need. So the rest, yes, any organization it's, it's, if you start cyber secure and your cyber security builds with you, you are never going to be a viciously vulnerable as you are when were under that huge pressure.
And I think we're getting to an environment now where banks and insurance- because at first insurance companies wouldn't even insure against cyber risk. It was like too amorphic, too I don't know, "gosh how do we manage that?" Now they're doing it but it puts the insurance companies in huge- at huge risk and um, banks of course, they have so many loans out to all sizes of SMBs and larger businesses. And hope loans can get defaulted on just overnight, you know, a business- as I said earlier, SMBs go bank- at least I think I said earlier, SMBs are bankrupt within a couple of weeks of the breach if not overnight. And there goes all the bank's loan, you know and no recoup. So as the people who control the money are waking up to the need, I think we're going to start slowly seeing some roll out of, you know, "well, if you want a loan from us, you'd better approach to get cyber secure. And I think, on the whole, venture capitalists seem to be even on the ball and a little more um, ahead of things than the more conservative institutions like banks and insurance companies. So, I- I'm hoping, actually, that VCs are going to start saying "very hefty investment in this project when you bring your business plan back with cyber security embedded."
Garrett O'Hara: [00:27:55] Yeah.
Dr Cate Jerram: [00:27:55] But for now, yes, that the pressure startups, it's- I disagree about the pressure to grow. I'd say pressure to survive. So at launch you say, "give me your standard statistics, most new businesses don't make their first year. Those who do are bankrupt by five. Those who make that are bankrupt by twenty, I think, or something. But um-
Garrett O'Hara: [00:28:19] Yes.
Dr Cate Jerram: [00:28:19] Yes, so it's that pressure to survive and considering cyber security as an added extra instead of an embedded core. And once people start really understanding that cyber secure- that the bad guy- it's not like... no-one would think of starting a brick and mortar business and leaving the door wide open when they went home at night. You know, they, they, they wouldn't be very happy with just turning a key and thinking that was enough. Most people in that first startup, they're still going to find ways to put steel bars on the night shutters or um, and have a safe or make sure the banking is done before they go home and there's no cash on the premises. Uh, and security cameras or a security, um on call team, security alarms, um and they don't think of that "Oh well, I'll- we'll get security after we've got a decent turnover." They just don't even put stock in the empty building until they've got security.
And people... part of it is people still don't realize how valuable their data is. That, you know, uh, you steal uh, an email list, it has more value than if you stole a truckload of electronics. So, when people start thinking of embedding cyber security in everything we do in the same way they'd invest in locks and alarms and guard dogs, um, then we're going to go far with security issues.
Garrett O'Hara: [00:29:48] And do you think it's a human thing where you can point at the stock on a shelf, for example, it's physical but you can't really point to electrons. So, I mean- well you can but you know they're so small, right? We're not going to see them, so... like is it?
Dr Cate Jerram: [00:30:02] You are so right. I think that's one of the reasons CIOs often find when they're trying to convince the C-suite that they, they need a budget to do things, it's sort of like, "what do you need, fricking need this money for? I can't see where it's going. I can't see the outcome." Uh, usually cyber security, IT, they build those cost centers, not productivity centers. Uh, and yet, yeah so a lot of it is just not seeing the value of data. I mean like-
Garrett O'Hara: [00:30:28] Mm-hmm [affirmative].
Dr Cate Jerram: [00:30:30] When I first teach my, uh students the foundation courses, quite often um, one of the first things they're saying is, "oh my gosh, really? I'm that vulnerable? I had no idea." And these, these are intelligent people who've grown up in the electronic word... world we've created. World 2.0 pretty much.
Garrett O'Hara: [00:30:53] Yeah, and then you're right, it think we're... we're almost trained these days by some of the larger platforms out there to give away data and to be open with things and to fill out quizzes on social media, not realizing that uh, you know, when you fill out your quiz that's a you know, air quotes, psychological test and they ask you about um, you know, what's your pet's name? And you know, because that tells you everything you need to know about you know, who you are as a person, but actually that's so often the one of the test questions when you've forgotten your passwords, you know. Um, so it does feel like we're... we're almost primed to give that stuff away.
And, and maybe slightly related, you were actually one of the team that was responsible for, uh developing what was called the human aspects of information security questionnaire. Can you talk us through what that is?
Dr Cate Jerram: [00:31:39] Well, uh, 2008 pretty much when I started cyber security uh, study into the discipline of cyber security, I partnered up with um, a team from DSTG, those days it was DSGO, uh and they were the, they were human aspects, behavioralists. And um, that, that's kind of what got me into cyber security. So we spent several years on it. We investigated phishing as well, but the main thing is over a number of years, we called ourselves HACS, Human Aspects of Cyber Security. We were the HACS research team, you might remember the acronym. Uh, and we, because most of that team are psychologists it was very oriented to uh, a huge study that governments and some large organizations have deployed with their employees that is designed to analyze each participant's their knowledge about cyber security, their attitude to what they know in cyber security, and their behavior that follows their knowledge or attitude.
Um, it's very well designed because that's what that team are expert in and I was able to balance out with the um, the human aspects and organizational aspects which they didn't know so much. So, um, yes, it's been published. I pulled out of the team a couple of years before the final couple of iterations, which they polished and finalized even beyond where we'd got it before. Uh, so it's a very sophisticated instrument that really investigates quite deeply. It was exciting working with the team. Uh, and unfortunately I, I, I could still be working with them now, but I pulled out mainly because of my time commitments and the fact that all the areas of cyber security I also want to research as well as what we were doing, uh, are areas they can't because, as a defense organization they have very limited parameters. So I left a colleague, Dr. Malcolm Pattinson, uh, who took over my role as the University of Adelaide director of the HACS. And I um, pulled out to start looking at other aspects, for instance the strategic leadership, topics like that.
Garrett O'Hara: [00:33:54] Awesome. And what are your kind of uh, what are your pet projects, or what are you researching at the moment or doing that maybe we could talk about? If that's okay.
Dr Cate Jerram: [00:34:04] Okay, so I've actually kind of talked about the two main focus areas that I'm trying to develop in research right now. One is I'm working with uh, a PhD student. I was so thrilled when she approached. Uh, she's from the finance sector and she wanted to research specifically that absence of strategic leadership in cyber security, but particular to the finance sector. And so I've been supporting her in that, and uh, so I'm, I'm interested more broadly than just the finance sector but she's diving so deep into that. And of course the, the, crossover.
It's interesting, it's also a topic I set my, both my undergraduates and postgraduate students as a research topic. Most of them avoid it, they choose one of the other topics, but uh the reality is they, they nearly have heart failure when they look at it because the standard university requirement is, you know, at least 50% of the references should be academic. And there are none. All of the references about strategic leadership in cyber security are at best government, for the rest they're vlogs, blogs, opinion pieces, business publications. Academia hasn't gone there yet, so there's no really substantial research. One or two pieces but not enough to do half of an essay's worth of references. So, um, that, that is a high passion area for me and I, I'm still mainly reading the links my PhD student has sent. She's the one who's the real nitty-gritty work so she'll be the expert in that, eventually. But that's my um, yeah, I want to go more than within the finance sector.
And the other project that uh, currently a team, two of my colleagues are... I... um, I was real excited to rope them in uh, because they... again I went si- I went multidisciplinary. So neither of them has come from cyber security but they bring all sorts of strengths and knowledge and skills and expertise that I don't have. And together we are looking at the SMB problem. We want to put together a three year research program that really investigates why businesses themselves say, you know, what types of security they're doing, why they do it, what the drivers were, how they afford it, why they're not doing what they know about and why they don't know about what they don't know about.
And just, essentially, all the barriers that stop them being cyber secure. Particularly, and we're especially looking into banks through partners in this because banks are going crazy in that, uh, they ask their clients to use protect- take cyber security measures and no matter what they say, businesses aren't- they don't implement cyber security. So you can't say that they're not aware, when they're aware it's so important the bank wants them to do it, it has to be other barriers. So I want to research that. First um, investigating why they say they're not doing it, and then really analyzing that um, about how that translates to what the real barriers are. Because quite often you think, "Oh, I'm not doing that because, you know, whatever" but the real reason behind that is, you know, if you're just trying to fix that problem you're not going to fix it. Once you realize that the problems come underlying behind that, then you can move forward. So that, that's our goal. First, really finding out what they think their barriers are, analyzing what that means the real underlying barriers are and then addressing that.
Garrett O'Hara: [00:37:31] That sounds fascinating and I, I'm guessing you've got your, your plate full, um, uh, I'm sure it won't be easy to figure that stuff out.
Dr Cate Jerram: [00:37:42] Well, at the moment we've got a number of cyber security programs going. And, of course, I'm in the business school, so um, I don't have any colleagues in cyber secure- uh, in computer science. Of course, we have quite a few colleagues who specialize in cyber security. But at the moment, I'm still having to design and run a lot of the courses by myself, and eagerly sort of trying to build up other people with expertise because the demand is growing and growing in what we can co-execute for a project in cyber security. And suddenly this burgeoning program that every time I turn around more course and more courses and more courses are wanted. So, it's exciting to see that the world is waking up to the new [inaudible 00:38:26]. Especially as, and people just don't realize this, you... we've got half the world totally worried about not being employable any more and yet, right around the world globally, there are several million, not hundreds, million vacancies in cyber security. And not all of them have to be computer science or programming. Uh, a lot of organizations are just looking for people with ordinary, everyday business skills who also understand, know, and can implement cyber security. Um, so people are starting to wake up and it means that, yeah, I'm- quite- you know, never mind the research, just the teaching I'm getting quite busy.
Garrett O'Hara: [00:39:05] Good to hear. Well, we've absolutely blown over the time, uh, which is always a good sign and uh, yeah, so like what I would like to do now is just thank you for, for taking the time out to us- uh, with us I should say. Um, fascinating conversation and it's really nice to hear from, from my perspective somebody who's uh, working on the academic side and doing that kind of deep research into what does this all mean. So, thank you so much for taking the time out, um, given how busy you are with all the stuff that you're currently working on, so it's very much appreciated.
Dr Cate Jerram: [00:39:36] Thanks, Garrett, it's been interesting talking to you again, just being able to hear the questions and what is interesting others, especially others in the industry. That, that's of great value to me too. And it's lovely to have a chance to get those opinions that I hold so passionately out in, you know, in, into the internet world where hopefully other people will hear and pick up and follow on. So thank you.
Garrett O'Hara: [00:40:07] What a wonderfully rich conversation. It is so easy to learn from someone who is literally an educator, so hopefully Cate's work continues to push us all in the right direction. Thanks again to Dr. Cate Jerram for her time and wisdom and thank you for listening to the Get Cyber Resilient Podcast. I'll look forward to catching you on the next episode.