Skip to main content

Digital Guardian Podcast Episode 10: Building a Security-Driven Business Culture with Dr. Jessica Barker

by Nate Lord on Monday February 5, 2018

Contact Us
Free Demo

Dr. Jessica Barker joins our hosts for an episode focused on this often-ignored yet critically important subject from the human side of cybersecurity.

Welcome to our tenth episode! Renowned consultant, speaker, and head of publication Dr. Jessica Barker joins hosts Will Gragido, Thomas Fischer, and Tim Bandos for a discussion on how to foster a security-focused business. Tune in to get tips for driving security awareness, building secure habits, and establishing a security-driven organizational culture. As always, you can listen and subscribe to our podcast via SoundCloud or iTunes to keep up with new episodes every month.

Highlights from this episode include:

  • 2:10 - Findings from Dr. Barker's research into end user behavior and risk and how security teams can shape user habits
  • 7:00 - Tips for communicating to end users and going beyond traditional security awareness training methods to change risky behavior
  • 13:45 - How to measure the effectiveness of security awareness efforts
  • 18:00 - Why many security awareness training programs fail and how to set yours up for success
  • 34:00 - The cybersecurity industry's role in promoting security awareness amongst the general public

Intro/outro music: "Groovy Baby" by Jason Shaw, licensed under CC BY 3.0 US


[0:00:08.8] WG: Welcome back to the Digital Guardian Podcast. This is episode number 10 and with us today is Dr. Jess Parker, award-winning sociologist and a cyber-psychologist based out of the UK. Joining me today are Thomas Fisher and Tim Bandos, and my name is Will Gragido and I’ll be one of your hosts. Thanks everybody for joining.

[0:00:28.7] TB: Thank you, Will.

[0:00:29.9] TF: Thanks, Will.

[0:00:30.7] JB: Hi, thanks.

[0:00:32.9] WG: Dr. Barker, I know you prefer Jess, and that’s fine by me. Jess, why don’t you tell us a little bit about the work you do. I’ve read up on your bio — We’ve never met in person. Thomas knows you though and so are some of the other folks. You do some work with Digital Guardian with respect to social media and blogging and things of those nature. Why don’t you tell us what you’ve been working on for the last few years.

[0:00:52.3] JB: Sure. My background is in the sociology and town planning, but I started working in cybersecurity about eight years ago when I was finishing my PhD. I was headhunted for cybersecurity consultancy. I started out doing consultancy work in the defense space mainly doing information security assessments of really large organizations and then also doing some awareness raising training. Then about four and half years ago I set up my own business and during that time I've been working with a real variety of organizations on what is kind of generally called awareness raising training, but I’d like to think of it more as behavioral and cultural change training.

Talking to people about cybersecurity, what it means, what it means to them and really focusing on how we can kind of get the message through to people for whom security is not their day job. Really, try and change some behaviors in organizations. I do work around communications around policy, around understanding where the vulnerabilities lie in the human sense.

Then about six months ago I cofounded a new company called Redacted Firm where we’re really looking at how we can bring the human and the technical and the physical sides of cybersecurity together in a more meaningful way.

[0:02:10.5] WG: Excellent. It sounds like really interesting work. You spend a lot of time obviously then studying and collecting data with respect to the behavior of authorized users and insiders. Is that correct?

[0:02:22.4] JB: Yup, that's right.

[0:02:23.4] WG: Okay, excellent. What can you tell us about what you’ve seen over the years when you’re collecting that information related to insiders that leads to either potentially negligent behavior or accidental situations that allow or encourage compromise or those behaviors that lead to intentional compromise and exaltation of data, for example?

[0:02:45.5] JB: Sure. In terms of the accidental insider, the non-malicious insider, I think it’s well-known that that is far more prevalent than a malicious insider than someone who intends to cause harm and intentionally, for examples, steals information. Of course, when you have a malicious insider, they are much more costly, much more damaging problem.

In terms of the accidental non-malicious insider, I think the most interesting thing, the most important thing to understand is how big the gap is between what we in the industry get and talk about and the language we speak in, how big that gap is between what we do and what we say and the average user, so called.

For most people, for whom security isn’t their day job, we talk in this language that they don't get. We kind of work in a technological way that is more advanced than they might be, and so we have this kind of big gap where we’ll be telling them things like, “Oh, you need to use a password manager. You need to use two factor authentication,” stuff like that, and they don't even really know what those terms are. We’ll think we’re giving advice that is just like common sense and straightforward and we’re wondering why people don't do it when in fact they don't even really know what we’re talking about. That's one of the biggest findings that I always get when I do pieces of research.

For example, I surveyed the UK public a couple of years ago, a thousand people in the general public about two factor authentication. Basically, just asking if they understand it, if they use it, and I found that 70% of people said they didn't know what it was at all and 80% of people weren’t using it. I think that's an example of something where we think there’s this simple thing that is quite effective and people aren’t doing it. People in the industry will get really frustrated, but in fact we’re using terms that they don't even relate to, that don't make sense to them. When you're thinking about trying to change behaviors in an organization, it’s is really important to think, “Am I speaking in a language that people get? Am I speaking in a language that people understand? Are they going to hear what I'm saying, or am I kind of just talking to my peers?”

If you want to change behaviors, if you want to change culture, you really need to shape your messages so that they’re at the right level and that they are appropriate and interesting and translatable for people who don't work in security.

[0:05:08.45] TF: Would you say that we, as an industry or the professionals in our industry have the problem talking to people up and down, so to speak, or across?

[0:05:18.4] JB: It’s something I hear a lot. People come to me with that as an issue a lot. Certainly, how can I speak to the board? How can I influence senior-level? How can I get them to understand the problem? Then, also, people will be trying to change behaviors, change culture, maybe doing some awareness raising training and they'll be saying, “Why isn’t this getting through? Why isn’t anybody listening to it? Why are people finding this boring? Why are they not engaging with it?”

Usually it does come down to the same issue which is the communications haven’t been shaped for that audience. When you’re speaking to the board it's all about speaking at the business level and how you can translate cybersecurity issues and risks into business risks that they understand. Conference, recently, somebody described their senior executives as stupid and you kind of think how you can think that these people are stupid. They’re running a huge business, a lot of money and they are obviously pretty successful at that and have been pretty successful in their career. Just because they don't get what you're saying about security it doesn't mean that stupid it just means you haven’t found the right way to translate it for them.

I think we haven't focused on as an industry. We focus so much more in technology and we've kind of banged our heads against the wall when it comes to people who aren’t in security thinking why don’t they get it without actually looking at why they don't get it and realizing that, actually, the problem isn’t them. Really, it’s us. It’s an issue we have in the industry where we need to think about the human factors in a much more broader sense.

[0:06:56.8] TF: If we think about leaving out the aspects of the board, but if we go back down to the end-user or even to the general public, because it affects the general public when we’re talking to them, what would be the best way to address this or how can we — I want to use a different term, but I can’t think of it right now. We talk about dumbing down the message or making it simpler or making it more relevant to people. I usually use the example of terrorism where in the French Metro and in the London tube, they’re continuously making an announcement, “If you see a suspicious package, report it,” etc., etc. You’re training people to do kind of like a physical security awareness aspect. I don’t think we need to go to that extreme. What do you think we need to do to get a better message across — In UK, we did that — What was it? The Home Office Program where they were doing those commercials trying to teach people what cybersecurity was. Do you think that’s the solution or do you think we need something else?

[0:07:58.0] JB: I think the campaign you’re talking about is maybe the Cyber Aware I think it was called.

[0:08:02.3] TF: Yeah, that is the one. Yeah, I think.

[0:08:04.4] JB: Yeah. They spent a lot of money on that, but from my point of view, I mean, I don't think it made much of an impact in a good sense and I didn't personally think it was that effective. A lot of organizations will come to me and they’ll kind of say, “We know we’ve got an issue around the human side. We want people in our organization to take security more seriously. We want a better culture. We don't know what to do.”

I will work with them to look at what they’ve been doing already and the problems that they have, and the issues I generally find are people look at awareness raising training as being the answer and they kind of think, “Okay, we’ve got these problems, so we’ll do some awareness raising training,” and so they might do a half a day training program once a year or even worse they’ll just do the online training programs that people click through, and they know it's not to change behavior, but then they’re frustrated when it doesn't. I think we need to look at the communications that we’re giving to people and we need to think about whether they’re actually going to be effective.

I have an issue, and I said at the start, I don’t really like to call it awareness raising training because awareness is actually quite raised, and that’s one thing I've found when I've done research both in organizations and in the general public. People are really quite aware of cybersecurity, and with some of the big attacks we’ve seen, like with WannaCry, it’s all over the news. It is the top stories, for example, on the BBC website. It’s the headlines on all of the mainstream news TV programs. People are talking about cybersecurity like never before. What we don't generally succeed in is actually getting to the next stage where we change behaviors.

I think, for me, where I see success is kind of breaking down the problem. I will do like I say what I call behavioral change, cultural change training. For that, it's all about explaining the how and the why. Why cybersecurity matters? That has to be really relevant for the people in the room. It’s relevant to the organization, not just the threat in general. It’s not about scaring people with APT when that isn't relevant to them. It's about looking at what is actually relevant for the organization, and even deeper than that, the actual people in the room. Talking to them about personal security as well. Then, also, kind of demonstrating the how. I think cybersecurity can feel like a very sort of intangible thing and people can click on a link in a phishing email and they don't see anything happen unless it’s ransomware. For them, they won't put together the actual kind of threat vectors in the attacks with what happens. I like to do lots of demos, for example, spearphishing demos, of showing people, “This is what an attacker can do. This is how it actually works.” Taking it from being very theoretical to a much more practical learning.

I think in that regard as well it's about looking at the different ways we communicate with people, so not just kind of throwing information at them in like a classroom setting where some person talks and everybody listens, but how can you be engaging. How can you get them to learn visually and learn by problem-solving and learn by talking and taking initiative for their own learning as well, because there all these different kinds of methods are ways of actually embedding information with people, so not just talking at them but using different parts of their brain and getting them actually engaged and being interactive has a much bigger effect.

[0:11:41.8] TF: In that interactive kind of session, what do you find is most effective?

[0:11:46.0] JB: Partly, it depends on the people in the room, but I think demos are really effective. That's where we have found we’ve had the biggest impact in organizations. We worked with a bank not long ago. We did some social engineering awareness raising stuff. They were really worried, a very big bank, and they were worried about everybody’s use of social media. How much information they were sharing? Then getting a lot of spearphishing emails.

We did some OSINT on the people who were coming in the room for the training, and just a few days doing some research into them and then presented that all back at the people in the room, “This is the information we’ve found out about you. We’ve found out a lot of stuff.” We even got to the point of finding blueprints for somebody's house, things that people would not have expected to be publicly available. Then we showed, “Okay, this is what we found out about you and this is how an attacker could use it.” An example of a spearphishing email using the information we gathered, and then if you click the link on that email this is what then happens,” and showing them both the attacker and the victim side. Obviously, nothing changes on the victim side as far as they can see, but from the attacker’s side, everything that can be achieved. That seemed to really change perceptions.

We also did a password cracking demo. Again, people hear all the time, they should have unique passwords, complicated passwords, but that just seems like an annoyance to them. They don't understand why because they’ve never had that experience. When you see password cracking and you see, “Okay, this is actually real,” and they see it’s very, very easy to break a simple password. For example, in that training session, people went out and 60% of the group requested information on the internal password manager. The bank had been trying to push the password manager for months beforehand and hadn’t had much update, but suddenly when people get why they need it, there was much more drive to have it.

[0:13:45.7] TB: That makes sense. Jess, let me ask you. After you've gone through your program with an enterprise, what is the best method then to measure the success of that? How do you go about kind of after the fact with a company?

[0:13:57.3] JB: Yeah, it’s a really good question. Metrics, I think are big issue for everybody and sometimes the human side of cybersecurity is dismissed because people say, “It’s hard to measure,” which of course isn't true at all. What I do is I work with the organization before we do the training, “What are the changes that you want to see in place?” For example, is it around uptake of the password manager? Is it around using two factor authentication? Is it around inquiries going into the info sec team? What are the kind of things that you want to see change and then how are we going to measure those? I think it's really important when an organization is doing training is to think about why are they doing it. This can’t just be about ticking a box. If you want to change behaviors, what are the behaviors that you want to change and, okay, let's work out how we’re going to measure them.

It’s very particular for each organization. For some they do like to do kind of phishing exercises. They will like to do that thing of sending out the fake phishing emails, seeing who clicks on them or how many people click on them. That's really popular, I think, in a lot of organizations. If I’m working with an organization that really wants to do that, then I think the key thing to understand is that that should be part of a wider culture. I think you can’t just test people. There has to be a reason for testing them and it has to be part of quite a positive culture, because people don’t really like being tested. A lot of organizations like to do those kind of phishing exercises because they provide them with a metric and it can be nice to wrap those around trainings. To do phishing exercises before do a training awareness raising, behavioral change campaign, and then do an exercise afterwards and say, “Okay, how has that changed?” Then you may want to keep those kind of exercises going and when you start to get a spike in numbers, think about more training. I think the key thing if you're doing any of that kind of testing of people is not to punish them if they click on the link or whatever it might be. It shouldn’t really be about punishing people. It should be about kind of understanding where you need more training and more support for them.

[0:16:04.0] TB: Yeah, I think that make sense. I had another question for you. When you go into these organizations, do find that people know what they need changed or do you have to go in and really educate them on what you believe needs to be changed, I guess as a base? How often do you have those conversations in the beginning?

[0:16:20.6] JB: Yeah, it's more of the latter, to be honest. I generally find that people come to me and they know they want something done around the human side and sometimes that will be as much as they know. They will just say, “You seem to have the skills we need. We want to do something with our user base. Can you help us?” That’s a really brilliant position to be in to be honest, because then I can go in and get to understand the organization a bit and get to understand where the issues are and then look, “Okay. How are we going to address those?”

Other places, they have been more refined in what they want. They’ve had more understanding of where their issues are. They may be particularly concerned about, for example, spearphishing emails, but usually people feel like we need a bit of a cultural shift, but we don't quite know what we want or how to do it. Then that's always a really interesting exercise to help the info sec team or the IT team really think about what it is they are looking for.

[0:17:18.7] WG: Jess, can you talk for a moment about what you see with respect to post-testing exercises? In my experience, security awareness programs and education programs have, at least in the world that I came from, very limited success. What do you attribute those limitations to overtime? Especially in an age where we’re discussing and promoting breaches and the consequence of breaches on a global scale quite often. Last year, the numbers from the rise in DBIR came in, in mass with respect to the number of breaches reported. Slightly lower, but the percentage of breaches is attributed to insider activity and they don’t qualify that insider activity necessarily in the granular terminology that we’re talking about today necessarily. It was about 25%. What’s your perspective on why these programs fail or what needs to occur to mediate the risk associated with failure in pre-existing education and awareness programs?

[0:18:13.8] JB: Yeah, it’s a great question. People complain a lot, organizations have issues a lot where their training programs or education programs won’t have succeeded. I think one issue is if you see your awareness program as just like a one-off activity. If you think it's just a once a year, a half a day classroom session, or it's just a rollout of online training, just doing that isn't going to change behaviors or change culture. If an organization does that, it's normally because they just want to tick a box and say, “Yeah, we've done our annual training,” which is fine if you just want to take the box. If you want to actually have an impact, then it's about being much more inventive with your training and it's also about seeing it as a process that is ongoing. It's not just a one-off thing that you can do once a year. It’s something you need to constantly think about, because you can raise awareness and start to change behaviors, but people will kind of slip back into their old patterns unless you keep that activity up.

It's about thinking about how you can keep getting your message out and how you can kind of keep having a conversation with people. This might be, for example, having cybersecurity ambassadors or champions in your organization. Having a network of people who are out there spreading messages about cybersecurity at their team meetings, raising the latest kind of news and attacks and stuff conversation with their colleagues and also feeding back questions that colleagues might have feeding back issues that people might have. Thinking about it in that way how can I keep a dialogue open with the people in the organization who don't have security as their day job. How can we hear what their issues are and how can we help them constantly remain aware of cybersecurity?

I think, on the other hand, sometimes some of the issues are because the policy is wrong. Too often, organizations will write what they want in an ideal sense in a cyber security policy and then they’ll push it out and people won’t read it or they'll read it and they'll be things in there that aren’t realistic for them to get their day job done, and so they will find workarounds. This obviously is very frustrating for the people, for example, in the info sec team, but it's an example of where you haven't had that two-way dialogue.

I would always encourage organizations and work with organizations to look at their policy with the people who are actually going to be using it. Talk to people in the organization, find out what is working for them, what isn’t. When they aren't following policy, then why are they not following it? Can you find a way forward that works for them but that also puts the organization at less risk, because if you just try and force rules on to people and if those rules get in the way of the day job, then they’re always going to break them because security is not going to be their number one priority. It's about having that two-way conversation and it's about trying to have a policy and approach that takes into account how people work and tries to support and enable that and it's also about making sure that awareness is raised in a way that is sort of interesting and that people can actually get, so you’re not just telling people what not to do, which is often an issue with cybersecurity, but you’re explaining to them why there are certain things that are going to be bad for security and why they cause damage and how they can still work but put the organization and themselves at less risk.

[0:21:44.3] TF: Jess, earlier you mentioned one example with the password story that you mentioned earlier and in the end people were asking for a password manager, to get access to that password manager inside the organization. To follow through on that, do you think that if an organization provides tools that users can also use at home, that helps the situation or just doesn't add any additional value?

[0:22:07.7] JB: Yeah, I think it’s actually a key thing, is helping people, either giving them tools that they can use at home so they can have better personal security, but also kind giving them the knowledge and the information they need to stay safer in a personal sense to keep their family safer.

In another bank I’ve worked with recently, we actually ran cyber security for parent sessions. This was kind of evening sessions where parents, guardians, people that cared for family members were invited along and there was a small panel of us talking about cyber security from our perspective and then it was kind of an open floor where people could ask questions about their individual circumstances. They could ask questions about keeping their kids safe or about problems that they or their family members have had in terms of security online. When you do things like that, you become really aware that a lot of people don't have anywhere to turn. If someone is worried about that teenager and what they're doing online and whether they’ve had a problem with someone behaving or contacting them inappropriately or if someone is worried that their Gmail account has been compromised, people really don't know where to go. They don't generally — Unless you happen to have someone working in cyber security in your family, a lot of people don't know where to tend to for advice. They have questions and they can't get answers to them, and so they’ll be quite worried about them.

I think for an employer to have those kind of conversations or to facilitate those conversations with people who are working for them, then it works in lots of different ways. I mean, it's a good thing to do in terms of corporate social responsibility. It's a very nice thing to do in terms of having a positive culture and in terms of making your staff feel supported and understand that you care about them. It’s a great way of spreading cyber security messages out to the general public, because they can go home and talk to their family about them. It also will keep the organization more secure as well, because when people get into that mindset of cyber security in their personal life or when they’re given tools that they can use at home, then they kind of start to get into those behaviors work as well and they'll start to understand security in that workplace setting as well as in the home setting.

I think a lot were trying to do when we’re doing awareness behavioral change training is we’re trying to influence the mindset. We’re trying to get them to think about security. If you can get them to think about it at home then they’ll think about it work as well.

[0:24:46.8] TF: Yeah. That makes me think of when — the IC Squared, they do quite a cyber-security training for children or for kids as a group who goes to schools and does it in these afternoon sessions or evening sessions where they try to explain to kids the risk that they’re taking using things like Facebook and other social media networks. Do you think — It’s happening now, right? We need to train people whether it’s training the new generations of information security professionals or whether it's just training people to use systems properly. IC Squared is doing it. Do you think it something that needs to be embedded completely in our school curriculum? I know you know the IC Squared is doing it in the UK, because we both know that one of the people is actively doing it. What’s your position on that? Do you think we need to go that far down or should we stay to this level when we’re professionals and when we enter the workforce?

[0:25:50.0] JB: I completely agree that doing that kind of educational piece at schools is really important. I love the IC Squared approach. They use Garfield in a lot of their resources for kind of the younger kids, which I think is a really nice way to do it, make it nice and accessible and more interesting for younger children. I definitely think providing cyber security education at schools is completely a vital thing to do, because young people are — We all know that kids are using the internet from a really young age and they will go on to use it more and more, use it when they're at university and then getting into the workplace. It has that kind of twofold benefit where, obviously, we want to keep people safe especially young people, so it works for that. Then, also, when they go on into the workplace, then hopefully they will have a level of awareness and understanding that’s being built in from them being in school. I think that's a really important thing to do.

A couple of issues I’ve found with that. One thing, recently, I did a session with some young teenagers and young teenage girls talking to them about cyber security in a school, and the girls knew most of what I was talking about. They had a lot of their own stories to tell and there were stories that hadn't really been shared with anyone else, but because I went in there talking about cyber security, they felt like they could open up to me, I guess, and they felt like this was a conversation where they could talk about some issues that they'd had. These issues then really surprised the teacher who knew nothing about them, and I think for the girls, it would have been like, “Why would we talk to you about this, because never had the kind of conversations?” As part of that, actually, there was a few teaches in the session and they were the ones that were kind of wide-eyed at what I was saying and they were the ones writing down questions and coming up to me the end of the session. The girls definitely found it interesting and they like to have the opportunity to talk about some of this stuff, but it was the teachers I think that benefited from it the most.

There is talk and plans of kind of rolling out cyber security further into the curriculum, certainly in the UK, which I would support, but teachers need a lot of support in that as well, because they aren’t cyber security experts. They have a lot of questions themselves, and kids have kind of grown up with this stuff but the teachers haven't. A lot of them are of a generation where they wouldn’t have grown up with it at all, so we need to make sure that the people delivering the education are supported.

Another issue I have, I think it’s really important to focus on schools and to look at these kind of next generation of people coming into the workplace and also filling the so-called cyber security skills gap, but at the same time we need to not forget about the people who are out of school. We can’t just focus on 10, 15 years’ time. We need to think about the people who are adults now.

[0:28:46.3] TF: There’s definitely a generation gap, right? If we pick up from where we left off kind of, there’s definitely a generation gap between the young kids that are coming along and the generation who is teaching those kids who have absolutely no idea about security who have minimal competencies, I’d say, in just technology, using computers and things like that. We have the professionals that come into play.

There’s the aspect of who trains the trainer, right? How do we get proper training for the teachers to be able to get those kids into the right mindset and to help them get to that level? Then on the opposite side is like what’s going to be the impact as these kids come through to the workforce on information security as we know it today, because it’s going to be a completely different board game. We saw it a couple of years ago when the millennials started to come into the workforce, so they all expect complete internet access. They all expected to have mobile devices with a complete impact on IT teams as those millennials came into power, or not power, but came into the workforce. Do you think something like that is going to happen to the information security as people are coming into the workforce are more aware and more likely to have the foundations? Will that impact our jobs? Would that impact what you were talking about, building that user, that cultural change or building those correct practices into the workforce?

[0:30:21.1] JB: Yeah, I think so. I certainly hope so. I think you’re right and you’ve raised a couple of really interesting issues. One thing, as you said, who trains the trainer, and we really need to address that when we’re trying to roll out more cyber security training and more cyber security teaching in schools. The minute I think most cyber security teaching in schools comes down to kind of private companies going in, people going in and giving kind of ad hoc support and lessons and that kind of thing to kids whereas what we need is to get teachers together and get them trained and supported and also to put in place a support network for them so that — I kind of see their role as being a bit like the ambassadors I spoke about in organizations. If we’re expecting them to go into schools and teach about cyber security and kind of be ambassadors about it, they need the support of who do they go to if they have a question. Who do they go to when kids raise something that they don't know about or when there’s an attack that they don't understand. How can they get information that they can pass on a meaningful sense and make sure that what they are passing on is accurate and have enough detail to be meaningful for the kids?

Then, on the other hand, you asked about what happens when the kids that are growing up now with a bit more awareness when they get into the workplace. Certainly, I see this from young people. I see that they have more awareness of cyber security. There seems to be this kind of myth out there that young people are less security savvy, that they’re less interested in security and privacy, but that's not what I have seen or what I have found at all. For example, I did a bit of research a year or so ago, I think, on biometrics and on what people think about biometrics replacing passwords. I found that the most resistant group was young people, that was people like 18 to 24, I think. I thought that was really interesting.

One thing that didn't occurred to me at that time that someone suggested afterwards was that a lot of people of that age have lived through having biometric systems in their schools, and so they've had to provide their thumbprint to get their school dinners and stuff like that and have probably felt like they don't really want the school to have their information and they’ve started to think about these things of what happens if that gets lost or someone accesses it, and they see attacks in the news and they think, “That could happen to my school and I’m not really comfortable with that.” Also probably don't like the authority of a school kind of demanding that of them.

I think we will get people who have more of a mindset coming into industry, and that's going to, I think, help with the culture of a lot of organizations. But there also will be a bit of an interesting tension between people, much like we saw with your example with kind of so called millennials and uptake of tech. I think we’ll see younger people coming in who have more awareness of security and then there’s going to be a bit of tension between them and the older generations in the workplace who have their existing practices which aren’t as secure. Organizations will probably have to manage that where you have very different groups of users, and so you need to communicate the messages in a different way to try to get everybody up to the same baseline.

[0:33:35.0] TF: If we go back to the train the trainer aspect, I think everywhere in the world nowadays — maybe apart from France because McCorbin announced that he’s going to fund a lot of training, a lot of enhancement in the education program, but most governments are cutting back fees on education and they’re cutting back — Not fees, but funding of education. There’s a lot of pushback on what money that education has to actually spend on things like that. Should we, as an industry, be funding or financing train the trainer programs for teachers so we actually get that message out and get proper behavior in to the young generation? Do you think that would help or should we try to carry on with these kind of ISC Squared programs where groups going and do sessions on a regular basis?

[0:34:29.0] JB: I think that, for me, probably sits more comfortably than it just being down to private organizations. I think there’s a few issues and the discussion could get very political because who is responsible is a very big question in cyber security, but especially when it comes to kind of the general public, whatever age they are. It's one thing when we’re thinking about security in an organization, but when we’re thinking about getting these messages out to people in the general public or to educational institutes, then who is it that should be responsible for that and maybe it should be the state, but as you say, that's not really realistic considering there’s already a lot of issues around resource and funding.

Then should it be private companies? On one sense, I think that would be fantastic. On the other sense, you can see how there could be issues with that. That if a private company is sponsoring education in the school, then is that going to be very vendor heavy? We’ve all sat through kind of conference vendor presentations where people are pitching their products, and I wouldn't wish that to happen in schools. There may be, you know I’m kind of exaggerating. There may be some conflict issues there. Also, you may just have the point where the private company is no longer able or willing to provide that support, so it’ll leave a gap.

I think it’s probably preferable that it is those kind of professional bodies we have if they have the facility and the resource to provide that support, then it's fantastic. Realistically, I think the position we're in, is there anybody who can provide that support is going to be helping. Whether it should be our role or not, is one question, but if you can provide that support, if you have a bit of time, a bit of resource, then that's can be helping society in general and also a lot of individuals. That's why I engage with some of the schools stuff that I do because I just think I have the information and I can provide some of this support, so I try to do it where I can. If everyone can do that, then it would, I'm sure, help a lot of schools and a lot of people.

[0:36:36.3] TF: That’s a good point. I think a consortium or professional groups probably are the better, because you get a more diverse aspect of the training and more diverse view of security rather than one individual company or vendors.

[0:36:55.5] JB: I think you’re right.

[0:36:55.1] TF: You’re always better off looking at more diversity.

[0:36:57.2] JB: Yeah, I think so. I think, as you say, you have more diversity and probably also more continuity.

[0:37:04.3] WG: It’s great. Do you have any closing thoughts or before we sign off?

[0:37:09.1] JB: I guess one thing that I’m thinking a lot about lately and that we’re talking a lot about in the industry is kind of the rise of the human side of cyber security, because for a long time we have been more technically focused. I think in the last year or so there's been more said about the human side, more emphasis on human side, which is really good to see.

From my point of view, this industry kind of prioritizes technology so much more whereas I think we’ve still got quite a long way to go on the human side. One thing that I think is really important is thinking about how we support IT workers, how we support people working in info sec to develop more of the human-based skills.

We talked about awareness raising training earlier and the research suggests, for example, if you look at some securing the human stuff, their research finds that most people in roles delivering those kind of education programs have a technical background. So I think it’s really important for us as an industry and for organizations to think about how are we supporting those people and better understanding the human factors so that the awareness raising behavioral change training can be more successful. How are we helping them to understand issues around psychology and sociology and communications so that they can have more of an impact whether they’re talking to the average person in their organizational or whether they’re talking to the board to try and get more resource or whatever it might be for the infosec team?

[0:38:37.1] WG: Thanks, Jess. Tim, any last thoughts or words?

[0:38:40.3] TB: Yeah, I think that it’s really important, and Jess touched on a lot of really important ideas that — I agree, that the success of these types of programs, of awareness and education type programs in cyber really depends on consistency and time and engrain that within a culture. I think that’s where I’ve noticed most of these things fail. I would agree that the effort needs to be much more global. You made an excellent point earlier as well, Thomas, about beginning that type of training, seeding that type of training within the educational system, because the world is quite different than it was when most of us started out in this space. When you have an entire generation to —

[0:39:20.2] TF: That’s an understatement.

[0:39:21.2] TB: Yeah. When you have an entire generation that’s grown up with unfettered access to the internet and all the accouchements that are associated with that, it behooves us to take the time to think about these things from an elementary perspective. I would agree. Absolutely. This is really an excellent podcast. I think it’s going to do well for a while. Thank you so much again, Jess.

[0:39:41.6] JB: Pleasure. I enjoyed it. Thank you all.

[0:39:49.1] TF: For my last thoughts, closing thoughts. Yeah, it’s the same, really. The human aspect was grossly overlooked over the past decade or so when we started this journey into cyber security. The fact that it’s coming back and the fact that we’re focusing on so hard really, I think, will make a difference. People are starting to be much more aware. Even when you look at commercials on TV, I’m thinking — Jess will probably laugh at this, but the latest HSBC commercial for two factor authentication in the UK is based on voice recognition. It’s corny to a certain extent, but it’s subtly introducing the aspect of we need to be more careful of what we’re doing with our daily access and our daily accounts and things like that. Obviously, the human aspect is really important, I think, today.

Yeah. Thank you again, Jess. It was good to hear you. I’ll probably see you September 44CON timeframe.

[0:40:58.1] JB: Sure.

[0:41:01.4] TF: That didn’t sound too enthusiastic.

[0:41:05.3] TB: The enthusiasm was not brattled.

[0:41:07.9] JB: I’m sorry. I’m like at the end of conference season, and you know what it’s like? I’m so excited about —

[0:41:12.1] TF: You probably can’t wait to go.

[0:41:15.9] JB: Yeah. You see, for me, it’s ended now and I’ve now got a month where I’m not getting on a plane or — I was going to say not on a train, but that’s not actually true. No. I know that by the time September comes around, I will be fully excited for conference season again, but I’m so exhausted right now.

[0:41:32.2] TF: I understand that. Again, that you again, Jess, for being on the podcast. My name, it’s Thomas. Thank you all for listening, and we will announce episode 11 later through our Twitter feeds and usual outlets. Expect to see this podcast very soon online. Thanks.

Previously on the Digital Guardian Podcast

Tags:  Podcast Cyber Security Education

Recommended Resources

The Definitive Guide to DLP

  • The seven trends that have made DLP hot again
  • How to determine the right approach for your organization
  • Making the business case to executives

The Definitive Guide to Data Classification

  • Why Data Classification is Foundational
  • How to Classify Your Data
  • Selling Data Classification to the Business