Building cyber awareness for NIS2
Cybersecurity awareness is no longer a checkbox exercise but a core part of risk management under NIS2. In this session, you gain practical insight into how awareness and security culture can reduce real risk, strengthen resilience and help organizations turn regulatory requirements into meaningful behavioral change.
Why awareness matters under NIS2
NIS2 marks a clear shift in cybersecurity, moving responsibility beyond IT and into leadership and the wider organization. Awareness and security culture are now central to managing cyber risk. Employees become an active line of defense when they understand threats, relevance and expected behaviors, not just technical controls.
From compliance to real risk reduction
Many organizations focus on training completion and formal compliance, but this rarely delivers resilience. The real goal is reducing risk through behavioral change. Measuring impact means looking at reporting behavior, incident detection and decision quality, not only quizzes and attendance rates.
A structured approach to security culture
Effective programs start with understanding human risk before designing interventions. By identifying whether challenges are knowledge, skill or behavior related, organizations can target the right initiatives. Continuous evaluation ensures efforts scale only when they deliver measurable impact and lasting resilience.
Building cyber awareness for NIS2
Cybersecurity awareness is no longer a checkbox exercise but a core part of risk management under NIS2. In this session, you gain practical insight into how awareness and security culture can reduce real risk, strengthen resilience and help organizations turn regulatory requirements into meaningful behavioral change.
Why awareness matters under NIS2
NIS2 marks a clear shift in cybersecurity, moving responsibility beyond IT and into leadership and the wider organization. Awareness and security culture are now central to managing cyber risk. Employees become an active line of defense when they understand threats, relevance and expected behaviors, not just technical controls.
From compliance to real risk reduction
Many organizations focus on training completion and formal compliance, but this rarely delivers resilience. The real goal is reducing risk through behavioral change. Measuring impact means looking at reporting behavior, incident detection and decision quality, not only quizzes and attendance rates.
A structured approach to security culture
Effective programs start with understanding human risk before designing interventions. By identifying whether challenges are knowledge, skill or behavior related, organizations can target the right initiatives. Continuous evaluation ensures efforts scale only when they deliver measurable impact and lasting resilience.
View transcript
Welcome everyone and thanks for joining this bright and early morning. Today, as you probably know, we will be talking about how to strengthen cybersecurity culture, specifically in the contact of NIST 2. We aim that today's session will be very practical, but already now start thinking about any questions you might have for Ulrich and I and we will try to answer them as we go along or in the end. We also expect a lot of you to be in a commute right now, which is completely perfect on the way to work. So don't write them in the chat if you're driving a car. Keep eyes on the road. We don't want to get in any trouble. But we're here if you have any questions for us today. Today we'll cover why awareness matters for cyber resilience, what NIS 2 actually requires when it comes to training awareness cybersecurity culture, the common pitfalls that we see all the time when working with organizations, and then we'll leave you with the how-to on building structured awareness programs. We have an anonymized client plan for you to show as an example. So our goal is to help you build a stronger security culture that reduces actual risks and increases compliance within NIST 2. But before we start, a quick introduction. Yes, I can go first. My name is Ulrich. I've been working at Implement for five years. I have a legal background and my specialty is regulatory compliance and security. And I try to do a count. And I think by now I've done more than 30 NIST 2 projects. So I'm fairly well versed within that area. Okay. I'm Mia. I've done fewer NIST 2 projects. I have a background in humanities. So not especially technical. I build that upon it. But I work within the human risk area. And I help organizations think strategically about how to build strong cybersecurity cultures. And I think that what me's work and my work has in common is that we both focus on helping organizations turn cybersecurity requirements into something more practical. And that means helping organizations achieve real behavioral change and organizational resilience. And a big part of that both for me and myself is obviously security culture and security awareness. All right. And I just went over this before. But for the new joiners, what we will cover today is, first of all, the introduction. We're doing it. Then why we are giving attention to awareness when talking about NIST 2. Makes sense to establish that, hey? Then the actual requirements. We'll go into that with the 30 NIST 2 projects. And then the how, the pitfalls. And we'll leave you with implementation steps as well. So, again, if you have any questions, please feel free to write them in the chat. We'll try to cover it as we go along. Or we'll make time for it in the end. All right. Let's dive into it. Thank you, Mia. Thank you, Mia. Thank you, Mia. Thank you, Mia. Thank you, Mia. Thank you, Mia. Thank you, Mia. And this shift is also what NIST 2 is trying to achieve. So, it goes very well hand in hand with the regulatory landscape that we're seeing now. And obviously, I think that that is self-explanatory. But awareness and security culture is a very big part of that shift. So before we get into the specifics about what NIST 2 says about awareness, I want to set the regulatory stage a bit, so to speak. And as many of you have heard by now, NIST 2 marks a clear shift in cybersecurity. It's no longer just an IT concern, but a leadership responsibility. So what is new about NIST 2 is that it introduces a risk-based approach. Many of you will have heard that a million times by now. This means that organizations must focus on real threats, real vulnerabilities, and actual business impact, with management actively involved in oversight of managing these risks. And what is very important about this and regarding NIST 2 is that now cybersecurity explicitly goes beyond technology. And as such, NIST 2 puts a very strong emphasis on awareness and human security and security culture. And this means that organizations need to build a culture where employees understand risks and where they stay alert and they act as the first line of defense for the organization. That is, in essence, what NIST 2 is about. So it moves cybersecurity away from technical functions to organization-wide responsibilities with awareness and security culture at the very core. And so it's a NIST 2 requirement to work within this field. But you might say NIST 2 or not, there is still a lot of value in focusing some efforts in the human aspect. What we see is that strong organizations focuses on these three pillars in the security work, process, technology, and people. And we also see some pitfalls that most organizations, they invest heavily in technology, having a very structured way of risk management within processes. But they lack a structured approach when it comes to mitigating the people aspect, the human risk aspect of it. And I've heard, but culture is such a difficult thing to work with so many times. It might, because people change their dynamic. But it's actually something that we can work with structured, in a structured way. We'll get back to that. And the thing is, there are very few controls, technical controls, process controls, organizational controls, whatever you want to call them, that don't require that someone, some person needs to change a behavior. Maybe they don't know how to. Maybe they don't even want to. We do also see a lot of resistance within the cybersecurity field. And that's why it makes sense to start working with this. Because, first of all, cybersecurity, to be honest, is rarely designed for optimal human behavior. There are a lot of things that we set in place, barriers, that don't feel natural for the human brain, because it makes things harder. So that is one reason why we should keep working with the human aspect of this and look at the way we design cybersecurity as well, on a side note. And then you've probably heard a million times that humans are the weakest link. I don't really adhere to that, because in my opinion, humans, they're a dynamic risk factor, meaning that you can make people do the weirdest or the greatest things. People go to war, for crying out loud, if it makes sense. So people is an immense resource if you nurture them. But if they don't see the meaningfulness in doing the things, if it doesn't make sense if they're dismissed as not understanding what to do, they will start disobeying, because that's how we do as human beings. So we're either the strongest firewall or the weakest link. And third, if we invest in the human aspect, we also invest in making a bigger impact in the rest of our initiatives. Because if we are able to establish meaningful cybersecurity core narratives around what we do, then it will be way easier for people going forward to understand why a simple, difficult legal process is needed. It's way easier to tap into that. So, missed to or not, it makes sense to invest in the human aspect of it. And if you are to remember just one thing today, it's that security awareness is part of risk management. And it's something that we can work with in a structured way. It's not something that we just do within HR, though we would like them to play a big part in this. It's not just one-off communication campaign that we do every cyber month in October. It's not something that we really do for compliance. Not if we want to have an effect, an impact of what we do, at least. It's a risk-reducing measure, just like any technical control that we would implement. And that is also exactly how NIST 2 treats it. Yes. A little bit more on NIST 2. Yes. So, with that said, what does NIST 2 actually say about awareness? And what is a bit sort of strange with this is that it's a central requirement, but NIST 2, as anybody who's read the text will know, is super high level. But there are a few things that we can actually extract from the text. So, the first part is management accountability. And as such, under Article 20 of NIST 2, leadership must approve cybersecurity risk management measures and oversee them. And here comes the slammer. They must receive cybersecurity training themselves. So, it's a direct requirement in NIST 2 that management are trained in cybersecurity awareness. Then, secondly, we also have a baseline that basic cyber hygiene and awareness is required. Under Article 21 of NIST 2, organizations must implement appropriate and proportionate security measures. And this explicitly includes cyber hygiene and awareness programs. So, it's a part of the measures that are required under NIST 2. However, NIST 2, like I said before, doesn't say anything about the types of awareness to be applied, the frequency of awareness, or the specific audiences of awareness beyond management. Are we talking about everyone? Are we talking about certain roles? What are we actually talking about? Who needs training? But we are in luck because we can actually learn a bit more from some other legal sources that helps us understand what is the requirements under NIST 2. And there are many of these. We've included a few here. So, first of all, we have national guidance. In Denmark, we have, for instance, the Steuersen for Samfundsäkerhed that has done some guidelines on measures. Then, secondly, we have what is called the implementing regulation, which specifies the requirement in NIST 2 for certain actors. I'll get back to that. And then we have some guidance from INISA on this implementing regulation. And implementing regulation, I think, is actually, from these sources, the best place to start because it's quite specific regarding what could be involved in awareness and security culture. But please do be aware that these, both the implementing regulation and the NISA guidelines, only apply to certain technology providers. So, they do not apply to everybody who's in scope of NIST 2, only a certain part of those. However, in my assessment, these regulatory texts actually reflect general principles that are relevant to all types of organizations, no matter if you're in scope or not. So, please don't be sort of alarmed regarding the scope of this. Instead, focus on what is actually, what inspiration can you get out of it. And on this slide, which I will not go into depth with, we have simply included links to both the implementing regulation and the INISA guidelines and the guidelines from Steuerten for Samhansäker Health. And after the webinar, we will share the slides with you so you can go in and click on these links yourself and read up on the sources. But, you know, like Jamie Oliver, we have actually done a little bit of pre-cooking to help you understand in a bit higher level of detail what is actually in these legal texts. So, I will now try to dive into some of them. And if we sort of scan through the legal texts, we can see that actually quite a high number of different awareness initiatives, possible awareness initiatives are mentioned. So, we have everything from cybersecurity, information materials and handouts, like a pamphlet, physical sessions and events. We have attack or incident simulations, phishing simulations, we'll go into that bucket. We have virtual training, e-learning, gamified intervention, gamified nudges, and then courses and certifications for IT professionals. And I think that this list, more than anything else, just shows us that organizations must remember to think broadly when they think about awareness. So, awareness is not only e-learning for all parts of the organizations. It could be anything. And because NIST 2 doesn't prescribe any specific type of awareness or security culture program that you must apply, the key takeaway from this part of the slide is that awareness, like me said before, must be risk-based. And that's also something we'll get back to later. And similarly, like I said before, NIST 2 doesn't say anything about the frequency of training. The frequency should be based on the risk that we are trying to address with the training. So, if the risk is sort of is that we don't have any awareness in the organization, that is one kind of awareness with one frequency. But a good rule of thumb is that if nothing else, you should at least ensure that all your training and awareness elements are run at least once a year. And I'm not saying that it's our recommendation always, but at least once a year. And me is smiling. But we'll get back to that later. I have notes. And what is important is that we are not talking about the format. We are talking about the impact on behavior. And speaking of impact on behavior, we also want to say a few words about how to measure the success of training. Because what we see with a lot of our clients is that they tend to focus only on the outcome of the program, not its impact. So, when we measure whether our security campaign was successful, we will look at stuff like how many people completed it, how well did they fare on the final quiz, you know, those fake phishing simulations that we send them, how did they react to those, how engaged were they during the training. And those KPIs are good. I'm not saying that. But what I'm saying is that they say nothing about the impact on the risk and on our security level, but only the outcome of the training itself. And like I said before, the training is not the point. The risk reduction is the point. And therefore, our advice is also when you look at how well you're fare in your training to look at the impact as well. So, that means looking at the observed behavioral change after the training has been completed or the security culture program. For instance, do people actually start reporting suspicious emails? Does our time to detect incidents go down? Do our risk assessments actually have a higher quality and so forth? So, again, only the imagination sets the limit. But remember to look at the impact of what we're trying to achieve and not the outcome of the training only. And is this easy? No, it's not. Because it's easier to have those training completion rate KPIs. I understand that. And that's probably why we see some common pitfalls when working within this area. Either that we do the pure compliance. It's the very least effort. It might get you the stamp because it's fast and cheap. But obviously, it will not guarantee actual resilience. So, an example of this could be having people sign a document saying that I will adhere to the type of security measures within this organization. Will it impact your resilience? Probably not. But you have a CMA, cover my ass, in place. Another pitfall that I see a lot of the times is doing the generic awareness. That takes a bit more effort than having people sign a document. And honestly, it's also a good place to start if this is a completely new field for your organization. Because then you need some foundation to stand upon. But the downside with doing generic awareness, for example, via an e-learning, is that you have to think about that every single person in the organization, they spend time on clicking through the e-learning. The learning outcome is variable, at least, at best. And the real problem is that you don't actually know whether you're mitigating the actual risk that you have within the organization. But again, good place to start if you have nothing right now. So, the recommendation, obviously, is that you do spend a bit more time on this. The downside, obviously, is that this would require your organization investing a bit more. But the upside is that if you do the actual analytical and strategic work, you will be more sure that your investment within this area will actually resolve in some impact on your resilience in a positive manner, hopefully. So, it's just to sketch out what we see a lot of the times when working within this area. And as we promised, here is the approach that I at least work with a lot when I go into organizations and have to work with creating lasting cybersecurity cultures. A lot of the organizations I work with, they jump straight to the design phase, the design and implement. And, yeah, it's the fun part. You get to actually design all these behavioral interventions that you want to do. The problem is, though, that if we haven't analyzed what our human risks are, the design that we're doing, we might not design for full impact. At least we don't know. We are probably not mitigating our highest risk if we don't know what they are before we start designing. So, what I would urge you to do is invest some time in the first phase, which is solving the right problem before we start to solve the problem. And that requires knowing your human risks. For example, do you have a knowledge problem within your organization? Do people actually not know what to do at all? Is it a skill problem that they don't know how to? Or do people know what they need to do, how to do it, but they still don't do it? Then you probably have a behavioral problem. So, just to take an example, if people don't know that there's an increased threat from cyber, then that's a good place to start, yeah? Create that awareness. And if they don't know how to navigate this, then training is probably a good way to go forward. So, they know what to click, how to be aware, how to report, all these kinds of things. But if they know the cyber threat is there, if they know how to navigate this and they still don't do it, there are probably some barriers, some frictions within the organization, in the processes, how you set up things that you need to mitigate and do interventions. So, having this sort of foundation before you go into design is quite crucial for knowing that you will mitigate the risks that are actually within your organization. And then, obviously, last phase is that you start building on top. Before you scale this, please make sure that you can document that your intervention has the right impact that you're looking for. Sometimes people don't do what you expect them to, and you might get a worse outcome. So, please, don't scale before you know that it's a good outcome. And it's not just a one-off building a cybersecurity culture. It takes some repetition, and it requires putting some thought into building on top of initiatives. So, you don't just do the yearly one-off going into what is phishing, what is social engineering, and all these kinds of things. You'd rather have yourself built upon. So, that's usually the approach that we would go about when meeting an organization. And, Uleg, we'll give an example. Yes. And actually, before that, I just want to give sort of another real-life scenario of something that happened where we had a client that did a security program where they sort of wanted to raise awareness regarding incident management. And the first thing that happened after they did that was that the amount of incidents that were reported increased a lot. Yeah. And they said, what is happening? Are we getting more incidents? No. But the amount of incidents that you have to handle is increasing because people don't understand what an incident is. That's not what I wanted. Yeah, exactly. And that was not what they were going for. But that was the first step in actually starting to work with this. So, like me said, please remember to look into how do people actually react to the training when you do it. And like me said, the last part before we go to the questions, and there is actually a question in the chat, which is very nice. We want to show you a specific example of some client work that we did. So, this is a roadmap, their awareness and security program roadmap for 2026, obviously anonymized. And it's fairly detailed. So, I will not, with the time that we have remaining, go into each and every deliverable. But there are a few things that I would like to highlight. So, first of all, as you can see, if you take a look at Q1, we actually spent quite a lot of time on ensuring that we solve the right problem. So, we began with a risk assessment that was approved by management. And based on that behavioral gap that we then identified, we could implement KPIs and impact measures and finalize the project plan. And I think this is a good example of how much time you can actually spend on solving the right problem. And if we had not done this, I'm not sure that the next initiatives that we would have done would have had the same impact as they did. So, secondly, and this is where sort of the real fun begins. In Q2, we started working in two different security culture tracks. So, one was core initiatives, which are awareness initiatives across the business. And then, secondly, specialized initiatives, which include specific employee groups with elevated human risks. And I think for this client, it was like finance, research and development, privileged users and HR. And the core initiatives, they pertain to secure use of AI. No surprise to that. And the role-based training pertain to social engineering because those were the human risks that were identified in the first phase. And I think there was a higher risk of these specific segments within the organization being subject to social engineering for social media specifically. So, that's why we want to go more into the social media aspect of social engineering on top of doing, of course, secure use of AI, which was for everybody. Yeah. Just to add. Yeah, that makes sense. And along with the initiatives, obviously, we measured the impact. So, not only, like I said, the standard KPIs, how many people attended the secure use course, but how did it actually affect their use of AI in their day-to-day lives. And then, and this also ties very nicely into what me said before, at the end of the year, we will evaluate the success of the program in accordance with the set-out KPIs and impact measures and then prepare for initiatives in 2027 based on that. So, the plan for 2027 will definitely not look like this. It will be based on what was the impact actually on security and risk. We have two minutes remaining and we have two questions in the chat. So, I will skip ahead to the final slide, me, if that's okay with you. Yeah. So, that we can actually go into it. Should we start with the questions and then do a summary? Let's do it. So, we have two questions. The first is from Deo, who says, let's talk money. We always like talking about money. Many small and medium utilities providers in the Danish water sector operate with limited budget and under stringent financial regulations. Where should such organizations prioritize their initial investments to get the most resilient impact per krona? I think that's a very good question. Do you want to start me? Yeah, I can start. I would definitely ask myself the question, what are our human risks? Not all the other risks that you have within your organization because you probably have a lot of technical and processes related risks as well. But what are your human risks? This is probably something you know if you're a quite small organization. You usually know what people tend to do. And then I would also ask myself the question, what have we done before? If you haven't done any sort of awareness within this field, I would probably start there with some quite generic messages to go around the organization. Because it's way better to do something than nothing. So don't go into analysis paralysis. If going through the strategy and analysis phase feels a lot, then just do something. And then start building from there once you get more confident in what you're doing and how people respond to the initiatives that you set in place. So know your human risks. And depending on what you've done before, at least do something within the generic awareness field. And I would say, as I agree, and on top of that, if you are a small organization, the upside is that it should be fairly simple to sort of figure out what is the level of awareness in the organization. It's not sort of impossible to send out questionnaires or have interviews given the size of the organization. Then the second question from Joanna. Along with mapping our risks within the security and analysis phase, how often are organizations missing the opportunities that can be built into the design phase? For example, OT data and automation usage and possibilities that can be identified while creating segregated networks. And I think very shortly to answer your question with a few seconds remaining, that is a very, very good point. And we always advise organizations to look into the business opportunities that exist when you do these types of initiatives. And very often we actually see that our clients forget that. So I think that's a very good point and maybe a very good place to leave this. Remember to not only focus on the risks, but also the opportunities that exist when you do these types of programs. Yeah. All right. Thank you so much for the questions, Joanna and Deo. Will you have the honor of summarizing? Yes. Thank you. So to summarize, News 2 raises expectations for cybersecurity governance and resilience. Awareness is now a core risk management control, not a communication exercise. And organizations that succeed will treat awareness as a strategic program that shapes behavior. Thank you so much for listening in. And we will share the slides. And please feel free to reach out with any questions that you have to me or myself. And have a very good rest of your days. Thank you so much for today.