Our new podcast for boards, executives and cyber professionals
The Cyber Brief is a podcast for decision-makers in cyber. Through candid conversations with the industry's best, The Cyber Brief delivers executive-level insights on cyber risk, best-practice governance and emerging threats. Leaders in the field share practical insights, real-world stories and actionable advice for boards, executives and cyber professionals.
Our first episode tackles an issue that's critical for the industry but often overlooked: the psychological toll of cyber incidents on those who manage them.
In this episode partners Valeska Bloch and Sikeli Ratu speak with Peter Coroneos, the founder of Cybermindz and a pioneer in applying military-grade trauma recovery protocols to cybersecurity. Peter shares with us his real-world experience and insights on the psychological toll of cyber incidents, why it's a growing concern for operational resilience and best-practice techniques for mitigating psychosocial risk.
Episode one: The human side of cyber incidents with Peter Coroneos, Cybermindz
Resources from episode one:
Allens resources
- Why organisations must embed mental health and wellbeing support into cyber incident response planning
- AICD's guide for directors on governing through a cyber crisis
- Allens' cyber resources and contacts
External resources
- Cybermindz website
- Cybermindz research
- Peter's show recommendation, Mr Robot
- For mental health support, call Lifeline Australia on 13 11 14
Valeska: Welcome to The Cyber Brief, the podcast for decision makers in cyber. Through candid conversations with the industry's best, we bring you executive-level insights on cyber risk, best practice, governance and emerging threats. We've advised on some of the world's most complex cyber incidents, and we know what it's like in the trenches. We're asking the experts for their unfiltered truths and best advice on what executives, boards and cyber professionals should be doing now to stay ahead.
Valeska: Hi, I'm Valeska Bloch, head of cyber at Allens, and today's episode takes us into an area that's critical for the industry, but often overlooked, the psychological toll of cyber incidents on cyber incident responders and decision makers, why this is a growing concern and what to do about it. My co-host today is Sikeli Ratu, a partner in our employment, industrial relations and safety practice. Sikeli works closely with organisations to help them manage work health and safety risks. Joining us today as our guest is Peter Coroneos, the founder of Cybermindz and a pioneer in applying military-grade trauma recovery protocols to cyber security. Peter's career spans internet policy, leadership, neuroscience and stress management, with a brief cameo as a clerk at Allens. He now advises leadership teams on best practice support for those who defend critical systems and respond to cyber attacks. Cybermindz has conducted world-leading research into burnout in the cyber sector, and I vividly remember when back in 2022 Cybermindz published the findings of one of its early studies, which found that cyber professionals were burning out faster than frontline healthcare workers during the pandemic. It was a real wake-up call for the industry, especially given the skills shortage we face. We're going to cover a lot of ground, from the psychological and physiological impacts of cyber incidents to the dangers of blame culture, to resilience techniques, to how to get buy-in from management to invest in strategies before and during incidents, to build resilience and mitigate those impacts. We learned a lot from the conversation, and hope you enjoy it too.
Thanks very much for joining us today.
Peter: No problem, and I almost ended up working at Allens. So this is an interesting circle back to when I graduated from law school, from uni at Tasmania, and did my summer clerkship at Allens, and was actually offered a job. But for whatever reason, the universe had other plans, and I pivoted into the internet sector.
Valeska: Your sliding doors moment.
Peter: It was a narrow escape.
Sikeli: I think it's a testament to the quality of our summer clerk recruitment process. Can you tell us a little bit more about the Cybermindz origin story? What inspired it? Why is its mission so important?
Peter: I really didn't anticipate the human dimension of cyber incidents until I started working more directly in cyber security. So just to backtrack it, after leaving law school, I actually contemplated joining Allens, but instead decided to spend a year with the competition regulator on telecommunications and other issues. That was an area that I'd done in honours law, but for whatever reason, ended up leading the internet industry association in Australia during the heady days, pre the dotcom boom, right through the rise and fall, but cyber security emerged, even then, as really a major determinant, I think, of the whole trust paradigm around what the internet was to become. And so we did a lot of early work around policy and legislation, all predicated on the idea of building a safer, more trusted internet. But in the course of doing that, more and more, we were engaging directly with cyber professionals, practitioners, in fact, and it became evident, particularly in the lead-up to Covid, where we were seeing the attack rates going up and the consequential fallout, you know, really being quite catastrophic at a human level. I was leading also the Asia-Pacific arm of the cyber security advisors network, and ended up becoming the global vice president. So the idea was, we had a network of cyber professionals across 22 countries. And I don't know, something came over me that I thought I'd start running meditation classes with them, just to help get them a bit clearer and out of their stress. And during Covid, I actually stepped back because we couldn't travel at all, and went and got qualified in this integrated restoration protocol, which is a military-grade intervention that is used for trauma and burnout, depression, anxiety, insomnia, a whole raft of mental conditions. I think, to really sum it up, my mission is to really alleviate suffering in the people that we know and love and work with on a daily basis. And fortunately, we've got something now that we can measure, we can show the efficacy of this. We have psychologists that we work with in Cybermindz that come in. We do the research, as you said, Valeska, which did, in fact, surprise the nation, with those findings back in 2022. To be fair, you say that to a cyber professional, and they think for a nanosecond and realise, yeah, that's probably right.
Valeska: It wasn't hugely surprising to me when I saw it, but it's still really concerning, I think, and putting it in those terms, I think, is pretty confronting. I mean, it's certainly something that we see during incidents when you're in the middle of it, and we also see the after-effects of it in the context of post-incident reviews that we undertake as well, you can often, even sometimes, when they take place months later, you still see the after-effects and how raw it still is as people are then reliving it. I mean, aside from the individual victims of a cyber incident, it really feels as though there are probably three key groups of individuals for whom an incident can have a really profound psychological impact. So we've got the sort of cyber defence, cyber response professionals who are operating in a heightened state of vigilance, and often feel a real sense of responsibility when a cyber incident occurs. You've got the broader cyber incident response team who are not necessarily cyber professionals, and turn up to work thinking about their to-do list and are suddenly sort of yanked into a very intense around-the-clock period of responding to a crisis. And then you've got the key decision makers who have all of that. But then I think we are increasingly seeing with some of the types of incidents and threat actors that are on the rise are also receiving threats of violence and other sorts of harassment to try and induce them to those decision makers to pay a ransom. And so slightly different stresses, I guess, for all of those groups, but still quite a significant impact. Can you talk us through, for those groups, how they experience and feel a cyber incident? How does that play out, and then how does that impact organisations operationally?
Peter: Incidents typically go through a number of phases, and depending on which part of the phase you're in, there'll be different people that will be involved, and the impacts on them psychologically will vary, although there are commonalities as well that we can point to. In terms of the initial responders, I mean, you've got the detection first, so you have the SOC teams that are looking at all the incoming alerts, we see a lot of even even of breach situation, we're seeing a lot of alert fatigue, hyper vigilance, leap issues, burnout in an active breach situation, all that gets escalated, you start to see a lot of emotional response coming in as well. You've then got an escalation process, the next stage, where things get triaged. So you're going to have your threat hunters and your incident responders coming in. At this point, you're looking at an active adversary in the network. So you've got this sense of violation, that somehow the system's been penetrated, the defences, you don't know what they're doing yet, necessarily, but you certainly know that they're in there, and then at some point, you may be hit by a ransomware demand. I think, of all the different forms of cyber incidents, the two that really, I think, have the greatest psychological impact. This is from organisations we work with ransomware attacks and insider attacks, insider threat.
Valeska: Why is that, Peter?
Peter: Sense of violation, primarily sense of moral injury. It comes from the fact that something so close to you has been impugned and you're carrying the sense of downstream consequential impacts that could be tens of millions of people, as we've seen. So it's, it's, it's a tricky situation. At that point, they're also in a high emotionally, in a heightened emotional state, they're challenging. So they're in their limbic system activation. They're basically in flight-and-fight mode, unless they've been trained. A lot of the training we do is to actually sort of try and get that back into control. But you can imagine, in a high state of emotional arousal, you're not actually thinking clearly. You tend to develop tunnel vision, so you're getting a lot of potentially missing cues that you shouldn't be missing, simply because of the nature of the, you know, the neurology that's in play. I think at this point, I often when I talk about this. I have a background in law and science, but also evolutionary neurobiology. The evolutionary construct that gives rise to the limbic system was all predicated on a physical threat actor, saber-toothed tiger or attacking tribe or something. And so there are always cues to signal when you're in danger, but also when you were safe. The system could de-escalate a lot of what we see within an incident situation, even in just a high state of alert in, you know, the threat landscape in which all cyber teams are operating now, there is no off switch that is actually being triggered. And so as a result, we see a lot of the psychological but also this physiological effects that come from hyper activation of the limbic system, and that can precipitate into all the way through to anxiety and into full depression and a lot of the symptoms, and certainly in an incident, you go all the way through to trauma.
Valeska: An overlay, a lack of sleep. And the other thing we often hear is just the sheer information overload as well, coming from the various dashboards and alerts and inquiries from different stakeholders and senior management and boards. It's a lot and then the urgency as well in trying to analyse and make sense of that, sort of a lot to take in and then digest and then make something sensible out of it.
Peter: It's suboptimal in the term, in the sense that really we want our cyber teams to be in their clarity, things in perspective, having a sense of control of a situation, even though there is no ultimate control, at least a sense that you've got control over the mental instruments and the training that you're seeking to bring to bear on the situation, and yet that's been degraded simply by these ancient evolutionary factors that we don't tend to have control over, that we move into the containment stage, and then we see exhaustion and fatigue setting in, and that we start to then move into the resolution, the recovery and the aftermath, as you mentioned, there's a lot of the debriefing, the recriminations come depending on the culture. There can be blame. There can definitely be sense of shame and failure that comes from that. And then to talk about trauma specifically, because I think in these very high-intensity breach situations, what the sort of evolutionary basis of protecting our own state of mind revolves around is the sense that you have a limited amount of cognitive and emotional capacity that you can bring to bear in the moment. And so typically, what happens is that where that is overwhelmed, we will tend to sublimate those stressors, sublimate the impacts, to push them down into the subconscious mind and ideally to go back and deal with them and resolve them. But in fact, if that doesn't happen, then that can culminate in PTSD-like symptoms of trauma, where you lose your sense of internal psychological safety. That is taken from you, and it doesn't self resolve. So this is where the intervention needs to occur, either ideally during a breach, so that teams can be supported even as this is unfolding, but at the very least immediately post breach that you can get them and bring in this sort of trauma. And our case, it's very peer informed, because we are cyber professionals ourselves, so we're coming in with the language and the culture of cyber security to actually help de-escalate. To actually heal the literally heal the trauma can be done, but if that isn't done and just to sort of work through the sequencing here, we often do see resignations, people leaving as a result, possibly leaving the organisation with unhealed trauma, and that gives rise to a lot of duty of care issues that I know you pointed to in that advisory that you put out.
Sikeli: It's fascinating to hear you speak about these sorts of issues. Your point about psychological safety and the interventions that you can see being possible and that you'd recommend doing, I think are things that are of great interest to employers, certainly the clients that I advise, could you speak a little bit more about what those interventions look like during an incident, immediately post incident, and if you think there are any strategies or interventions that actually could be done prior to an incident, almost the preparedness piece.
Peter: Always prevention is better than cure in any situation, really. In terms of building resilience, we talk about resilience in terms of establishing within individuals and teams and leaders an internal sense of psychological safety that they can return to, that they can invoke and call upon at any time. And so our training is all about showing them how to access that. And the beauty is that once you actually inculcate those techniques into their daily life and into their weekly workload, then the brain will respond using principles of neuroplasticity to start actually developing structural capability. So this isn't just a psychological trick where we're learning how to talk ourselves out of stress. This is much more grounded in the neuroscience, which is around reducing the areas of the brain that get activated during trauma, and increasing the areas of internal emotional regulation in the anterior, insular and cingulate cortex of the brain, different parts that map to it more effortless sense of control. And I've meditated most of my adult life, so I know that for me, this feels like second nature now, but for most people that haven't had the development the neurological development in this area, they're very easily tipped into emotional imbalance. And I think the more that we can actually train them, to give them the skills, but also to give, to actually precipitate neurological growth and change. That is, that is absolutely the key. If you want to develop a resilient organisation, you need to have psychologically resilient individuals that are actually protecting the organisation.
Valeksa: Peter, is that …? What does that look like, practically? Is that sort of very regular practice. Is it specific type of training? What does that actually look like?
Peter: Well, so the protocol, the integrated restoration protocol, we use, is the methodology that we apply. It's a 10-step deep relaxation sequence that is run through with a live facilitator. So in our initial training, we take eight weeks, an hour a week, with a group, and we lead them into slower states of brainwave activity, where they're getting a deep physiological reset. They're also getting the capacity, importantly, to locate within them the centre of internal psychological safety. That we say that we all carry. And so we start with that part of you that is unbroken, that can never be damaged or attacked. And we ground you back into that. We call it the inner resource, for lack of a better word. So our premise is that we all have some internal capability, some inner core of strength; even though we may not feel that it's available to us, it's definitely there. And so the training is about orienting them back into that state. Now, once you're there, it enables them to detach from the emotional drivers. Give them a bit of perspective. They start to report they're getting better sleep and they're making clearer decisions, because they're now out of the fray and back into a sense of higher perspective, of calm. So the practice is more repetition of that technique, and over time, within a week or two, they'll start to report feeling calmer, more in control, so that over time, when we run this for a year, ideally with a team. I mean, in a perfect world, you'd want to build this in as part of the sustainable workplace support, because the threats are relentless, and they're going to not go away.
Valeska: And is the idea that you develop that muscle on a sort of BAU basis, so that that is then something that can be practised during an incident as well, but, but then in terms of—
Peter: Well, not even practised, I just say it becomes second nature. It automatically cuts in. You don't even have to remember it. It's like the muscle analogy is a good one, because, in a sense, the parallel is like going to the gym and you're building strength, so you're now carrying strength, whether or not you're needing it or not. You don't have to remember if you go to lift a heavy object, it's there. Yeah, I think that is the analogy, for sure.
Valeska: And so then what are some of the specific interventions during an incident that you would recommend.
Peter: So then what happens is that if that, let's say they haven't done the preparatory work, they don't have any resilience at all, and they're in their complete fight-and-flight mode. They're feeling all that fall out psychologically, irritability, insomnia, fear, anxiety, self-doubt, all that kind of stuff comes up. Self-blame, we then would actually give them the ability, even within an active breach situation, time out where they can separate from the incident for long enough that they can get a different perspective on it. We remove them from any sense of personal responsibility in the sense that this is a role that they are performing. See, ego function is a big part of the problem here, but the ego aspect of mind is self-appropriating and seeks to make everything about itself, including failure, including 'what I could have done differently'. The key is, though, within any professional there is a sense that there is professional detachment that should come, and the more that you can depersonalise this and make it more about the mission and less about the individual, and that everyone is part of a team, and that you're all working towards the same objective. And this is not something that you created, but it's something that we are now responding to. see that the issue is when I said about the subconscious drivers, none of this works if you're just talking to people at the conscious level of mind, because I can tell you tomorrow, you know, to be more relaxed, and you'll agree that you should be more relaxed, but it's not going to make you more relaxed.
Valeska: Don't tell me to calm down.
Peter: Exactly. The point is that we have to do the work at the level beneath. It's basically the operating system level of the mind, we're going down into the OS. That's where the work has to happen, because that's where the trauma is stored, but it's also where the healing and the resilience occurs. I think that's the key to this, and that, I mean, it's something if you haven't studied psychology or, you know, depth practices, a lot of this sounds a little bit mystical, and woo-woo, but the truth is, it's all grounded in neuroscience, and we can show it experimentally that it works. And I think the key in an active breach, then, just to finish the point, is that you're actually giving them the healing even as the damage is occurring, so that there is no accumulation of trauma and unresolved triggers that come back and bite you a year after the incident, when you see the litigation commencing, or whatever else might be there.
Sikeli: Really fascinating, Peter, to hear you speak in that way about these sorts of issues. The point you make about the ego and avoiding a self-focused blame state is one that I actually think has some really interesting parallels with other work health and safety issues that have been considered and thought about previously, not at a psychosocial safety or psychological safety perspective, but there are some industries where, for decades now, we've known that you get an improved safety culture if you move away from a blame culture and a blame focus, and get more people reporting safety concerns, you get better quality of reporting. So I think there are a lot of businesses, a lot of organisations that for whom what you're saying would resonate at a practical level, but probably hasn't. They've not thought about extending it into this particular kind of hazard, this particular kind of risk that's exposed by cyber incidents, etc. So, it is fascinating to draw those links between, I think, the field that cyber minds and you have been exploring, and what we know from safety culture, particularly in this country.
Valeska: And interestingly, the AICD Governing Through a Cyber Crisis Guide, which is fantastic, actually lists as one of the governance red flags, scapegoating or blaming in the context of an incident. So I think that's absolutely right. It's critical that, of course, there are lessons that come out of an incident, and insights, but that blame culture can have a really negative impact more broadly.
Peter: I agree. I think to both of your points, I think culture has a massive bearing on the degree to which stress and trauma become embedded and entrenched, or instead are avoided, you know, prevented and healed. And I think you know in the context of that skills crisis, at least shortage, pernicious as it is insidious in this industry. You know, the under-resourcing that we hear about constantly, and the fact that I think I see, us too, has just done some research last year that showed 74% of cyber professionals believe now that the current state of cyber security is worse than it's been in the last five years. So you're really in this perfect storm situation where we've got high levels of burnout, we've got an attacker skill set and geopolitical tensions that are playing out more than ever before. You've got more and more of society connected to the internet, more dependent on things running safety, safely and continuously, and yet you're not seeing the support in the organisation as you would secure you to your point, around, say, aviation safety, even nuclear facilities or medical, you know, the way you've got, certainly in aviation would be a good parallel. You know, you've got mandated rest periods and you've got mandated times off and rules around alcohol consumption and all kinds of things like that. We don't see that, we're still not mature enough as an industry, but I really think the opportunity here is to learn from these other industries, and the fact that our protocol has been so well established in the military, I mean, it's in 85 Veterans Affairs facilities in the US now, with veterans coming back from combat zones. So you can imagine the parallels there are somewhat aligned. But I think the other thing about cyber security is you cannot see who's attacking you, and so it's at another level again, beyond even these other high-stress occupations. And we don't necessarily have to argue that we're more stressed than anybody else, although I think, arguably, for the reasons I've touched on, we probably are. But it's enough to understand that the consequences of failure of cyber teams now are so far reaching, not just individually and organisationally, but even from a national security standpoint. But we really, as a nonprofit that's pioneering in this space, really, you know, we'd love to see more investment, more funding, certainly organisations committing to supporting their teams, because when you think about it, the cost of losing a cyber professional, you know, attempting to replace them, when so much of this is actually preventable, that the opportunity, our vision is really to imagine cyber teams that are not only technically well trained, but are trained in the survival sort of mindset where they can have an effective career, but also a balanced family life, good psychological and physical health. I mean, that's really what we should be aiming for collectively, because then we're all better off.
Valeska: So, Peter, how, and I know you've had quite a bit of success with this, particularly out of the US. But how do you get buy-in to undertake these programs, to invest in these things, from senior management, especially prior to an incident, given the importance of building that muscle, but also, even if that hasn't happened during or following an incident, what's working?
Peter: Yeah, I think it is an ongoing challenge. I think the answer lies in a few different aspects. One is to, I mean, the CISOs, who are in charge of teams, know full well, they have a deep sense of responsibility to, they love their teams, and they want to help them but they're not trained in these areas. So usually the approach comes from the CISO. I'm using the US pronunciation here. Apologies to Australians that persist with CISO, but we'll let that go. But also the UK. But the leadership within cyber security is acutely aware of the problem, and they are the ones that are making the investment, but they don't have total budget control. HR tends to be a bit of a hurdle for us, because they feel like they have ownership of the domain, and yet they don't have the specialist expertise that we have in this area. And I think, you know, the EAP, the traditional EAP solution, is underutilised. It's not measured, and it's certainly not targeted. So as a result, we think that the existing sort of methodology and framework that is in place there, supposedly to support teams, anyone in the organisation, is not fit for purpose when it comes to cyber security professionals that are defending the organisation. So we believe that we can make a pretty powerful argument around an investment that, you know, on an ROI standpoint, more than pays for itself, just in the retention of the key people that you don't want to lose. Beyond that, looking at the executive liability, and we've got regulation now, as you know, at the board level, more and more; you know, we're preventing risk where this is a risk mitigation exercise. This is not a wellness thing. And I think a big point I'd like to leave your listeners with, the audience, at least, is that traditionally, this has been lumped in the category of wellness. We don't see it that way at all. We see this as strategic risk mitigation, operational support and strategic uplift for the organisation. Wouldn't you want your core defenders to be operating at peak from a risk-reduction standpoint? I mean, it makes perfect sense. So I think the way to convince organisations is to really show them where the alignment is between supporting the vendors and the wellbeing of not only the organisation, but also the responsibility that you have to your stakeholders. You've also got your GRC, you know, compliance issues, regulatory compliance as well. So in a way, you know, to avoid, to fail, to avoid the pun, this is a no-brainer. Investment in this area really should be something that should be prioritised. And I think the more that we can have these kind of discussions, bring people in, I mean, we're happy to go and do high-level briefings to boards to show them what the alignment is. I think C-suite, classically, doesn't have a deep appreciation of cyber security. They may more and more now because of the regulation, but I think it's still not well understood, and I definitely think that they don't have a deep appreciation of the human impacts that come from a major breach situation. And you mentioned three or four groups at the beginning, Valeska, that were impacted, but I'd go wider than that. We've worked with major breach organisations in Australia and the impacts report all the way through to the customer call centres, the help desk. I mean, we hear death threats. We see, you know, the drop in the share price. We see, you know, fear. You know, threats against board members. So this is a non-trivial exercise. We've come into this area at a time when we've been through a major pandemic globally, and in Australia, where the Australian Mental Health Think Tank has described it as a population-wide decline in mental health, we've got a whole generation coming through in Gen Z now that are carrying anxiety; 60% have a diagnosable anxiety condition. We are recruiting from them into cyber security, and we've got a threat landscape that is just, has never been more serious and consequential. So I just think, I just can't see how we have any alternative but to be doing this work, given the factors that we've discussed.
Valeska: So the good news is that there are strategies that can be deployed, to try and end on an optimistic note.
Peter: Leave them with a message of hope. I agree. Sorry.
Valeska: No, no, no, I think it's just such an important issue, and this has been a really fascinating conversation. Thank you, Peter. But before we wrap up, I know Sikeli's dying to know, do you have a favorite cyber book, TV show, movie, podcast series?
Peter: Yeah, I do. I've watched it about three times now, Mr. Robot. And what I love about it is, in fact, we've tried to approach Rami Malek's agent in Hollywood to see if he'd become an ambassador for cyber minds, because here we've got the quintessential nerd. He has an anxiety condition. I mean, he's got all sorts of delusional stuff going on in there, highly mission-driven individual or albeit maybe for not quite the right reasons, highly technically brilliant, but carrying this heavy psychological handicap or impediment. And I just think, for me, that the depth that they've gone into because it's a series, obviously, you get into deep characterisation and development, and I think that really speaks a lot. I think there's a lot of cyber people that relate to that program, because we do see a lot of neurodivergent individuals in cyber security, and we didn't get to this, maybe we should do a follow-up. But I've got to say there are three classes of people in cyber that are more susceptible to burnout than the norm, and they are Gen Z for the reasons I've mentioned; the neurodivergent who are amazing at what they do, because they are neurodivergent, but they are also more susceptible to burnout; and also women that we're bringing into cyber security that tend to score less well on emotional exhaustion, and so I think as an industry, if we're really serious about inclusion, we've got to be supporting these people. But that's the reason I love that program, is because it speaks so much to a big part of our narrative around supporting the people defending the defenders that are defending society.
Valeska: That's a great one. Well, let us know how you go with Rami, and maybe we can get him on to do a follow-up with you.
Peter: We'll keep trying.
Valeska: Yeah, thanks so much, Peter.
Peter: Thanks, guys.
Sikeli: So, some really interesting points and really profound insights there from Peter, I think something of real value for our listeners, certainly for me, I found it fascinating to hear him speak about all of those issues for the last half an hour or so.
Valeska: Yeah, I agree. And I think in this space, we focus so much on operational resilience of an organisation, but actually thinking about the resilience of the individuals that are both defending and responding to these sorts of incidents is really important and fascinating. Sikeli, I'm curious. I know you're doing a lot in relation to psychosocial safety and the regulatory environment with boards and senior executives. How are you seeing the intersection with cyber?
Sikeli: Yeah, it's a really interesting point first here, because we're at a moment in time now from a regulation perspective, from a cultural perspective, from a operational and organisational dimension perspective, where I think psychosocial safety at work in all its different forms, psychological safety at work, these are concepts that are being spoken about at board level, being discussed in the C-suite at ELTs. It's come out of the shadows a lot more than it has been historically, and there's a lot more willingness, I think, to acknowledge that these, these psychosocial risks in work health and safety, are real risks in just the same way as if you're a business producing dangerous substances or using dangerous substances, or you're in an industry that relies on a particular sort of machinery, and you detect that there are risks in those physical parts of your business, you would do something about it. And I think that we see right now a genuine interest and a genuine willingness to be doing more about psychosocial safety. And I think the key is really going to be drawing the connections from the abstract general idea of psychosocial safety into specific parts of a business's operations, like their cyber risk preparedness. I'm interested to understand from your perspective, thinking practically, what do you think that starts to look like for businesses in Australia?
Valeska: Yeah, I think it's it starts from incident response plans and playbooks and making sure that things like that there is a rostering system so that people get time on and off and have that scheduling, making sure that there are alternates identified in playbooks, so that people aren't both having to sit in a war room for many hours in a day and also do their regular day job. So there's that backfill that's already contemplated, and that the backfill has already understands what they should be doing in that context. I think it's doing more than just giving people a link to EAP and some rostered time off, it's actually understanding the much more sort of fundamental impact that these incidents can have. I mean, we've also seen in some situations, organisations appoint a sort of chief care officer in the context of an incident, so that there is someone with a dedicated role of making sure that people are getting the time out, that they are getting some sleep, that they're being fed, and that that's not just happening with the core incident response team, but also, you know, as it sort of dissipates out as well, and, as Peter mentioned, people on the front line and the contact centres in the other parts of the business, who are also experiencing the sort of ripple effects of cyber incidents are also being checked in on and it's not just that core response team. I think there's also the sort of follow-up debrief as well, making sure there are thorough post-incident reviews. Often we get really useful, practical insights in the aftermath about the actually quite easy things that could be addressed to make everyone's life a lot easier and things less stressful as well. So a number of things that absolutely can be done but are so frequently overlooked.
Sikeli: Excellent. Good to end on a positive, practical note.
Valeska: Thank you.
Thanks for listening to this episode of The Cyber Brief. Check the show notes for resources from this episode, or visit allens.com.au/cyber for our latest thinking, don't forget to follow to keep up to date on what's ahead for cyber risk, governance and emerging threats as we interview some of the most respected voices in the industry.


