Jason Pufahl 01:16
Oh, yeah, for sure. Yeah, these are legitimate, no doubt. And I think though, today, we wanted to spend a little bit of time focusing our discussion on sort of insider threat. I think, for a variety of reasons, right? One, certainly, you know, it’s a concern, when we do some incident response, it tends to be something that comes up for sure. Certainly, you know, each report spent some time alluding to it, but I also think there’s, you know, there’s regulatory compliance reasons for businesses to think about, you know, what is insider threat and how do they protect against it? So, you know, Matt, I think you probably have some definitions or thoughts on what insider threat is.
Matt Fusaro 01:51
Yeah, I think it’s a good place to start. It’s actually talking about what a lot of these places when they make these reports, or when you’re looking at some type of infograph or something like that, what an insider threat actually is and what, I guess how they classify them, right? Most people think it’s some disgruntled employee, or someone that’s looking to get some data out of the organization for whatever reason, right? Most of the time, they do include those numbers, but they also include things like a negligent employee that you left a password somewhere, or you didn’t secure a database properly, they usually group that into insider threat as well, right? So keeping in mind that those those are part of all those numbers. But yeah, we’re focusing on the insider threat, mostly because we saw some interesting data there for this year, a lot of costs went up. And I think a lot of organizations kind of focused on the hacker, if you will, to someone attacking or using phishing or something like that to get a password, right, so that they can take that data or take that access and move laterally, get data that that person had access to, whatever it might be, right? I think it’s interesting that this year, we’re looking at, FIT was a 56% of the incidents that were part of these studies were because of negligence because of either a contractor or an employee did something that they shouldn’t.
Jason Pufahl 03:19
So, interesting, so greater than half was a result of negligence. And yet, the majority or greater than half of the people polled were concerned about the attacker. So in a way, I think a strong argument can be made that you might be concerned about, to some degree, the wrong thing.
Steven Maresca 03:38
And I actually have a theory on that and it has to do with some of these other reports that make the popular press. Some things that we’ve cited ourselves, for example, the general metric of an actual incident, like ransomware, that’s been reported over the last five years or so, is that 80 to 90% of those incidents have to do with credential theft. That’s accurate, in that demographic. This is broader than that. And I think it’s a cause of people conflating the two.
Jason Pufahl 04:11
So here’s a question though. So the credential theft, the individual who was successfully phished, do they fall into the category of negligent employee or do they fall into the category of attack, a successful cyber actor?
Matt Fusaro 04:27
Yeah, for the reports that you see, they’re gonna be probably not considered, that’ll be a credential theft, right? You can definitely make the argument for negligence there. That’s why a lot of security awareness training is targeted towards people or groups in departments that have a history of falling for phishing campaigns.
Steven Maresca 04:49
Negligence, you know, it sounds really negative. It is, right? There’s a reason for the term to have the connotation it does, but it’s inadvertent disclosure in a lot of cases, somebody emailed something they shouldn’t have to a party who shouldn’t have received it. That’s negligence. And it happens all the time. And that’s lumped up into the reportable incidents.
Matt Fusaro 05:12
Yeah, sometimes it’s just, we forgot a configuration item, and guess what, guys, this system is super complex. I’m sorry, I didn’t hit that checkbox. But it could have been that, you know, a million dollar incident.
Jason Pufahl 05:24
Right, so there’s a nice segue, right, which is, what are the costs of these things? And they go up every year, for sure. I think one takeaway, the bigger the company, the more expensive it is to protect yourself against insider threats. That was a real revelation.
Matt Fusaro 05:40
Yeah, I guess it, to also go along with that, some of the things that you probably would have expected, the services and financial industry, they’re the ones that get hit the hardest from a dollars per incident standpoint, mostly because they have so many compliance structures that they have to fall under and make sure that people are working with and they’re disclosing things properly. And they actually clean up everything, a lot of checks and balances have to happen. But yeah, the larger you are, the harder it is to contain these things, the harder it is to do the forensics on this, they may have more data available, but that just means there’s more things to do with the data.
Steven Maresca 06:15
Complexity comes at a cost. And you know, organizations that are small, that have modest revenue, but still resemble big organizations, many offices geographically spread apart, those are the expensive ones from a complexity standpoint to recover from an incident. And I think that is something to keep in mind. You know, broad numbers in these reports may be in excess of the gross operating revenue of many of those smaller orgs. But it’s still a complexity concern.
Matt Fusaro 06:46
Yeah, one of the ones that we saw, you know, anyone under 500, they’re looking at about $8 million over the year to resolve insider threats. I think that’s really high. I think maybe, maybe they had some outliers in the data that they had on that one. But don’t, I guess, don’t shy away from that number, either, right? I’d say maybe 5 million is a good number.
Jason Pufahl 07:06
I was gonna say even if it’s half that, it’s still plenty.
Matt Fusaro 07:08
Yeah, it just costs a lot of money to to address these things. And that includes things like you brought up, Steve, the credential monitoring or not credential, the credit monitoring, that would have to happen afterwards. All the technologies that even your board are probably going to say, hey, I want you to implement these things now. So it’s products, licensing, it’s people, it’s services that have to happen afterwards.
Steven Maresca 07:30
And it’s insurance premiums, because you know, that $8 million there goes directly to what you’re paying from a carrier in terms of coverage. I think that 500 headcount, for 800 million for resolution of insider threat isn’t unreasonable from the perspective of an insurance modeling perspective. We have a lot of customers of, you know, 1,200, headcount that are more around five, 4 to 5 million in terms of policy coverage. That might be appropriate, it depends on the org, right, but that’s the general ballpark we’re talking about.
Jason Pufahl 08:05
And, you know, I think the complexity of the solutions you put in place are greater for larger companies, right, your security budget is going to be greater. Hopefully, in many cases, I think the security program maturity is better. And you’re spending more money on technologies, user behavior analytics is not something that we frequently recommend to 1,500 person companies. But I think it’s perfectly appropriate for your large insurance carriers, etc. So you’re gonna spend more money there.
Matt Fusaro 08:40
Yeah, absolutely. And yeah, like you said, the smaller you are, the harder it is to implement something like user monitoring, the cost of those solutions is just high.
Steven Maresca 08:51
Or worst case, you know, they’re not staffed at all to make use of those technologies, there’s a outweigh to put it in place, and then it sits unused.
Jason Pufahl 08:59
I think that goes into the cost, right. So it’s not just licensing, it is staffing and you know, getting the headcount you need to actually make use of the tools you purchased.
Matt Fusaro 09:07
Yeah, I think for smaller orgs that they probably should focus on, I guess more to the point of the problem, email is a huge issue. And even in the respondents of a lot of these reports, they constantly say email was the reason why this happened, right? Whether it’s people storing or basically using as a file share, right? Sending documents back and forth to either entities or other people that they should not have in the first place, sending passwords, all those types of things and including phishing, happen through email, that’s where most of the problems happen. So if you focus on that, and you’re a smaller org, you’ll probably help yourself out quite a bit.
Steven Maresca 09:46
Closely coupled with that was simply accessing data outside of a job role, which isn’t, you know, entirely email related but certainly email adjacent right, you know, you have people who are granted access as a member of a department. Let’s say, to sensitive info, they didn’t need it, but, hey, their account got compromised. It’s accessible to the attacker, therefore it’s in scope.
Matt Fusaro 10:07
Yeah. Or, you know, Tom is in accounting today, but he’s been moved, right, and nothing was ever taken away from him, he just keeps getting more permissions. And a lot of times, that’s because HR and IT aren’t really well aligned. Sometimes IT doesn’t even know these people have changed job roles, right. That’s why this happens.
Steven Maresca 10:27
So I want to shift, if it’s alright, to incident timelines, if that’s OK. I think that one of the more interesting statistics broadly that some of these reports indicated was that a very small number, under 15% ish, of incidents were contained in 30 days.
Matt Fusaro 10:46
Yeah, that was surprising.
Steven Maresca 10:47
This is really critical. And we talk a lot about long tail of incident resolution, this is a key reflection of that fact. It might be, you know, the first week is simply getting the data to make sense of the environment and get the arms around a threat to contain it. Especially true when there’s an org that doesn’t have that good information, good tooling. Then the next week is, you know, getting business operations back up and functioning. And the tail end is clean up and restoration and dealing with legal insurance and so on. That’s a long time.
Matt Fusaro 11:24
Yeah, once you start getting past the, like two and a half week range to the 30 days, you’re talking about spending a lot of money. And like you said, Steve, because of things like employee productivity going down, paying for consultants or experts to come in and help with those things. It can get costly real quick after 30 days.
Jason Pufahl 11:46
I don’t know that we’re going to see the timeline to resolution get that much shorter, maybe the timeline to detection? And the reason I say that is, the insurance carriers now are are putting in certain requirements that businesses have to implement prior to getting policies, right. EDR, some other technologies, I think they designed to make them a little bit more secure. But I think from a responder’s standpoint, probably give a little bit better data during that time of containment. So we might see a little bit of that containment time shortened, you’re still going to have your notification period, you’re still going to have your your legal requirements at the tail end. And frankly, you’re still gonna roll into clients that don’t have great architecture documents or engineering plan to tell you what services, how they were built, how to restore them. So I don’t know if we’re gonna see that change a lot.
Steven Maresca 12:37
I think the data even reflects that to some degree. The increase of rapid response type technologies may be reflected in the fact that fiscal 22 saw containment costs drop relative to fiscal 20. That could be a pandemic oddity, as well, but I do have a theory that it’s related to insurance to some degree, simply because there’s been a sea change in terms of what EDR technologies are in place, and the relative ease with which systems can be isolated.
Jason Pufahl 13:11
Yeah, I’m optimistic though, that the discovery time, or maybe even the dwell time, gets shorter, as businesses now implement some of these more, sort of tried and true technologies, right, and I think, you know, to some degree, I think insurance took a beating because they didn’t know the risk that they’re taking on, I think they understand it, now they’re putting in requirements and that, frankly, are going to be, are going to improve the security for a lot of these smaller businesses.
Matt Fusaro 13:37
Yeah, a lot of the businesses that end up in these reports, they’re large, so it’s always good to keep that in mind. So they’ve probably implemented a lot of the things that insurance is now telling the smaller places you need to have for your cyber liability. So I think it’ll be interesting to watch these numbers to see if they go up or down, right, because if they continue to go up, that’s bad for the industry. Yeah, I guess, time will tell with that one, to see if this, if the technology is actually helping or if it’s just shifting the tiles around a little bit.
Jason Pufahl 14:12
Which is possible, right? Because you because you’re right, this is representative of generally larger companies that probably have a lot of these technologies in place already. So maybe we just start to see a little bit of an improvement at that sort of smaller business side of things, just in general, but not a not a big dial change here.
Matt Fusaro 14:26
I think a lot of this goes back to the skills gap. That gap is getting bigger, and especially in large orgs you’re not going to be able to hire an entire team of experts. Right, you’re gonna have a mix. I think that mix is becoming worse, if you will.
Steven Maresca 14:44
You know, I think this goes back to the relative emphasis on credential theft compared to negligence actually. The skills gap has a psychological effect where organizations are likely to look outwards for the source of a threat, and I’m not personally of the opinion that that’s the best place to look, and I think you’ve expressed something similar.
Matt Fusaro 15:12
Yeah, and while that stuff usually causes a more expensive incident, credential theft or something like that, the prevalence of net negligence, I think highly outweighs it, right, I think it’s an order of magnitude, that credential theft is more expensive than a insider threat that is from negligence. But, yeah, if you have twice as many or three times as many of those incidents, that’s the real problem here, right? So much of the technology is driven to stop that stuff anyway, stop the credential theft, and so much of the guidance and security hardening, it all focuses on it. So if you do your regular cyber hygiene, I feel like you’re covering a lot of that already.
Jason Pufahl 16:00
So, and I would say honestly, if you start if you take the credential theft part out, because I think that’s detectable with a lot of technology that we have in place, right, and you should be able to then sort of react and make security improvements relative to that. It’s a lot more difficult to, I think, identify and then address the legitimate insider threat, right, the individual who’s been compromised, the individual who’s siphoning data out in small increments, in really undetectable ways, maybe a USB key that comes out at you know, a hundred records at a time, like there’s a lot of ways that people can can cause problems. People want to trust the people who work for them. So you’ve always got that just general human nature bias in place. And it’s really hard to to identify the real insider threat.
Matt Fusaro 16:47
Yeah, I think this once again, proves that technology helps security. But it’s ultimately a people problem.
Jason Pufahl 16:54
Steven Maresca 16:55
We’ve had, you know, tabletop exercises and, you know, games around scenarios with true criminal insider threat. They can only be inferred, usually, because they’re operating within the bounds of actual user privilege points. So I think you’re right in the sense that they’re not easy to discover. You have to find them after carving everything else out at the way.
Matt Fusaro 17:21
Yeah, I think we’re at the point now where we need some data to back up, I guess that practicing theories that we’ve been doing for years now. I think in a lot of areas that that has proven itself to be useful, but things like this, I think we need to see some better numbers.
Jason Pufahl 17:21
And it is an institutional problem like those tabletops that you’re referring to, we had individuals from HR in place, we had individuals from legal, like it was not an IT issue to identify the insider threat. It was a company wide problem, kind of all hands on deck in some ways. So it isn’t, again, they’re interesting reports, for sure. And I think it behooves everybody from the business owner to the security practitioner to at least generally understand what risks are out there, what some of the potential costs might be. I think, frankly, understanding report, like this helps probably business owners even realign some of the spend in their organizations, on training, on technical controls, etc. Because there’s a lot of great, there’s a lot of great data here. And I think your point, Matt, is really well made, which is we hope we see some of these numbers change a bit. Because I think our industry has gotten more mature. It would be nice to see some of the numbers drop in some ways.
Steven Maresca 18:42
I think I’ll be satisfied if the costs drop overall, even if the incident per organization count goes up. Because the act of looking will yield more incidents. I think it’s appropriate to be more, I don’t know, generous with what might be determined to be an incident because it just sheds light on things that need direction.
Jason Pufahl 19:08
Yeah. Well, you have a responsibility. I don’t know how many times I’ve said in the past that, in spite of the fact that an organization I might have worked for had more incidents, my argument always was it’s because we do a better job now of detecting right. And I don’t know how many companies I’ve spoken with said, ah, we’ve never had an incident. And you simply know that’s not true. You’re just not looking. And so I do think it is fine to find them, I think you actually have a responsibility to identify them and disclose them and sort of handle them in a responsible way. And in large part, that’s what these reports are sort of identifying right, the appropriate disclosure and appropriate management of incidents, insider threats, etc.
Matt Fusaro 19:50
Yeah, I think if you’re listening and you’re on a board or some type of management position, instead of saying hey, what are we doing for security, maybe look at these reports, maybe take what we’ve talked about here and use that as a, a way to construct that question a little better. So how are we handling specific types of threats? This is what I’m seeing. Yeah, let them talk about that a little bit.
Steven Maresca 20:15
Look for your specific industry, find the trends in that industry, and then map them to your own organizational priorities.
Matt Fusaro 20:22
Yeah, they’ll be different for everyone.
Jason Pufahl 20:25
I think, Matt, that’s a good closer. So I think on that note, then, if anybody wants to talk about any of these reports, and you know, sort of more detail with us, feel free to reach out to us at Twitter and LinkedIn, we’re happy to continue the conversation. If anybody has any concerns about how to deal with some of these issues, we certainly have background in social engineering and insider threats and a variety of other things, so let us know, we’re happy to have a conversation and help in any way that we can. As always, we hope people got value out of this. Thanks guys for participating today, it was an interesting talk.
Stay vigilant, stay resilient. This has been CyberSound.