Vancord CyberSound Podcast

Episode 140

Health Data & Privacy Risks You Should Know

Wearables, fitness apps, and wellness platforms collect massive amounts of health data. But how is it actually protected?

In this episode of CyberSound, Jason Pufahl is joined by Alex Cox of Troutman Pepper Locke to break down the realities of consumer health data privacy. They discuss why most health data is not protected by HIPAA, how state privacy laws apply, and what happens when data is shared across apps and platforms.

Listen to this episode on

Episode Transcript

Speaker 1  00:02

This is CyberSound, your simplified and fundamentals-focused source for all things cybersecurity.

 

Jason Pufahl  00:10

Welcome to CyberSound. I’m your host, Jason Pufahl, joined today by Alex Cox, Associate at Troutman Pepper Locke, Hey, Alex.

 

Alex Cox  00:17

Hey. How’s it going?

 

Jason Pufahl  00:18

So normally, I’m joined with, you know, at least one other co-host, but today I’m doing it solo, I guess, which we’re trying everybody traveling too much before the holidays. So.

 

Jason Pufahl  00:30

It is that time of year. Hey. So you know, we, we actually did a podcast not too long ago, maybe, maybe a couple of months ago, where we actually chatted sort of health and fitness in kind of, you know, how IT professionals can cover our time and be a little bit healthier. And we, and I feel like we touched very tangentially on the idea of, you know, the data that these apps and devices and things collect, but we didn’t dive into it. And I left that discussion probably feeling a little unfulfilled, because I think in a lot of ways, that’s, you know, that’s kind of, you know, the that’s the basis of what CyberSound is, right, is, is, you know, how do you actually have better privacy, how you have better security? And I, I’ll say I’m a serial exerciser, but I don’t necessarily do a ton of, I don’t wear a lot of wearables. I don’t have a Whoop, I’ve got an Apple watch that I literally wear just for the duration of whatever activity I’m doing, and then I take it off because I just don’t care that much about the steps and things like that. So I’ve, you know, but I don’t know that, you know, frankly, I don’t know if that necessarily strikes the greatest balance. I still have every bit of activity in Strava, still using, you know, like a My Fitness Pal, for just general sense of kind of what I’m eating. So I know there’s a lot of data out there, and, you know, I don’t know personally that that I’m dissimilar from a lot of people. Like, I feel like I generally rely on them to keep the data safe, but I really am interested in, you know, like, what does my history look like relative to kind of health and wellness? So I do, I kind of like that stuff. So you and I chatted, and yeah, you said, Hey, listen, I know about this stuff. So let’s, uh, let me give you some info about, you know, what is, what’s HIPAA, what’s the sort of consumer health data? How do they use it? So that’s the discussion today. So I’ll stop there. Maybe give you a chance to kind of weigh in on where you want to go with this.

 

Alex Cox  00:32

Yeah, Thanksgiving next week. It’s that time of year.

 

Alex Cox  02:29

Sure. So, you know, I’m also, you know, hooked in. They got me. There you go. Although, for me, it’s, you know, I’m the worst. I mean, I, you know, I have friends who exercise every weekend, and I go to a buddy’s place on the weekend, and they’ve all got some sort of fitness tracker. And I’m, like, the least fit person in this fitness group who, like, mostly goes for the fun and the conversation, and, you know, to hang out with my buddies from from law school. But, you know, they’re all very serious about it. And something that I think is really interesting in and you really saw this around the pandemic, around people, you know, being excited about, oh, I reported my vaccine status to someone else. And I think this is the first time I got this question, just en masse, which is, why is so much data that’s health data not protected by HIPAA? Because, you know, sort of common vernacular people think of HIPAA as the health law, protects your health data, but it only applies to, you know, healthcare providers, like a doctor and like payers, like insurers, and then something called healthcare clearinghouses, which you don’t have to worry about, but it doesn’t apply to these, you know, app companies. Doesn’t apply to, you know, Apple isn’t, isn’t a health care provider. This isn’t a medical device, and they, they work very hard to make sure it is not a medical device, because then they would have to pass, you know, a whole set of FDA requirements to get it certified as such. And so, because it’s not health data subject to HIPAA, it’s not being processed by a covered entity. And I should be more specific, there health data only is subject to HIPAA when it’s processed by what we call a covered entity, which is, you know, one of these HIPAA regulated companies, like the health insurer or the healthcare provider. And so health data in anyone else’s hands, like, for example, your employer, anything they know about you not subject to HIPAA. Anything you tell to anybody else not protected by HIPAA. It’s all protected by this sort of patchwork of state privacy laws that are basically most of mostly copycats of the GDPR in Europe. So GDPR passed in 2016 California passed a copycat law. A couple years later in 2018 that was effective in 2020 Connecticut jumped on the bandwagon for we were one of the first states, Connecticut, Virginia and Colorado, to pass a comprehensive Consumer Privacy Law, which included health data and that covered the gap really so now your Apple, watch health data, your Fitbit data, all that stuff that’s all protected by, you know, whatever state consumer privacy law exists in your state, if there is one, yeah.

 

Jason Pufahl  05:10

So what does that mean? By then protect it. Because now you got, you know, you’ve got Apple essentially interpreting the laws of, we’ll say, 50 states, although I know that not everybody has a comprehensive law. Yeah, right. So what are they really what are they really doing with it? And Are they pulling features out of devices to try to make it easier? Are they actually protecting it?

 

Alex Cox  05:31

Yeah, I mean, they’re probably protecting it pretty well. So those laws have pretty good, you know, reasonable security requirements, and typically they also give private rights of action for meaning, like someone could sue if you lost their data because of a security breach or something. So you know, from a health perspective, you know, and that data is considered sensitive personal data, so they have to protect it, you know, what is reasonable for a sensitive category of personal data is different than what is reasonable for, like my date of birth, right? So Date of Birth not super sensitive, you know, information about a health condition I might have that would be sensitive to me. So you know what is reasonable to secure that is different, right? And then there’s assessments that they have to do under these laws. So if I’m apple and I want to take your health data and I want to, you know, develop recommendations for you based on that health data. I want to develop inferences based on other things you might so I’m taking your health data and I’m inferring based on, you know, your heart rate, whether you may or may not have another health condition, right, right? Like, that’s an inference that’s using data for a form of collection. They have to do this privacy impact assessment to figure out whether that type of activity is going to create heart risk of harms to individuals, benefits to consumers or society. They have to document that they balance the risk. I mean, in practice, they basically say, Oh, we want to do this. And they go on a piece of paper, and they write what they want to do, and they say, we thought really hard. And we think, yeah, great. And no problem, you know, and then they stick it in the file drawer. I mean, in practice, that’s what happens, right? But that’s like the processing assessment side, not really the security side, but the security side. It’s like, surprisingly simple. It’s basically reasonable cybersecurity, reasonable security. They really hand wave a lot of that away. Yeah.

 

Jason Pufahl  07:26

So there’s no, there’s no stringent regulatory requirements. They’re doing some security best practices, maybe to protect it, maybe pulling some data out and updating, I’m sure they’re sharing policies. I have no doubt. Yeah.

 

Alex Cox  07:37

And the interesting thing there is, you know, it’s such a patchwork of, well, you know what laws apply where, right? So it really depends. I think the way to think about it, in terms of how secure your data is, has a lot less to do with the data itself. So, you know, health data, like isn’t really the subject of determine, like, how do I frame this, if you’re trying to figure out how secure does a given piece of data have to be? How secure is my health data? Right? The right question is, actually, who has my health data? Sure, not. How secure is it? Because my health data is only as secure as the requirements imposed on the company or the individual that’s holding it. It’s not it doesn’t really follow the data. It follows the people. So healthcare providers, insurance companies, very secure. Apple app companies, more secure. They’re going to be subject to these consumer privacy laws, but like smaller companies that fall under these applicability thresholds for some of these laws, you know, like they’re kind of wild west in it a little bit, you know, there’s some requirements, but it’s, it’s basically just, you know, whatever’s reasonable, whatever would pass the sniff test, you know, and there’s not a lot of specifics that they have to meet,

 

Jason Pufahl  09:01

Well, your statement a minute ago about who has my data, that’s not straightforward, either. So, you know, you can be an you can be an Apple user. And then the minute here, I’ll just use Strava because that’s what I use, right? The minute you want to Strava to interact with Apple, right? You get all these prompts around, you know, I’m going to share data with health, and you accept all of these things. So now my Strava data is probably getting ingested by Apple with they in turn, then ultimately go somewhere else. If I think that there’s another app I want to integrate, it’s really complicated to know where your data is and that are just the choices that I’m making, not even the choices that maybe Strava or Apple are making on my behalf to share data somewhere else?

 

Alex Cox  09:43

Yeah, and it’s hard to even understand that and that that that’s happening, right? Because each time you do that, there’s a really long document that I write for clients that I don’t read, that nobody reads, right? I mean, I know that, you know, I write these for regulators, you know, competitors. You know. You know, business partners, but I know that like no consumer is really reading the policy that I write, which is kind of sad, but, I mean, there, there was, I forgot, who did it. There was some article maybe a year or two ago about how if we actually read all of the click wrap terms that we click through on all these apps and everything, we’d spend something like two months a year of our life, like sitting there reading these policies, which is just like, it’s absurd, right? Yeah, you know. So I appreciate that Apple does something, I think, with their with their app store, where, like, it says, kind of in a little summary, like, this is what it collects. This is how granular, and that, I think that’s constructive, right? Like, helping people understand a little better. And the states have tried to do that, you know, tried to impose requirements on companies, like, keep it easy, make sure it’s understandable. But, like, it is hard to do that, and it’s hard to get people to, like, people don’t want to read anything. I mean, you know, it’s, it’s hard to get people to read these things.

 

Jason Pufahl  10:59

Yeah, I don’t even know how long most of them are, like you you know, because you’re all presented, you know, either a on my phone, so it scrolls for 1000 miles, or maybe it’s a doc, and you kind of have a sense, but they’re long, that’s the one takeaway I have every time,

 

Alex Cox  11:14

Yeah, and they’re long because often they’re meeting, you know, a whole slew of requirements, right? And you’re trying to check, you know, 14 different boxes with this huge document. I mean, I like to compartmentalize things by like regulatory universe, and I think that keeps it a little lighter weight, you know. So that way, instead of reading this document that’s 26 pages and is handling global issues, you know, I don’t, you know, I don’t want to read a 26 page document. I like doing it, like, there’s an eight page document for the US and a four page document for California. You know, that way, like, Oh, I’m a California resident, or I’m a Connecticut resident. I just go to the multi state notice that meets the Connecticut requirements, and I can see what’s happening there. But everyone has their preferences. And, you know, there’s arguments pro and con for all this stuff, you know, but it’s, it’s hard for people, I think, to understand how it’s being used, how it’s being shared, what what protections exist.

 

Jason Pufahl  12:10

So let’s segue a little bit into the house. I feel like you’re the US. We’re not. We don’t have the strongest privacy laws. Certainly, you’re to your point where we modeled a bunch of off the GDPR. We’re not, we don’t have a federal law.

 

Alex Cox  12:30

So I want to, I want to interrupt you there really quick. So I actually think which is this is going to sound a little crazy, and I get in this argument. I’ve my family is my whole dad’s side of the family live in the UK, and every time I go back and visit my cousins, I get a ton of shit as, like, the the stereotypical American, like, I’m the American who’s over there, you know, getting made fun of for being American. And one thing they love to make fun of is like, Oh, you have no privacy and, you know, whatever. And like, you know, I happen to be a privacy lawyer, right? And I get to explain to them. I get the opportunity to be like, You guys are kind of wrong about that. Like, that. It’s kind of a misnomer, and it’s true in a couple senses, in the sense that, like GDPR is the EU, it spans the whole European Union, so there’s no places where it doesn’t exist, right? And so that’s easy for like a person to just see, oh, big, big law there covers the whole scope, but in a lot of ways, like the US privacy laws, even though their scope isn’t as broad, they’re actually more protective in some ways than the GDPR is. I mean, the GDPR has extremely strict penalties, and you know, they can fine you for 4% of your global turnover if they’re not happy with you. So for big companies, they take it really seriously, right? But in terms of, like, what you can do and what you can’t do with the information, I mean, GDPR is very procedural, like, as long as you do the things that they want you to do, you can kind of do more or less whatever you want, whereas in the US, they’re much more like, No, we don’t want you doing this, or you better give people really direct opt outs for this stuff. Like, it’s much more particular, and in some ways you have more protection. But in order to, like, figure out whether you have this protection, like, who has this protection? Well, people in this state, people like, and it only applies to these companies. And so it’s over complicated in the US, but I want to, like, try to diffuse that misnomer that, like, there is a lot of privacy protection us. It’s just complicated,

 

Jason Pufahl  14:20

Which how many, so many states have a privacy law at this one?

 

Alex Cox  14:24

Yeah, it’s to be, I think it’s 19, and then, sort of, like, we can argue if it’s 22 as well, because there’s, there’s privacy laws that aren’t that I don’t consider to be the comprehensive consumer privacy laws. So like, for example, Florida has a comprehensive consumer privacy law, but it only applies to like companies with a billion dollars in revenue who make a voice assistant. So it’s like, do we count that, you know? Probably not really, right, you know. Or Nevada, they have, or, sorry, Nevada, they have a law, you know. But it’s, it’s sort of narrow in scope, and it. Only applies in a certain couple narrow circumstances. Washington has a law, but it only applies to health data. So you know, where do we draw that line? Or what? What types of laws count? 19 states have a comprehensive Consumer Privacy Law, like a GDPR copycat.

 

Jason Pufahl  15:16

So we could say approximately half, even if we talk about some of those that you say are, yeah, talking

 

Alex Cox  15:21

About population wise, it’s, it’s the vast majority. It’s, you know, Texas and California are the two, you know, they have the two most broadly applicable versions of this law. Actually, funnily enough, Texas is the lowest barrier to applicability, which, like people often don’t think of. It’s kind of a counterintuitive thing. Everyone thinks of Texas is very, you know, not putting up these sort of things for business to deal with, but they’re, you know, it’s everywhere. California does also has a really broad applicability thing. And actually, Connecticut is going to join that club a little bit, where we have a new and so this is part of the issue. Is that when one of these laws or another of these laws applies, is really complicated to figure out. Like, how many people do you have? Like, how many individuals data do you have you know, and all these questions that are questions that a lot of time the business people don’t even know, or their own company, like, so it can be complicated.

 

Jason Pufahl  16:20

So, you know, my kind of, my burden. Question always is, sure, what can we anticipate the data potentially being used? So I always go to this idea of, if I’m an insurer, it feels like, you know, health data, Apple, watch, Strava, whatever. That’s a gold mine of information relative to do I have somebody who exercises? Do I have somebody eats? Well, like, I don’t think that’s, you know, I don’t think there’s any direct relationships like that today, but I feel like I’d be naive to think that there won’t be at some point, because the data is great for that.

 

Alex Cox  16:56

Well, there definitely is already so this already happens. This definitely already happens. You know, I can’t say who and why and what, but, like, I’ve

 

Jason Pufahl  17:05

seen it, let me ask the question. It happens for an insurer that you’re similar to, like the safe driving where they say, you know, throw this app in your car. We can, we can monitor your speeds and your accelerations, and we can, we could decide whether you’re a good driver. Yeah, I totally appreciate if an insurer says, Hey, if you wear this, right, you could drive your premium down by demonstrating the exercise. Are there other examples where, simply, my choice of using an app might feed that ecosystem? Yeah. I mean,

 

Alex Cox  17:32

How do you know, right? So let’s say no. It depends on the state you’re in, right? But let’s say you’re in a state that doesn’t protect health data. Does it limit the sale of health data? Let’s just assume that right? Because Connecticut does so. In Connecticut, we’re sort of unique in where there’s the blanket prohibition on the sale of health data. So it makes it easy. You know, for people here, it’s like, Ah, well, it’s not getting sold. It might get disclosed for some other purpose, or something like that, but it’s not getting sold. But to take the really basic circumstance here, a resident of some state that doesn’t have any prohibition on the sale of health data, I’ve got a Fitbit or some something like that. You know, there’s nothing, strictly speaking, preventing that company from just continuously selling the metrics that they collect about people to insurance companies, to use for whatever purpose they want. So they could just be continued. They could just have open data feeds, and they’re just now, they would have to disclose that to you. They’d be required to tell you that

 

Jason Pufahl  18:32

they’re doing that in the in the privacy statement that you didn’t read. Is that where exactly

 

Alex Cox  18:39

So but, and they could say it in a way that like might not sound like they’re selling it, you know. They might say, you know, we share information with our partners for improving their products. Or, you know, something like that, something vague, like, okay, and you hear that and you’re thinking, I don’t quite know what that means. And the sort of translation is, we sell your stuff to people who use it for reasons, you know.

 

Jason Pufahl  19:01

So, so it’s definitely happening already. Yeah, you so you wear a watch. I mean, you’re wearing it, you’re wearing it now, you know, certainly I’ve got plenty of my data into things. Maybe you and I can chat about your AI and some of the, some of the things coming out of that space at some point. So we’ll, we’ll table that for now. I don’t want to get, I don’t want money this too much, but sure, you know, much, but sure you wear one. Do you wear one because you feel like, hey, the risk just isn’t that great? Or do you wear one partly because you say, well, it’s just the world that I live in. And, you know, I can be a Luddite and do nothing, or at least embrace some of the technology. So I’m curious just where you generally land

 

Alex Cox  19:40

With that. I mean, I’m I’m inherently like, I mean, I can’t not be a little bit paranoid about this stuff just because I see it all the time. I mean, I like the watch. I mean, the reason I use it is that I hate my phone and I want to avoid the phone as much as possible. And the watch lets me, like, leave the. Phone, you know, at my desk in my office, and walk around my house, and it has Wi Fi and so I can, like, you know, if I’m getting an important call or someone needs to get a hold of me or something, or I don’t know, someone needs to ping me. For some reason, I’m not completely disconnected, but I don’t have a phone in my pocket. That’s just like, you know, tempting to be used for some reason, you know, something that’s going to take me out of the moment or something. So that’s why I use the watch. And it’s convenient for, like, notifications and things like that. And it does do the health tracking. I’m not, like, super fastidious about that. I mean, like, my wife and I all, like, compete with the steps. Would be like, Oh, I got this many today. How many? They’re a great, simple metric, yeah, but, but it’s not like, it’s not something I’m really thinking about a ton, but it is concerning. I mean, I use only, I don’t use any, you know, third party health tracking apps. Okay, no, I only used apples because, you know, my thought process there is, you know, Apple’s incentive is to sell me the widget, you know, so I’m the customer here, right? If you know, if you buy a widget from someone, and it’s supported in some sense, by the data itself, the monetization of the data you provide, like, just keep that in the back of your mind, right? Any product that you’re buying, like, you kind of want to be the customer, not the product.

 

Jason Pufahl  21:21

Yeah, that’s fair. And a lot of those, for sure, is the kid, and honestly, that’s part of my thinking. With, like, a Strava, like zero, it’s a reasonably expensive app, as far as apps go, yeah, right, pretty reputable is not. There’s a lot that I feel like you wouldn’t because it is all about date, you know, collecting and selling that data, right?

 

Alex Cox  21:40

I mean, you hear this random, you know, I mean, you know, random foreign company that you’ve never heard of that, you know, you can get the product on AliExpress for like, $12 Yeah, tracking stuff. And it’s like, you know, they’re probably monetizing that data, like, really quickly. That’s probably their goal, right? Is to collect as much data as possible. That’s why they’re selling you the product that, you know, some dramatically low price, right?

 

Jason Pufahl  22:07

You’re the consumer. You’re the product. Ultimately, your data is the product.

 

Alex Cox  22:10

Exactly, well, I mean, it’s the old concern around Gmail in the first place. I mean, for the longest time, and I still have a Gmail account. I mean, I was, like, a beta gmail user back when I was a kid, like, I had a high school you know, I, like, clicked the beta, you know, got the, gotten the beta in the early days. But, you know, I was always in the back of my mind. I was always thinking like, Hmm, you know, they’re just data mining. My email, it says to target ads. Like, that’s, that’s why this is free, right? And it’s a little nerve wracking, yeah?

 

Jason Pufahl  22:39

I mean, you really find that you either, you either subscribe to the idea that, you know, there’s value in these apps, or you you have to kind of run the other way. But, I mean, you’re there’s so many ways you’re

 

Alex Cox  22:49

giving balance though, like, it’s impossible. I do think it’s possible to be like crazy and take a balanced approach, like, like, for example, right? Home security systems, right? They can collect all sorts of data about you. You know, you’ve got cameras. You know that. You know a baby monitor, you know is, you know, they have those, like, breathing things for your kid, you know it. Can see how your kid’s breathing, right? And you wonder, like, are they, you know? Okay, they’re tracking my newborn. They’re figuring out if my newborn’s breathing rate is reasonable, like, and I’m reading the policy, because I’m, you know, of course, I’m going to read it right. Like, I’m curious, like, what are they, you know, professionally, curious. And, you know, they say some things that are, you know, the right things, and they say the right things. And I’m like, I’m comforted by what they’re saying. And then I just wonder, like, are they doing that, you know, and all that kind of stuff, you know, because how often you’re more

 

Jason Pufahl  23:44

you log in, you say, Okay, I’m gonna, I’m gonna do whatever I do with my app today, and you get that quick pop up that says, Hey, are you a privacy a privacy statement has changed, you know, click here to look at you say, Man, I just, I, all I want to do right now is go for my run, right? So I’m gonna hit okay, and there you go. You never I, you know, you never look for that again. You never think twice about it. So what you agree with today may not be what you agree with a month from now or a year from now.

 

Alex Cox  24:08

It’s a great point. And there’s always the you know, you know, you buy a product from the company you really like, and they’re doing all the right things with your data, and they get bought, and then suddenly, their parent, their new parent, has a new approach, and there’s not much you can do about it, like if they buy, if they buy the entity and they disclosed, I mean, and everyone does, you know that, you know they can transfer the data as part of a merger acquisition. I mean, that’s permitted by all the laws, pretty much. So kind of shit.

 

Jason Pufahl  24:40

I’m talking there. Here’s my because we’re, you know, I always say, we do these for 15 minutes, and then we invariably go longer. Sure. I’m going to try to wrap up this question. I think, in the model you just described around a company that got acquired, maybe, and changed policy that, you know, that’s one way. But just in general, I know GDPR is one of, you know, there’s two things I think about with GDPR. It’s that sort of Right to be Forgotten concept. And then the other is, you know, you go to Europe and you spend your whole time clicking cookie notification. So, you know, there’s the two, there’s the two things. But that idea of your your your ability to theoretically request that your data be removed is pretty clear there. Do we have that same sort of straightforward capability in the US? Because that’s a challenge, right? If you are, if you all of a sudden decide, like, I don’t want to use this anymore, but I want my either my data expunged, or my data back. That feels really complicated.

 

Alex Cox  25:36

You have that here, it’s just a matter of, it’s that Patchwork, you know, like, for example. And just to show how annoying this patchwork can get, right, you have that right under, let’s say I’m a resident of Connecticut. Let’s make it easy, right? I have a right under the Connecticut, you know, data privacy law. I submit a request to the company pursuant to their whatever web form on their website, please delete all my information. Thank you very much. You know whatever they say, Oh, yep, we’ve deleted all your information. Thank you very much. But they were a company that was maybe regulated by or they have an affiliate that’s a licensee under some insurance statute or something. They’re exempt from the privacy law, so they’re not going to delete your data. So you have to submit a request under the insurance privacy right to the affiliate. And so there’s all these, like annoying ah and follow ups that you have to sort of approach, which is why I think you see a lot of these, like third party services that people, you see them, like, marketed on, you know, online and stuff. And those can be good. And like, candidly from the other side, like, watching companies try to comply with requests from those it’s like, sometimes I’m like, Oh, this is reasonable. Like, yeah, you should comply with this. And other times you’re like, wow, they’ve just sent a list of names. And like, none of these people are our customers, so they, fundamentally, they just disclosed a bunch of personal data to like, a bunch of companies who, like, didn’t really have any association with that person. And you’re just like, is that even really protecting these people’s privacy? And so I think the specific company who’s offering that service you got to pay attention to, but that’s probably the the only sane way to approach that that require, like, I want to get rid of my data, I don’t know, find some third party that seems reputable and try to sick them on getting all your data deleted. Because I think, as an individual, it’s really difficult.

 

Jason Pufahl  27:33

Yeah, I mean that, yeah, I was honestly when I asked the question. I wasn’t even thinking about the affiliate relationships. But of course, that makes it more complicated. Yeah, inevitably, I get, you know, any any closing thoughts that you have that you feel like, hey, I really wanted to say this, but can you talk too much?

 

Alex Cox  27:50

No I mean, health privacy is really complicated, unfortunately. But you know, people generally do have a lot of rights with respect to their health data, whether it’s at a provider, at a company, you know, so you know, remember that you know, if someone’s sort of doing something shady, like, you know, or you don’t feel right about it, like, just don’t use it, you know. And and it is, it is, you know, just to flag your earlier comment about like, the use of AI. It really adds another layer of I think what scares me the most about it is how interoperable it makes all this data right? Like, all this data used to be behind SQL databases and nerds with spreadsheets, and so it was harder to get at it. And when it’s behind an LLM, it’s not only easier to get at, but it’s it’s easier to get at, wrong.

 

Jason Pufahl  28:48

So what is that? Somebody, this is not my phrase, but I loved it. Somebody said, you know, AI is essentially democratization of IT, right? Because it takes what was really complicated at one point and really for people who had a programming background or whatever, and makes it accessible in many ways, right, via just traditional language, normal language, to somebody else. So again, whole different topic, and I think it’s one that actually, if you and if you’re open to it, I’d be happy to have you come back on, because this was sure I enjoy my conversation. So yeah, a lot of my time these days is AI, governance programs. Well, bet it is so riveting. All right. Well, the let’s end here. You know, as always, if anybody has any questions, I’m sure, reach out to us. We can. We can reel and Alex, you have them back on, but Yeah, Alex, I would love to have you back on to chat. Ai, because it is, yeah, it’s a concern. So cool. I appreciate each minute today. Thanks for the thanks for the half hour first.

 

Speaker 1  29:39

We’d love to hear your feedback, feel free to get in touch at Vancord on LinkedIn and remember, stay vigilant, stay resilient. This has been CyberSound.

Episode Details

Hosts
Guests
Alex Cox
Categories
CyberSound