Vancord CyberSound Podcast
Episode
78

Privacy Concerns with TikTok

Numerous states have placed a formal ban on the use of the popular social media platform TikTok, leaving users to question its safety surrounding data and asset privacy.

Today, Jason, Steve, and Matt discuss the restrictions, current threat model, and perceived threats in hopes of easing some concerns. The team encourages being proactive and protecting your data before worrying about platforms accessing it.

CyberSound ep78

Episode Transcript

00:01
This is CyberSound, your simplified and fundamentals-focused source for all things cybersecurity, with your hosts, Jason Pufahl, Steven Maresca and Matt Fusaro.Jason Pufahl 00:14
Welcome to CyberSound. I’m your host, Jason Pufahl, joined, as always, by Steve Maresca and Matt Fusaro.Matt Fusaro 00:19
Hello.

Expand Transcript
Steven Maresca 00:19
Hi.Jason Pufahl 00:21
So, today we get to tackle the topic of sort of perceived privacy and security risks, security issues facing TikTok. We, I think we’ve, we’ve all seen the myriad states that have now placed a formal ban on TikTok primarily for government institutions. Are the blocks justified, you know, are the threats that they’re sighting, you know, legitimate? Maybe that’s a strong way to put it, because I’m sure there’s some truth to them, certainly. But, you know, why go down this path? There’s a lot of, there’s a lot of social media and data sharing networks out there like this, what makes TikTok so unique, and why are people so concerned about it?Steven Maresca 01:08
So, it’s not an American phenomenon, it’s certainly occurring internationally, Belgium is one example of a country that’s performed a similar prohibition. There are lots of states that have banned, you know, Alabama, Florida, Georgia, Idaho, Maryland, Nebraska, and North Dakota, several others at this point. The important thing to know is that a lot of the actions being taken at the federal level, are informed by data that we don’t have as people in the public sphere. So does it imply that there is a requirement to act in your personal or corporate area of influence? Maybe, maybe not. You have to think about the threat model, in particular here, and some of the the actions that have been concerning.Jason Pufahl 01:57
Before you get into threat model, let’s talk specifically about what the bans are, right?

Steven Maresca 02:02
That’s easy. It’s a ban of the use of TikTok on devices that are owned by the individual or owned by the organization, under the Influence. So you know, state owned devices, corporate devices,

Jason Pufahl 02:02
But not personally owned devices?

Steven Maresca 02:20
Well, there’s a blend there, because,

Matt Fusaro 02:22
Yeah, they’ll block them at their firewall.

Steven Maresca 02:23
Exactly.

Jason Pufahl 02:24
Right. So, you can’t bring them in and use their wired network, their wireless network, right. They’re saying that they can’t have a TikTok account that is tied directly to an institution.

Steven Maresca 02:32
In essence, but, but I do think we need to talk about the threat model first.

Jason Pufahl 02:36
Sure.

Steven Maresca 02:36
Because here’s the thing, the concern is that the use of any device personally owned or otherwise in sensitive scenarios, is valuable to the owners of TikTok. And by extension, the Chinese government, there’s a nation, national defense consideration, that’s usually at the underpinning of these conversations. It makes sense if you’re talking about, you know, someone in the armed forces with their TikTok app open when they are on the job, in a sensitive location, giving away some sort of behavior, giving away density of other peers that might be in that same location using the app at the same time. That’s meaningful from a defense perspective. Outside of that sphere, does it have the same threat? You know, if you’re a college student using TikTok to share videos and look up, you know, the behavior of celebrities? Probably not.

Matt Fusaro 03:33
Yeah, probably, probably not. I think what they’re, what they’re also worried about is how the algorithms are used, right? You know, what contents being driven towards certain people, for whatever reason, that might be, right.

Steven Maresca 03:46
Proliferation of misinformation, malicious advertisements, that sort of thing.

Matt Fusaro 03:50
Yep. Or just driving meaningless content to people, just for the betterment of some other group, right? There could be 100 reasons why they do it.

Steven Maresca 04:01
Now, here’s the thing. There have been many allegations that employees of ByteDance, the parent organization of TikTok, have accessed the data of US based TikTok users. And that’s true of every other country that has also banned TikTok, that’s the fear. But, you know, for the vast majority of TikTok users, they are willingly sharing their location, their image, their videos and things of that nature. For those individuals, especially if they’re operating in a place that doesn’t have sensitivities attached, other than privacy, there aren’t too many considerations at play, aside from the sanctity of those individual devices.

Jason Pufahl 04:38
Yeah, I mean, I think the thing that we’ve seen cited, maybe more opinion pieces are hey, you know, they’re able to get a broader sense of the behaviors, let’s talk about this from the American perspective, right, behaviors of Americans, and they can adjust their own policy to run counter to the way the way our way of life works, I think there’s a there’s a group of people who feel like there’s a legitimate threat to that American way of life. And I feel like, to me, that feels a little bit far fetched. Maybe there’s a grain of, you know, some ability there, but I’m not sure if it would manifest in reality.

Matt Fusaro 05:16
Yeah, I’d say it’s maybe a data point for, for those organizations. But I mean, you know, we talked about this before we started recording here, how it’s interesting that it’s just TikTok that they’re concerned about, and we just had a huge information disclosure, based on Discord.

Jason Pufahl 05:36
Yeah, certainly not the only company.

Matt Fusaro 05:37
Yeah, it’s not the only company. And I think that’s where a lot of people kind of don’t like what’s going on with a country-wide ban. Where, why them if we have all of these other organizations doing essentially the same exact thing, right. It’s a little hazy.

Steven Maresca 05:56
Right, I mean, Facebook Meta, good example. I mean, very well documented similar misbehavior for almost a decade at this point. There’s a lot of reasons that the FTC has, you know, consent decrees established with them. It’s a pervasive problem, whatever data and demographics are being used to make algorithmic decisions, you have these risks at play.

Matt Fusaro 06:22
Right.

Jason Pufahl 06:25
So, I mean, do we expect more states to follow? I mean, we’re certainly getting, I feel like we’re on the side of working with clients that are in states that have had, you know, the block enacted, sort of asking for guidance from us. I mean, I think just from a transparency standpoint, we’re really not treating this substantively different than some of the platforms that already exist. To the point of Discord and Facebook or Meta, yeah, I don’t think we’ve recommended to anybody that they go down the path of, of blocking unless they were required to, right, state agencies, etc. Maybe I’ll let you Steve, most specifically, because I think you, you’ve tended to field a lot of these questions. Any reason you see your position change on this?

Steven Maresca 07:20
I think that the utility of a block is questionable, in some environments. If you can block at the network level, great. You’ve just prevented TikTok traffic through devices on a corporate network, on an educational network, what have you. But the possessors of those devices can shift without even them being aware to 5G, 4G, whatever the local secondary access mechanism might be. Does it still give an adversary similarly useful information? Yeah, potentially. I mean, you could easily, you could easily assert that that person is a member of the community in or the corporation organization in which the block was applied, and still deliver meaningful information,

Jason Pufahl 08:13
On their local on campus coffee shop, right?

Steven Maresca 08:16
Exactly, so I think that the conversation needs to be reasonable in the sense that, what what are you going to achieve with a block? What residual risk is there? Substantial, if you think a block is actually important. The thing that would make me change to shift to like an outright prohibition with a platform like this, taking it outside of the scope of just TikTok would be, you know, is there knowledge that it’s being used or planned to be used to deliver some sort of malicious content, unambiguously malicious, misinformation, disinformation, something obscene, something explicitly malicious and weaponized? Those are reasons where you block a platform like that. The trouble is, you would need to do it most likely, in a really comprehensive way to avoid any such risk. And that’s a difficult pill to swallow for many people.

Matt Fusaro 09:09
Yeah, you’re better off spending your energy trying, trying to make sure that the way that your information is being accessed, right, the things that you’re trying to protect from a platform like TikTok, making sure that that stuff is in, in the right place, protected properly. If you want to use specific tested devices that get to that data, that don’t have TikTok on it, make that so. There’s, there’s ways around this because this is going to be an ever changing thing, blocking one of these applications is not really going to help you.

Jason Pufahl 09:42
It’s just, I mean it’s cat and mouse. That’s what it turns into.

Steven Maresca 09:45
It’s such a permissive barrier, because it’s socially oriented, that actually effecting a block is an extraordinary challenge. If you’re in the domain of national defense, whether you’re in supply chain or directly in a federally adjacent institution, you probably want to have a policy and an explicit technical prohibition. Because, you know, the actual espionage oriented around people, where they sit, how densely they’re sitting, you know, what work they’re doing, how fast they’re moving. All of those things apply offensively. And you don’t want that stuff to be known by an adversary. But that’s true for everything that we’re talking about, not just TikTok.

Jason Pufahl 10:30
So the conversation is certainly not going to go away it. I mean, I think it’s interesting that it is localized a bit to that platform. I mean, I think there’s, you know, there’s murmurs of others, but I do agree it’s a, it’s a little bit of a broader topic around, you know, social media, which is inherently designed to share information. And then sort of that negative reaction to taking all the data that people are freely providing and trying to do something that maybe they don’t agree with having haven’t been done with it.

Steven Maresca 10:58
I think there’s, the providers of platforms like this are relatively impossible position. Because American law, as an example, of European law gives those business entities safe harbor for content that might be problematic, because they are not obligated to police the content, not really, they’re not liable for it in most cases. And as soon as we shift the balance of that towards making them obligated to behave in a certain way, we end up in territory where, frankly, the original underpinnings of the internet, which are eroding daily anyway, are going to fall out from beneath us. And I don’t know that there’s a balance that’s easy to strike there. It’s it’s a philosophical debate at this point.

Matt Fusaro 11:47
It is.

Jason Pufahl 11:48
Yeah, I mean, it’s, I don’t know that we want to spend too much time, too much more time on this. I feel like it’s a good segue to, you know, to sort of say, if people want to engage in a broader conversation around, you know, the utility of actually blocking these things. Do you use technical controls? Is it really a policy decision? And maybe just leave it at that, the risk to some of the other platforms, more generally, because it’s certainly not limited to that. Certainly happy to have that conversation and explore where this goes. I don’t, I don’t want to treat our opinion, purely as fact, by any means, right? There’s, there’s a whole variety of ways to approach this. But to date, we really have our guidance to our clients has been, don’t necessarily block unless you’re compelled to, but be mindful of the fact that these aren’t privacy enabled tools.

Steven Maresca 12:35
And for the time being, if you have effort to spare, invest it elsewhere.

Jason Pufahl 12:40
That’s fair. Well, as always, we hope that people got value out of the conversation. If you weren’t familiar with what was kind of going on with TikTok, that this brought some of that to light for you. If there’s any questions or further discussion you’d like to have, please feel free to reach out to us. We’re happy to talk. Thank you.

12:56
We’d love to hear your feedback. Feel free to get in touch at Vancord on LinkedIn or on Twitter at Vancordsecurity. And remember, stay vigilant, stay resilient. This has been CyberSound.

Request a Meeting

Episode Details

Hosts
Categories

Work with a Partner You Can Trust

Our goal is to provide an exceptional experience to each and every client. We learn your business and protect it as if it were our own. Our decades of experience combined with our expert team of engineers and security professionals provide you with guidance, oversight, and peace of mind that your systems are safe and secure.

Cybersecurity Tips In Your Inbox.

Get notified when we have something important to share!

Related Episodes