Cyber Ways Podcast

Voices of Privacy with France Bélanger and Donna Wertalik

July 31, 2023 Tom Stafford, Craig Van Slyke Season 1 Episode 20
Cyber Ways Podcast
Voices of Privacy with France Bélanger and Donna Wertalik
Show Notes Transcript Chapter Markers

Ever thought about the digital footprints you leave while surfing the web? What about those convenient log-ins via multiple accounts - ever wondered about the risks involved? This week, we're thrilled to talk with Professors France Belanger and Donna Wertalik of Virginia Tech University's Pamplin College of Business to help us unravel these intriguing questions. They're here to discuss their groundbreaking initiative, Voices of Privacy (https://www.voicesofprivacy.com/), aimed at raising awareness about the significance of online privacy and empowering individuals to make informed decisions about their data.

Navigating the digital world can be a complex affair, with pitfalls and challenges at every turn. In our conversation with Prof. Belanger and Prof. Wertalik, we dissect the crucial distinction between security and privacy, highlighting the understated importance of data protection. We also touch upon the increasingly blurred lines between convenience and privacy, scrutinizing the risks of logging into websites and apps with multiple accounts. Besides, we evaluate the role of big corporations in safeguarding consumer data and the dire need for raising awareness about this issue.

As we dig deeper into this compelling conversation, we explore the Voices of Privacy initiative further. We discuss their treasure trove of resources, including engaging webisodes and insightful talks with privacy experts. We also evaluate the upcoming webisodes on children's privacy and privacy during vacation - essential, thought-provoking content that everyone should check out. So, brace yourself for an enlightening exploration of online privacy and how you can better protect your data.

Voices of Privacy website: https://www.voicesofprivacy.com/

Intro audio for the Cyber Ways Podcast

Outro audio for Cyber Ways Podcast

Cyber Ways is brought to you by the Center for Information Assurance, which is housed in the College of Business at Louisiana Tech University. The podcast is made possible through a "Just Business Grant," which is funded by the University's generous donors.

https://business.latech.edu/cyberways/

Tom:

Hello everybody and welcome back in to another episode of Cyberways, a production of the Center for Information Assurance, the Louisiana Tech University College of Business Administration. Today we're pleased to have two distinguished guests join us, the Professors France Donna Wertalik and , both at Virginia Tech University. Now Dr Belanger is University distinguished professor, RB Pamplin professor and Tom and Daisy Bird senior faculty fellow, which is going to take both sides of her business card to print. At the Pamplin College of Business at the Accounting and Information Systems Department.

Tom:

Virginia Tech, France has published in all of our top journals, the quarterly information systems research organization science, one of my favorites among many others, and she's a world renowned expert on information privacy, ranked among the top 1% most influential authors across all disciplines, noted most recently by Woxsen University in India, which has honored her career by funding the France Belanger Chair in Information Systems. Professor Wertalik is also with the Pamplin College of Business, where she serves as a professor of practice and marketing and as director of marketing strategy and analytics. She's an experienced marketing and advertising executive, speaker, author and consultant, focusing on strategic marketing and predictive data analytics. She's also a noted keynote speaker and panelist. For over a decade, Professor Wertalik has been focused on the growing importance of data privacy and related topics such as virtual identities and the paradox of data access and protection. Welcome.

France:

Thank you.

Craig:

Thank you so much for having us on your podcast, really appreciate it Great, and Donna and France are here to talk about their exciting new initiative, voices of Privacy, and so it's a nice combination of really informative videos and other resources, links to articles, that sort of thing that can inform the public about information privacy. The goal of Voices of Privacy, the project is to raise general awareness about information privacy issues to the general public and to educate and empower individuals to make informed choices about the information they share online.

Tom:

So if you tell us more about your goals, putting voices of privacy together and keeping it up to date seems like a whole lot of work which trying to accomplish with it.

France:

So the whole idea of Voices of Privacy came from lots of research for many years, where we've been studying what do people know about information privacy, why do people share so much what's going on?

France:

And one of those key sentences you often hear privacy is dead is something that we don't accept.

France:

But in order to tackle privacy for society in general you know we live in our academic bubble and we hear about these things. But what about your mother, your brother, your children? What do they know and how can they control how much information is shared about them or what they do online? And so the whole idea of Voices of Privacy is let's find a way to tell everybody about what they can do, to tell them why it's important, and to do this in a way that is easy to understand. So we're really, really hoping that everybody in the audience who's going to go and check some of our videos and everything, gets the feeling that the explanations are easy, clear and really helpful, and that's the goal. Let's make sure everybody can make an informed decision, and that was something Craig mentioned in the question. But this idea of informed decision is you can't just say I'm going to live in the offline world only, but let's make decisions of what we share, with whom, when, and understand why.

Donna:

The beauty of it is really a little bit of Yen and Yang, with France and I, my industry background and just always being an advocate of marketing, personalization, custom messages, but being very aware and cognizant of privacy and barriers and boundaries. And it's changing right In the last decade different generational cohorts, how they view privacy versus when you associated different forms of their actual offline privacy. Their views are much different. If you talk to them in terms of what you were walking and for every step you took, someone started walking with you, you would be very aware of that in the physical realm, but in the digital world that occurs as well at a hyper speed and yet you have no idea. So it's interesting to look at different generational cohorts and what their views of privacy are. But once again, what has been done in the past been done the past. We were looking to create a real educational value on a third grade consumer level that can help everyone, anyone who's online, any age, any stage.

Tom:

Before I forget, you guys have both been mentioning the excellent resources you have available on your website and I wanted to get the URL out there for our listeners. It's www. VoicesOfPrivacy. com. Go there and check those videos out and see what's happening in their latest work at their website.

Craig:

Suppose that I wanted to tell somebody about voices of privacy. Maybe I want to tell my students in the fall, go check this out. How should they get started? What should they check out first?

France:

So what's interesting is that we really are tackling different aspects of learning about privacy. So we have more educational content in our webisodes and the goal there is to clearly explain what is information, privacy, why is there a concern, and things like that. But we also have the very specific how to. How do I set up my phone for privacy, how do I set up my Alexa or my TikTok account, or you know, and so on, and so those get a lot of attention from people who say I did not know I could do this, and so that's one side of the element. Knowledge, the webisodes, again, awareness, understanding, the why, the what's happening. And then we have these privacy talks, which is a little bit like what you're doing with your podcast, and so our privacy talks are conversations with experts and they could be like really academic, major research projects, but we turned them into understandable.

France:

What does that mean for everybody? What does that mean? Why is my car smart? What does that mean to have a smart car? And should I be concerned about the information I'm sharing with the car manufacturer? So those kinds of conversations we have as well. So where to start? I would say and thank you, tom, for pointing this out I would start with the website, which has a collection of everything. Now, we do post on social media and this is where the Yin and the Yang is, because that's Donna's expertise, social media and but we post there. We post sometimes smaller segments of what we do, and then we put everything. All our collections are on the website.

Donna:

Rome wasn't built in a day and it's not going to change overnight, and so we've gotten this question a lot. Where it's been an audience will say, well, my child, they're sucked to their phone or they do this or they do that and they could care less about everything. And I said, well, they weren't educated. And if you think about it, it starts very young, just making awareness, just, and that's an education of the child as well as as the parent, because it starts very young.

Donna:

Online usage, even from a psychological perspective. When you look and do this from an ethnographic perspective, but looking at a normal family, most of them are staring at their phones at most times. You know if it's at dinner unless the family is actively saying, you know, put them in the middle. Let's have a conversation and actively discussing just anything and everything without online, but also knowing when you're online listen, this is what it is. So I think everyone has a choice.

Donna:

My younger daughter, particularly, is not on social.

Donna:

She's going to be a freshman soon in college and you know it's the aspect of I like authentic conversations and why put everything out there and plus what Instagram has shown through the years in terms of psychological impact and mental health and bully online, silent bullying and all of that. So she sees all that that's occurred and says why do I want to be a part of that? Privacy? All about the apps and everything being tracked. We're not going to fix it overnight, but more and more and every single day, you see truly what companies are being kind of pushed to do in terms of tells you want these settings and making it a little bit more than a small, very, very small font on a website that most users ignore. It's becoming more prominent as pop ups where users are saying oh, I had a choice, I didn't know it. So I think that's once we continue to educate on choices and simple settings and, quite frankly, getting content you want versus just spam, I think we'll have a richer conversation and, hopefully, more privacy happens.

Craig:

Right. So it does seem like Informant Empower is the theme that runs through the entire project. I want to maybe throw you a little bit of a curveball, so I'm going to do that in two ways. So first is Franz said, you all don't believe that privacy is dead, but I think we'd all have to admit that it's kind of on life support for a lot of people. And Tom and I just had a conversation with a number of folks the other day where some had the attitude of what's the point? You know, it's just too much to do all this. So how would you respond to that?

France:

Well, we hear that all the time and my response is that it's not dead. It's about you know how much you want to balance, and we'll talk a little bit maybe later about the cost benefits. But the idea is that you can decide where it's okay to share certain things. So we have, for example, parents sharing pictures of their children and all kinds of information about them, and then suddenly Zuckerberg, who puts a picture of his kids online, puts smiley faces in front, so we can't recognize them. But most parents don't do that. So what are the issues? Well, the issues are that we can, particularly with all the software available today. We can create profiles, we can recognize them, we can even pretend to be individuals through various artificial intelligence or other tools like that.

France:

So we did a panel with our president of our university on privacy and I asked two students if they would allow me to search them, and we did that and we searched them and one of them I knew. The other one never heard of this person and, because I know the trick of the trades right, I didn't search them. I tasked two searchers who were just students who are very smart, but I only gave them a name and I said find everything you can, and through the information, when they brought it all back to me, I was able to create profiles of who they are, political affiliation, how they met their significant other, how long ago, and so on and so forth, to the point where I even found their parents mortgage account online. And so that's the scary part. And so what people don't realize is that little pieces of information together become a profile.

France:

That doesn't mean you have to give everything out there and so you make decisions. Do you really want everybody to know exactly where you were at every minute of your day when you took pictures and then posted them online? It's a choice, but most people don't even know that their pictures are recording location and date. So if they know and then they decide you know what, I'm okay. That's what I want Everybody to know what I did Tuesday at 3 pm. It's their choice, but let's just make sure they know that they're making that choice, and that's what we're about.

Tom:

France in one of your episodes, the state was made. You can't have privacy without security, and I think I know where you're going with that. But could you put it in lay terms for the audience?

France:

Absolutely. So a lot of people confuse the two terms. Right, the idea is, security is about protecting the information, the access to it or the systems. And I like to use the example of a house. If you lock the door of your house, you're using security so that nobody comes in and steals what you have in your house.

France:

The privacy aspect comes from. Well, once people are allowed in your house, you have a party or something, and the people come into your house and then you have certain, say, a nice painting or a nice sculpture, and you say to people please, you know, don't say anything about this. You know, beautiful sculpture I have in my house and they go out and then they start telling everybody or even posting pictures of that outside. Then they violated your privacy. So the idea of security is, of course, if my data is breached, I have no privacy about that data, and we all know this from all of the breaches we've been victim of. I'm sure you all have too.

France:

I have been a victim of multiple breaches. So if my data is breached, I don't have privacy. But even if it's secure, you may not have privacy if somebody shares the data because they have access to it. So that goes back to what we're saying about decide what you're going to share with whom, and then you're going to decide is this a reliable party and they're going to respect my privacy or not? And so those are the kinds of decision.

Tom:

It's almost as if deciding not to be totally private is to default to a broadcast to general society across the social media. If you don't lock it down, it's available to be shared on. I think is my sense of it.

Donna:

The first step is setting up the app and you know just having conversation earlier about the Threads app, the brand new app, and how it's less than 10 seconds to set up because it integrates within Instagram, right, so that's wonderful, but they also make it very easy for a reason. So for the user to not just blow through all the settings and say yes, accept everything I've done on Instagram, they need to pause. So I actually did a test on it and I was private for the first 30 hours that I was on Threads and really didn't suggest select anything. And guess what? My experience was bland. It wasn't enriched by any comments or anything along those lines. So, you know, with the apps I went back, but then I customized it to say I want to see this, I may not want to follow this person that I do on Instagram. So you know, I think it's really, really important that people you know the usage and especially of Threads. You know 30 million downloads in the last 24 hours.

Donna:

People are yearning for a new platform, but the different generational cohorts who has adopted it and not is quite interesting. But at the end of the day, you still have a choice. The user has control and that has not been. It has not really been publicized as much as it should, because most of these platforms are set to monetize. So do they want you to know you have choices? Well, now they do, and now that it's part of the government. But you know, even for them it's better, so they have more quality. Consumers are targeting.

Craig:

I totally understand everything that you're saying, but I want to play a little bit of devil's advocate here. It's the secondary data sharing that concerns me. You know where there's some nebulous language that says we may share this with our partners, not specifying who the partners are, what they're, doing, nothing. So how do we protect against that?

France:

And so that's where settings is the first step. Right, so you can say I'm not going to share. And if you, for example, watch our smartphone, our iPhone, how to settings? There are many settings. They're turned on by default, and even when you have a new software update, they get back to the settings, and so there are some choices there.

France:

And I'm going to take it further, Craig. It's also the data that's collected that we don't know about, and so that's the tracking part. And we do spend a lot of time talking about tracking and the digital footprints that we leave online, because that is the part that people don't even know. So we're really talking about multiple levels of trying to protect yourself. Decide what you're going to share via the settings Once it's shared and this is the big issue if somebody has shared something online five years ago, forget it.

France:

You're not going to fix that. It's been shared, it's online. There's very little that can be done about the past. What you can do is decide what you're going to share in the future, and that's that part of that. So what can you do about the secondary data? First, don't share in the first place. Be careful of your digital footprints. So we talk about you know, using a private browser, if you can, because there's times you can't because, of course, the websites are designed so that they want all your information. So there are certain things you can do to protect yourself on elements that you know of, but what if you don't know?

Donna:

No, Right, right, and it is. It really is. It's just being president and cognizant and knowing that awareness and hopefully a lot of these companies. So, whether it's a website, you want to say I want to share my information with ad partners or send me marketing, send me news, whatever, with apps, the same thing, you know two and a half ago, years ago, they didn't call it to the carpet.

Donna:

Now consumers are getting more and more rights within this space and that it is their data, right, and so when you look at it and if consumers knew, guess what, you have a value to this data. Like you know I think I said it before there's a coffee shop in Texas. They don't want your money for coffee, they want your data, and that was a few years ago. So I think, just making consumers more aware, and if you could position it where you know and I don't know, in the future, if people are going to say, hey, we'll pay this much for your data because you're an influencer in this area, in that area, and then that really puts the power in the consumer's hand to say what am I getting my data?

Donna:

And so I don't know what the future is. It's definitely more privacy focused and value on our personal data.

Tom:

I wanted to reinforce a point. France had made mention of an episode of theirs on tuning your iPhone up for privacy and security. I wanted to remind the listeners that to the website is wwwvoicesofprivacyonewordcom where they can find that, because I feel like the iPhone is one of the worst vectors for privacy that is out there, particularly if you're on AT&T, which promises to sell your data. They will tell you that fairly obvious about it. But we all use Facebook on the iPhone and we do this and that on the iPhone. I'm sure there's a lot of users that want to go learn how to lock their iPhone down. Do you have anything like that for Android?

France:

though it's in the plan. Actually, we do want to do that. We haven't put that one out. One of the things that's happening is Donna and I do this on top of our other regular jobs, so we're trying to tackle some topics. So for this summer, the two episodes are not yet out there. One of them is about children and privacy, because it's so important and we know they're all out of school, so what do they do online in the digital world? So we have a webisode. We call them on that. So eventually we will get to the Android, but, apologize, we don't have that one yet.

Donna:

We are in terms of for the show Voices of Privacy have a litany of different topics and what we're doing is we're weighing them. We're also getting feedback from listeners, but to easily blend in how to on any phone settings and making sure it covers both is definitely the goal. For sure there's. I mean, it's marketing, it's the world of privacy. It happens so quickly. So we also want to be relevant as well. But we also want to have that base from the how to videos that any consumer at any time can come back and they find value and they can update it from there.

Craig:

This is going to seem a little bit out of the blue, but I've been wondering about automated driving systems, self driving cars. How does that relate to privacy?

France:

Without data it's going to crash. If you let it run, you know, and you say okay, just like self park. Well, self park requires the cameras to take, you know measurements. And then it requires everything to you know, just like happen. And so if you don't have the data, the car won't park itself.

Craig:

That's really interesting. It'll be very interesting to see how that all turns out, but I want to switch gears a little bit. Something that's been running through my mind is you all have been talking. We used to live in the city in St Louis, and you know not the safest place in the world, but we locked our doors. I mean, it wasn't. Oh, we can't absolutely keep a burglar out, so there's no point in even trying. So part of what I'm hearing is yeah, are you going to totally protect your privacy? No, but that doesn't mean you shouldn't protect it to the extent you can. You know, you also don't just leave your doors wide open, even though if a professional burglar wants to get into your house, they're going to get into your house. So the logic doesn't quite track sometimes.

France:

People are overloaded, they're discouraged with trying to control settings. I mean, this is what we do and I can't keep up. You know and the whole debate about thread is just fascinating and how many settings. And you know it's linked to your Instagram account and if you don't, if you want to delete your account, you have to delete both. I mean, who has X hundred pictures out there? Who wants to delete the account because suddenly they don't want the other one? Genitive AI. You know, when chat GPT, everybody was talking chat GPT, there are multiple privacy aspects to this and it's going to stay because, even though that particular software was, you know, very, very popular and maybe people are going to move on to a different one, the concepts behind and the privacy issues are going to stay, and so that's what we're about. So, yeah, I appreciate you mentioning the idea of you know let's do what we can instead of just give up completely.

Donna:

And you know, greg, I think that because we talk about this a little bit in our class too, and when you talk about privacy online, the students are like, I don't really care, you know this, and that Then, when you give the example of what you just did, so you gave me, you gave us a perfect prompt for this. So I say so, when you go home, you leave all your doors open, you take everything on, the all your jewels, and put it right on the table and say let's go. I said, because that's the same thing. I said, so walk with me for a second. I said, and one of the things I say is for every click, imagine how many steps you would take, because that's how far you're going on a digital trail and creating footprints, and those are some beneficial because you want to go from Facebook to this and Facebook's still with you. So you know, if they knew the entourage that was following them every single time you were online, no one would make up to them.

Tom:

I share with my students this notion. From when I left the marketing profession, they were just starting to talk about this as database marketing. It's become sophisticated since then. It's not just Aquifax and TransUnion and parsing all of your credit card records. With AI and advanced analytics in real time, they can figure out what you're shopping for and bounce up ads that are extremely relevant to what you're looking for. I'm building a guitar right now. I'm Googling guitar necks and machine heads for the tuning and I'm getting lots of interesting ads out of that. But the thing I remind my students about when I give them examples like that is you're exchanging your privacy for a marketing benefit. If you don't mind them knowing, they will show you what you're looking for and they'll help you find it at a price you like. That's how good it's gotten just of late 100%.

Donna:

I mean, and this is right target, you know, the right message at the right time to the right target, on the right platform, on the right frequency. I mean we always used to have the first three, but it truly, truly is when you think about it, considering that aspect. So, from a marketer, it's Gigo, garbage in, garbage out. So we're just wasting time speaking to the wrong consumers. We're never going to get the conversion right, and whether it's to purchase, to follow, to subscribe, whatever it may be. So you know, the goal truly is to get quality candidates, quality target audience.

Donna:

And, yes, there's, you know, the ADA model, which is awareness, interest, desire and action. So, where there is that big funnel concept, where you're trying to see and grab everyone, but if it's already in a defined market, you know the people, especially the competitors, et cetera. So, you know, I honestly think marketers could learn a lot too and this is going to benefit and I think, wherever it's going to go, we just have to be very aware and present that these apps are determined to get our information or to monetize our you know, our data and just going into that from an educated and just street smart perspective, but online smarts.

France:

If I can add something to that, and as we're talking about educating, we really do think that we have to start way earlier than what we're doing. We're not talking about educating university students. We need to start when they're very young, because that's when they start moving into this digital world and it's normal to do things on the phones or mom's iPad or whatever, and so it is at some point, whenever they're mentally, cognitively, able to understand, we need to start educating them about making these choices, and but, of course, for that to happen, we need the parents to have that same willingness to do that.

Tom:

So, as we're thinking about what people can do to protect themselves to the extent that they feel they want to opt out of sharing their data, give us a list of to dos and not to dos to keep them safe, if you would please.

France:

So I'd like to start by educate yourself. I'm actually going to say listen to this podcast, start looking not only at voices of privacy but other resources to know what is being collected by the various devices and services and apps that you use, and then at least you'll know what is happening. And then there are certain things that you can do, and this is where you know. We talked about security versus privacy before, and this is where a lot of the good privacy tips are also security tips, because you need to protect your information both from a security point of view but also from not being reshared.

France:

So the first one I would say is or the second one I would say is share cautiously, because and that's what we've been talking about this share cautiously totally linked to making informed decisions. Do they really need to know this? Do you really need to post all of this? Does everyone need to know certain things? And, by the way, tom talked about having private versus public accounts before. If you have everything private, but the people who are close to you and linked to you are not private, we can find information about you. So think about what you really want out there and just be very careful about what you share.

Tom:

I'm like my millennials that Craig was talking about earlier. Why worry won't help? You mentioned Scott McNeely's famous quote about privacy being dead. It's coming on for decades since the guy actually said that and this is back about the time. He also said the network was the computer Both prescient predictions about the future of IT. But 40 years have been telling ourselves privacy is dead and we still haven't quite gotten the messages.

France:

what I fear Well, I don't want us to think privacy is dead. I think that we want to say privacy is possible. It's being attacked on all fronts and we just need to be vigilant about it, and so sharing cautiously is one thing. There's all kinds of scams out there, donna and I talk often about. We've all been victim to that text message that you get suddenly. I had a text from a package that was going to be delivered and they couldn't deliver it, and it happened to be that that day I was really anxiously awaiting a package, and so I actually clicked on the link. Of course, I didn't put any information after that, because as soon as they started asking me information, I'm like you should know this. So I just deleted that, and Donna, if she wants, can share one of the story that happened while we were together. But in case of doubt, just protect, don't do it.

Donna:

Googling yourself. I always talk to students about that and it's interesting. The more public you are, obviously, the more sites and opinions etc. But they go down to every different level. But looking at it from another standpoint too not just Googling yourself for the bad, but understanding what your name means. So let's say you do have John Smith, mark Jones, whatever it may be. There's a lot of stories that could go under that name. So what does that mean and what information is coming up about you? And I always say curate from the good, not the bad. So have a award winner and this and that, but it picks up, it's constantly scraping the internet for it.

Donna:

So I think, just people being cognizant of where they are and what they come up, their settings. Once again, in terms of you know, years ago people could check into a location and tag their friend and say this friend is with me. All of a sudden they give in your rights and your public also in the public knows your location because your friend decided to share it. So I don't think privacy is dead when it comes really into understanding both the dangers and the benefits of it. So you know, I think the other piece is you know the vision, the goal, trying to make it their grade reading level. I'm so excited about this and my mind is constantly going in terms of what apps, what privacy? Because we want to build a resource of tools that's never been done from this position, from this type of platform, and get in front of people. So you know, whether it is Senator Warner's conference coming up in September, where it's women and businesses and just information that they can't get their hands on, or, you know, going through the school systems and talking and saying how do we get this in front of K through 12? How do we put some of these videos, some of this awareness, just different things, very simple, almost game like for them, but for them to really understand what's occurring online? I think that education happens in the beginning and if voice is a privacy can help, we're going to see the impact, but it's going to take time. And it's interesting because I think we're all learning and as much as I'm immersed in this.

Donna:

We, France and I were together and I got a text and it said I need to drop off the dress but I'm running a little late, very generic. So I was, as a good human, going to say redirect and say you've got the wrong person or whatever the case, and France stopped me in my tracks and said it's fishing. So you know, I feel bad for anyone that gets wrong numbers these days because I'm not responding to anything, but it's then. That's what starts the fishing, the conversation oh, you have the wrong person. Oh well, actually.

Donna:

And then it continues on and on and then they start to bait you. They do this a lot with senior citizens, obviously. They do this a lot online constantly. I mean, we just had something in my own you know area and department where an email came from my boss and said hey, I need to talk to you immediately, send me your phone number. And I knew my boss had my number, so I immediately took it and forwarded and said there's a fishing going on with your name. So we have to be proactive as well and really a part of the solution, and not just sit back and wait for answers to come.

France:

Tom, if I can go back to you, ask a couple of tips. I want to mention two, one I'll let Donna talk about. The first one is to close accounts that are unused. We keep opening accounts left and right for everything and you just say, oh, I'll drop it, I don't care, I'm not putting anything there. But every account has some information about you, and so if it's something you're not going to use, you should close it, just like you should delete apps you're not using. Keep only what you really need. That would be kind of a one set of tips. I would say Do we really need all these other things? And then the other thing is about linking accounts. Donna, you want to jump into that one.

Donna:

So it's really important, I think, once again, time for society. If something's easier, we're going to do it. So you're on Facebook. You see this great ad about this trip to Italy. You click on it and use Facebook as your login. When you go over to all that information, facebook's still with you, watching and tracking everything. And there was something. There was a law recently that it was called off Facebook activity, but you once again have to go into your settings and set that up to see ad partners.

Donna:

And if I use Facebook as a login, can Facebook come with me? So that, just in general, is thinking about how many logins do you have and where do you access? Do you access from your Google account, your Facebook, your Instagram? Because every time you use any social platform, they're coming along with you for the ride, and so I think that that has gotten really, really important. Also, since COVID, the amount of subscriptions and apps and I mean we were all online, we were immersed. That was our world. So can you imagine the number of? Nobody even looks to say how many subscriptions am I not using? Where was my name sold during COVID? All these other companies? It was. There was just an enormous amount of data that was given and shared based on the time that is still out there and there was some aspect of commercialization of your data. Not even remembering what data you gave, but it's continually being resold.

Craig:

Can I give a personal one that I've started following, which is along the same lines as the syncing of accounts. If it's too convenient, I'm trying to train myself to pause and think what are the risks here? It's so convenient to just log in with Google, log in with Facebook. Do I really want to do that? The UPS if a package is late, well, I can go into UPScom and check it. So that's my personal rule is just pause for a second.

France:

And Craig you're talking. You're really touching on an important point. Is the whole effort right and so nothing? There's no easy way for privacy. There's just no way to say it's just easy. All I have to do is X and so, for example, when I use a private browser because I don't want certain information you know my search history and things like that whenever I want to log into the school I have to do two factor authentication and you can't remember me for seven days. You can't actually remember me for one hour. So every time I log in I have to redo this. It's a choice that I make and it's an effort, and we understand people have limited time, limited energy.

France:

But there are certain things like. I really like the example of the spring cleaning. Right, just go and clean things you don't need. Just, you know, once in a while, when you do a major update to your software, take five minutes to go. Check settings on privacy and most places is pretty easy now, because they're well, pretty easy. There is a privacy settings, at least that you can find. There's always other, more hidden aspects, but at least there is a set of settings with the word privacy on them and so you can find them. You just check them.

Donna:

It's really important and we don't do it. But I hope it's coming more into the world to be present and just take a peek, take a moment to see what are you signing into, what information are you giving? So all of that you know comes down to really this you know these audiences and, like I said, time poor. That's why you've got everyone using access for Facebook for the login. It's easier, it's already done, we're on our phone. I mean, think how fast we make choices, think how quickly purchases are done now that they're in line. So you see an ad in your Instagram. You can purchase it right there. You can look at it, you can purchase it. It's all done within less of five minutes.

Donna:

And they know when to run the ads and when to grab you, based on timing and, obviously, patterns. So you know, like I said before, I mean the 10 second onboarding for threads. They did that purposely. People do not want to go through a number of steps. If you lower the barrier, especially on a new app, the adoption rate is going to be higher, right? So that's why you know the 30 million within. You know within 24 hours and now they're up to like 100 million. It just is knowing the consumer, understanding and what to really put in front of them that they're going to respond to because it's timely, it's relevant or really really just brings value to them.

Craig:

Absolutely. Yeah, it's a problem, it's a challenge. Let's put it that way. You know, I think people need to focus on what they can do, not what they cannot do to protect their privacy.

Tom:

What I'd like to know is where you're going in your next few forays into this. We've got the iPhone. You say, the Android's coming up. What other things are on your horizon so?

France:

we've already recorded a set of webisodes about children's and privacy and also privacy on. We call it privacy on vacation because and you can see the double sense there, because it is a big issue Suddenly we don't care. You know we're on vacation. We don't want the trouble of thinking about these things. We're going to use that convenience and so on and so forth. And so what happens to privacy when we're on vacation? So these are already recorded. We have a whole set of recordings for the fall that we have lined up this year.

France:

You know was our big launch and we're keeping up to just like all the time. Every week we post on Wednesdays something. Sometimes there are relevant articles, webisodes either, privacy talks, and so we're going to keep that pace at least till the end of the year and we'll see what kind of what happens after. We're still measuring the impact. We want to get feedback from our audience. We're really looking forward for more feedback on that. But it's exciting. I mean we're. The problem is we don't have enough time because there's so many topics. I mean, privacy is everywhere, data is everywhere. It's like we could just literally have a webisode a day and we wouldn't tackle all of that we need to talk about for privacy. We are also going to talk with a county school, a district, to see how we can use what we're doing to help with educating the younger ones, and so we are having conversations about that. We have an amazing conference coming up that we're going to present at Donna. I'm going to let you talk about that one.

Donna:

Yeah, which is very exciting. I've worked with Senator Warner for over a decade and have done things with young professionals in the past, so very excited to have this opportunity to have a session and an interactive session. So I think obviously France touches is incredible, so I'm excited for us to do that. And then, working with we're working with schools and have an appointment already with the superintendent. I mean, hopefully we'll get into front of the school board. We're also looking at potentially a monthly segment on some of this stuff and working with potentially some other stations and obviously Virginia Tech. So there is a lot to come. We want to be as relevant and timely as possible. We want to provide the how tos, the expert talks, but in terms of our finger on the pulse, that's where we want to be with topics and looking at things that just children, children, privacy, senior citizens, different diversities and privacies there is so, so much to really really look at and to really to uncover and to engage in and continue to build this community.

France:

And we're going to be I mean, we're really trying to think outside the box, and so we just thinking about how we're going to present some of what we're going to do at that conference and how do we create materials that people are going to really pay attention to which you all know is difficult, right and we're thinking about, you know, little tip sheets that we're going to share. We're also developing, kind of developing some quizzes so you can self assess your knowledge about Instagram settings, and we're hoping that these will help people say I really think I know what I'm doing. Answer this self assessment and you'll know if you're really knowledgeable or not about it. And if you're not, here's where you can go and learn more about it. So, and it's really about you know, can we make a difference? Can we get society everybody in society to pay attention to their data sharing behaviors and to understand what that means?

Donna:

We are in conversations with a publisher that I've worked with for a long time to do a beta in the spring with our content, which I'm excited about. I'll continue to have it as part of my own 600 mass class. But then, yeah, we are doing some immersive work. I spent my Friday in Mount Fuji at the base of it immersively with a couple other students doing some unique things. So we're definitely getting. We're in the metaverse right now and on some platform that allows us access to 45 different areas within VR. So we're building that and learning that, but then also looking at what information do they need for us to have a really good immersive on in headset experience. So that's coming up.

Tom:

Folks. We've been talking with professors France Belanger and Donna Werdlick from the Virginia Polytechnic Institute and State University. I used to live in Plexburg. They have voices of privacy online. I've given the L several times, one last time. It's not a shameless plug. You really need to see what they've got for you there to protect your privacy wwwvoicesofprivacycom and we'd like to have you guys back to give us an update after you've developed some more stuff there, because this is vital information for our audience.

France:

We'd love to Thank you so much, tom and Craig, for having us on your podcast. We want to make a difference and if anybody has any questions, they can look at our website. We have our contact information. We have resources. Even if somebody has just a great idea of something that's or something that's really on their mind about privacy, we want to hear from them too.

Donna:

Absolutely. Thank you so much.

Tom:

This has been the Cyberways podcast, a production of the Center for Information Assurance of the Louisiana Tech College of Business, and we're glad to have you with us, send these podcasts wherever you obtain them and share them with your friends.

Voices of Privacy
Privacy and Security
Privacy and Online Security Tips
Privacy in the Digital Age
Cybersecurity Experts Discuss Online Privacy