Cyber Ways Podcast

Cybersecurity and the herd with Dr. Merrill Warkentin

September 15, 2021 Tom Stafford, Craig Van Slyke Season 1 Episode 1
Cyber Ways Podcast
Cybersecurity and the herd with Dr. Merrill Warkentin
Show Notes Transcript

Whether we know it or not, our behaviors, including those related to cybersecurity, are influenced by others. Sometimes this is obvious, such as when we read reviews before reserving a hotel room, but often the effects are more subtle. In this episode of Cyber Ways, Dr. Merrill Warkentin of Mississippi State University joins us to discuss his 2020 paper, "Can Secure Behaviors be Contagious? A Two-Stage Investigation of the Influence of Herd Behavior on Security Decisions, which was published in the Journal of the Association for Information Systems. Dr. Warkentin co-authored the paper with Dr. Ali Vedadi of Middle Tennessee State University.

Vedadi, A., & Warkentin, M. (2020). Can secure behaviors be contagious? A two-stage investigation of the influence of herd behavior on security decisions. Journal of the Association for Information Systems, 21(2), 428-459. doi: 10.17705/1jais.00607.

Intro audio for the Cyber Ways Podcast

Outro audio for Cyber Ways Podcast

Cyber Ways is brought to you by the Center for Information Assurance, which is housed in the College of Business at Louisiana Tech University. The podcast is made possible through a "Just Business Grant," which is funded by the University's generous donors.

https://business.latech.edu/cyberways/

Tom:

Folks, this is the Cyber Ways podcast and we translate our academic knowledge about information security into stuff that you can use as a security professional. We think it's a unique mission, we think you'll like it. I'm Tom Stafford, Craig van slyke.

Craig:

Tom and I are your hosts on your journey to knowledge. Cyber waves is brought to you by the Louisiana Tech College of businesses center for information assurance. The center offers undergraduate and graduate certificate programs in cybersecurity and sponsors academic research focused on behavioral aspects of cyber security and Information Privacy.

Tom:

Welcome to Cyber Ways today we're happy to have Dr. Merrill warkentin. With us, Merrill is the William L. Giles distinguished professor and James J. Rouse endowed professor of information systems at Mississippi State University, his research and it goes back a long, long time I've known him for 20 years. And it goes back further than that His research focuses on factors that influence individual behaviors in the context of information security, privacy and social media. Now he's published over 100 articles in leading journals in our field, including the ones we'd all want to be in the quarterly JM is JA is and a number of others in 2018. Merrill's contributions were recognized by the Association for Computing Machinery when he was named ACM distinguished scientist. And that's a huge honor in our world. So welcome, Merrill.

Merrill:

Thank you, I'm really happy to be here and share some of my work with your guests.

Craig:

So Dr. warkentin, is here to discuss his 2020 paper can secure behaviors be contagious, a two stage investigation of the influence of herd behavior on security decisions, which he co authored with Ali Vedadi of Middle Tennessee State University.

Tom:

So Merrill, give us a big picture of the paper, what's it about, and also mentioned your method and at a very high level, and remember that we're talking about our research for our constituents in industry who might want to put those findings to use?

Merrill:

Sure, Tom, I'm happy to talk a little bit about this work. Because it's really important, we know that users are the weakest link in information system security. And we know that influencing users to engage in secure behavior is an important consideration important objective of a CIO or Cisco, and other organizational leaders. So what we know is that users don't always have the best information we try through our training programs to give them information. But oftentimes, they're uncertain about what's the right way to go, whether they're an employee or just a home computer user who's not sure what to do. So what we were interested in is when people aren't sure aren't sure, what's the best solution? What are some of the things that guide their decisions about engaging in cybersecurity, hygiene behavior? And, you know, what we noticed is that in many different contexts, people follow the herd. Let me give you an example. So you're in a new town, you've never been there before. You don't know what the restaurants are like. But you see that there's a really popular restaurant, a lot of people prefer to go to, well, they must know something you don't know. Right? That's, that's an example of herd behavior. Obviously, we get the word herd behavior from, you know, when the wolf attacks and you see the herd going away from the threat, or when the shark attacks and you see a school of fish moving away. But in this case, what we know is that individuals who have some uncertainty about the threat, in this case, an information security threat, might just follow the herd because those people know something you don't know. So what we did is we picked technology that a lot of people have uncertainty about, namely, password managers. So a lot of people know that they aren't picking the best passwords, they don't know how to keep track of their passwords, they lose their passwords, they reuse their passwords, two thirds of people and surveys show that surveys show that two thirds of people reuse passwords, which is obviously a big threat, especially for your banks, and brokerages and so forth. And so what we wanted to do is explore this technology, because it was one where people might have less information and more uncertainty. And then what we did is we created an experimental design, where we had two groups, like a lot of science you've heard about, we have a control group and a treatment group. And the control group and the treatment group were each exposed to a little bit of information, but the treatment group was given popularity information. Now we all see these all the time you see the five star ratings at at a travel website, you go to a online retailer and you see you know, popular reviews and so forth, you see that 1000s of people have downloaded something so you think it must be popular. And that's the kind of thing we did. We told them that this particular software tool, I won't name it, but it's in the paper, but we told them that this particular password manager was highly popular in the treatment group only. So we have this treatment group and we have this control group. Both groups were asked a few questions. And then the treatment group was given this popularity information. And then both groups use the software for one week. And after one week we went back and asked them some questions again, we validated that they had used the tool the password. Manager. And then what we found was that the group who had been given the popularity information was much more likely to want to continue the tool.

Craig:

you've addressed this a little bit about your motivation for doing the research. But I wonder if we could dig into why you chose this particular question. And what I what I want to get at here is what I think is an underlying, often implicit assumption of a lot of security research. And that's that users know what security technologies are capable of. And so I think, if I remember, you challenged that assumption.

Merrill:

Yeah, that's a really good point, Craig, a lot of the research that the three of us and others are doing related to the behaviors of individuals in the security context, presumes rationality, first of all, that people, you know, cognitively look at costs and benefits or risks and rewards. And it also assumes that people have information on which to make their decisions. But the truth is that, number one, people don't always act rationally. Now, that doesn't mean they're irrational. But sometimes it just means it's non rational, it's not a deliberate decision, it may be a decision that follows minimal cognition, you know, a habit, or some other factor that we are trying to understand and some of our other research projects. And then we found, we find that a lot of people don't have sufficient information to really make a well informed decision. And you know, in the real world, this is probably true. Also, I recently booked some hotels for a trip, and it wasn't like I had all the information in hand, but I felt like I had sufficient information to make the decision. Scientists call that satisficing. You know, it's not an optimization, it's more of a satisficing kind of situation. And we know that a lot of people have a lot of uncertainty about security, all of us get asked all the time. security related questions, you know, I don't know how often I've had somebody say, Well, what would you do you know, what kind of passwords would you recommend, and what's the best cloud storage and things like that? So what we knew is that the assumptions of most of the research that we've done, including a lot of my own research, are really flawed. And the fly is that the theories that we've been applying primarily require that users have full understanding of the trade offs and that they deliberate about their decision, and they consider the alternatives and make decisions that are rational. And so to do this, we are looking at various other factors that influence the outcome of these decisions. I'm using air quotes here decisions, because sometimes they're not even explicit decisions. They're just actions people take without much decision making. And in doing so, we stumbled on this idea of psychological nudges, in some of my earlier work, we found that if you influence people by giving them here's an example, a recent survey of our employees concerning this policy showed that over 85%, would not share their password even with another employee, regardless of the circumstances. Now, that's a treatment we had in another paper we published a little earlier. And we found that it had a significant influence on people's intention to engage in policy violation. And this kind of comes from the work of behavioral economists who have looked at the role of psychological nudges, you know, they, they did a test in the UK with taxpaying. And when people were told that other people were paying your taxes, they were more likely to pay theirs. So we we laughed, we looked at the aspect of social influence. Now social influences when somebody you know, somebody who may be your boss, your coworker, your family member, somebody who's important to you thinks you should do something. For example, if I tell somebody, you should use this software, they're more likely to do it, because they trust me as an authoritative voice. But what's interesting about herd behavior is that you don't have to know those people. That's just the herd. That's the popularity information. That's the 1000s of people out there, who are nameless and faceless, who are anonymous to me, but I know that they must know something, I don't know. Or they wouldn't be going that direction. And that's what's really interesting, it's different from social influence where it's your boss or your spouse or somebody telling you you should do something. So we began to look at that and we decided we could create this field experiment to manipulate the popularity information yielded some interesting findings.

Tom:

So essentially, you're talking about managing the popularity of security hygiene, am I getting that right?

Merrill:

Well, yeah, the the popularity of the good security, hygiene, and you can also hurt people away from doing the wrong things, of course. So we know that managers who are able to convey this effectively, especially if they convey that it's people like them. And so my other research, we use social identity theory, you know, if I find that people who are similar to me are doing it, I'm much more likely to engage in those behaviors than if they're people who I don't associate myself to be a part of their peer group. And so if managers you know where to go to the the local dock and say, you know all the other people in loading dock or picking good passwords, that's more influential than some other group that you don't feel a part of, for example.

Craig:

So what made you settle on using an experiment?

Merrill:

Well, you know, we've done a lot of surveys and surveys have their place, there's some advantages of surveys in terms of reaching a very large number of people, you can survey 1000, or more people and get a lot of data. But a survey typically ends with a question like, what would you do in this circumstance, you know, and it's what scientists call measuring behavioral intention, what we wanted to do was have real behavior. And so we had to go out into the field to have people actually use the software. And furthermore, what we did is a two stage experiment here. So we had them, we recruited these panelists for the first stage where we collected data from them, and then we expose them to these manipulations in the treatment group. And then we had them use the software for a week and so that they had the experience. What's interesting is that all of them had an increase in what we call self efficacy, their own belief that they can engage in the behavior effectively. There's a lot of research that suggests that if you have high self efficacy, you're more likely to do something. And one of my other papers, we looked at hospital employees complying with HIPAA and other related regulations. And we found that if they observed other people, let's say you're watching another nurse properly, log out of a screen before walking away, for example, you're more likely to engage in that HIPAA compliant behavior, because you have more confidence or self efficacy, that you can do it properly as well. So we know that self efficacy is a very important component of behavior. And what we wanted to do is measure their self efficacy after really using this password manager, not just some hypothetical in a survey or reading a scenario. So again, scenarios and surveys have their place. But we feel like our method of data collection from real users in the field, especially because it had this longitudinal component created a more rigorous environment for testing the impact of popularity information.

Tom:

So our typical listeners is going to be a manager in an important industry somewhere who's interested in applying scientific principles in the workplace. And in the paper, the notion of bounded rationality figures, prominently, help somebody from a lay perspective, understand bounded rationality, and then then describe the connection of the bounded rationality perspective to the herd behavior concept.

Merrill:

Well, it's a, it's I'll just kind of simplify it a little bit. But essentially, the previous research up until about the 50s, or 60s, really was based only on rationality. And then along came some some key scientists who've won Nobel prizes, saying that a bounded rationality is really what humans do. And it essentially suggests this satisficing that I talked about earlier. So we know that people engage in a sort of quasi rational approach to solving problems by applying rules of thumb, we call them heuristics. And we don't really come up with an optimal solution all the time. And the bounded part is is the fact that we don't have all that information, you can only make decisions within the parameters of what you know. And yet we do pretty well with that, you know, we drive in cities we've never been in before we we make purchases we've never made before we engage in all kinds of behavior. That's, I guess what what a lay person would call a good enough, if not optimal, it's good enough. And it's it solves the problem at hand. And there's much more that goes into the idea of bounded rationality. But but simply put, we know that human beings in general can make good decisions without complete information. And that's really important. And that was why we looked at two factors in our study. One was this idea of uncertainty that I talked about earlier. And the other one was imitation. The The fact is, when you feel you have some uncertainty, you're more likely to imitate others, which is a really an interesting phenomenon. If you think about it, and we do it in everyday life. You know, if you're on a highway, and the traffic's all backed up, and you start seeing people taking some exit, you think, ah, maybe they know how to get around this traffic jam, and you start, you start thinking about following them. So when you have uncertainty, you're more likely to engage in imitation. And that's why a manager who can project this idea that other people are doing it can send a powerful message, even if that employee doesn't know those other people.

Craig:

So I want to maybe jump ahead, just a bit, and I'm sorry to maybe get us off track here. But so So you mentioned the idea of not bringing this herd behavior to the forefront and making it available to the, to all the users that other people are doing this, you know, other people are using password managers or complex passwords or whatever it might be. But didn't your studies show that there's a pretty short half life to the effectiveness of

Merrill:

Yeah, that's a fair point, once somebody has their own experience, the uncertainty is reduced. And then they're going to base their decision more on their own experiences. So in the real world context, if you notice other people doing something, and then you go ahead and do it, you will probably not be thinking about the herd, after you've used it yourself, you're going to be more interested in your own experiences. television advertising does a lot to try to convince us that other people are doing something. So you see, people on TV, you don't know all, you know, laundering their clothes with the same detergent, or drinking the same beer or some some other behavior. But let's say you then tried that beer or that laundry detergent, it didn't work for you, well, you're not gonna follow the herd anymore, you don't need the herd, you have your own experience. So it's true that a manager who uses this idea of herd behavior to get people to engage in some behavior has a half life you mentioned it has that impact for a period of time, but that that influence is only it's fleeting, it's only temporary. And what you have to do is hurt people toward good behavior that will then be reinforced behavior, that will be something that the users in this case the employees feel has, has served their purposes, and then they're more likely to continue to use it. So that's, that's similar to the idea of, of just hurting people in any any behavior, any activity, once you get, once you get the herd of animals going one direction, it's pretty easy to keep them going that direction, because they're moving away from the herd. They're sorry, they're moving away from the threat. You know, I think the idea of a herd of animals is is a really good analogy. Once that person on the perimeter sees the lion and starts moving one direction, and everybody else starts moving that direction, all the other gazelles or deer are moving that direction. Clearly, there's no need to really think about turning around, it's working for you. There's no threat, you're running away from it, and you just keep going. So we didn't do an experiment where we herded people toward bad behavior, we herded them toward using a password manager, which our subjects found was a useful behavior. But like you said, self efficacy kind of takes over after a while, and now the herd behavior is less important, or the herd influence is less important.

Tom:

So in the paper, one of the key experimental results was the effect of imitation on intention, and it was really predicted really high. Let's take that experimental effect and put it into a commonplace context, help a security practitioner figure out how to apply that imitation effect on intention.

Merrill:

Well, let's, let's start with the idea that intention is the immediate precursor to behavior, right. So all the science in all kinds of disciplines, especially psychology suggests that in terms of a deliberate rational action, that before you do something, your brain forms the intention to do it. So I went, I decided I'm gonna go ahead and take a different route to work and turn turn right here, I have the intention as followed by the behavior of doing it. So we know that intention has many antecedents or factors that lead to it. And imitation is just one of these many antecedents and we find that it's the strongest when there's uncertainty. So I'm not likely to imitate others when I feel like I know more than they do. Okay. So for example, let's say that I feel like I'm an expert at investing. And I see everyone else buying cryptocurrency, that doesn't mean I'm going to follow the herd. If I feel like I have greater knowledge, then those idiots I'm sorry, those people buying cryptocurrency. And so, you know, the imitation factor is really an interesting one, because it doesn't always work. So imitation is important if it's an environment where there may be some uncertainty. So as an example, let's say that you have a professional, maybe they're an auditor, maybe they're a surgeon, maybe there's some other professional, and they find that there's some new method technique, something else that's come along that other people are using, you know, you might follow that herd, you might dry it out, because you want to, you know, experience that procedure or technique yourself. But if you already feel like you know what you're doing, even though everyone else is doing it, you know, are you going to jump off that cliff? Also, you know, that's that's what's interesting is, is what if the herd is wrong? Now, this brings up a really interesting point. And first mover is not always right. And so we know that some people once they, once they start running with the herd, if you will recognize that maybe the herd was wrong, and it was a miscue. And then you see, for example, stocks getting sold off after they, after they get run up, you know, so, what you by the way, people in finance study herd behavior quite a bit because it explains a little bit about bubbles and how they burst. But once you see that the herd is moving in one direction, there is sometimes a reckoning and, you know, moving in the other direction as people either question the wisdom of the herd the wisdom of the crowd, Or some other external factors influence that process? You know, and that really explains a lot in terms of why some really big movements don't have much don't have many legs to them, they they kind of fizzle out pretty quickly. The herd is not always right. But imitation is very powerful. Managers can use the tendency for employees to follow the herd to steer people toward good behaviors, but it's not a lasting effect.

Craig:

Well, and it, it seems like managers ought to be a ought to be cautious about the flip side as well. And I know you didn't get into this on your paper, but you did mention this earlier, hurting behaviors can be counterproductive. So in your study, you tried to encourage a sound security behavior. But you know, if we say alright, you know, Billy sharing passwords, and I heard everybody sharing passwords just to get work done more quickly, I'll share passwords to or everybody else's using the simple passwords and using one password for everything. So why shouldn't I do that, too? Seems like the that could really work against security managers.

Merrill:

Yeah, we actually do talk about that in one of the paragraphs near the end, where we talk about at both the individual level, meaning just a specific employ, and at the organizational level, meaning what companies do and other companies kind of following those herds. You know, oftentimes, those initial herding instincts kick in, and everyone else follows that crowd. But then they learned that that's not the direction that is right for them. So there's this errors for first movers that we talked about that can lead to sending the wrong signal to late Movers. And, you know, it's it's been, it's been identified for decades at the organization level where, you know, you see your competitors all choosing a particular technology or system whether it's, you know, cloud based storage, or or whether it's some other it, for example, and because everyone else is doing it, you do it as well. And then you find out that that's probably not the best solution. So we've seen many technologies come and go that were very popular for a brief period of time. But then, once they were put into use by a company, those company executives, were disillusion, for example, and quit using those technologies. So you know, the herd is not the right answer for everything. But if an IT manager knows that the particular technology is good, in this case, password managers are pretty widely recognized as being a good technology, especially for individual home users, then it's probably a good idea to think about influencing users to move in the direction of a good technology.

Tom:

So a key implication of your your study as you're engaging in the final portions of the paper on discussion is that herd theory appears to cause individuals to discount their own information, as uncertainty increases. Can you put this into context for the managers who are listening?

Merrill:

Yeah, so a lot of people think they know something. But when you begin to see that other people may know more than you do, then you discount your own information you doubt yourself, and another way of putting it. So you, you know, I'll go back to the restaurant example, you think there's probably this one restaurant that looked good. But once you get to that town, and you realize nobody's going to that restaurant, maybe you discount your own information, you think other people must know more than I do. By the way, that reminds me of a one of the famous sayings by Yogi Berra, where he said that nobody goes to that restaurant anymore. It's too crowded. So, you know, clearly imitating others is it goes hand in hand with discounting your own information. If I have high confidence in my information, and I have high self efficacy, I'm probably not gonna follow the herd, because I think they're all wrong, I think I know more than than the others. And that's what investors I guess we'd call them, you know, the contrarian strategy. You know, when it when a stock is beaten down, and everyone else is selling it, some people say that's the best time to buy it, you get getting a cheap, you know, so if you if you believe, you know, more than the others, and that's this concept of asymmetric information, right, where we're one company or one individual knows more than the other side that's in warfare, that's a real key component of a strategy is to have an asymmetric information and know something your opponent doesn't know. In this case, the herd doesn't have any info influence the the invitation is not going to be there, if you don't have some uncertainty about your own information.

Tom:

So what's next, Merrill, you've done a lot of work and in a lot of topics other than security, but security is what everybody knows you for these days in our circles. But what's the next thing what's what's the next big idea down the pike here in this this direction? researcher?

Merrill:

I'm glad you asked. I think there's a lot of interesting avenues to go one of the things that we all in our in our field, as scientists looking at these behaviors in this context, know is that we need better measures. So one of the things that I'd like to mention briefly is the idea of newsing neuro physiological data collection. What is neuro physiological data collection is the idea of using technologies that measure things Things that you can't really lie about or fake on a survey. It's things like eye movement, it's things like your eg readings, I did one study where we put people in a functional magnetic resonance imaging or fMRI magnet, and expose them to various treatments about security threats and security responses, and looked at what parts of the brain were activated when we expose them to these kinds of stimuli. There's a lot of interesting research being done by a few people in our field, I won't mention for fear of leaving one of their names out, but essentially using some of these technologies like eye tracking, which includes initial gaze and gaze duration, and all these interesting things. And then looking at when that hormone changes when you're exposed to a threat, you know, which can be measured, we're looking at eg we've done some work in that field. And the idea is that you know, you can't really lie, you can't really manipulate your brainwaves you can't manipulate, you know, if something pops up in the screen and your eye looks at it, you know, that that's, that's reliable data. So that's one area is having really a better understanding of these behaviors by looking at them in ways that are less or more valid and less likely to be flawed. Another thing we're looking at is all of these psychological processes that we are learning about from behavioral scientists and behavioral economists. So in one of my studies, with some colleagues, we are looking at the priming effect. And priming is where certain words or concepts that you can introduce into someone's brain will either subliminally or super limitedly influence their thinking on that. So for example, if you we have, we have a little exercise that people do, where they form newspaper headlines out of a mix of words, and we can influence them to think more about threats, we can influence them to have a little higher level of threat awareness or fear, we can also influence people to have a higher degree of feeling of safety by playing this little game, and they may not realize that we've primed them, but it actually empirically changes the outcomes of the decisions they make following these priming exercises. We're also looking at the role of habit, we know that much of what we do is actually even if you cognitively think about something, we just follow habits in daily life as well as insecurity behavior. And it's difficult to change habit. But, you know, if you for example, every time you click on a link in an email, you're the seat you're sitting in, gave you a little electrical charge, maybe you'd quit clicking links. Now, our ethics board won't let us do that. But we're designing some experiments to look at the influence of various things on habit, I'm also really interested in the results of some science around what's called Prospect Theory. And prospects theory suggests that the prospect of losing is actually more powerful than the prospect of winning. You've heard athletes sometimes say that, you know, losing hurts a lot more than winning feels good. And the idea that you, you know, might lose is is actually more compelling than the goal to win. So we did a project that was funded by the National Science Foundation, where we had real employees who started with a zero balance, and then they received rewards every time they had good cybersecurity behavior. And then other groups started with a positive balance, and they lost money every time they for example, fell for a phishing attack or other bad cybersecurity behavior. And the prospect theory held the idea of losing money as a bigger motivation than the idea of gaining some money. So these are just some of the things that we're looking at. I think what's really fun about all these things at this stage of my career, is that I'm beginning to question everything that I did in the past. And that's really kind of interesting. I'm actually at a point now where I can criticize a Washington's early work, you know, because it was all based on this rational idea that people thought through everything they do. That's, that's somehow really rewarding for me to criticize my earlier work. I don't know why.

Tom:

But your earlier work guy when we first crossed paths was a was your work on spyware when I did that issue of Communications of the ACM, and that seemed reasonable enough at the time, you haven't refuted any of that yet, have you?

Merrill:

Well, I wouldn't say any of its refuted, but we are looking at the edges of how these theories apply. And we're looking at how sure the theories work there is the science is still valid, you know, in some part of the the range of behaviors, but we know that there's more to it than that we know that we can't explain everything. And so that that really kind of motivates me and others to to really push the boundaries of the science and try to understand what's really going on because not everything is a clear, binary kind of description of the events that people engage in. And those are the fun things to look at in terms of designing an interesting scientific experiment, not the not the obvious, middle of the road. But what are people doing at the edges? Merrill, what are three or four things that a manager could put into practice based on what they've learned about your work and perhaps after having read your article that we're talking about today? Well, I think first and foremost, it's important to recognize that employees don't Always follow the training they've received. So if you think you've solved the problem by implementing some training program, question yourself question whether or not the training is always going to be effective because human beings are subject to a variety of these, you know, cognitive and behavioral processes, sometimes following even the role of emotions in effect, another area of my research we haven't talked about much, but when emotions are at play, when people are engaged in sort of abnormal behavior, deviant behavior, it's called you know, where they're sort of violating the norms, it's important to recognize that not everything is is straightforward. So number one, question training, don't think that just because you've trained employees, they're going to follow those rules. And it's not that employees are bad, but they're just so many other forces acting on them. And then more specifically related to herd behavior. What our findings reveal is that you can successfully move people toward the right behaviors initially, it may not last. But if you can use these techniques to talk about what others are doing to kind of create that psychological nudge about what other people are doing, you know what we call that popularity information. You can move people in the right direction, you can say that the majority of the other employees of this company are following these guidelines, for example, and especially in contexts where there might be some uncertainty where they don't have their own experience, yet, maybe a new technology or maybe something, something like a new threat, new kinds of email attacks that you're receiving new phishing attacks, if you can immediately introduce a nudge, you know, some kind of messaging, it doesn't have to be an email, it could be all kinds of different forms of persuasive communications. But if you can influence employees to think about what other people are doing other people who are like them, and other people who may know more than they know about the topic, the subject, the threat, whatever it is, then you can move many of them you won't move everyone but you can move your employees and nudge them in the right direction to not click on those links or whatever the behavior might be.

Craig:

Dr. Warkentin, thanks for joining us today on the Cyber Ways podcast. You and Dr. Vedadi have done some interesting, important research here and we're glad you were able to share your insights with our audience.

Merrill:

Well, thank you very much, Craig and Tom, I appreciate the opportunity to talk about my research with your audience. And I look forward to your future podcast. I think it's an exciting opportunity.

Tom:

This has been Cyber Ways a production of the Louisiana Tech Center for information assurance. Join us next time and tell your friends.

Craig:

Also, if you have any suggestions about topics or papers for future episodes, please let us know. You can email us at Stafford sta FF o rd at Logitech LA tch.edu. Or van slyke. That's VA n s LYK. He had logitech.edu we'd love to hear from you. Next time we'll discuss Tom's paper on Compute computer security complacency that is not easy to say computer security complacency

Tom:

and I wanted to point out to that when Merrill was talking about neurocognitive research, my paper really is the result of a neurocognitive study gone bad. I had a prime, it was robust. I manipulation checked it. I mean, this is scientific talk and everything. My results didn't come out the way I thought and I had to go figure out why. And out of that came the biggest surprise and that's what I wrote about. So we'll enjoy talking about that next time. Find that wherever your podcasts are acquired from.

Craig:

Alright, thank you for joining us. We'll see you all next time.

Tom:

And it is important to say that the Cyber Ways Podcast is funded through the just business grant program and the Louisiana Tech College of Business. And we're grateful for that.

Craig:

So join us next time on the Cyber Ways Podcast which is available on all major podcast platforms. We want you to subscribe or follow or whatever button your favorite podcast app has. Thank you very much.