1. Home
  2. VPN / Privacy
  3. Greg young interview trend micro vp of cybersecurity

Interview with Trend Micro’s VP of Cybersecurity: Greg Young

Last month we had the chance to sit and chat with Greg Young, global Vice President for Cybersecurity at Trend Micro, a multi-national $2 billion security product company. Greg is an authority in the cybersecurity space with 34 years of experience.

See our interview with Greg or read the transcript of it below, and please leave a comment with your questions or opinions – maybe we can address them in our next one!

Greg Young – VP Cybersecurity at Trend Micro

Greg was at Gartner for 14 years, authoring 20+ Magic Quadrants, acting as the Research Vice President, lead analyst for network security, threat trends, and cryptography, as well as Conference Chair for numerous Gartner Security Summits.

He was also a pioneer in cybersecurity for Canada, being awarded the Confederation Medal from the Governor General of Canada for his work with smart card security. He was CISO for the federal Department of Communications, headed multiple large security consulting practices, and was a commissioned officer in the military police focusing on counter-intelligence and computer and technical security.

In addition to his work with Trend Micro, Greg is currently the industry co-chair for the Canadian Forum for Digital Infrastructure Resilience (CFDIR), a pro-bono advisor and member of the CyberSecurity Working Group appointed by the government of Barbados, and a fantastic co-host on The Real Cybersecurity Podcast.

Transcript of Interview

[00:00:00] Nathan: We have Greg Young, global vice president of cybersecurity for Trend Micro, a multinational cybersecurity product company. He’s headed several large security consulting practices throughout his career was the chief information security officer for the federal department of communications and was with Gartner for 14 years in various capacities, including research vice president and lead analyst in several cybersecurity verticals. Greg. Welcome.

[00:00:29] Greg: Hey Nathan, really great to talk to you today.

[00:00:32] Nathan: You as well. Um, can we start by just getting a little bit about your background and, and maybe what drew you to the field of cybersecurity to start with?

[00:00:39] Greg: Um, it’s my 34th year in cybersecurity. It’s a frightening number, but, um, it was all by accident that, um, you know, I, uh, had some computer science classes.

[00:00:50] The military found out about it. And, uh, they immediately put me into securing these new things that were called, uh, computers that they had. And, uh, from there forward, I just, uh, you know, being a technology, uh, lover, I just fell in with it. And it was an important mission that, you know, I started wanting to be a policeman, uh, combined these two things together.

[00:01:08] And, uh, the rest was history, you know, uh, military, federal government, private sector. Um, and here we find ourselves today and, uh, we still haven’t solved it all. So shame on shame on us.

[00:01:20] Nathan: Well, I appreciated your work for 34 years. That’s fantastic. I mean, you know, how is the arms race different at all now  in cybersecurity than it was, let’s say 30 years ago?

[00:01:31] Greg: Uh, radically different. Well  actually, there’s been one sort of, you know, circle back that we’ve had that it started out that, uh, you know sort of the state sponsored stuff was a big concern that we’re very concerned about, you know, that the bad Soviets were going to steal our information and rightfully so.

[00:01:47] And that was, you know, the major threat at the time, but it really quickly evolved as the internet expanded into individuals. But now we’ve swung back again, now that it’s very well funded, you know, individuals, but also state sponsored groups are a big concern as well on one end of sort of the, you know, the bell curve.

[00:02:04] But that side of the bell curve is expanding. So it’s really changed. I think the biggest change also has been that the malware has changed. It used to be, you know, one sort of attack, one infection that was done, but now it’s multiple chains, reconnaissance, the complexity of the attacks and the expansion of them and the patients to the groups is just so different than it used to be.

[00:02:22] So it’s gotten much more complex.

[00:02:25] Nathan: Have we, uh, innovated, uh, in any spaces to accommodate for that? Like what, what have the changes looked like? What are the focuses now? When it comes to cyber security protect.

[00:02:36] Greg: Yeah, I think, you know, frankly, the industry didn’t do a great job in the last sort of, you know, five, six years in some ways.

[00:02:42] Uh, we had some great sort of periods of advancement. We’d get ahead of the bad guys. I think we were a bit of behind in some ways. Uh, but there’s some really cool things happening right now, which I’m encouraged about personally. So one of them is that gathering more, telemetry, more information, more signals, and making better use of it.

[00:02:57] We’ve kind of promised about that in the past, but haven’t done it. So there’s technologies, they would call like XDR or expanded detection and. What it’s doing is it’s gathering a whole bunch of information, uh, from non-traditional sources and using that to make those sort of dots that we can connect.

[00:03:12] So instead of just sort of the front door alarm in your house, now we’re gathering other signals like, you know, heat sensors or temperature changes or noise monitors. Think of like all these other things we could have sensors about, which could give us an indication the bad guys are getting in. So that’s one, um, I think also, uh, making products work better together.

[00:03:32] That this sort of vendor silo business, that’s just been terrible for, for everybody involved and having stuff communicate better and also individual sort of products working with even the portfolios better. So we talk about platforms. We talk about that and new architectures we’ve had the same architectures for, you know, almost 30 years and those are starting to change now.

[00:03:52] So this it’s all good news, I think.

[00:03:55] Nathan: Excellent. So I think you, you touched on some of these, uh, uh, trends that were, that are happening today in, in cybersecurity. What are they, you know, what are they serving to, to help protect, um, in, in particular, when we’re talking about these, these trends,

[00:04:10] Greg: the, what we’re trying to solve now is very different.

[00:04:13] So it used to be kind of spot the bad thing, kill the bad thing that whackamole game. Now, I think the game has changed it’s and we know we’re gonna find things. It’s just, how fast can we find it? So lowering that, that time to detection, time to response. Uh, and I think also just knowing where to look that even just knowing what our attack surface.

[00:04:30] What sort of machines we have that are even what machines do we have that we need to protect. Those are two massive tasks we have that are turned out to be some of the hardest things.

[00:04:40] Nathan: So are we talking about, um, kind of corporate level, uh, industry level, organization level protection, or, um, are we also talking about, um, you know, individuals or communities, you know, kind of government initiatives or, things like that.

[00:04:55] Greg: Yeah, both fronts. I think most of the effort required individually right now is just for individual organizations to know what they have, uh, being able to spot stuff, get better telemetry and faster time to response. That’s kind of the, you know, the individual or local level. Uh, but then going beyond that, there’s great things happening.

[00:05:11] Like for example, organization called Mir, um, with the minor attack, framer is helping make more repeatability or giving us sort of a, a playbook that when we see some really advanced attacks happening, you get an earlier sort. Clue about it and saying, yes, this is, you know, the, you know, the east European fancy bear group that is doing this, um, uh, it makes it, you know, faster time to detect.

[00:05:32] So there’s some sort of intergroup sharing, but I think that’s probably the next generation of advancement to be made. Fantastic.

[00:05:40] Nathan: And so again, thinking about, you know, when, um, you know, whether it’s a, whether it’s a small business or a large organization, when should they be concerned about cyber security?

[00:05:51] Is there a specific time? Is it a specific type of a small business or a large organization? Uh, what are your thoughts around that?

[00:05:59] Greg: Uh, the day you start your business, you have to unfortunately be concerned. It used to be the case that only sort of big organizations used to get. Or if you have a lot of money, but, uh, folks don’t care now, uh, you know, the baddies don’t care, uh, one example was we put up a, a fake factory.

[00:06:13] Uh, we even used equipment. We didn’t just try to virtualize it and pretend cuz they can spot that. So we put up actual equipment that looks like a factory and analyzed who was attacking it. We made it vulnerable. Um, there was two groups, there was the targeted ones who would say, yes, this is a factory. We know what we’re doing.

[00:06:28] But uh, close to the majority were just drive by people and said, Hey, a windows machine. We’re gonna try to ran some. So, but that indicates to me is that, um, it’s, you know, uh, even if, even if you’re a non-profit organization, even if you have two machines, sadly, um, you know, or a potential target, so what are

[00:06:47] Nathan: the threats? And you, you mentioned ransomware, what are the threats that they should be most concerned about? And, uh, and what can they do about it?

[00:06:53] Greg: Though the biggest ones today, I think are number one is phishing or, you know, folks trying to get you through email because it turns out that over 90% of attacks have email vector involved, somewhere in them.

[00:07:04] Uh, and email is cheap. Uh, phishing is easy and we see that every day as you get, you know, uh, as you get false attempts to either steal your credentials, get you to pay money to somewhere you shouldn’t et cetera. So that’s, I think number one, number two, ransomware is always a result of this. You know, if you have any vulnerability, the bad guys can do two things.

[00:07:21] They can use that vulnerability to install ransomware or other malware, or use you as a, a platform to do other kinds of attacks. So I think that’s the two big things is that if you take the three steps of, you know, uh, back your stuff up, uh, patch, what you have and then, um, you know, try to have, uh, you know, some level of, of other visibility.

[00:07:42] I think that’s, that’s the three steps. I think they can protect more businesses.

[00:07:46] Nathan: Interesting. I’ve also heard a lot about, uh, kind of the psychological plays that come in here kind of a social engineering that comes to, to some of these threats. And so thinking back at the individual level, I mean, a small business is still, or a large business at the end of the day, constitutes a bunch of individuals, um, that might be targeted.

[00:08:02] So, you know, how important are those, uh, is this kind of a psychology behind this and you know, is there anything that can be done to, to prevent cybersecurity on, on that, uh, prevent breaches on that front?

[00:08:14] Greg: Yeah, there’s two ways to do that. So one in fact is interesting. You bring that up because this social engineering is a whole new sort of slice of the pie that’s been expanded lately, because if you can’t get through technology, um, you know, use the phone to fool people and there’s some great, uh, uh, there’s some great people who do that, uh, you know, especially you see ’em on the conference circuit, they do a great job of it.

[00:08:32] Um, but beyond that, what they’re trying to do is then get you to do something such as install something or send some information. And that second step is where technology can help. So you can only educate people so much. I believe, I think it can be overdone as well, but beyond that, when mistakes inevitably will happen, what are you gonna do about it?

[00:08:49] So this is why I’m excited about things like XDR, because we can get clues from things and saying, if you see unusual activity coming from somewhere, even if it’s not malware, that can be a clue that there’s something going on, uh, in the infrastructure.

[00:09:02] Nathan: can you talk a bit more about XDR you mentioned it earlier, you know, how does this work for, um, you know, from individuals to, to, uh, organizations?

[00:09:11] How, how does it work practically speaking?

[00:09:14] Greg: Yeah. Um, you know, I think XDR is kind of a large company, not a large company, but it’s a, you know, for, for, for a, for a company, not so much for an individual, but the, the principles stay the same, um, historically insecurity. We’ve had these sort of, you know, you get a few clues, but they’re really good clue.

[00:09:29] So, you know, uh, the analogy would be, uh, you know, we have a picture of somebody stealing the bar of gold, uh, that is an indisputable sort of thing. That’s a cause for concern. But instead it turns out that the bad guys, you know, know where the cameras are. They know where the sensors are instead, they’re gonna take other measures.

[00:09:45] So what other clues can we get? Um, we can get a lot of clues. It turns out, but they’re not gonna be sort of 100%, but together they will form a compelling sort of warning to us. And. You know, uh, there’s there’s stuff going on that we need to be concerned about. Uh, and that’s takes a lot of data, but we have the capability now to store large amounts of data.

[00:10:05] And it turns out that the detections that we can find, um, you know, we don’t need that picture of the, our gold theft. We know that when we see, you know, that front door open and it’s the person we’ve never seen before, and they’re carrying a bag, we get all these clues. And, um, you know, we can have earlier times of detection with this, uh, which is again, crucial before damage.

[00:10:25] Nathan: fantastic. Yeah. And so, uh, that combined with, I suppose I wanted to take the question again towards, uh, kind of like the psychology of the individuals and how that might play, uh, together with that, um, you know, to really tackle any kind of data breach that might happen, um, for an organization. So there’s that, that preempt.

[00:10:48] You know, fast detection and then there’s the kind of educating, uh, your employees or your company set. Is there, are there practices like this? Like, are there gold standards or places that you know of that are kind of embracing all these angles? Um, that could be, uh, either mimicked or yeah.

[00:11:09] Greg: Yeah, absolutely.

[00:11:10] Uh, you know, the, the education part is, is, uh, you know, in the best organizations it’s done. So, you know, you’ll see sort of, uh, you know, uh, phishing education campaigns. So, uh, you know, something will be sent out to employees. Uh, and then, uh, you know, instead of naming shaming individuals, you publish around a total and saying, Hey, you know, like 30% of the people clicked on this.

[00:11:31] When, uh, you know, it was obviously from the outside and it was, you know, uh, obviously, you know, asking you to put in credentials, um, uh, and track that over time. And I think the people won’t necessarily change so much, but maybe some of the, uh, know there, there will change a little bit, but not enough that tech technology can’t help it.

[00:11:51] Right. Are there also when there’s an event that happens or a special, uh, sort of event, like for example, you know, there was a tele outage and there was talk of refunds immediately. There was emails sent out by, by fishing and spam to try to get people, to claim their payback, you know, claim their refunds.

[00:12:07] Um, so again, people are stressed taught, you know, educate them. Then there are times of stress times of, you know, um, uh, personal, personal issues or whatever the case is. That’s when they’re most vulner.

[00:12:19] Nathan: To expand on that. Are there individuals that are most vulnerable to the, these types of cyber security?

[00:12:25] Greg: it has been, who is the most valuable have been executives. Right? So if I’m trying to do a, what we call a business, email compromise, or B C attack, and I’m trying to get somebody to, you know, uh, send me an invo, send me a, you know, a million dollars, um, that is, you know, great sort of whale fishing. It’s also called that, uh, you’re going to, uh, get an executive to do that.

[00:12:46] And it does happen even. There’s been stories within Canada that are heart prideful profile all around the world as well, where this happens. So we have specific tools to do that. Even writing analysis tools that will say, Hey, look, you know, this request from the invoice from your CEO is fake. Uh, Mr. CFO, that’s, that’s not them.

[00:13:04] Um, so that was the, the target, but now it’s shifted one level down the bad guys realize that yes, the, you know, the extra scrutiny that’s put on CFOs and CEOs is there, let’s go down to their directors. So in fact, just recently we saw a ship they’re going after the people that report to the people who signed the checks now.

[00:13:22] Nathan: Incredible. It continues. So is yeah. Is there, um, to switch gears a little bit, is, is there a difference between data breaches and, uh, and data privacy?

[00:13:33] Greg: Um, that’s a really good question. Um, I think that the there’s a, I think the, the similarity between data that’s sensitive for business purposes and data that’s sensitive for privacy person purposes ends up being the same.

[00:13:47] Like we treat it differently. I think we ha you have to treat it differently. Um, because, uh, even, you know, a CEO should not have any sort of access to individuals, priv private information in the company. Um, But there’s a real, uh, change in, I think how we’re viewing privacy and there’s some really thorny issues about that.

[00:14:09] So for data privacy, now, one of our greatest concerns is lawful, um, uh, lawful intercept or lawful disclosure. So in some countries like so many corporations are operate sort of multi nationally now. So within, you know, the us or Canada, if the authorities come and they have a warrant, you have to, you know, cough up the information legally, unless you believe it’s a, a bad warrant.

[00:14:30] But generally we cooperate with law enforcement, but in some company, in some, some countries, um, you know, the local law enforcement may be the bad guys. So how do you, uh, deal with that? Uh, so that’s as much of an issue, I think as, uh, in many cases having, you know, the, the information breached as if in fact it’s probably worse because I can disclose if I’ve had a data breach of private information, but if the authorities go after it, I may be legally bound to say that I can’t tell you that there was a warrant and they got this information.

[00:15:00] That’s a really, really tricky part of the business right now. Yeah.

[00:15:04] Nathan: It makes you think also a little bit about kind of when the cybersecurity infrastructure is established for an organization or a company, are those questions kind of ethical questions in terms of like your privacy or your freedoms? Uh, are those thought of, at all either when, um, on the organizational level, when they’re establishing a cybersecurity kind of infrastructure or even, uh, uh, when cybersecurity product companies are developing their technology, um, to protect, is that considered at all?

[00:15:33] Is that a phase of, of the development of any product?

[00:15:37] Greg: Yeah. In, in my last job, you know, with Gardner, I interacted with, you know, almost every cybersecurity company in the world, these, the larger ones, and I really divided them into the two camp. There were those that definitely took a, Hey look, you know, we’re gathering information through the security process that could be, uh, harmful to individuals.

[00:15:56] Um, we, we should take an ethical path there. And then where there were those that in the race to, you know, IPO and funding, uh, never put a, a moment’s thought to that. Uh, and it’s all about the money. Uh, and that’s one of the, in my personal opinion, one of the, the, you know, the downsides. The bay area sort of ethos of it, and cybersecurity is not matching the ethical concerns with the business concerns.

[00:16:21] Um, now that being said, the biggest companies, I think actually do a great job about that. And before they’re doing, for example, you know, allowing packet captures, uh, well, here’s a great example. Um, some of the, uh, information gathering you can do on the wire, you can exclude things such as, um, credit card numbers or, uh, anything going to a health provider.

[00:16:43] So you can do a lot of, you know, security gathering information, but exclude things that are uniquely private and shouldn’t maybe not be gathered or at least give the capability of doing that rather than just, you know, scooping up everything. So I think it’s there. Uh, I would like to see more legislation around that for sure.

[00:17:00] Nathan: Are there any governments that you know of that have been trying to Allegis, uh, legislate that process? You know, GDPR, I know, uh, in, in Europe is, is one step that Europe has been trying to take, uh, for that. But, um, do we have any other examples of, of that working and, and kind of where it might be going?

[00:17:17] Greg: I think what the, the compliance regime we have now for GDPR, uh, you know, HIPAA and, uh, uh, you know, PIDA. Uh, these are good, great compliance sort of regimes as the goal, but I think we’re missing often the technology to implement it. So you can have this, you know, you can have good rules there about what’s the enabling technologies that are there.

[00:17:38] So this ability to exclude information or flag it, um, or even track or tag that information, that’s always been a, been an area that’s emerging. So I think that would be a benefit to, um, uh, at least maybe some standards, uh, if legislation isn’t going to tackle. to say that, you know, in products, there could be some kind of privacy level of, of, of, uh, seal of approval.

[00:18:02] If you have those capabilities.

[00:18:04] Nathan: Absolutely. Well, it’s comforting me here, at least that, uh, there has been some thought around it and they are taking actions. At least some businesses are it’s good to, it’s good to know. And, and the legislations PI pita in Canada specifically, I mean, it’s good to, it’s good to know that it’s moving in that direction, but I mean, what you, you know, tracking technologies and whatnot, I mean, how far away do you think we are until something like.

[00:18:24] You know, efforts like that can, can align?

[00:18:27] Greg: Uh, I think, I think the, as I said at the start, I think it’s encouraging about the direction right now. So with this, uh, you know, no B C 26, uh, you know, talking about, um, you know, mandatory disclosures, uh, for critical infrastructure providers. I think that’s fantastic.

[00:18:42] I’ve always been a fan of mandatory disclosure, not to the public necessarily, but at least to, you know, uh, security authorities that if our, you know, infrastructure players are all getting hit by something one, they should tell, you know, call the police. Number two is, uh, that the, you know, uh, our, our cyber defenders at the federal level can at least, um, you know, have, have a communication that’s going there that we don’t just expect them to spot it when we’re not helping them find.

[00:19:09] That’s obviously, you know, part of the legislation is also to make sure that’s done in a, you know, a way to protect the company, sort of, you know, intellectual property. Uh, but you know, I think with so much is interconnected now and it’s getting more interconnected and we’re also getting surprising interconnections.

[00:19:24] We didn’t know we had, um, that’s, uh, I, you know, I think mandatory disclosure, especially for these key organizations. Yeah. It’s it has to.

[00:19:34] Nathan: Fantastic. And so, you know, why is data privacy, um, so important? Um, again, I mean, it might be a bit more obvious on a, on a business level, but for individuals themselves, like why should, uh, why should people care about data privacy going beyond the obvious, you know, financial credit card information being leaked, uh, sort of thing

[00:19:55] Greg: as more information I think in this AI and machine learning.

[00:19:58] Sort of era that we’re entering, uh, the more information you have disclosed about you, the potentially negative impacts you could have. So I think for a slice of the population, that’s of no factor, but, uh, you know, when we see potential small indicators for, you know, some certain health information, uh, or for, um, you know, if, even if you’re a journalist, uh, you know, even having some minor, uh, amounts of your information disclosed, uh, that could lead the loss of life for a source in, you know, many C.

[00:20:27] Um, you know, or get you denied for health coverage or auto insurance coverage or something just based on data, which may or may not be correct, or maybe which you don’t wanna share with people that could affect your education, it could affect, um, you know, and if the wrong groups are gathering it too, uh, it could lead to, you know, uh, you being, um, you know, exploited or, uh, especially for young people, uh, could lead to, you know, some, uh, some exploitation, uh, you know, you know, that could, that could be, it could be just awful.

[00:20:55] So, um, We’re all responsible for our own privacy in one way. But, um, I, I think that, um, you know, we’re tracked so much now and there’s so much information available and we have the tools now to, to link that information together for bad purposes. Not, not just good purposes that yeah, I know. It’s increasingly concerned.

[00:21:12] Nathan: Yeah. I mean, we were just talking about this among my colleagues, how easy it is now to, I mean, you, you. When you wanna log into any kind of app or you’re using any kind of these, you know, if you’re using TikTok or if you’re using like these beauty apps that, you know, track your face and, you know, we immediately, or do you wanna log in using your so-and-so credentials from Facebook or from Google and they immediately get access to a bunch of your information that you’ve provided to Facebook, for example, or, uh, and we often click on that, um, because it’s easier, right.

[00:21:41] Um, to just click on that, but then they get a wealth of information there. The conversation is always, you know, how much does that matter? What are they gonna do with it? Um, is it just gonna lead to, you know, if it’s a beauty care app, you know, am I just gonna get better targeted ads? Uh, you know, are they gonna use my data to create some, you know, train some AI model to, to develop some, some new tech that might actually be useful?

[00:22:03] Or, you know, what’s the, what’s the rabbit hole. What’s the path that could lead to something. Um, more destructive. I mean, you touched on it a little bit, but is, is it, do you think that’s common commonplace or there’s, there’s more risk involved?

[00:22:17] Greg: I think there’s more risk and it’s different for individuals, especially the most vulnerable.

[00:22:20] I think if somebody who may be transitioning and that information, or, you know, older information being there that they may, you know, not want to have. Uh, I also think of, uh, you know, for example, I’ve never done one of the 20, you know, one of the, uh, you know, uh, uh, genetic, uh, results for genealogy purposes, the DNAs, because, uh, I am, you know, possibly disclosing information that all of my aunts, you know, my future sort of, um, you know, you know, children’s grandchildren, et cetera, have to live with that impact of.

[00:22:48] So, um, it’s, it is, uh, it is challenging. Uh, I really like some of the European rules that are, that are in place to really take a stronger hand in this. Um, but you know, I think it’s caveat EMPA right now that, you know, the more information that you’re giving out and, you know, if you’re clicking through some of these excepts, um, you know, you’re getting these apps for free for a reason. Um, and that’s because you’re, you’re, you’re paying with your privacy.

[00:23:12] Nathan: Yeah, absolutely. Talking about the genetic information. And it just makes me think how much of this cybersecurity, you know, what, what, what can be put in place reg you know, regulation wise to allow companies, um, you know, genetics companies, uh, like 23 and me, for example, to, to, um, To still be able to advance the science, um, that they hope to advance by gaining this data faster than say a research, uh, a group at a university might be able to gather what kind of, you know, security measures could be placed.

[00:23:45] Do you think, uh, or regulation measures to still let people feel comfortable enough to share that information, um, uh, to allow the science to grow and expand faster, um, uh, without feeling threatened by, you know, all they have all my genetic information, all my

[00:24:01] DNA. Yeah, I think the, uh, there has to be sort of legacy mechanisms in place that when companies are sold or data sold, that it, uh, you know, the protections carry on with it.

[00:24:11] So for example, if I, you know, if my company’s bought, uh, you know, if I have a, you know, a genetic company and it’s bought by another company and another jurisdiction that the approval for that sale could be contingent on, uh, you know, protecting the information of those citizens. Um, so, uh, you know, I think there has to be something around more the approval of.

[00:24:30] Uh, and, uh, also the end use that it’s gonna be put to. So it’ll be very easy for a, you know, a, even a cord blood company, for example, to be bought by somebody in another area. And there’s incredible amounts of information that could be gleaned there. Yes. Possibly for fantastic medical benefits, but, um, you know, uh, you know, cord blood has a lot in it.

[00:24:50] So, uh, for, for not only DNA results, but other other possible medical medical information or personal information. So yeah, it. It’s a really thorning field.

[00:24:59] Yeah. Yeah, absolutely. Well, I hope we navigate it well, um, in

[00:25:06] Greg: that, that scenario, I think we’re losing out on actually is the privacy front, uh, where we’re doing well in security with privacy, we seem to be losing ground so, so quickly right now it’s, it’s a bit discouraging.

[00:25:15] Nathan: Just the ex the acceleration of the types of apps and technology that’s coming out, just AI algorithms, uh, people training, different types of AI models and all the different apps that accompany that. And then it’s kind of like the wild west right now, a where all these things kind of come out and, and everybody’s just using them.

[00:25:31] Um, do you think that, uh, uh, the future of cybersecurity, um, there’s gonna be an extra effort on that front, uh, when it comes to, to privacy or, or what do you think the future holds when it comes to, um, cybersecurity in general? What kind of threats? Uh, what kind of innovations? Where are we focusing? Yeah,

[00:25:50] Greg: I think there’s a big change coming.

[00:25:51] So I’ve talked about some of the advances we’ve seen in the positive things. Um, you know, I think we’re entering this era of platforms now. We have all these sort of disparate pieces and they don’t work well together. But that, that I think is where the platforms will fix this, that, uh, today we have, you know, APIs or how we, you know, get information from things.

[00:26:09] Um, I think the two way API is really the future in cyber security, where we’re gonna be able to get information, but also send more information, uh, and, and provide it. And the richer, those are, I think the, the better that people will be protected. Um, you know, I just, uh, that I think is going to be the next sort of, you know, competitive.

[00:26:26] Uh, along after XDR and sort of better telemetry is also the, how we share information, the share that telemetry that we have in much richer ways. Right now it’s really basic. It’s almost like Morse code between some products and that’s one way I can receive a signal, but the ability to send a signal back to saying, Hey, stop that connection.

[00:26:43] Or, or, you know, reimage that endpoint. So you don’t have to quite limited. So I think that’s really encouraging as is better use of AI and machine learning to help make faster decisions better because we’re so short of qualified, skilled people in the field, um, on the privacy front though, as you brought up, uh, that’s probably, I think the area that almost needs to be a separate market, that it’s part of cyber security, but it seems like there’s so little value put to it.

[00:27:08] It’s just remarkable. Uh, even when the legislation is there, there’s. Push for it. Like, you know, as you mentioned, a few of the, uh, you know, the good sort of compliance regimes, but, uh, I, I don’t, I think that putting greater value on privacy has to be a future, a future advancement.

[00:27:26] Nathan: Great. And, and just to talk about blockchain and, and cryptocurrency, because it’s all the buzz these days, I mean, you know, where do you think blockchain plays a role here in, uh, in cybersecurity?

[00:27:37] Greg: Um, we’re in a really bad spot right now, because this has been garbage from day one. Uh, you know, I’m an old cryptographer, um, you know, I’d spent a lot of time with the military cryptography and then, you know, in the PKI era, worked with a lot of technology there and, uh, I smell the rat and we’ve seen the rats now then in fact, you know, the blockchain is awesome.

[00:27:58] It’s a fantastic, solid technology. But it’s much like transporting using an armored car checks written in crayon to people without addresses under bridges randomly, it just, you know, the stuff around the blockchain and around cryptocurrency is just the worst thing in the world. Uh, your banks don’t operate this way, so why would we allow things to trade other kinds of money to operate in this way?

[00:28:22] So, um, I think that, for example, the blockchain is a great thing for, uh, in, in, in medical and privacy. I think that that has been underutilized to date. Uh, I’m cautious to say that, cuz I don’t want it to be oversold. Like a lot of the cryptocurrencies have been based on this, but it’s all in the implementation.

[00:28:39] I think that you have this fantastic tool, but what useful we put it to. And for example, me, be able to control my medical records and who they get sent. Uh, and see a, a record of who then access them afterwards. That would be extremely powerful from a privacy region, uh, your privacy perspective, but the apps aren’t there yet we’re are, you know, most of our medical sort of, you know, apps are very rudimentary or what I can see or control even from my own medical providers, but that’s, I think the future and be able to have that information travel with me.

[00:29:06] Um, I know there’s work going on in that area and that’s exciting. Um, I’m excited about, um, you know, using blockchain for other reasons as well. Just for example, just in the day to day life of, uh, you know, even preventing art fraud. I think that’s fantastic. Use the tech as well, but it’s the apps and the, uh, processes around that, that for some reason, people think they can take shortcuts on.

[00:29:29] Nathan: An exciting place to go, but it’s gotta, it’s gotta be done well again, I mean, we’ve gotta, I have a hammer. Hear you. I have a hammer. What can I do with it? Yeah, yeah, exactly. That’s right. Well, Greg, thank you so much for your time. We, we really appreciate it. I don’t have any other questions for you today, but I think, um, you know, this was very helpful, very useful, and, and it really, really appreciate your time today.

[00:29:51] Greg: Oh, I appreciate the questions. This was, this was really really, really interesting and thought provoking questions you brought today. So this has been great. Uh, I think everybody should have these kinds of discussions more in the industry.

[00:30:01] Nathan: Fantastic. Thank you so much.

[00:30:03] Greg: Oh, thank you!