A Journey through Time and Security (Guest: Bill Burns)

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, A Journey through Time and Security (Guest: Bill Burns). The summary for this episode is: <p>Our guest this episode has had a front-row seat at some of the most game-changing innovators in the Internet Era - Netscape, Netflix, and now Informatica. He has seen the Information Security industry change dramatically since he first cut his proverbial teeth on it in the nineties. Our discussion touched across 20+ years of dramatic changes in the way business is done on the internet and how we as a society think about information security - all the way from security in a college computer lab, to security at Netflix, to his plans at Informatica, and to the new privacy regulations coming out of the European Union - the General Data Protection Regulation (GDPR).</p>

Ben: Welcome to Masters of Data, the podcast where we talk about how data affects our businesses and our lives. And we talk to the people on the front line of the data revolution. And I’m your host, Ben Newton.
Our guest today has had a front row seat at some of the most game changing innovations in the internet era – Netscape, Netflix, now Informatica. Bill Burns, currently the chief trust officer at Informatica has seen the information security industry change dramatically since he first cut his proverbial teeth on it in the ‘90s. So, it’s no surprise that when Bill and I sit down for the interview that our discussion touched across 20 plus years of dramatic changes in the way business is done on the internet and how we as a society think about information security. So, without further ado, let’s jump right in. Welcome, Bill.
Bill: Thanks for having me.
Ben: And just really excited when I was looking in your background at some of the stuff… You’ve really been around the block.
Bill: [Laughs]
Ben: I was looking at some of the stuff you’ve done. You’ve been at Netflix. Of course, I’m a huge fan of Netflix.
Bill: Yep, awesome.
Ben: My kids in particular are big fans of Netflix. And even back to Netscape. So, literally at this point, you’re internet royalty going back to Netscape.
[Laughter]
Ben: And you’ve really been…like I’ve said, you’ve done a lot of things in this industry. I was really interested in how you got into security, because that doesn’t seem to be where you started. So, how did you kind of get into the security realm based on where you kind of came from?
Bill: I got into the security realm because I got caught in college. So, the story goes that my buddy and I had a computer, but we didn’t have a printer. And we needed to print our lab reports. And the only place on campus where we went to school at Michigan Tech University…the only place you could print from a Mac was in the humanities lab, the arts lab. So, we were engineering students, but we would sneak in, and we would eventually find this printer that was hooked up to all the Macs.
And it was the only place that had Macs. And so long story short, after a couple months of printing on their printers, then I was caught. And they’re like, “Hey, you’re not in the humanities department. How come you are printing on the Macs here?” So, I had written some software that basically took over control from the print lab. I could go print my reports. And eventually, someone said, “You need to go talk to the lab director,” who turned out to eventually become my wife.
Ben: [Laughs]
Bill: So, as one of the few females on campus, getting “in trouble” to talk to the lab director was a pretty good thing in the end. So, I was always curious. I was always sort of taking things apart. I was always trying to figure out how things worked. So, curiosity was just one of the natural sort of traits I had, and that’s what led me to get into computers and figuring out how things work. And then eventually not just how they work but how they work when they’re sort of poked a little sideways or sort of told to do something that they weren’t designed to do. That turns into sort of fuzzing and security testing.
And so while I was getting my EE degree and my business degree, I realized I didn’t like analog circuitry design, and I didn’t really like putting together a computer at the sort of molecular level and the physical level. I really enjoyed the networking and the communications aspect, and then literally how does data flow between computers over wide area networks. And so I just got into that part of the electrical engineering and the computer science. And naturally, it felt into security. And at the time, security was always homebrew. It was not something that you bought off the shelf. Firewalls were… The very first firewalls were largely a kid. So, now it’s a commodity feature in products. So, that’s how I got into security was just sort of you had to because you had to figure out how to make your system secure. It wasn’t a button you turn on at the time.
Ben: Yeah, it’s funny you say it that way. I don’t know how many computer science-oriented people I’ve met that came from kind of the electrical engineering of the physics department. I actually was originally in physics. And I love playing with the lasers and the different toys. But I ended up enjoying the computer science better. And then I also realized that they were out there becoming millionaires, and no physicist becomes a millionaire.
Bill: That’s right. Lasers are cool, but… You know.
Ben: At some point, you have to feed your family.
Bill: [Laughs]
Ben: No, that makes a lot of sense. And in particular, what do you think has kind of kept you in the field? So, once you really got into it, you got a taste of what it was like to really dig into the guts of things, build stuff from scratch. What kept you going in the world of security?
Bill: That’s a great question. So, at some level, you are comfortable with building your system to meet your needs or to protect you against the things that you’ve thought of and the countermeasures you’ve put in place. But then once you get into the real world, and you get people who are motivated by different aspirations, you are… Maybe it’s ideological. Maybe it’s monetary. There’s attack vectors you hadn’t thought of. And so it’s the cat and mouse game.
It’s the, “Wow do you stay ahead of the hackers? How do you stay only one step behind the hackers?” Being a protecting and educational lab versus our corporation. Very different, small/medium size versus large size company, all sorts of different threat vectors, budgets that you can apply that you can buy versus build. The challenge for me was always the, “What will they think of next? And what can I think of next?” So, it was always that sort of gamesmanship.
Ben: You know what it reminds me of is when I was in college. I think that was the pitch that the National Security Agency was using…they were trying to recruit people on campus. But it’s fun. Really, I remember I was listening to another interview you did and how you talked about that. I like that way of thinking. But it’s a competition. There’s some gamesmanship about it. It’s serious stuff, but it’s also the challenge. The puzzle I think is one of the words you used.
Bill: Yeah. When we were at Netscape, we hired Paul Cotcher [Phonetic] to help us build SSL. We needed get the web browser to be a place that someone was…that they trusted the browser enough that they would put something like their credit card in and do electronic transactions. And at the time, this was in the sort of early to mid ‘90s, that was kind of a crazy idea. Like, “Can we make a web browser this trustworthy?” And one of the things Paul mentioned when we talked to him was you treat security like a game, like chess. But you don’t want to design your security so that if it ever has a check mate move, the game is over. You always want to design your systems so you at least have another move, even if it’s a stalemate. That’s a better solution than a checkmate.
Ben: Oh, that’s a really interesting way of thinking about it. And when you mentioned the credit cards, it is pretty amazing how things have changed so much. Because I remember when everyone was terrified of using their credit cards on line.
Bill: That’s right.
Ben: And I remember when I started doing that, I was like, “I’m more terrified of giving it to the waiter at the restaurant, because it’s actually much likely they’re going to steal my credit card.”
Bill: That’s right, yep. Yep. Yeah, that’s a crazy thought.
Ben: [Laughs] Well, thinking about what you did after Netscape, in particular… Like I said, I’m a big fan of Netflix but not only because my kids love watching cartoon on Netflix. And I do, too. But also because of the transition that Netflix went through. I’ve heard the operation story so many times about moving from the send the DVD in the mail to the streaming. But I’ve never really heard it from the security perspective. And I’d love to hear what was your experience sitting in the seat that you were in from information security watching that transition at Netflix going on from this very traditional physical business to a streaming business, being first in the Cloud on AWS, things like that?
Bill: Right. Yeah, by the time I started at Netflix, the company had already started the migration. And so really they were focused on, “How fast can we go? How fast can we move out of the DVD business into the streaming business? How fast can we grow that business? How fast can we grow it internationally but also features?” And so there was several interesting patterns and anti-patterns that I learned there. And this was in the early days. So, I think one of the first days I started there, they had done an experiment where they took the “on premise” web server and put it in Amazon’s cloud. And it failed spectacularly.
And you’d normally think like, “Oh my gosh, someone is going to get in trouble,” or there was a lot of bugs being files, and there was a problem. And it was seen as an experiment, and they learned a lot of things. And there was data. And there was one room that I was in that everyone was pouring over these giant screens full of data. And it was interesting they were all trying to figure out, “What did we learn from this? What can we improve?” And it wasn’t seen a failure, it was seen as, “This is an experiment. We’ve learned a bunch of stuff, and we’ll improve later on.”
Ben: Yeah.
Bill: So, early on, it was a culture that, A, they appreciate experimentation, but everything had to have a hypothesis, and everything had to have data to back up what was your next move.
Ben: That was probably pretty uncommon at the time. Now that’s become kind of a…the idea of experimentation, data driven decision making seems to be digor. But I don’t think that back then…that wasn’t really what people were doing.
Bill: Right.
Bill: And part of their culture deck was we value data, we want the most informed person to have an opinion and to make a decision. Otherwise, it just devolves into who has the biggest title, or who’s got the loudest voice in the room. That’s not really an appropriate way to make business decisions.
Ben: Yeah. Yeah, no, absolutely. It’s a very human way to make decisions, but they’re not very effective.
Bill: Right.
Ben: One thing, too. I remember when that was going in, and a lot of us watching Netflix…everybody kind of talked about kind of like the no ops versus devops kind of thing and how Netflix played into that. What was it like being on the security side of that? How were you interacting with those teams that were working that way?
Bill: Yeah, good question. So, that was where I learned a real practical use of the metaphor…the security metaphor for guardrails. So, rather than controls, and yes and no, and sticks, and you sort of beating people over the head with these rules and policies, you really understood guardrails, which was, “How can I create the system…how can I codify the security policy into the system so that if you do the right thing, if you check in code with the right method, and if you follow the right routines, the security sort of protects you?” It’s kind of guardrails on a road. And if you a dev test environment, the guardrails are really wide.
There’s a lot of innovation, a lot of chance for the developer to make mistakes because they’re probably not dealing with credit card data, for instance, or really sensitive information. And so they’re free to make more mistakes. They have wide guardrails. The closer you get to the sensitive data, the protected and regulated data, the guardrails get really tight. And so what that means is the developer has less freedom. So, maybe this is the production environment with credit card data.
Maybe it’s Sarbanes Oxley significant. And so the change controls are more stringent, the sign-off’s. So, the process feels a little slower. And everyone understands why, but it’s not like there’s one size fits all change control system, for instance. So, you have these varying degrees. You have these…the guardrails are wider. The controls are a little bit looser in the less controlled environment. That was one of the places where I really learned, “What does that mean, and how do you actually instantiate that?”
Ben: Yeah, I know. I love that analogy. Because when I was starting out in kind of the dev op’s area in the early 2000s, it was you dreaded the security people getting involved because they restricted activity, slowed down activity, actually broke things on a regular basis.
Bill: Right. That was the department of no as opposed to the department of how.
Ben: Right. Right. And I love the way you… Because it’s security almost being an enabler as opposed to…and enabling the right type of activities as opposed to just restricting things.
Bill: Yeah. And I think part of that comes from risk or risk management. So, risk is not binary. And the higher up in a company you go, the more comfortable you become with ambiguity and with risk. And in some cases, you have less data to work with to make a decision. And so you become more comfortable with, “All right, what’s the least amount of data I need to make an accurate or good enough to decision?” And if you’re new to let’s say engineering, you’re trying to get really precise with tons of data, and you end up being slower. And you make decisions more carefully when in fact you need to really understand, “What’s the word that could happen? What’s the best that could happen? And what is the environment that I’m dealing with?” Right?
So, if I’m in a dev test environment, and I don’t care about availability, I’m not playing with production data or regulated information, I can afford the developer to be more creative and not worry about some of these other parameters. So, you make the guardrail decision based on the risk. And so then you start to understand the nuances of, “Well, what’s the business impact? What’s the worst that could happen if this developer, his code crashes, or it’s open to the world for an hour while we’re trying to figure out what the vulnerability was?” That doesn’t matter as much as a production system. So, what am I going to do to this developer’s day in day out job if I have one security policy that says no across the board?

Ben: Yeah, I think that’s a great way of thinking about it. And I also love the way you talk about being able to make decisions in the…with less data. Because I would expect that on your…in the roles that you’ve been in…typically I…having come more traditionally from operations areas myself, there’s a sense that you have all the data you need. Which, actually, usually is not true. You think you have the data, and a lot of times you don’t. It seems to me like almost when you come from a security perspective, the idea that you don’t have all the data is actually baked into the whole way of thinking.
You know that you don’t have all the data, so how can you make the best decision based on the data you have at hand and try to lower the risk like you’re saying. That totally makes sense. When I was looking at your blog… And I remember I really like this… I’m going to take a quote from one of your blogs, and I want to talk a little bit about it. So, one of the things you said is that, “As a security professional, I can attest that the lifeblood of any company is the sensitive data that they process.
Protecting this data is the charter of a company’s information security team and a responsibility of all employees who work there.” And I just…I really like that kind of almost a picture of the lifeblood of a company. It seems like that’s actually something that’s changed pretty dramatically over the last few years. So, when you write something like that, tell me a little bit more about what you’re thinking when you say it’s the lifeblood and particularly in what you’re doing now with Informatica.
Bill: Yeah. And I think I wrote that before some of the bigger breeches that we’ve heard about. So, the OPM breach, and Equifax, and the companies that you literally entrusted your most sensitive information to. It got lost. It got breached. And there’s a myriad of reasons why. But people and other companies now make decisions…business decisions to do business with those companies with perfect hindsight, rearview, 20/20 vision saying, “What did they do with other sensitive data?” Right? So, customers, consumers are making informed decisions based on, “How can I trust a company with my sensitive data?” And it doesn’t have to be the most sensitive data.
It can be mildly sensitive data. It could be information that I’m talking about freely. But in the aggregate, I might wonder, “Wow, if they treat my data that poorly, what other poor decisions are they making?” So, we hear about the Facebook scandal. And people are chatting back and forth with their friends or clicking on survey results. And so in the minutia, all those little pieces of data and preferences are sort of innocuous. But in the aggregate, they start becoming really important. And they become very personal to people. So, I look at that…I think about that quote. And at the time, I was really focused on sensitive data. Like, “What’s the most sensitive data?”
But in hindsight now, it’s less sensitive data. It’s all of my personal information. And I think with GDPR coming out in May, people are saying, “There’s all sorts of attributes about myself that I may not want to have shared, or maybe I don’t explicitly want to share. And so why would someone else release that information?” So, I think there’s been a new sort of awakening of, “How do companies treat my data?” And again, it doesn’t have to be sensitive data. Maybe it’s just analytics, or it’s log information, or activity, clickstream data. All of that becomes A, valuable to companies.
They’re figuring out how to monetize that. And B, to the person or to the customer, they’re thinking, “Maybe I don’t want all of that shared without my knowledge. Or maybe if it is with my consent, I need to make an informed choice about that. And maybe I want to be more particular.” So, it really is at the time when I said that in the blog…that was the lifeblood of companies. And now we’re seeing that even the less sensitive data is still important and monetizable by companies and to individuals. They find it personal.
Ben: Yeah. And I want to come back to that. But for those who might be listening and don’t know GDPR, can you talk a little bit more about what that is?
Bill: Sure, so that’s the General Data Protection Regulation. It’s going into force May 25th, 2018. And it’s been I think it’s two years that Europeans ratified this law. And it is a data protection. So, data privacy is what we think about that in America. But it’s data protection around people in the EU, citizens in the EU. And it’s their personal information. So, their personally identifiable or attributes about that person. It’s protecting that information from leaving EU soil. So, data transfers, for instance. So, really any company that does business with people in the EU have to abide by the GDPR. And the fines are significant if you don’t.
Ben: Yeah. I was talking with our chief information security officer, George Gurtail [Phonetic], about this a couple weeks ago. And it looks like when that actually goes into effect later in May, there’s probably very likely going to be some big court case with some company having made some serious infraction and kind of… And that makes me think with the way that you kind of…you kind of contrasted those two words – protection versus privacy.
Bill: Yeah.
Ben: And it seems almost like privacy is more of a passive word describing stuff, but this protection, it seems like it’s a more active word – making the point that they are going to actively protect people’s data. And people can ask for the right to be forgotten and things like that. It’s a much more active idea about data privacy than maybe we’ve had before. Is that…?
Bill: Certainly from an American’s perspective, I agree. And I think it’s kind of an awakening across the board. It says, “Hey, these rules, these regulations that the Europeans are putting in place, they value privacy as an innate right for a person.” And that’s not the American belief, and I think people are starting to realize what that really means when they lose access to that data. I think there’s not an appreciation for, “Maybe I don’t want to have all of these online services for free in exchange for my private information that I… I guess I don’t really know what they’re doing with it.”
Ben: Yeah. Do you feel like some of that’s because the way we developed internet services here and the expectations we have for that, or is it really because of the way Americans think about it? Why do you think there is such a difference in public perception of privacy of data?
Bill: I’m not really sure to be honest. I think a lot of private information was used inappropriately in Europe in years gone past. And people have seen the ramification of using that information to target people. And in the American economy, we’re…in exchange for receiving ads, we are getting additional services. And as the marketers want to figure out, “How do I target that person more efficiently? I want to reach that demographic. I want more information about the demographic.
And eventually, I’d love to get information as a marketer to identify a person or a class of people.” And this is where I start to gather information and then use that to more efficiently market to people in exchange for services. So, Facebook doesn’t cost me anything to join. Back in the day, AOL was $15 a month. Someone has to pay for this. And I’m paying for it somehow. And in this day and age in the current internet generation, I pay for things by giving them access to my behavioral information.
Ben: I always wondered how much of that money that AOL charged went to making those CD’s that they would mail everyone.
[Laughs]
Ben: I remember getting so many of those that I was actually making decorations with them, putting them in the microwave in the lab and getting the pretty colors, and hanging them from the ceiling.
Bill: Yeah. Well, at the time, we had to buy thousands of modems and keep them in place. So, there was racks, and racks, and buildings full of modems that we had to pay for. So, I’m sure $15 a month paid for a lot of modems, too.
Ben: Yeah. [Laughs] Well, back to the… You were talking about that. As you were talking about it, an idea that came to mind is that originally when I read the lifeblood quote, I think I was thinking exactly what you…the way you were talking about thinking about it before, is that data is core to a company’s business in the sense that it’s…as a positive thing, as, “This is the way we’re able to provide better services.” But then almost the way you were talking about it - particularly with things that have happened with Facebook recently, and with Google as well, and a lot of this misuse of data, it seems like that actually… that analogy takes on kind of a sinister effect.
It’s like our private data is the lifeblood of their financial model. It’s driving their business. And just like you were saying is that there was a long time period of time... I wasn’t thinking about what I was clicking on Facebook. I’m in the industry. I know a lot more than the normal person about what I should do to protect my data, and I didn’t spend a lot of time thinking about what data was being gathered on me. And it seems like that’s…there might come to a point… Is there really going to be a reckoning around how these…our sense of having free services versus these companies making billions and billions of dollars off of our data not necessarily freely given.
Bill: Right. And to take the less sinister view…so, a company could be a completely legitimate Cloud service provider, and they’re providing information and services to their customers. And now what customers or what companies are realizing is part of this digital transformation wave is there’s all of that data about usage that is data in itself, this metadata. Now you can use that to say, “How can I give better services to my customers? I can start to mine that data and say, ‘Oh, I can split up this customer segment into sub-segments and realize that if this particular customer is behaving like this other customer, but they don’t have the same services, there’s an opportunity that I can upsell. Or I can expand, or I can check in with them.’” And so now metadata becomes more interesting, more monetizable to companies. And so they realize that all the stuff that they used to be not tracking and sort of leaving on the cutting room floor, that’s actually very valuable. Now you can start to make a business out of this behavioral data. And it’s not for sinister purposes, it’s for completely legitimate purposes. We just didn’t have the log management. We didn’t have the analytics. We didn’t have any of these capabilities before.
Ben: Well, as you say that, particularly from…kind of coming from the same kind of data driven business that you guys are doing over at Informatica, I think maybe definitely the positive spin thing to put on this is that we in the enterprise software business, which are definitely behind the curve on these kind of things, we have an opportunity to learn from that experience, like you’re saying, and actually do it in a positive way. Because even as you were saying that, it comes to mind that I’ve spent a lot of time with customers showing data about their usage, showing them their data, and having a discussion with them like, “Hey, I think you could do this. You could do that.” And I remember having this discussion, and we were looking at this particular customer and the type of searches they were running. And this one guy had for some reason searched for God.
[Laughter]
Ben: You know, capital G-O-D. And we thought… Well, I made a joke about it. I was like, “Did you find God? How did that work out for you?”
[Laughter]
Ben: And he was not amused by that. And it kind of occurred to me, it was like, “You know, I thought we would have a mutual joke there.” He was like… He didn’t have the idea that, “Well, that metadata is going somewhere.” And so I think it’s always…there’s a very fine line between that…I don’t know what to call it…the creepy and the helpful. And really being able to walk that fine line is going to be a challenge for these kind of data driven businesses going forward…
Bill: Right. Yeah, I think the customers, the public certainly, expect a Cloud service provider to protect that data. But at the same time, the company has to figure out, “Well, how do I use this data? How do I mine it?” Right? So, if you encrypt everything, for instance, that seems to be the latest thing is, “Well, let’s encrypt all the data.” Well, if you encrypt the data, it’s really hard to use it. And something needs to access that data in order to do something with it. If it’s encrypted and “safe,” and no one can use it, it’s really not valuable. So, there’s a fine line.
Again, we talked about guardrails, and we talked about policies. There has to be a fine line or distinction of, “Well, what can I do with that data, and who’s authorized to use that data?” And that’s where security versus privacy come in. They sort of butt heads. But GDPR, I think it has something like 35 references to the word security in the regulation. So, although it may be a data protection privacy regulation, there is a lot of security requirements in there. So, companies now have to figure out, “Okay, how do I have a viable business model? Take all of this data, sensitive or not, I still have to protect it. How do I stay in business, but how do I monetize this in a unique differentiating way so that I can be competitive?”
Ben: Yeah. No, absolutely. And I don’t know what you’re seeing kind of in your sphere of influence, but I’m not… I don’t get the feeling that a lot of companies have really come to grips with that yet. There’s probably going to be some mad chaos in the next couple of months.
Bill: Well, and a lot of this is based on precedence and case law. And this is really untested. So, a lot of the lawyers I talk to are like, “We’ll see. We’ll see what happens.” May 26th, what does that day look like? As you said, are there people staging lawsuits and investigations to figure out who’s not compliant with the law when it’s a brand new untested law?
Ben: That’s true. Yeah, that’s a lot like some of the anti-trust stuff that went on before with Microsoft and Google a long time ago. It was like you have to kind of establish the precedent. Well, one thing kind of…maybe one last thread about this data privacy. You and I talked a little bit before about artificial intelligence. And it seems like in particular when you’re talking about where this is going in terms of taking customers’ data and taking kind of this cross-customer metadata and looking at a group of customers and seeing what their behaviors are to help them, a lot of that is born up in ideas around artificial intelligence machine learning.
Because you really want that to be automatic, and you don’t want somebody peering through it. And also in some sense, you might think, “Oh, well, that maybe that’s safer because it’s a machine looking at.” But then maybe not. Because it goes back to that guardrail discussion, it’s like, “What do those guardrails look for in automated algorithm behind the scenes versus a human?” Have you thought much about that? Or how do you think that’s actually going to play into this whole discussion?
Bill: Yeah, so a couple of things. The speed at which computers operate on data… And obviously there’s benefits to that. But the challenge with traditional computers versus AI is I think we’re all comfortable with automation. And we’re comfortable with computers sort of automating repetitive tasks and doing them more efficiently. The challenge is with traditional computers… There’s humans writing the code. And so when it breaks or even when it works, we understand what happened. And we’re getting to the point now where with artificial intelligence, we’re creating algorithms that then evolve, and they form new patterns.
And in some cases, they make decisions that we don’t understand. The creators of some of these AI routines don’t understand the conclusions that the algorithms are coming up with. So, couple that with the speed and the pace of innovation, very soon we’re going to get to a world where the computers are literally making decisions that we don’t understand. And it could be the write conclusion, but we don’t understand how it got there. And depending on what the sphere of influence this machine has or this AI algorithm has, it could have profound impact. Right?
So, if machine learning determined how to best drive a car, do we really understand what it’s optimizing for? And other very much more complicated, complex system to system interaction. So, I worry a little about we don’t understand the failure modes. We don’t understand the guardrails that the system is willing to live within to stay within. So, that’s concerning to me. Because it sounds like we’re having fun sort of innovating, and we’re getting some benefit to it. But it’s getting to a pace now that… And some scientists are telling me, “We don’t understand what it did. We don’t understand how it got there.”
So, if we keep innovating down that road, what are we actually trying to accomplish now?” That’s one of the areas of concern. When I was in venture capital looking in the security space, someone asked me, “What’s the most impactful innovation or invention in the security space in the last five or ten years?” And people were expecting this box, or that appliance, or Cloud computing. And my reply was, “API’s.” API’s were the best thing to happen to security, because now we don’t have people clicking on Gooey’s [Phonetic] or installing physical boxes in order to keep up with the size or the velocity of growth in a data center.
Now the security teams get to keep up and actually keep pace with innovation with the developers of software, because everything is becoming software. So, API’s allowed the security team to keep up. Ten years ago, it was sort of dark days of security where everyone was on VM’s, and we had… We were still trying to sniff the network traffic and trying to figure out what was on these boxes. So, API’s and automation gives now the security team a chance, a fighting chance. So, we have that at our disposal. And if we can apply machine learning and AI to our systems sort of doing them for good, we can at least keep up with what the bad guys are trying to do. But in the end, if we start applying AI to both sides of the equation, and we start losing control, it’s really unclear what way…how can we impact the system that’s essentially fighting itself now.
Ben: And it sounds like some sort of war of robots where it’s going to decide who’s going to be our overlords.
Bill: I know yeah. Someone said… I forgot who this was. But someone said recently, “All AI and ML discussions talk about AI, AI, and then it devolves into Skynet.”
[Laughter]
Bill: And then sort of the discussion is over. Maybe that was Hugh Thompson at the RCA Conference. But I was like, “Yeah, we don’t want to just think about the dystopian future.” Right? So, I would encourage anyone out there that’s working in this area to think about how do we force the machines to tell you how did I get to that decision so that we are still in control and that we have a way to influence the system and the network.”
Ben: And I remembering right? I think even in GDPR, doesn’t it have something about algorithms describing how they got to something. I thought I remember reading or hearing about that. But it seems like that there was maybe regulation under consideration. But yeah, that seems to be coming up more. And I thought that was very interesting and how hard that’s going to be. It’s like to your point, even the guys who write the code don’t necessarily understand the steps that the algorithm went through to actually arrive at the conclusions.
Bill: Yeah, which is both exciting and kind of a little terrifying at the same time.
Ben: [Laughs] Yes. Yes, exactly. Well, maybe just to wrap up… So, as you in particular in your kind of position around… And I love the title chief trust officer. I think that’s really great based on everything we talked about. Because it really is about trust and particularly in companies in this day and age. What are the challenges you think you’re going to be dealing with over the next year or two? Where’s your kind of focus?
Bill: Yeah, so as chief trust officer, I look at a couple different vectors. One is availability. So, I have two main roles – I have availability, keep the Cloud up, and I have safety and security. So, keep the data safe, keep the system’s integrity intact. Those are the two main areas of focus. And Informatica is going through transformation. So, we’ve been in the Cloud space for a while, but we’ve had one small offering, and we’ve been slowly building it. But we’ve been largely selling on-prem software. And now we’re taking all of that…all those products and features and turning them into software features on a platform in the cloud. And so one of the big things I worry about is just availability. How do we build the Cloud fast enough to keep up with customer demand for Cloud services? And at the same time… Because I inherently have a security background. How do I bake all the right guardrails and security controls into these new transforming business processes and development processes that the developers and the product teams are trying to build and evolve? So, we have growing customer demand. We have an ever-increasing footprint in complexity of our Cloud. So, how…? The biggest thing I worry about is how can I keep pace with the developers and their innovation on those vectors of availability and safety. There’s lots of lessons learned from Netflix and from other folks, but putting it into practice and helping a company transform its culture to understand the value of those vectors, that’s really a challenge. And that’s actually what gets me up every day, and it’s pretty exciting.
Ben: Yeah, it really is about culture, isn’t it? It’s a very cultural issue.
Bill: Exactly, yeah. Culture trumps strategy. And if I can’t win the hearts and minds of the developers and the executives on a cultural level, what we need to change together – tactics, and policies, all that stuff just seems like big friction when it’s at the cultural level. When it’s not even top down, but it’s at the cultural level of the company. That’s when it resonates with everyone, and it becomes much, much more synchronous. And it’s a much better way to work together. So, it’s a real challenge to be able to figure out, “How do I move the culture, and how do I work with the teams?” But when you’re operating at that level as opposed to the sort of arm wrestling and the security team of “no”. When you’re operating at that higher level and first principles, that’s a much more fun challenge.
Ben: Yeah. Well, you sound like you got your work cut out for you. [Laughs]
Bill: Yeah, it’s good. Like I said, it gets me up every day. So, it’s fun.
Ben: Well, that’s great. And I think we’re just going to have to schedule an update, and you can in and tell us how that’s been going for you.
Bill: [Laughs] Great.
Ben: But no, it sounds great. And I really appreciate you taking them time to come on the podcast, Bill. And I really enjoyed this conversation.
Bill: Yeah, thank you.
[Music]
Ben: Thanks, everybody, for listening to the Masters of Data podcast.
[Music]

DESCRIPTION

Our guest this episode has had a front-row seat at some of the most game-changing innovators in the Internet Era - Netscape, Netflix, and now Informatica. He has seen the Information Security industry change dramatically since he first cut his proverbial teeth on it in the nineties. Our discussion touched across 20+ years of dramatic changes in the way business is done on the internet and how we as a society think about information security - all the way from security in a college computer lab, to security at Netflix, to his plans at Informatica, and to the new privacy regulations coming out of the European Union - the General Data Protection Regulation (GDPR).