Transcript:
Angie Chang: Welcome to Girl Geek X Podcast, connecting you with insight from women in tech. This is Angie, founder of Girl Geek X and Women 2.0.
Sukrutha Bhadouria: This is Sukrutha, by day I’m an engineering manager.
Gretchen DeKnikker: This is Gretchen and I’ve been working in tech for over 20 years.
Rachel Jones: This is Rachel, the producer of this podcast, and we’re the team behind Girl Geek X. This podcast brings you the best of Girl Geek X events, dinners, and conferences; where we’ve been elevating women in tech for over 10 years.
Angie Chang: And today we’ll be discussing software security.
Rachel Jones: So this is a lot more technical than our general, more career advice episodes. Why is this still relevant to our audience?
Angie Chang: We noticed in the news that there was a huge hack exposed, with Capital One being hacked with 100 million, I think, identities stolen. So it’s definitely always made the news that these types of data breaches are happening consistently and it’s always a software problem at the end of the day or potentially a human problem, but definitely always in the news.
Sukrutha Bhadouria: Yeah, real lives are on the line and we’re paying for everything that we do. Whether it’s a car service or it’s a delivery of groceries, we have all of that, all our credit card details and our bank account details all set up. So it’s even more important for our information to stay secure and it’s just one click. It’s very easy for us to have all of our information stolen and replicated.
Gretchen DeKnikker: I think from a company standpoint, there’s this idea of, particularly when you’re an early stage startup, that there’s, you’re sort of trading off things and you know, things aren’t as secure as they should be and those are generally known things. But I think why this topic is interesting to dive in deeper is that this happens to much bigger companies, too. And there are kind of known vulnerabilities or things and patterns that are repeated from company to company that sort of leave them wide open for something like this to happen.
Sukrutha Bhadouria: Yeah. And generally we tend to forget about making our software secure when we need to be doing that from the first design of it. It’s so easy to go in and gain access of one’s database and just like you said, Gretchen, it’s been happening. There’s a lot of stories of it happening to bigger companies where they hold more data, and those are the ones that are usually under attack because there’s more that you get for the energy, for the effort of hacking into the database of whatever company is easy.
Gretchen DeKnikker: Yeah, and I think there’s a common saying, at least among founders, of well, if your company hasn’t been hacked into, it’s just because you’re not important enough yet, and not because you don’t have vulnerabilities.
Angie Chang: We hear at Girl Geek dinners from security engineers and companies with a security mindset that building with a security first mindset is really important and I really enjoy hearing, at various Girl Geek dinners, people talk about security issues.
Gretchen DeKnikker: It is always one of our top topics too. People get really excited to show up for this content too.
Rachel Jones: I think it is really exciting getting to see the back end of this because, yeah, as a consumer, it feels kind of mysterious. The ways that your information is vulnerable and seeing all these stories come out in the news. Getting an understanding of what these attacks look like and how, what approaches companies are taking to protect against this… Having that kind of information that isn’t always included in these stories. It’s definitely interesting to know.
Angie Chang: Angie Song is a staff software engineer on the sync team at Okta. Here’s what she said at the Okta Girl Geek dinner.
Angie Song: At Okta, we always ask questions about security in the beginning stages of development because… And this is because it is much more difficult to retrofit security into an existing system. A great example of this is actually the internet. In the early days of the internet, the only people who had access to internet were researchers from trusted organizations like government organizations or universities. Because of this, a lot of the networking protocols that were designed during this era were built on the assumption that everyone on the internet was trustworthy and cooperative. Now that we have 4 billion users on the internet, we are now suffering from the consequences of this early naivety. This is exactly why Okta is pushing zero trust. But it doesn’t matter how secure your system is, if your users are not using it or, even worse, if they’re using it improperly. So let’s say your company decides to be secure and they decide to start using Okta, but at the same time they also decide to implement this password policy.
Angie Song: Your password needs to be a automatically generated 17 character-long password that would upper case, lower case, all the numbers and hyphen and everything, and it needs to be changed every month. What is going to happen is people are going to start writing down their passwords on Post-It Notes and then start sticking it out on their monitors because they can’t remember it. So human factors matter and security systems must be usable by non-technical ordinary people because it will be used by ordinary people. An average person is not going to remember a 17-character long password with upper case, lower case, numbers, hyphens, everything that changes every month. So when you’re building a security system, you have to make sure that you have to take into account the roles the humans will play when they are interacting with your secure system.
Angie Chang: That was a hilarious talk. You can check it out at our YouTube channel at youtube.com/girlgeekX. So look for the Okta Girl Geek dinner talk and find the Angie Song segment. But she said some good things about enforcing least privilege, which is the really excellent way of thinking about it. I really like using my One Password manager to have those incredibly long 17 characters alpha-numeric passwords for ultimate protection, but at the same time, you have a lot of normal people who don’t use One Password, or say your parents forgetting their iTunes password, and you need to figure out as the designer of that system, what’s the trade offs for having, requiring such long passwords or not?
Gretchen DeKnikker: I mean, I think designing security is like product design, right? You have to think about what the human’s going to do and the human isn’t going–frequently going to do the rational thing, right? They’re going to do the easiest thing. So I mean I think her point is that you need to make it as easy as possible for them to do something secure. Right. Which I think is why you use One Password. I use One Password, also. I finally became a convert when I got locked out of like trying to pay my mortgage for like the 10th month in a row and I had to call them every month to get unlocked because they just had some crazy password I couldn’t remember and I’m not one to write it down on a Post-It, but it did after like so many months of friction. It finally did draw me to One Password.
Sukrutha Bhadouria: So I think like when you were starting software design, it should be just as important when you’re learning how to program and design systems that you should be like a compulsory part of that or a prereq to call that program complete is to be exposed to security and design. So I really like her and she talks about implementing it from the get go and like she says, it is definitely very, very hard to go back and you know, update your product to be more secure later because there’s just so many different things that you could miss along the way. So you want to probably like weave it into the design.
Angie Chang: I remember you started as a software engineer in test and then you did a lot of like testing.
Sukrutha Bhadouria: Yeah, so I did a lot of security in addition to writing test frameworks and such. I did a lot of–what I got exposed to was performance testing and security testing as well. So that’s why it was important to be involved from the early start, early stages of the architecture and design.
Sukrutha Bhadouria: Senior software security engineer, Nicole Grinstead shared how Netflix approaches security during our 2018 Elevate conference.
Nicole Grinstead: The first thing we do is we enhance our data and make sure that we have everything that tells the full story about what user action, what action a user has taken. So then we start to take those actions and model what their normal behavior is like. So just to give you kind of an example of a few of the things that we think are interesting. If you think about what a user typically does, their agent is a really common thing that you can see in a log where you know, we can tell what kind of machine they’re coming in from and that usually doesn’t differ. You know, sometimes people get new machines, sometimes they upgrade their browsers, like we have some logic to kind of dampen those kind of upgrades or things like that. But if all of a sudden that changes, it might be a signal or an interesting thing to look at.
Nicole Grinstead: So, as you can imagine then just generating anomalies and figuring out where things are different doesn’t necessarily give us a full picture of when something is malicious or if something might be going wrong. So that’s where the next step is on top of these kind of raw anomalies that we’re generating. We apply some business logic to be a little bit smarter about what, what we think is important to investigate. Because just seeing raw anomalies it could be interesting, but it also can be a little bit noisy because as you can imagine, people do deviate from their normal behaviors sometimes. So this is then kind of the step where we try to figure out is that actually risky to our business if this action is occurring.
Nicole Grinstead: So you think about accessing really sensitive financial data. That’s something that’s higher risk than maybe accessing our lunch menus. If I never access lunch menus for Netflix and then all of a sudden I do, well, yes that was anomalous, but does the security team care if somebody’s looking at lunch menus? No, we don’t care. There’s no sensitive data to be gleaned there and it’s not something that we want to spend our resources investigating. So that’s one aspect. We also kind of look at what type of user it is and if it’s a certain type of user they might be a little more or less risky. And so these are the types of things that we apply after the fact to kind of weed out the noise a little bit and see what are the really high risk things that we should be focusing on and looking at. Then that final step is where we get information from outside of just our anomaly generation and tie that up with other interesting data sources.
Nicole Grinstead: So if we’re looking at, not just that interesting event, but then events around that. What does the user typically do? What kind of applications do they log into right before? What types of applications did they log into right after? That type of thing. Also, you know, what organization they’re in, what type of job they do. So any other extra information, that extra data that we can use to kind of enhance that and tell the whole picture of who this user is, what they typically do and why this was a weird behavior and if it’s risky.
Rachel Jones: So we basically just got a Netflix play by play of threat detection. Is there anything that stood out to you all?
Sukrutha Bhadouria: In 2018, a lot of companies had to make a lot of changes in how they design their systems and how they logged the user actions and user information simply because when the GDPR regulation was enforced it was meant to be for the European Unions, individual citizens, but it also had enforcement around how that personal data was transferred outside of the European Union. And so what Nicole talks about is a bit about logging customer behavior and you know, being able to see details like what machine they’re coming from and such, which usually is–was originally meant to sort of track usage and be able to troubleshoot when customers have issues.
Sukrutha Bhadouria: But what software designers didn’t think of along the way is that how much information is okay to log? And so there was a lot of like rushing to the finish line, sort of scrambling situation was going on to adhere to the rules of the European Union at that point. So that whole reaction to it really that spoke to how security does end up becoming an afterthought. And so when Nicole talks about how they do it at Netflix, it’s really interesting to see how they’re trying to weave it into their process. But you know, you will always, if you don’t keep a constant eye on it, you’re always going to find something that slipped through the cracks. And so it’s really important to have regulations like these to protect people and their data.
Gretchen DeKnikker: You know, a tangential topic to this is we’ve been pulling all of this data for so many years and I think there’s a lot of questions coming up now around like, do we even need that? We just do it because we can, right? Thinking about like something going–it started off like send a bug report to Microsoft, like after it crashed or something, right? Or even within Apple apps, but now it’s become, when you look at something like OAuth and how much information that pulls, and how unaware people are of what they’re giving up just so that they don’t have to remember a user ID and password, right?
Angie Chang: And it’s true that it is unfortunate over time that our trust has eroded in these big tech companies, but even like Apple and Google are like, wait, do they even, should we give, are making them, are they really doing something useful? They need it. Maybe we need our own GDPR here in the United States.
Sukrutha Bhadouria: Yeah. I know a lot of people who unplugged their voice assistance systems that they have at home because they feel so uncomfortable about the fact that it’s constantly listening and just waiting for those wake up words to actually respond. In other words, they’re constantly listening. So it’s almost like the more we learn about the data that that is being collected and stored, while it does make life a lot easier, it does get more scary and dangerous.
Rachel Jones: I think that’s interesting, thinking about also who these security measures serve and who this data collection serves. ‘Cause this quote from Nicole, it’s really about how Netflix protects themselves. It’s interesting how much data they collect on their users in order to protect their own system. But how does having all of that user data actually put the users at risk?
Gretchen DeKnikker: Yeah, I think that perfectly summarizes, sort of, the issue that we’re kind of looking at right now of do we actually need to get all of this information? Are we just getting it because we can? And then at what point does this become… When you’re not using it to improve the product, then what are you going to use it for? And I mean, and the answer of a lot of these, it’s to sell more ads, right? So that shouldn’t be a surprise that they’re a company and they’re putting their economic interest ahead of an individual, right? That’s the part that I’m not sure how people get surprised about.
Sukrutha Bhadouria: Oh, when I was traveling last year, I noticed that every time I had to pay for the service, they would ask me for my phone number and it occurred to me midway through, I’m like, why do you need my phone number? And then they said, Oh, it’s just about a process that don’t mean that I should ask it every season. Why do you need this information from me?
Sukrutha Bhadouria: And like, a lot of times we just wanna connect to that lock down wifi so we can get stuff done or you know, look up the next place we want to go to. And they ask us some information that we don’t need to share. Just like you said, Gretchen. And I think is just like how much information can we gather just cause we can and we’re not–we aren’t also questioning it as consumers when we should.
Rachel Jones: We aren’t always able to question as consumers what’s being collected. Like even everything that Nicole references Netflix collecting, these are not things that we like click a button knowingly to opt into. It’s stuff that they just automatically know. So do we just have to like read every fine print piece of terms agreement with a magnifying glass to be able to protect ourselves or, yeah, where can we expect to be able to do?
Gretchen DeKnikker: I think the practical thing is to think about what data would this company collect on me, right? And then keeping an eye out for things that sort of go past that. Like if you’re doing a Facebook OAuth to save time, you are absolutely 100% giving away like a tremendous amount of information that they don’t need. Whoever it is, they don’t need it. So it’s a little of understanding how those things work. Like OAuth in with Twitter. Like they know practically nothing about you, right? If you have a choice or just don’t. Like use One Password and have an email account like that sort of stuff where you’re really… Where people get like really upset about it. You gave it away. So I think you know, where we started was talking about how we as individuals have a responsibility to be a little bit less human maybe. This last quote that we have coming up is awesome because she’s talking about security within your QA environment, which I think is probably a huge vulnerability for a lot of companies. So at our dinner with Palo Alto Networks, Meghana Dwarakanath spoke about her solution to this common vulnerability.
Meghana Dwarakanath: When it comes to production environments, we are very thoughtful about protecting them and we should be, because it has our customer data, it has our reputations, and it needs the protection. By the time we come to our QA environment it kind of tapers a bit. Right? Why? Because you’re thinking it’s QA, we don’t have customer data in there. Hopefully. And you know, it’s an afterthought, we really don’t think about it. But if you really think about the challenges we have and the kind of products we are testing today, we need to think about why we need to secure QA environments. Because when somebody gets to your QA environment, there are a lot more things they can get out of it apart from customer data. For example, they can get an insight into your system internals. They can figure out how your systems and services are talking to each other and you’re literally helping them make a blueprint to attack your production environment.
Meghana Dwarakanath: You have proprietary code, of course, that is running in your QA and staging environments and so there’s a potential loss of intellectual property there. This is just your test environment. What is the other aspect of testing? Test automation, right, and now anybody who is testing the SaaS service will tell you they test against production. Every time you release, you want to make sure that your production is doing okay. All the features are doing okay. So what do you do? You run your test automation against production, which means your test automation now has credentials that can access your production environment. You probably have privileged access because you want to see better what you’re testing and now you’re co-located next to customer data. Which is a very–potentially, a very unsafe mix. So how do you do the security? One of the ways we have been able to do this successfully here is to consider test as yet another microservice that is running in your production.
Meghana Dwarakanath: So all those production microservices that you deploy. Test is just another one of them. How do you microservices store credentials? That is exactly how your test automation will store credentials, the same SDLC process that Citlalli talked about where security is not an afterthought. The same thing applies to your test automation code as well. You are deploying monitoring for your test automation services just like you would do for your production services. And then whatever deployment automation you have, your IS automation code, you first test deployment into the same very architecture and now you have all the added protections that your production microservices are getting.
Gretchen DeKnikker: I’m sure there’s a lot more controls in a larger company, but in a smaller company, this is like 100% of vulnerability that most people aren’t even thinking about it. It’s one of those ones that goes a little off the rails when a company starts scaling and there are these things that haven’t been… Systems that haven’t been put in place to prevent that sort of thing, but, you know, having… like she was talking about a blueprint for your backend system too. It seems like a really good entry point in thinking about it at the developer level of you’ve got these guys here and they’re building things and they’re taking bits and pieces of code. Like whenever you create a friction point for a developer, they’re going to create a work around to make their job easier. And so making sure that the security that’s built into your QA isn’t making friction that they are going to work around.
Rachel Jones: I think it’s an interesting example of the kind of blind spots that can exist with security. There are so many vulnerabilities that you have no idea that you’re opening yourself up to even through all of these different stages of the process. Cause I know we talked about folding in security during development, but yeah, once you think that stuff is done and you’re just testing it and getting it ready to go, yes. Why would you even think about security as much at that point? How do you prepare for these kind of blind spots?
Angie Chang: I guess that’s why companies like Palo Alto Networks exist. To be a leading provider of security.
Sukrutha Bhadouria: Yeah, and it’s definitely a really important point to not save your customer information. Even when you’re trying to test your system, you don’t want to save that information to replicate your vulnerabilities to test them out. You want to do it with customer-like data. So that was really important to call out as well.
Angie Chang: Yeah, that’ll be super embarrassing if you got an email later, like we’re sorry we just sent you that by accident because we were doing some testing with your data.
Sukrutha Bhadouria: Yeah.
Rachel Jones: Does anyone have final thoughts on this topic of security?
Gretchen DeKnikker: Think about humans and humans will find the fastest path to anything, whether it’s in their own best interest or not.
Sukrutha Bhadouria: Well, we think about security throughout your development life cycle. It gets harder if you don’t pay attention at the start to make adjustments along the way.
Angie Chang: At the Palo Alto Networks Girl Geek dinner, we learned about having a security first mindset versus security as an afterthought.
Rachel Jones: Anything else to say about that? I can’t just pop that in.
Angie Chang: So if this is interesting to you, you can check out Women in Security and Privacy, which is a 501C3 group helping people get into the field of security engineering. OWASP also has a lot of knowledge and a top 10 list and you can also check out conferences like the Diana Initiative.
Rachel Jones: I think it’s challenging and also really exciting to get to do an episode like this, that advice and more specific women talking about the cool stuff that they’re doing at their companies. But I think that’s so much of what happens at these dinners is just women sharing. Like this is what I’m doing and it’s cool and here’s why. So yeah, being able to put that on the podcast, even if it’s not as universally relevant of a topic as like mentorship, I think. Yes, it still really highlights just what’s great about Girl Geek.
Angie Chang: Thanks for listening to this episode of the Girl Geek X podcast. We’ll be back soon with more advice from women in tech.
Rachel Jones: This podcast is produced by me, Rachel Jones, with event recording by Eric Brown and music by Diana Chow. To learn more about Girl Geek X or buy tickets to one of our dinners, visit girlgeek.io, where you can also find full transcripts and videos from all our events.
Angie Chang: This podcast was sponsored by Okta, the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to both secure and manage their extended enterprise and transform their customers [inaudible 00:28:52] This podcast is also sponsored by Netflix. Netflix has been leading the way for digital content since 1997 and is the world’s leading internet entertainment service. This podcast is also sponsored by Palo Alto Networks, a global cyber security leader known for always challenging the status quo insecurity.