In this second episode of our DevOps Unbound streaming broadcast on TechStrong TV and DevOps.com’s sister site Digital Anarchist, Mitchell Ashley of ASG and Alan Shimel are joined by Caroline Wong, CSO at Cobalt.io; Andrew Van Der Stock, executive director at OWASP; and Dr. Grigori Melnik, CPO at Tricentis, to discuss DevSecOps and application security.
The video is immediately below, followed by the transcript of our conversation. Enjoy!
Transcript
Alan Shimel: Hey, everyone. This is Alan Shimel, and you’re watching DevOps Unbound. This is episode two of DevOps Unbound, and on our show, which airs biweekly with a monthly roundtable open to the public, we explore topics in DevOps. You know, if it’s DevOps-related, it’s fair game. Today’s topic is DevSecOps, and I realize that DevSecOps has become sort of a mouthful, intrinsically linked into AppSec now and everything we do around that.
We’re gonna explore today about really, security—whose job is it, anyway? Right? Who, ultimately—there’s a lot more developers, DevOps testers than there are security people, and we increasingly rely on them to make sure our applications are secure, are being developed and released and maintained as securely as possible. And we’ve got a great panel of experts to talk about this today.
I want to quickly mention, though, this is episode two; episode one aired two weeks ago, and we will be doing our monthly roundtable, so stay tuned on that.
But let me now introduce you to our panel. Joining me first is Founder and CEO of Accelerated Strategies Group, my partner and friend, Mitch Ashley. Mitchell, welcome.
Mitchell Ashley: Thank you, Alan. Great to be on with you and with this great panel.
Shimel: Yep. The rest of the panel, I’m gonna just say their name and who they work or, and if you can give a little bit of your background. Let’s first start with Andrew van der Stock. Andrew, welcome, and if you can introduce yourself.
Andrew van der Stock: Good day. My name is Andrew van der Stock, I’m the Executive Director of OWSP. I was recently appointed to that role, and I’ve been doing application security since 1998. When we first talked about application security, it was called mobile security, because, well, it was applets and ActiveX controls that were running in browsers. So, it was a rather exciting and very different time.
Shimel: That took me back—yes, it did. Next, the one and only Caroline Wong. Caroline, you want to introduce yourself?
Caroline Wong: Hi, Alan. Thank you so much for having me on today. My name is Caroline Wong, I am the Chief Strategy Officer at Cobalt.io. We’re a company that builds security software. I’ve been in infosec now for 15 years, starting out at eBay and Zynga, some of the original DevOps companies, if you will. I also do a podcast called Humans of Infosec, and I’m a LinkedIn learning instructor for the OWASP 10. Almost 10 years ago now, I published a book called Security Metrics: A Beginner’s Guide.
Shimel: Very cool. And a fantastic speaker, too, I may add. I love watching Caroline present. And then last but not least, a holdover from our first episode of DevOps Unbound, Dr. Grigori Melnik, Chief Product Officer at Tricentis. Grigori, welcome.
Grigori Melnik: Thank you, Alan. Thanks or having me back and hello, everybody, wherever you are in the world. Yes, as Alan said, I’m the Chief Product Officer of Tricentis, the continuous testing platform and company that’s building the testing tooling for the future. And previously, I’ve been at Splunk, so this is where, actually, a lot of my kind of security mindset has been forged; also, MongoDB and Microsoft. So, good to be here with this panel
Shimel: Excellent. So, let me start things off and then we’ll be off and running. Andrew, as the Executive Director at OWASP, right, there was a time where, clearly, the OWASP member, the user, the person who was gonna, who you were existing for was the security professional who was in charge of, you know, as you called it, mobile app security and then, of course, appsec has become all the rage probably for the last at least 12 years, right? Appsec is, a lot of people say if it’s not appsec it don’t mean a thing, when it comes to security.
How has the rise of DevOps, DevSecOps as some call it, some not, how has that kinda changed who you serve, or who does—not you, personally, obviously, but OWASP—who do you guys cater to?
van der Stock: So, that’s actually really interesting. I had a conversation with Mark Curphey, the Founder of OWASP, a little while ago as I was appointed. And the original thing the OWASP folks worked on was the developer guide, which, it’s a developer guide or developers. And the reality is, is that we missed. We stopped talking to developers. We became very, very internally focused, and we’ve converted the vast majority of the infosec folks to the fact that apps are the firewall. I’ve done almost all my pen tests through firewalls and over SSL. So, the reality is, is that the application has always been the firewall. We should have been talking to developers much earlier than this.
My personal opinion is that the thing that OWASP has to do right now is to refocus on talking to developers and having tools and information they can actually use in their toolchain. If we’re constantly just having that internal echo chamber conversation, we will fail. And it’s really important to me that we actually start doing more stuff like the OWASP dependency check which can be pipelined, GitHub actions and things like that. These are where we need to start focusing, not on .pdfs, as the person who does the OWASP Top 10 and the developer guide and the ASVS, .pdfs are probably what we’re famous for, but the reality is, developers don’t know of it, they don’t use it.
I really want us to get involved in Stack Exchange. The number of people who post to Stack Exchange, just to copy and paste stuff, and that information is just wrong? It’d be really great or people like the volunteers and the chapters and the communities to get involved in things like Stack Exchange and other places where developers hang out.
Shimel: Excellent. I didn’t realize that the original focus was developers. Caroline, you’re in a funny place because you’re Chief Strategy Officer at a security tools company and of course, as you mentioned, you’re a LinkedIn—not professor, LinkedIn teacher or whatever the exact term is around appsec and OWASP. How does that, you know, what Andrew said, how does that jive with what you guys, where you guys see the world.
Wong: Yeah, so, you know, when I started my infosec career at eBay and Zynga, in both cases, these were companies that were running online operations 24 by 7, with millions of simultaneous users daily. And so, it was always clear to us that we had to get it right when it came to talking with the developer.
A couple of years before I began my career in infosec in 2003, the first version of the OWASP Top 10 was released. And right now, I wanna say we’re on version 6 or so. Andrew has been spearheading this project for a long time and we’re actually looking at the next revision coming out in February, 2021. If I look at the most recent version, which is 2017 RC2, and I compare that to the original OWASP Top 10 that came out in 2003, one of the things that’s very striking to me is that at least half of the stuff is the same.
So, that says to me that, on some level, some of these basic security fundamentals that I think appsec practitioners understand are not either being implemented by developers and where is that disconnect? So, I actually think that, as an industry, appsec tends to focus on sort of the hot and sexy things, which is, like, finding cool stuff and really interesting sort of sophisticated breaks.
But I think that, if you were to look at, for example, the past year or so, you know, if you look at the 10 largest data breaches, I’d be willing to bet you that the majority of those occurred because of something that had to do with, like, a Security 101, basic kind of thing gone wrong. And that tells me that we’ve been really focused on technology, and actually, the hard part is getting the people and the process to work out together.
So, that’s something that I think a lot about, and I think it really speaks to the core of what we’re addressing today, which is—whose job is it? You know, a security person can find all the vulnerabilities that he or she wants. Until that security person is effective at communicating what the security vulnerability is to a developer who can actually fix it and convincing that individual that it’s a priority enough to actually spend time remediating, then the code will never be more secure.
So, it’s always needed to be a partnership, and I think that, you know, these days, because of things like the transition of software development methodologies from waterfall to agile to DevOps, I think that culturally, we’re beginning to see some changes.
van der Stock: I just want to highlight one of those choices that you said.
Shimel: Oh, I’m sorry—go ahead, Andrew.
van der Stock: Realistically, the thing that I see as the OWASP Top 10 leader is, it’s self-referential. The things that people pay attention to is the things in the OWASP Top 10. And, in fact, CWE is, the MITRE Group are about to release the CWE Top 25 in a few weeks’ time, and it’ll look like the last one in 2011. And this is really sad, because people are actually concentrating on these.
What we really need is for developers and security people to work hand in hand, and I would like to see the security folks stop delivering .pdfs and start delivering, you know, pull requests. If you can fix the problem, you don’t have to convince anyone. If you actually start talking in developers’ terms and actually convince the product managers that this is the right thing to do, we’re gonna be a lot more successful than just saying, “Look, your baby is very ugly.”
Ashley: Andrew, bless your heart—bless your heart. [Laughter] That’s something I’ve advocated for. It’s not that, necessarily, security people have to become the expert software developers, but I think we’ve sort of done this to ourselves by thinking security as a penetration activity, right, something that we’re testing from the outside. I mean, now we’re talking about very porous software architectures and API security and Kubernetes container security. It’s a much more different world than even 5 or 10 years ago.
So, what I’ve advocated for—and it’s a little bit scary at times, but advocated for security people to learn software architecture, whether they learn how to code or not. But they need to understand what’s going on inside the m&m as well as the outer hard shell, if you will. And, you know, one way to do that is to bring software people into security groups. You don’t have to go learn it all yourself, you can learn from other people and bring in skills, as well.
But to your point, you know, it’s not an after the fact thing, it’s gotta be right there in the flow and the process. If it gets fixed as soon as you find it because it’s a priority—great. If it’s a day to a week, month later—forget about it, it goes into the prioritization process and it may be months or quarters before it gets fixed.
Shimel: Grigori, we’re all looking at you, man.
Ashley: [Laughter]
Melnik: Yeah, I was ready to jump in to talk to Caroline.
Shimel: Go ahead.
Melnik: Because what she said absolutely resonated with me, and then what my other esteemed colleagues just added to it, you know, rings the bell. Because, you know, I always looked at the security—and again, for a very, very long time, I was focusing predominantly on the developer audience, ________ engineer and then the development teams, platforms, tools, and all of that. And, you know, kind of designing for security something that’s been a focus of the teams that I was part of. Starting from, you know, as early as back at the beginning of the century with the Microsoft, if you remember the early days, having the disastrous year 2000, 2001 with lots of worms and breaches and —
Ashley: [Cross talk] virus —
Melnik: And then, you know, the bagman CTO, Craig Mundie had the internal initiative that later turned into the actual company-wide letter from Bill Gates, Bill G. himself. And that was the beginning of what’s known as the Trustworthy Computing initiative, which was company-wide. Again, we’re talking about tens and tens of thousands of people had been trained regularly, not just once, but on a regular basis on the fundamentals of security, on design with security. The whole life cycle, we even didn’t talk about SDLC, software development life cycle, we’re talking about secure software development life cycle and all the practices that came from it.
And it actually beared the fruit, right? Because in general, you look at the history and how much more secure, you know, Windows and Microsoft has become—that’s actually, you know, the results of those early initiatives, right? So, by the same token, I think that when we’re talking about security and whose job is it anyway, we need to start from the ground up. And maybe even before the junior developer, maybe even at school, right, if you think about software engineering schools, boot camps, whatever format you are learning your code in, it’s super important to be introduced to those secure practices, to be aware of the patterns and anti-patterns, right?
Like, I think ________ and ACM, the official ones that define the curriculum, they had included, for years, the topic of software security in there, but realistically, how many graduates of computer science degree has actually dived somewhat deeply in it, right? It’s amazing that now it’s been so many, what, two, three decades that we talk about, you know, design for security notions and trustworthy computer initiatives, and we still see what Caroline said, the Top 10 breaches—you still see something as basic as a SQL injection attack, right? Cross-site scripting—all that stuff that is almost, you know, I view it as part of developer hygiene, right? It’s just something that you do fundamentally.
It’s like, these days, it’s not something that developers, we don’t even have this conversation, should developers write their own tests or not? Well, by the same token, there should be no conversation, should developers think about threat modeling, should developers think about, you know, various mitigation strategies? Should they get involved more with the security specialists when it comes to defining or going through the release gate. And yet—and yet, when you look around, those secure practices still hasn’t made into, you know, the mainstream as much as I would’ve liked it to happen.
So, I don’t know what you guys think if there are some other ways to—it just feels like people start looking at it only when the breach occurs and the you’re facing a massive litigation, right? It’s like—but it’s too late, right? We need to think about all this ahead of time. And, of course, everything that OWASP is doing and other organizations and kind of advocating for security, promoting the secure mindset, this whole movement of DevSecOps, I absolutely welcome it. Because anything that we can do to make developers and everybody on the team to think about security early on as opposed to the afterthought is a good thing.
Ashley: I have a question or the panel. You know, these vulnerabilities have been around, as you mentioned, Cynthia and Andrew—
Shimel: Caroline, excuse me.
Ashley: – Caroline, sorry—forever. And you know, it’s not like we don’t know what they are. You know, we have really smart ID environments that will bring up the objects and what all the methods and parameters are. Isn’t there something we can do, either through AI or just in the tool side so when we’re creating software we have fewer of those basic vulnerabilities, is that—
Shimel: Spoken like a real security person! I think that sounds too magic bullet to me, but Andrew, go ahead.
van der Stock: I want to give Caroline some time to answer this, because there’s actually some things I wanna talk about education underneath from Grigori’s talk before. But fundamentally, we need to take the OWASP Top 10 to frameworks and say, “These are the things you must not have—end of story, done. You must not have these.” And then the developers don’t have to think about those other things, and then they can use the developer tools like OWASP Cornucopia, which is like the elevation in privilege game that Microsoft came up with.
Do you know what? The reason why we do this again and again and again is, we allow people to use insecure frameworks and they’re not given the guidance they need. If we had engineering students who didn’t know about, you know, wind loads and things like that, we’d never have a bridge that stood up. Yet, we allow our computer science and software engineering students to graduate without taking one single semester of security.
Shimel: Amen.
van der Stock: That’s why we established an education committee before, and we have to get people to actually do it. The ACM syllabus is okay, but it is focused on crypto, and we don’t need a lot of crypto folks, we need a lot of people who know how to engineer security properly. Now, I want to hand it over to Caroline.
Wong: Thank you, Andrew. And, you know, Mitch, I love your question, because I think that one of the things that security leaders must think a lot about is what combination of machines and humans to bring into their application security programs.
So, anyone who’s only using people is clearly missing out on efficiencies that can be found on machines or implemented, as Andrew was saying, with patterns and pre-written things that people should just be using, HSDS. And anyone that’s only using machines is, on the other side of the coin, missing out on entire classes of vulnerabilities, you know?
So, I think that, ideally, you know, security professionals are working with development teams to use scanners in order to find low hanging fruit and then using that information to provide context for analyzing the risk that’s posed by those issues. I think it’s absolutely a hand in hand sort of thing, even if one of those hands is a computer hand. And, you know, kinda the more that we can do via automation and by simply using secure libraries, using secure frameworks, you know, that allows the sort of precious and often more expensive human technical skills to be focused on things that can’t be found by machines. Finding things like business logic bypasses, race conditions, chained exploits—those are the kinds of things that people should really be focused on. You know, not things that can be solved by, again, simply, in quotes, because to get folks to actually do it turns out to be a whole, another challenge in and of itself, but I really do think it’s very much the mix of both.
Shimel: Agreed.
Ashley: I love the framework idea, too, Andrew, and thank you, Caroline—sorry to mispronounce your name earlier. [Laughter]
Wong: Just think of “Sweet Caroline” like Neil Diamond and you’ll never forget it.
Ashley: There you go. What a wonderful way to remember your name. [Laughter] You know, I wonder, too, if you layer on top of putting into the framework, security people love sort of standards like a NIST framework or something like that. If there was something that, “Okay, all of our frameworks must meet this,” right, then that’s gonna get kinda into the process and the psyche of the organization to say, “Okay, we’re only using software that meets blah, whatever that is,” it seems like that’s one more element that would help with the adoption of standardization of that. Is that practical for something like that to happen.
van der Stock: So, I’ve really been pushing for the idea of the generally accepted security principles, the equivalent to GAAP in the accounting world. There are some fundamentals that we should be trying to do.
The very, very, very, very first NIST special publication had the OWASP Top 10 in it. It was written in 1976.
Ashley: Wow.
van der Stock: It was classified.
Shimel: Really …
van der Stock: When it was—yeah. When it was unclassified and I found it, I found it as I was preparing the OWASP Top 10 2017. The only thing that we don’t have in the OWASP Top 10 is race conditions. Those guys had race conditions in the first one in 1976 when there was just mainframes and they only had one processor and they did things in batch. And quite frankly, for them to think that far forward just shows how early those folks were thinking. However, I think we need to make sure that we give the developers time to think about the desire for security.
I just want to touch back on what Grigori said before, and I think this is where strategy comes into it. When Microsoft, when Bill Gates wrote that memo, they then followed it up with action—executive action. They reset the Longhorn build. Windows 2003 server release 2 is a very different beast in terms under the head than the original 2003. And what was released as Vista was very, very different to what XP was, and that then created Windows 7.
So, quite frankly, it was—yes, we need support from the top, and we need people at the bottom to understand what this really means. But the fact that they had executive support means that things like SQL server just doesn’t have remotely exploitable bugs today. You know, that is a really good outcome, and it’s because of executive support.
Shimel: Agreed, agreed. So, you know, we live in a world right now where, quite frankly, without modern applications that we’ve been surviving on these past months with COVID and everything, imagine how ugly it would be without them. But do we really—do we really have that executive support, maybe not to the level of Bill G. and Microsoft and the Trustworthy initiative. But Caroline, at Cobalt, you’re dealing with dozens of organizations, right? Andrew, you know, at OWASP, hundreds of organizations. Grigori, Tricentis has how many customers that you guys are providing—hundreds as well.
Do you really see that top-down support at most organizations, or was that Microsoft thing sort of a catching a firefly in a jar kind of thing, lightning in a bottle, right? You’re not—we just don’t see that, unfortunately.
Wong: So, I’ll tell you what. A lot of times, when I think about security, I kind of go back to, like, the principal concept that the only reason security matters is when you have value to protect. And so, these days, naturally, a lot of value is moving from the physical to digital worlds—cyber security is more important.
I’d actually really like to challenge something that Mitch and Andrew put out with regards to sort of an all-encompassing security framework that says everyone’s gotta do this. Having worked on the ISO 27034 standard, I’ll actually ask if anyone on this panel has even heard of the ISO 27034 standard. Cool, so that’s great, Alan. [Laughter]
van der Stock: So, I actually tried to provide advice to that when it was coming out, and I couldn’t because I was part of the national organization and that’s the reason it failed.
Wong: You know, there are a lot of different things, right, and I think that the standards route is one that we’ve been trying. And I think that sometimes it works better than others. What’s really gonna work is the market route.
So, when Bill Gates wrote that memo, my understanding is that Bill did not do it for any sort of noble pursuit. He did it because he recognized that if Microsoft was going to continue building security vulnerabilities into their products, that they were gonna lose customers, that they were gonna lose market share.
And so, what’s kind of cool that I see happening when I talk to our customers like VeriFone and MuleSoft and health care organizations and HelpSpot and financial organizations is that, due to the proliferation of cloud and SaaS, today it’s not like you have a CIO who goes and plays golf and has steak dinners with their one big IT software vendor. You know, now, anyone in an organization with a credit card and an Internet connection—buy something and it becomes shadow IT.
So, the whole concept between these business relationships of trust has been transformed. And what you see is tons of software companies using tons of other software companies. So, every software company is also a buyer of other software products and services, and that business transaction requires a level of trust, where I’m seeing customers asking for things like a penetration test in order to prove that there’s some adequate level of security before a business engagement begins.
So, my perspective is actually that frameworks are cool and great, education is cool and great. And I think it’s really gotta be the market that drives the power and the adoption and the prioritization of security.
Ashley: You know, Caroline, there was also—thinking back to the Microsoft example Andrew was talking about—I think an important factor in that was Apple, because they were marketing very heavily as a secure —
Shimel: That was it, yeah. Microsoft was afraid of Apple.
Ashley: So, they have competition, you know? And I hate to use the Vista word, but you know, there was some pretty crappy software that had a lot of security vulnerabilities in Microsoft’s code, and that’s—it was the market, as you said, but it was also competition. Because that’s when Apple was doing pretty well or at least rising up in the laptop market.
Maybe there’s some group of vendors, companies that band together and say, “We’ve got better software because it’s more secure” and that now becomes more attractive. I don’t know, something like that might happen, too.
van der Stock: So, we just got involved with the—sorry, just really quickly. We just got involved with the Linux Foundation and we just established the Open Source Security Foundation, the Open SSF. And it’s early days yet, but I see that as actually being way more successful than, for example, ISO 27000. Not 27034, which has got very low adoption, but 27000 bakes in a risk-based approach which says the business needs to only do that that it thinks is right and can’t really hurt it. And unfortunately, that then becomes whack-a-mole, whereas the Open SSF works from the other way around that I think, you know, with the participating members of GitHub, Google, IBM, Red Hat, and others, and obviously OWASP, we’re gonna be able to do a lot more there than I think in a traditional standard-setting body.
Shimel: I wasn’t aware of that organization, Andrew. That sounds fantastic. We’re gonna look into it here and see what we can do to help get behind that—really, really good stuff.
Grigori, I know you wanted to talk and I’m gonna give you a chance before I jump in.
Melnik: Oh, no, I was just gonna kind of sum up—so, we talked about the importance of frameworks, we talked about the importance of market forces driving these secure initiatives, now the new organizations that have been put in place. But I am still convinced that it’s the developer’s mindset that needs to do sort of the most in order for us to be, you know, winning in this space.
So, doing whatever it takes to take them, take us, the developers, the responsibility for security. It seems like somebody else’s job—“Oh, I have this broadsec group that, you know, that looks at the metrics and the static scans and then occasionally does some kind of a penetration test and exercises or whatever and then sends me a bunch of reports, many of which I’ll try to justify as by design.” No. It’s like, as I’m doing my job, day in and day out, to be thinking about the security, to be thinking about, “Okay, what are the different ways how it can be breached?” and again, making it part of my daily routine, I think, is important.
I can tell you, internally, at Tricentis, I kind of, I’ve been in the process of launching this Trustworthy Computer Initiative, because we have opportunity to improve in this space, big time. But I can tell you, even with some easy things where, you know, we do things like hackathons, right? Not everybody participates in the hackathon, right? So, what we did at MongoDB, I actually loved it, and that actually did require a little bit of effort, but we had our Product Security person put together this type of competition, kind of catch the flag where you would actually have this increasing set of security challenges where you would have to go and break in or identify all these different holes and then, of course, reflect on that.
And actually, that raised awareness of many, many, many, many developers. So, I’m just thinking about what can we do with, you know, the small wins, small initiatives, not necessarily at the level of defining the major standards and pulling the big initiatives, but in this kind of a baby steps to move the needle, I think, is important. The other point —
Shimel: I got—oh, I’m sorry, go ahead, Grigori. I was just trying to get to my next place in line, but you go.
Melnik: Ah, okay. No, the other point I was going to make is that I also think that, for the future of the Internet becoming safer and securer and all of our systems, the other dynamics or the other initiative that I’ve seen happening and getting more and more popular, this whole collaborative testing or collaborative secure, you know, that companies like Hacker One is doing by engaging the white hat hackers in kind of—
Shimel: [Cross talk] and stuff.
Melnik: – inviting them. In fact, inviting them to go and launch the attack, but for the purpose of learning and raising awareness. And even, you know, if you look at the Department of Defense and Pentagon, inviting hackers to hack into the Pentagon systems, I mean, if you think about it, 20 years ago, that wasn’t heard of, right? But what they’re doing there is that, actually, they’re making sure that you get all these diverse perspectives, that you’re not only looking through the narrow angle of your Security Department, but this broader community at large participating; i.e., I think that there’s some hope at the end of that tunnel with the right level of engagement with those collaborative testers, white hat testers that the overall security posture of the Internet, of the world will be improved.
Shimel: I think bug bounties have certainly been a game changer. I don’t think anyone disputes that. But I’m gonna tell you something I’ve learned in my 20 plus years in security. And that is, security, as much as we may want to think of it as not a different animal than many other disciplines within IT and in business in general, and market forces—market forces rule, right? So, security becomes important when the business decides security is important, right?
I remember, you know, having a conversation—I did a podcast with the CEO of MongoDB and Couchbase and Rich Mogull, who I think most of us on the panel probably know Rich. And this is 10, 12 years ago on Network World, I was writing, and I asked those two CEOs, I said, “Does NoSQL stand for no security? Right, because we don’t see a lot of security in the”—this was a long time ago. And they both said they agreed, they said, “Alan, we will build security in, more security into our products when our customers demand it. Right now, customers are more enthralled with having this new type of non-relational database, NoSQL, you know, database that allows us to be more flexible, faster, scalable and all that, we’ll get to security, right?
A couple months later, MongoDB had a pretty decent breach. Microsoft is a great point. Apple was starting to kick their butt, and the word on the street was, even though Microsoft gave you free AV, maybe, you didn’t need AV on Apple. Remember those commercials? “I don’t need AV, I run a Mac.
And so, it’s over and over. When we make security important, you know, Grigori, you guys are the leading continuous testing provider in the world. When your customer starts saying, “Grigori, we’ve got to test for security as well, because that’s part of our quality, right, that’s the Q in QA and we need Tricentis to give us those tests and you go to companies like a Cobalt or you go to the organizations that Andrew’s talking about and stop building that in there, because your customers demand it. Not because it’s the right thing to do, not because it’s a nice thing to do, because it’s demanded, then you’ll see how quick developers take security seriously.” It’s the market.
Melnik: And it is happening. Alan, it is happening. The customers are, absolutely, they are, you know, demanding this. They want to see the evidence and they want to see, they want to have the tools and, again, everything from security testing and static analysis, but a lot smarter, you know, augmented with ML to dynamic analysis and actor-based models and, you know, penetrations and all the way to compliance, right, and a lot of other things.
And you’re absolutely right. I mean, with, back to the days of MongoDB when the initial releases were not very secure, because the idea—they were optimizing for the wrong thing. And this is what I fear was ________. They’re optimizing for adoption. It’s like—get me as many users as possible. How do you get as many users as possible? You lower the entry barrier. Now, if you lower the entry barrier, then automatically, that means that you’re sacrificing some security, right? Just like, again, allowing and opening up the ports and local hosts with MongoDB and all of it in the old days, that’s what had happened. Now, all of that has been tightened. You know, you had the security of trust, you have the security in transit, they’re doing all kinds of phenomenal stuff with encryption, fill level encryption, all of it, but it took like a decade to get to it, right?
When you are at the beginning, you’re putting something in there and you’re trying to win in that race to get more of the magic number of MAUs, right, the monthly active users, a lot of companies and a lot of startups, unfortunately, tend to sacrifice security for that speed to market and user acquisition, and then bites them later, right? So, again, if there is some way to raise the awareness and to think about it early on so that you don’t end up with the unpleasant situations later on, I think that’s something that could change the world.
Shimel: Agreed. Panel, we are, we’re getting near the end of our time, and we have such shy wallflowers here, no one wants to talk. [Laughter] So, I know how long it’s gonna take us to wrap up. I’m gonna give you each a closing, if you will. We’re practicing for the debates, here. We’re gonna give you each a chance to leave the audience with the most important thing you want them to leave. And Andrew, also—is it called the Open Source Security Foundation, OSSF?
van der Stock: Open SSF, yeah.
Shimel: Open SSF?
van der Stock: So, the Linux Foundation—yeah, the Linux Foundation announced it.
Shimel: Is there a website yet for that?
van der Stock: Go to the Linux Foundation first, and then you’ll find it. It’s one of their initiatives. I don’t know that there’s a special site for it.
Shimel: Alright, love to hear more about that. But Mitchell, do you wanna start off and we’ll—your screen’s different than mine, but on my screen, it goes Mitchell, Grigori, Andrew, and we’ll let Caroline bring it home.
Ashley: Okay, great. I’ll make mine brief. I’m actually very optimistic, and I mean this with all sincerity, that listening—people like Caroline and Grigori and Andrew, they’re some pretty brilliant people that are thinking about this, and all of you had a different but also shared some perspectives. So, I’m hopeful that we’re gonna make some progress here in the near future, so thank you for working on this problem. That’s what I have to offer.
Shimel: Thanks, Mitchell. Grigori?
Melnik: So, I wish I actually wore a different shirt today. I had a T-shirt from Splunk and, you know, they have some clever slogans there, but one of them said, “Ignorance is not a bliss,” and that’s my message to all of the developer community. You know, ignorance is not a bliss. Learn security, get the security mindset embedded, you know, from the get-go, from as early days as possible. The engineering managers think about it accordingly and embed security into your teams early on.
And then what Andrew said earlier, I absolutely sign up for it, and I love that this whole notion on the side of the security specialists, security experts—when you’re coming in, don’t send me .pdf reports of, you know, 275 pages and 722 vulnerabilities. Pull requests for the win. Pull request wins today. Pull requests, you know, really, really work. Everything else is just talk. So, again, if security specialists can get more into that written and be more embedded and more attuned to what the developers want and need, I think, jointly, again, we will definitely make a difference, so—thanks.
Shimel: Thank you, Grigori. Andrew?
van der Stock: So, many, many years ago, we wrote the OWASP Developer Guide and then I think the OWASP Top 10, and I realized telling people what not to do was exactly the wrong approach. We need to tell people how to do what they want to do better to enable secure business.
So, the application security verification standard, which I work on, is a set of tests. It’s literally written for developers. It’s literally written to be testable. And this is actually, I think, really important. We need to change the mindset from a negative, “you shouldn’t do that” to a positive, “how about doing it this way?” Because sometimes the developers have a better idea altogether and we just don’t know. But many security folks have the mindset, “I know better than you.” But they’re not software developers, they don’t know.
We need to have a conversation together, and I want to make sure that we’re bringing in developers into the conversation and not treating them as, you know, people who don’t know that much. We don’t know that much, either, and I would like for people to have that bidirectional conversation in a positive, not negative fashion.
Shimel: Fantastic. Caroline—it’s all yours.
Wong: Alright, so, the thing that I’d like to leave folks with today is, I’d invite folks to check out the 2020 State of Pen Testing Report, which I wrote and co-authored with my colleague, Vanessa Sauter. One of the topics that we dive into pretty deep in this report is sort of the relative capability of machines and humans to find different kinds of web app security vulnerabilities. In addition to drawing upon more than 2,500 penetration tests, the data of which exists in the Cobalt platform—and Andrew, I did connect with your colleague, Eric, so we’re gonna be donating some data to the next OWASP Top 10 project.
But in addition to that, we also surveyed more than 100 practitioners in security development, operations, and product across a wide range of industries. And the really good news that I’d like to leave folks with today is that the majority of respondents, in fact, 78%, reported a strong relationship between the security and engineering teams, and we do expect that to grow in the future.
Shimel: I sincerely hope so. Guys, you know, someone much smarter than me once told me that when you want to impress people and you get in a meeting, you know you got a real great group when everyone there is smarter than you. [Laughter] So, thank you, all. I really felt like we had an amazing—what amazing brain power on this one.
I want to thank you, all, for joining and contributing. We will try to put in the notes on this one some of the links to a lot of the sites that were mentioned, reports that were mentioned, et cetera. We’d love to have you back on, you know, maybe at some point in the next month or two, we’ll do a roundtable on this topic, we’ll add another person or two. Caroline, I know I had spoken to Chenxi, actually, Chenxi Wang about coming on, so I would—because I like listening to Chenxi talk about security. But anyway, and we’ll open it up to—yep, we’ll open it up to the public for questions and it’ll be a great time.
But for now, guys, thank you so much. Andrew, best of luck with the new ED position there, and I know they picked the right man for this job, so make us proud, keep making us proud. Dr. Grigori, we’ll see you soon, hopefully on another DevOps Unbound. Okay, and Mitchell, as always—thanks so much. We’ll be in touch.
This is Alan Shimel. Many thanks to Tricentis for actually sponsoring DevOps Unbound, couldn’t do it without their sponsorship and help, and we’ll see you in two weeks with another episode of DevOps Unbound. Have a great day, everyone.