Organizations tend to have a very predictable approach to security: reactive. Most don’t really care about security until something bad happens. But with data regulations and requirements becoming more rigid around the globe and cybercrime only continuing to escalate, the notion of security as an “add on” or an afterthought is quickly becoming outdated. Today, an organization without adequate security measures is like a skydiver without a parachute—sure, the parachute doesn’t guarantee a 100 percent safe experience, but wouldn’t you rather have one than go without?
A big problem, however, is that the modern approach to security remains inadequate. Most companies simply layer piecemeal measures over their existing infrastructure, not realizing that this isn’t an effective way to mitigate risk. Instead of adding on bits and pieces of security such as firewalls and antivirus programs, organizations should strive to incorporate security principles into the very foundation of their IT, in both the code and the culture.
This is what Microsoft sought to do with its SD3+C principles, e.g. Secure by Design, Secure by Default, Secure in Deployment and Communications. While all of these are designed to impose better security measures throughout the development process, the first two arguably provide the highest security benefits. This is because they prevent the introduction of vulnerabilities in the first place while minimizing the potential exposure of software—so-called “attack surfaces.”
Security and safety by design has thus far achieved greatest maturity in the automobile industry. Modern-day cars have more than 100 different safety and security features, including redundancies, and are rigorously tested for verified assurance. While crash tests destroy vehicles and take time to conduct, they ultimately teach us invaluable lessons that make them 100 percent worth the investment.
Baking security measures into design and development is likewise a better approach when it comes to code, but it presents a problem: Developers, by nature, don’t want to take the time to build safety nets. A major criticism of SD3+C is that it slows down innovation by forcing developers to make more considerations than “make something really cool at lightning speed.”
This reaction isn’t surprising when you consider the culture of coding, including the process by which we teach new developers. Hacker culture has long carried a certain kind of allure, romanticizing the role of the developer as a creative, clever and adventurous rogue. It elevates those resourceful individuals that can overcome software challenges in the fastest, nimblest, slickest way—not necessarily the securest.
It’s never been the standard to teach the morality, ethics or laws of coding; instead, we teach people how to code and let them take off running. But the modern digital age has required us to put the brakes on and rein in that spirit. It’s like teaching someone to drive a Ferrari at breakneck speeds but not about the rules of the road, and then later expecting them to steer within the lanes, stop at red lights and not exceed speed limits.
Bring Order to Code Chaos
Our culture of coding inherently perpetuates chaos, but now we need to implement some degree of order, and this is a tough challenge. We need to drive a risk-based approach to development that introduces order into chaos without hindering innovation.
How do we begin to accomplish this? The first thing to consider is changing how you approach code in the first place by making it more robust. Typically, code can’t be easily changed once it’s written; a single line may be easy to manipulate, but at several million lines, you risk severing the ecological connections that make the app work, and changing any one piece could break the entire application. With your code in such a susceptible state, it’s far too resource-intensive to maintain.
The solution is to make it more mature in the first place. Make your code modular enough so that it can be amended in part when necessary, and then you can easily bake in additional security measures and eliminate dependence on firewalls and antivirus as your first (or only) line of defense.
Second, consider the privacy law and contractual requirements of your organization, and address them directly in your code from the start. Understand that different types of privacy data will need to be handled differently and require different levels of protection across technical, physical and administrative dimensions. Whether you deal with payment card industry (PCI) or protected health information (PHI), you should be intimately familiar with the standards governing the protection of data your organization deals with.
Now, what about culture? In its simplest essence, we can think of culture as comprised of three basic factors: rules, tools and fears. Where rules exist, you should apply them; where tools exist, you should use them; and as an organization, you should be “afraid” of the same things, because common fears tend to be strong motivators for groups of people or the masses. The only re-transmission of culture is through awareness, training and immersion.
An important point I invite you to consider is that human beings won’t do anything without a reason, whether real or imagined. Humans are experienced-based learners not knowledge-based learners. For this reason, we learn valuable lessons from mistakes and failures. An important aspect of this is timely communications. When something has gone wrong we need to be able to get the right people on task to work toward planning mitigations, learning lessons and making things better.
One way to effectively change or reshape your culture is to enforce the rules—for instance, the new General Data Protection Regulation (GDPR) mandates severe fines and consequences for PII data breaches (which is, in fact, similar to penalties enacted by the auto industry years ago).
Another is to provide better tools and processes that essentially “force” (or, more kindly, compel) your team to use the right approach or method.
Finally, you can employ fear, uncertainty and doubt, otherwise known as “FUD.” Though it may sound a bit sinister, FUD can be a very effective way to govern—although on its own, it’s a very tenuous and fragile kind of power. This is why you balance it with rules and tools, so that the overarching objective will resonate in both the hearts and minds of your team: Keep your organization happy and healthy.
Because this is the reality that modern digital organizations must face: It doesn’t matter how innovative or bleeding-edge you are unless your business stays healthy! A tree that is well-rooted and cultivated to grow properly and upright will weather the worst storms, and ultimately live a long and prosperous life. Remember that Ford once had to introduce critical safety measures to the Model T, and nobody was worried about hydraulic brakes, seat belts and padded dashboards slowing us down.
About the Author / Robert Hawk
Robert Hawk is Resident Security Expert at xMatters. He has extensive experience in Information Systems Security, Computer Security, Cyber Security, Information Assurance, as well as Governance, Risk, and Compliance (GRC) Management. He specializes in frameworks and standards from: ISO/IEC, NIST, IEEE, IETF, ITU-T, Common Criteria, AMI-SEC, NERC, CIS, DoD, ANSI, PCI, and ISECOM. Robert is a lifelong researcher, innovator and instructor. Connect with him on LinkedIn.