In “Building Security into Code + Culture,” I described how our culture of coding is predicated on chaos, and the critical challenge for organizations is to establish the right mix of processes and tools that support a smarter, risk-based approach to development. A key aspect of this strategy is, of course, to implement a sense of order within the chaos that enhances security without hindering the speed of innovation.
I overviewed a number of different recommendations to approach secure by design, such as making code more mature and baking in security measures and privacy/contractual compliance from the start. I also emphasized that organizations should tackle the rules, tools and fears pillars that help shape the culture, as each of these can be transformative in helping people use the right approach or method. Making the rules and defining fears is relatively straightforward, and with the right degree or training and awareness your organization can adapt.
But what about tools? The right tools can do wonders to “force” people to use better, smarter, more secure methods—that is, if such tools exist. Though I have not done a deep dive into whether there are any robust development tools integrated with security software, I’m a realist and will go out on a limb to say none of them have reached maturity just yet. In light of this, it’s essential that organizations focus on standardizing the overarching process that will enforce privacy and security by design, development and deployment.
The first thing to do is set the business drivers behind the process from a legal standpoint. Gather and research the privacy regulations and compliance requirements that affect your organization. Analyze and then translate these using a risk-based approach to security controls requirements, and then publish these as an awareness catalogue available to all developers across your organization. You should also ensure that training is available, on-demand and on a just-in-time basis, for any developer on any desired control.
Next, set the technology drivers by prescribing to secure coding practices defined by the industry, such as the Web Application Security Consortium (WASC) – Threat Classification system (based on Open Web Application Security Project [OWASP] Top 10) and SysAdmin, Audit, Network and Security (SANS) Top 25 (which are based on the Mitre – Common Weakness Enumeration [CWE]).
At this point, you must ensure that all your applications and systems comply with the security controls requirements catalogue you created. These can be batched into smaller parts that are handled by developers or teams of developers, all of which use your security controls requirements catalogue and industry-standard secure coding practices. Again, you’ll want to make sure that training sessions are available on a just-in-time basis for any developers who need them!
Upon completion of each section or unit, code must be validated and verified for trust. Peer code review is the first step—have your developers present their work and provide the logic to a different team of developers. You can always employ static code analysis to scan code for any weaknesses, defects or violations of best practices, and subsequently enter the bug fix stage if any are detected.
Then, you can merge the code branches into the main and compile for run time, so you can conduct live vulnerability scanning and penetration testing. Again, these should be based on the industry-standard secure coding practices adopted across your organization. At this point, you should be able to establish reasonable assurance for the release version of your application or system by having published artifacts for review by entities such as auditors or clients.
Continuous improvement will require adding new features and subsequent emphasis on the validation and verification process. I would also recommend repeating the entire process from start to finish on a regular basis to reinforce the process, perhaps annually or semi-annually. And of course, if there are any major changes or shifts (as there are likely to be) in your privacy regulations, compliance requirements or secure coding practices, this will absolutely need to be addressed at the organizational level so your process addresses these head-on.
Privacy and security are such important topics today because they are an artifact of the youth of the computer software industry. If this industry was more mature, then the outcome would be quality code. For some reason, it’s okay to create buggy, weak and inadequate software in a world in which we wouldn’t tolerate buggy, weak or inadequate cars, airplanes, bridges, buildings … and the list goes on. And the reason for this is pretty obvious: People are acutely aware that cars, airplanes, bridges and buildings put their very lives on the line, whereas software “just” makes them susceptible to things such as identity theft and ransomware.
This is the message we’re sending: It’s not OK to kill people, but it’s perfectly acceptable to ruin their lives. This is not OK! It’s time for organizations to standardize the processes that drive secure-by-design, and embrace necessary and fundamental changes to how we approach development. As developers and IT professionals, the onus is on us to shape a better world driven by quality software.
About the Author / Robert Hawk
Robert Hawk is Resident Security Expert at xMatters. He has extensive experience in Information Systems Security, Computer Security, Cyber Security, Information Assurance, as well as Governance, Risk, and Compliance (GRC) Management. He specializes in frameworks and standards from: ISO/IEC, NIST, IEEE, IETF, ITU-T, Common Criteria, AMI-SEC, NERC, CIS, DoD, ANSI, PCI, and ISECOM. Robert is a lifelong researcher, innovator and instructor. Connect with him on LinkedIn.