When the director of the US Cybersecurity and Infrastructure Agency (CISA), Jen Easterly, calls out standard software development practices and the resulting products as “dangerous-by-design,” it’s probably time to take a step back and assess your own development. Both developers and users have “normalized the fact that technology products are released to market with dozens, hundreds, or thousands of defects,” but Easterly states that this cannot continue. Her remarks reflect the National Cybersecurity Strategy released by the White House on March 1, 2023.
Easterly calls for “a new model” of technology development in which the emphasis on security and safety is placed on the initial development process, rather than a ‘ship now, fix later’ approach; she hints at (and in some cases, explicitly calls out) governmental regulations and requirements related to cybersecurity and software security requirements. These comments align with a number of the strategic objectives and related governmental actions, such as contemplated regulations.
The White House’s National Cybersecurity Strategy explicitly calls out the “software supply chain risk mitigation objective[…]and related efforts to improve open-source software security.” In our last post we reviewed the state of SBOMs; as of today, no new bills have been proposed in either the House or the Senate that directly address SBOMs. With the announcement of these strategies, however, standards for acceptable software development and open source software practices and documentation will be increasing, whether through legislation or executive action.
Over the upcoming months and years, we are likely to see government actions that:
- Establish minimum security requirements for critical infrastructure and industries and their related third-party providers,
- Incentivize security for IoT devices, and
- Establish safe harbors for secure development, while holding companies liable for deviations from these practices and standards.
Easterly notes that regulation “is one tool, but – importantly – it’s not a panacea.” For companies selling software to the government (or into the government ecosystem more broadly), meeting minimum security standards will be a requirement in the procurement process. Companies that do not keep up will simply lose access to the government market.
Impact on Software Development
Gone are the days of moving fast and breaking things: the Biden-Harris administration is looking to work with Congress to ensure that legislation is developed to “prevent manufacturers and software publishers with market power from fully disclaiming liability by contract” and instead to encourage them to engage in secure development practices. Most companies aren’t market makers who have the ability to disclaim all liability, so for the majority of companies developing software, it’s already in their own best interest to follow secure development practices. What do these whispers of government action actually mean for most businesses?
A rising tide lifts all boats, and we’ll likely see that the baseline for secure development practices increase throughout the entire industry. In order to be able to effectively navigate additional development and documentation requirements, companies must assess the current state of their development, testing, and documentation tools and practices. In Easterly’s assessment, this means that “[t]echnology must be purposefully developed, built, and tested to significantly reduce the number of exploitable flaws before they are introduced into the market for broad use.”
In order to achieve this goal, developers need to be able to determine whether their own code introduces vulnerabilities and whether open source libraries, packages, or other code introduce vulnerabilities. With respect to the latter, this might be accomplished through “[e]ffective use of an SBOM[, which] can help an organization understand whether a given vulnerability affects software being used in their assets” (as encouraged by Easterly) or, in instances in which the company doesn’t have – or doesn’t trust – the reported SBOM information, through the use of a tool that can identify direct, transitive, and forked vulnerabilities.
Memory Safe Languages
The use of memory safe programming languages prevents the introduction of many vulnerabilities in the first place; big players like Microsoft and Google have estimated that approximately 70% of their vulnerabilities during the past 2 decades related to memory safety. While software analysis tools (both dynamic and static) may identify some (or possibly even most) of these vulnerabilities, using memory safe language instead entirely avoids the introduction of such vulnerabilities.
Memory safe languages include Python, Rust, Go, Java, Ruby, and C#. Languages like C and C++ are less secure, as they do not prevent the introduction of memory safety vulnerabilities. With C and C++ still in the top 10 most popular languages, the prevalence of memory safety vulnerabilities is significant, and businesses that write or use components written in these languages should take additional precautions to identify and remediate any vulnerabilities.
What happens when you build in a memory safe language, but you include cross-language dependencies, such as components that are written in C or C++? Do you even know whether these are included, or are they hidden (like the log4j dependencies hidden in Python packages)? Take a popular example: OpenSSL is a C library used in many packages and it has been a frequent major CVE offender. Developers using languages like Rust or Python often overlook the need to address (or even assess) issues in OpenSSL within their own dependencies; they assume that as long as the operating system libraries are patched, they’re safe from potential OpenSSL-related vulnerabilities. This is a flawed assumption, however, as some dependencies include compiled, forked versions of OpenSSL that create hidden vulnerabilities that are nearly impossible to detect using most tools.
Our Dilligencer API scans source repositories and indexes all files, allowing us to locate any packages that vendor or fork vulnerable libraries (like OpenSSL) that other tools would miss. In particular, our fuzzy hash matching allows us to detect modified forks that other search methods don’t identify. Using our API during the development process results in a more secure development environment and enables transparency by allowing for the generation of more complete and accurate SBOMs.