Edition 27: Secure by Design is important, but requires a different kind of industry effort to achieve it
CISA's Secure by Design has good intentions, but has an identity crisis. At this point, it may not move the needle on software security.
I remember the time when the OWASP Top 10 was all the rage(2010). I was new to AppSec, and the list made it easy to enter the complex world of AppSec. It was clear to the authors that there are more than 10 web app vulnerabilities to worry about, but they wanted to make it easy for practitioners like me to get started. Given how unprepared the industry was for AppSec vulnerabilities, such a list was necessary. Fast forward a few years, and the OWASP Top 10 (for web) stopped being helpful. In some ways, it became harmful. While the foundation emphasized that the list is “not a standard,” many companies and regulators adopted it as such. “Are you OWASP Top 10 compliant?” became a thing we kept hearing. The list became a proxy for AppSec. AppSec tools started spitting out OWASP Top 10 reports, and as long as that report was clean, you didn’t need to worry about AppSec. Today, it’s fair to say that the OWASP Top 10 is not something good AppSec teams care about. They may use it as a part of awareness training for developers, but no more. It took a long time to get here. For a few years, The OWASP Top 10 definitely did more harm than good (after many years of doing more good than harm).
Overall, I’d argue that The OWASP Top 10 was good for our industry. When it was first published (in 2003), we needed a breakfast cereal version of AppSec. Something mildly nutritious and easy to consume on the go, even if it has some added sugar. Today, there are many different OWASP Top 10s (one for Mobile, one for APIs, and one for LLMs). It’s fair to conclude that such lists are helpful only when we have a new area of investigation (e.g., The LLMs one is quite helpful right now). In a few years, the lists cannot be used to measure the effectiveness of programs.
With this background, I find it hard to get excited when I look at the Secure by Design initiative (we will call it an "initiative" because I am unsure if it's a framework, standard, list, or something else) from CISA. A few good ideas are thrown into a list with an odd call to action to "take the pledge." To be clear, the ideas and guidelines in SBD aren’t bad; it’s just unclear if it can move the needle on Secure by design.
Hypothesis
“All models are wrong, some are helpful” is a useful way to think about such initiatives. Even if an initiative is not perfect, if it’s useful, that’s good enough. CISA’s Secure by design (SBD) is neither completely wrong nor helpful. The framework attempts to mix theoretical research (a 36-page Whitepaper on how to implement their three principles) with virtue signaling (“sign the pledge”). Furthermore, it wrongly identifies that the problem with lack of security is design is awareness. In reality, most software manafacturers do not think about security in the design stage because it's too expensive to do so, and CISA SBD does nothing to lower the cost.
Unintended consequences
If I had to describe CISA SBD in 1 phrase, that would be “well-intentioned.” Many years ago, when I briefly studied public policy, we learned about the law of unseen consequences. To be clear, unintended does not mean it cannon be foreseen. This classic post by the Executive Director at OWASP is an example of unintended, but foreseeable consequences. While the “intent” was to create an awareness document, everyone knows it as a standard. And it was absolutely predictable that it would be used in that manner.

While SBD’s intentions are altruistic, there can be unintended consequences. The probable outcomes are:
1. Given the lack of context and details, it’s not suitable for an internal security team to leverage SBD to affect real change and hence will not be something good security teams actively use.
2. Given its backing by a serious organization that has done fantastic work (CISA), it’s possible that SBD will become a “standard” that folks need to “adhere” to. This means SBD compliance will become a necessity, and we will have a cottage industry of companies that help you become compliant.
3. Finally, “the pledge” – an unenforceable, voluntary declaration – will be used for virtue signaling or as a marketing technique by companies that have security software to sell. While neither virtue signaling nor marketing fodder is evil, they probably don't move the needle on software security.
Digging deeper…
In the remainder of this post, I will attempt to dig deeper into the hypothesis in an FAQ format. This isn't typical for this blog, but we will try it anyway :)
Why don’t software manufacturers build security into design?
Software manufacturers are not a monolith. Everyone from a bored engineer building an open source side project to a SaaS company to an enterprise software company that deploys on-prem is a “software manufacturer.” Even if you limit the scope to “manufacturers that supply software to the federal government, " the spread is vast and complex to generalize. But here are a few common reasons why they don't build security into design:
1. There is no turnkey way of building security into design. Designing a product depends on what you are building, why you are building it, and many other contextual factors. Building security into design has the same complexities
2. Defining what security by design means to a company requires an understanding of software architecture, software security, and risk management. This skill is rare, non-scalable, and expensive
3. This gets exponentially harder in modern software shops where multiple deployments happen every day
Do we need initiatives like SBD? If yes, why?
An industry initiative to promote security by design is a worthy cause. The onus of security is tilted disproportionately towards the user doing the right thing ("pick strong passwords") v/s the manufacturer doing the right thing ("mandate MFA for all"). In this regard, CISA gets the problem statement right. Given Security is still an afterthought in most engineering organizations, it makes sense to have a champion of Secure by Design.
What is CISA secure by design? Is it a standard, guideline, mandate, or something else?
It's unclear. But before we address this, I think there are three useful ways in which industry initiatives (such as the CISA SBD) can contribute:
1. Create awareness: This is especially useful for "new areas" in Cybersecurity. The OWASP Top 10 for the web was helpful when web attacks were new. The OWASP Top 10 for LLMs is doing the same for LLM Security. For reasonably mature areas, awareness documents don't move the needle much. The OWASP Top 10 for web/mobile/API are no longer useful. There's one exception here: Awareness documents are still helpful for newcomers to the industry. If you are a college student studying cybersecurity or a mid-career professional making a decision to switch to cybersecurity, even older awareness documents are helpful. My limited point is that awareness documents in mature areas don't move the needle on industry behavior.
2. Maintain an exhaustive database: This is especially helpful in Cybersecurity. Mitre’s CVE program, OWASP’s ASVS (though poorly maintained), and NIST’s National Vulnerability Database (NVD) are fantastic resources, on top of which community initiatives, products, or programs can be built. For such an effort to work, the database has to be exhaustive. You cannot Top 10 your way into this category. For instance, ASVS aims to provide you with every type of security control you can verify. That is useful for Pentesters consuming it and (more importantly) for operators and vendors to anchor their programs/products against. Double bonus if you can find the funding and talent to keep the database up to date (and this is not easy, as NVD found out in the last 12 months. Here's a great article by Chris H on the topic).
3. Describe a methodology: Such initiatives help you with the "how". The goal is to help you achieve certain goals. For instance, STRIDE, DREAD, and PASTA help you perform threat modeling. It's fairly obvious that these methodologies are starting points, and the expectation is that the consumer will modify it before consuming it. Things get a little tricky when a methodology becomes a standard (e.g., Compliance standards mandating the usage of STRIDE), but a well-defined methodology will guard against it.
So, which of the above categories does CISA SBD belong to? It's surely not a database. If it were, it would not pick and choose a few things on its "bad practices" list (e.g., they talk about lack of protection against SQL and command injection as a bad practice but not any other form of injection). They routinely point out that the list they put out is not the most exhaustive but the most important.

I think they are trying to define a fourth category: Define the "bare minimum" requirements needed to build security into software.
So, what's wrong with an awareness document that defines the bare minimum
In one word: Context.
Defining a bare minimum is important for a Security program. Given everyone has resource limitations (time and money), you have to pick and choose where to place your bets. That’s true for Security too. The problem is this: what's "bare minimum" for a software team in a medical device company differs from that of a bank. It also varies by size, user base, and much more. The problem with the SBD is that it's devoid of context.
Here’s an example: The first goal of The SBD Pledge is to improve MFA coverage within a year of signing the pledge. That’s commendable, but it baffles me that they decided that MFA is a goal but said nothing about many other important things (e.g., protection against DoS attacks). Most reasonable people would argue that MFA is necessary. I am one of them. However, depending on the context, MFA may be the 4th or the 5th most important thing compared to (say) availability and may not make it to your “bare minimum” list. Here's an example:
Let's say it's Hurricane season on the Atlantic coast of the US. A Hurricane is about to make landfall in Florida, and state actors from an enemy state want to disrupt rescue efforts. Getting the latest information to residents and first responders is critical at this stage. Let's say you are a weather app. An enemy, non-state actor wants to undermine first responder preparation and rescue efforts. Would the threat actor try to attack your state-of-the-art MFA implementation or try to bring the app down because they have insufficient app-level rate limiting? Given limited security resources, should the software manufacturer optimize for resistance to bot attacks or strengthen MFA?
In other words, the problem with a bare minimum document is that it provides a false sense of security or incentivizes companies to meet the letter of SBD rather than the spirit.
How should we improve CISA SBD?
The problem with blog posts like this is that we can do a reasonable job (if attempted in good faith) at criticism but do little to improve it. As someone who runs a startup, my approach is to hear all feedback about our company (especially criticism), but own the "fixing" portion. You cannot outsource that. In other words, I don't think my advice on how to fix it should be taken seriously, given I don't understand the details of how CISA works or what their core incentives are.
Having said that, if I were magically in charge of SBD, my goal would be to publish resources that help lower the cost of building security by design within software manafacturers. Specifically, here's what I would do:
1. Recognize that among the hardest part of SBD is defining what it means for each company. Help companies define that. This means building a methodology that helps security teams define security by design for them (AWS well architected is a good example). Then, work with industry experts to publish a massive list of industry case studies. So, if I am a Security leader at a fintech, I should have a generic guide on defining SBD in my program and have access to multiple case studies on how other companies have done it.
2. Nuke the pledge. Seriously, just Cmd+Delete and double-check to make sure it’s not in the Bin. The effort that goes into designing the pledge, making it palatable, evangelizing it, and tracking who joined the tribe is not worth the upside: virtue signaling and marketing fodder (quick research tells me ~73% of all companies that have signed the pledge sell Cybsersecurity software). Instead, use the resources to build a community of operators and vendors who can help each other with #1. Once the community is up and running, make sure it is well-staffed for the long run.
Conclusion
This was a hard post to write. As someone building a company in the space, the simplest (and maybe the logical) thing to do would have been to go all in on SBD. Tethering your product to an industry initiative and building use cases is a great way to market the product, but at this point, it’s hard to do that with SBD. I hope future versions of SBD are easier to get behind. Until then, I will continue to follow how the initiative shapes up closely, and I'd be happy to be proven wrong if it actually enables companies to build securely (in which case, I will happily eat my words :)).
That’s it for today. What do you think about CISA SBD? Will it help your organization integrate security into design? Will this improve software security? Let us know! You can drop me a message on Twitter (or whatever it is called these days), LinkedIn, or email. If you find this newsletter useful, share it with a friend or colleague or on your social media.
References
CISA Secure by Design: https://www.cisa.gov/securebydesign
Secure by Design implementation guide: https://www.cisa.gov/sites/default/files/2023-10/SecureByDesign_1025_508c.pdf
List of 200+ companies that have signed the pledge: https://www.cisa.gov/securebydesign/pledge/secure-design-pledge-signers
Chris H on Secure by design and Secure by default, on NVD’s funding crisis, and shift-left starting to rust
That which is seen and that which is not seen, by Bastiat (english translation): https://mises.org/articles-interest/which-seen-and-which-not-seen