I think you've hit the nail on the head in the Fortinet example - unless the security incidents will cause some serious damage to the company (revenue, stock) they will be largely ignored. And that's a rational approach - why invest into security incident prevention, when you can just ignore the consequences when it happens?
I am a former CISO from a tier 1 automotive company. The major issue with secure by design in the automotive industry is the fact that there is literally zero interest in cybersecurity, apart from a few German automakers. As expensive as cars are, there is also a huge push to reduce production price, which means reducing expenses across the board, which means reducing tooling, staff, and resources. Lean and mean.
Also, secure by design, while very fundamental is very high on the maturity level. Your processes have to be mature. Your management has to make it very clear on a long-term and consistent basis that this is the way forward. This is almost exclusively seen in much higher regulated markets such as banking. Finally, your user base has to know that secure by design is a thing and they have to know who to engage to do so.
Theres quite a massive gap between infosec theory and reality. An even bigger gap when management won't support you. Hence why I left my CISO role and started doing woodwork and homesteading after 15 years in cyber.
I was co-founder of the Build Security In initiative that you mention while at CMU/SEI w/ DHS funding and a partnership with Gary McGraw at Cigital, but I left in frustration after 3 years because we were documenting the state of the art (what you call "best practices"), while having no impact on the state of the practice (what you call adoption/implementation).
I left cyber to go do Agile and then later DevOps and Digital transformations, but that's another story.
Years later, after I had made every mistake in the book and from that learned how to effectively change development culture, I got enticed back to cybersecurity by my BSI co-founder, Noopur Davis, who was then slated to become the new CISO at Comcast. We wanted to see what it would take to change the state of the practice inside of Comcast. I wrote the original DevSecOps Manifesto (borrowing 3 of the 5 guiding principles from BSI) and it became the heart of the program I launched at Comcast which 5 years later had transformed the culture of 300+ development teams there.
The reason I went to CMU/SEI in the first place was that I was asked to be the founding Executive Director of a poorly named research institute, The Sustainable Computing Consortium. I say poorly named because it wasn't green-earth sustainable we were talking about, it was financial sustainable. The SCC got folded into the CyLab at CMU when it was launched, but the main idea of my part of the CyLab remained the same as the main point of your piece here, that to fix the problem of cybersecurity, we needed to fundamentially change the financial incentives.
However, one big difference between that and what I now believe is that there are two ways to do that. The one you lean on is external incentives (regulation, market punishment, etc.). I've focused the last decade+ of my career on the other side of that coin, internal incentives, which comes down to lowering the cost/pain of changing internally. We kept the program going at Comcast because we were 1/4 the cost of the traditional vulnerability management spend, developments begged to be part of the program rather than needed to forced to comply... and it was 6x better at lowering real risk.
SbD 'Delusions' is a bit provocative and that's ok. After all, there are no silver bullets to assure complex software systems are free of safety issues. [To my colleagues pushing tech boundaries for memory safety, post quantum crypto, and SBOM ... don't give up, your innovations are welcome!]
SbD is attractive because leverage is inherent to the approach. Security advances in upstream compilers, components, frameworks, infrastructure, and operating systems can lift entire ecosystems.
Like most people, software developers take pride in their work. Highlighting SbD initiatives in the right manner helps engage and motivate developers. Let's applaud CISA, the global SbD coalition, and myriads of forerunners that have carried the SbD torch.
But back to the delusional part... real or perceived bottomless federal investment in cyber offense is another dimension of the cyber security challenge. The lack of commensurate funding incentives for SbD makes the current situation none too surprising.
It's also well past time to end the vehicle safety analogy. The historical approach to vehicle safety is quite distant from designing 'The Beast' for countering a targeted adversarial threat.
I think you've hit the nail on the head in the Fortinet example - unless the security incidents will cause some serious damage to the company (revenue, stock) they will be largely ignored. And that's a rational approach - why invest into security incident prevention, when you can just ignore the consequences when it happens?
I am a former CISO from a tier 1 automotive company. The major issue with secure by design in the automotive industry is the fact that there is literally zero interest in cybersecurity, apart from a few German automakers. As expensive as cars are, there is also a huge push to reduce production price, which means reducing expenses across the board, which means reducing tooling, staff, and resources. Lean and mean.
Also, secure by design, while very fundamental is very high on the maturity level. Your processes have to be mature. Your management has to make it very clear on a long-term and consistent basis that this is the way forward. This is almost exclusively seen in much higher regulated markets such as banking. Finally, your user base has to know that secure by design is a thing and they have to know who to engage to do so.
Theres quite a massive gap between infosec theory and reality. An even bigger gap when management won't support you. Hence why I left my CISO role and started doing woodwork and homesteading after 15 years in cyber.
Good article.
This is spot on!
I was co-founder of the Build Security In initiative that you mention while at CMU/SEI w/ DHS funding and a partnership with Gary McGraw at Cigital, but I left in frustration after 3 years because we were documenting the state of the art (what you call "best practices"), while having no impact on the state of the practice (what you call adoption/implementation).
I left cyber to go do Agile and then later DevOps and Digital transformations, but that's another story.
Years later, after I had made every mistake in the book and from that learned how to effectively change development culture, I got enticed back to cybersecurity by my BSI co-founder, Noopur Davis, who was then slated to become the new CISO at Comcast. We wanted to see what it would take to change the state of the practice inside of Comcast. I wrote the original DevSecOps Manifesto (borrowing 3 of the 5 guiding principles from BSI) and it became the heart of the program I launched at Comcast which 5 years later had transformed the culture of 300+ development teams there.
The reason I went to CMU/SEI in the first place was that I was asked to be the founding Executive Director of a poorly named research institute, The Sustainable Computing Consortium. I say poorly named because it wasn't green-earth sustainable we were talking about, it was financial sustainable. The SCC got folded into the CyLab at CMU when it was launched, but the main idea of my part of the CyLab remained the same as the main point of your piece here, that to fix the problem of cybersecurity, we needed to fundamentially change the financial incentives.
However, one big difference between that and what I now believe is that there are two ways to do that. The one you lean on is external incentives (regulation, market punishment, etc.). I've focused the last decade+ of my career on the other side of that coin, internal incentives, which comes down to lowering the cost/pain of changing internally. We kept the program going at Comcast because we were 1/4 the cost of the traditional vulnerability management spend, developments begged to be part of the program rather than needed to forced to comply... and it was 6x better at lowering real risk.
SbD 'Delusions' is a bit provocative and that's ok. After all, there are no silver bullets to assure complex software systems are free of safety issues. [To my colleagues pushing tech boundaries for memory safety, post quantum crypto, and SBOM ... don't give up, your innovations are welcome!]
SbD is attractive because leverage is inherent to the approach. Security advances in upstream compilers, components, frameworks, infrastructure, and operating systems can lift entire ecosystems.
Like most people, software developers take pride in their work. Highlighting SbD initiatives in the right manner helps engage and motivate developers. Let's applaud CISA, the global SbD coalition, and myriads of forerunners that have carried the SbD torch.
But back to the delusional part... real or perceived bottomless federal investment in cyber offense is another dimension of the cyber security challenge. The lack of commensurate funding incentives for SbD makes the current situation none too surprising.
It's also well past time to end the vehicle safety analogy. The historical approach to vehicle safety is quite distant from designing 'The Beast' for countering a targeted adversarial threat.