No Pain, No Gain - How Impunity Perpetuates Failure
It’s time to treat cybersecurity incidents and data breaches like preventable disasters, not the inevitable cost of doing business.
A few days ago I stumbled upon an old article about the Colonial Pipeline ransomware attack in 2021. An attack that had significant impact on parts of the real world. Gas prices spiked, panic buying ensued, and the White House declared a state of emergency. It was a cybersecurity incident that I would willingly call "significant", if I'm being timid in my classification.
Yet, within days, the company paid around 4.5 million dollars in ransom payments, operations resumed, and the public moved on. The attackers were later identified, some assets were seized, but no systemic change occurred. No executives were held personally liable. No regulatory overhaul forced organizations to fundamentally rethink their security posture. The incident, like countless others, faded into the background noise of cybersecurity failures.
This pattern is, unfortunately, not an exception - it is the rule. Cybersecurity incidents, no matter how severe, rarely result in meaningful, lasting consequences for the organizations involved. Unlike fields such as aviation or medicine, where failures tend to be meticulously dissected to prevent recurrence, cybersecurity operates in a sort of paradox: breaches are treated as inevitable, but accountability is optional.
In my opinion, the lack of consequences is one of the primary reasons why companies and organizations fail to improve their security postures, why the same vulnerabilities and attack vectors keep persisting, and why attacks that should have been rendered (if not impossible) obsolete years ago continue to succeed.
Consider the emergency room of a modern hospital. when a patient dies unexpectedly, a rigorous process begins. A root cause analysis is conducted, involving clinicians, administrators, and sometimes even external reviewers. The goal is not to assign blame but to identify systemic failures such as miscommunications, protocol violations, or equipment malfunctions that contributed to the outcome.
Findings are documented, shared across the institution (or institutions), and integrated into training and policy. The result? A continuous feedback loop that aims to reduce preventable deaths over time.
Cybersecurity, by contrast, despite all the assurances and compliance frameworks that pretend to mandate this, lacks this culture of responsibility and accountability. When a breach occurs, organizations typically follow a script. They try to contain the incident, restore operations, issue a press release, and - if required by law - notify affected parties. After waiting a few weeks or, at worst, a couple of months, nobody outside of the company cares anymore and business goes on as usual.
No matter how bad the company fucked up. Yes, I'm looking at you there Fortinet, with your blatant lying and stalling tactics to keep both your customers and the CSIRT-community from finding out how badly you fucked up.
Rarely is there a forensic deep dive into why the breach happened, let alone a public or even just internal reckoning with the organizational failures that enabled it. Executives may face temporary reputational damage, but they are seldom held financially or legally responsible. Shareholders may grumble, but stock prices mostly rebound. Regulators may impose fines, but these are frequently negotiated down or treated as a cost of doing business. Sales staff of vendors might have to do some sweet-talking or weather some angry calls, but at the end of the day contracts are likely going to be renewed.
This absence of real consequences, to me, sends a clear message: cybersecurity is not a core business priority. It's merely an annoying compliance checkbox, a risk to be managed rather than a failure to be eradicated.
Currently existing economic incentives further reinforce this dynamic. In most industries, the cost of a breach, while potentially significant, is still lower than the cost of comprehensive security overhauls. Files, even when levied (which happens far too little), are often a fraction of annual revenues. Depending on which source I looked at, the average cost of a data breach is apparently between three and five million US dollars, a figure that pales in comparison to the billions spent on digital transformation programs or shareholder returns.
Meanwhile, the probability of a catastrophic, existence-threatening breach remains low for most organizations. The result? A sadly rational calculation from a business perspective. It is vastly cheaper to absorb the occasional breach than to invest in robust, proactive security.
This economic logic is compounded by the diffusion of responsibility. Cybersecurity is kind of everyone's problem and so, often enough, no one's. IT teams blame executives for skimping on funding for security initiatives, executives blame IT for failing to communicate risks effectively, boards blame regulators for unclear guidelines .. so, again, in the absence of clear accountability, inertia prevails.
Speaking of regulation: regulation, in theory, should fill this "accountability void". Laws like the General Data Protection Regulation (GDPR) and the U.S. Securities and Exchange Commission's cybersecurity disclosure rules aim to impose accountability. Yet, enforcement remains inconsistent. GDPR fines, while occasionally substantial, are often reduced after appeals. The SEC's rules, while a step forward compared to how bad the situation was before, still allow organizations to frame breaches as unfortunate but unavoidable events rather than preventable failures.
More importantly, regulation alone cannot foster a culture of learning. Compliance does not equal security .. something that Information Security Managers seem to struggle to understand, much to my personal suffering. Organizations can meet regulatory requirements while still operating with glaring vulnerabilities. What is missing is a mechanism that forces organizations to treat breaches as opportunities for improvement rather than PR crises to be weathered.
I would love to say that the status quo is unsustainable. But (similar to the talent shortage in cybersecurity I talked about around a month ago) the speed with which this lack of sustainability is producing consequences is far too slow, which - for the uninitiated - makes it sustainable.
As long as cybersecurity incidents remain largely consequence-free, managers and decision makers will continue to treat security as an afterthought. The result is, and is going to be, a digital landscape where the same attacks succeed again and again, where innovation in defense lags behind innovation in offense, and where the cost of failure is borne not by those responsible but by customers, employees, and society at large.
Circling back to the example of a hospital emergency room: an ER offers a model that could work. Treating every breach as a preventable death, every vulnerability as a systemic flaw, and every attack as an opportunity to learn could help to evolve cybersecurity from a generally reactive, bolt-on function to a (largely) proactive, core discipline.
Until we manage to achieve that we will remain trapped in a cycle of breach, apology, and repetition - wondering why, despite spending billions, we keep failing in the same ways.