Demand Without Development

The cybersecurity talent shortage is not just a problem of numbers, but of structure. By systematically avoiding the hiring and training of true junior staff, the industry is reinforcing a feedback loop that shrinks its own future workforce.

Whenever I sit together with information security people in a social context, at some point the conversation inevitably ends up at the same topic: The state of the industry and the lack of people.

Almost all companies that I hear about are unable to hire the people they would need for their security departments. Sometimes it's because of budget cuts or internal politics, but often enough the reason is much more simple: There simply aren't enough qualified people to fill open positions (although "phantom" positions and the shitshow that is modern recruiting would warrant an entire, even more rant-y post by itself ..).

And as much as I think that most of the numbers floating around are overblown, even if you cut them down by 90%, there's still a distinct talent shortage. To list some examples:

  • According to some estimates (from 2023), in the United States alone there are over 600.000 open positions in cybersecurity.
  • According to an ISC2 report the global cyber workforce only grew by about 0.1% in 2024, despite the huge amount of open positions.

At the same time, many security teams have no true entry-level members. According to the ISC2 report linked in the previous paragraph, 31% of teams had zero entry-level staff, with 15% having no junior staff at all.

It's a tendency, a development I have seen emerge more and more throughout the past years. Less and less organizations being willing, or able, to hire entry-level staff and systematically train them up to their needs. And even if a company is listing "entry-level" cybersecurity positions, a lot of them aren't really "entry-level".

Because no matter what you say, I will never accept more than three years of experience and, ideally, advanced certifications as requirements of "entry-level" positions.

This, to put it drastically, structural neglect of integrating new people into our industry creates a vicious feedback loop, a self-reinforcing cycle.

Don't get me wrong, I understand that I, as someone without any relevant authority, can easily say that people should hire more "true" juniors. I am aware of the constraints present on all levels.

I myself have talked to people who admitted that they can't afford the time necessary to train newcomers, because their teams are already stretched to the absolute limit. Or that they are not willing to take the risk and uncertainty of trying to train someone new. It's reasonable to see hiring someone with experience who can, more or less, hit the ground running as the more appealing choice.

I'm also aware of budget constraints and hiring freezes. I get that things are tough out there for decision-makers. But still, that doesn't change the fact that by refusing (for whatever reason) to cultivate new talent internally, organizations ultimately end up having to draw from a rather finite pool of seasoned pros, intensifying competition for those same experts.

While this helps me personally (given that I can pass myself off as an expert without necessarily being one), it's unsustainable. Even if it weren't for the already existing demand that surpasses supply, an "all-senior staffing model" ultimately shrinks the bench over time.

But one doesn't have to wait for the effects to have an impact. It's already having a noticeable negative impact on existing teams. With too few personnel, security staff face chronic overload. Which, in turn, can make things worse - burned-out teams can lose members (triggering yet further vacant positions) or be forced to scramble just to keep up .. which, I guess, is a good thing for consulting companies who can hire out their contractors for a lot of money.

In short (pun intended), the original shortage feeds on itself, and will eventually turn a difficult talent market into an acute(r) staffing crisis. The question is: How did we end up here and how do we get out of this situation?


I don't think that there is one single cause and thus not single, magic fix. Despite organizations and companies slowly realizing that it's a mistake to do so, the prevalent view on management level still seems to be that of treating cybersecurity as an urgent "fix-it" problem, rather than a profession requiring long-term cultivation and investments.

Many organizations view mentoring as a distraction from immediate risks, reinforcing a short-term focus. Which reflects a general kind of market myopia. Companies externalize the cost of developing talent, seemingly assuming it will magically appear out of .. thin air? Nowhere?

But that assumption works because while training juniors might carry some risk, not training them carries no immediate penalty. Similarly, managers are rewarded for short-term delivery, not long-term talent cultivation.

I recently read an economic textbook, and I'm using the chance to use what I learned here (because I'm unlikely to be able to at any parties): the information security skill pool suffers a kind of "collective action problem". Everyone depends on strong defenses (which is a common good, ultimately helping "everyone"), but most individual organizations or companies barely invest in training.

This results in high demand with little to no mechanism to replenish the supply. Which is somewhat ironic, given the recent focus on the security of supply chains. This is less a shortage of people than a shortage of experience - an oversupply of motivated entrants and a dearth of senior experience to deal with them.

Underlying these issues are more institutional factors: budget cycles that prioritize short-term projects, HR practices that tend to tokenize "years of experience" over raw aptitude, and a risk-averse culture (again, often enough very understandably) that prizes certifications over capability.

In combination, all of this creates a self-defeating trap. The very steps needed to close the gap are undermined by the genuine norms meant to protect organizations.


In other professions, such as medicine or aviation, it is axiomatic that newcomers learn under guidance, a model cybersecurity seems to be eerily resistant against. Even as some voices warn that "it is imperative for decision-makers to prioritize cybersecurity talent management as a strategic necessity".

If I were philosophically inclined I would ask: Can an industry, or industries, truly claim to be interested in resilience and security if it systematically limits (or even shrinks) the ranks of its future defenders? The answer is clearly, obviously - no.

The current trajectory of insisting on immediate expertise while neglecting education resembles a broken promise to a "next generation". We preach vigilance, security, resilience and all the other marketing phrases, but at the same time take no steps to ensure the chances for growth necessary for long-term strength.

Ultimately, breaking this cycle will require a shift from short-term gratification to long-range stewardship. Organizations should, must treat early-career professionals as an investment in collective security, not merely a cost. This means realistic job postings, funded internships and / or apprenticeships, and rewarding experienced staff for teaching rather than merely firefighting.

Until that change happens, until we learn to balance demand with development, we risk defending against the attacks of tomorrow with the eroded defenses and burnt out defenders of today. Which, to put it mildly, isn't exactly a great outlook. But it's a predictable outcome of choices we (as in "large parts of the industry") continue to make.