Your reporting should act as a yardstick.

Threat Intelligence reports often don't manage to adhere to basic principles of analytical work. But they should.

As much as I'm spending more and more of my professional time these days on managing people and speaking to varying audiences while going through the motions of looking like I know what I'm doing, I like to still pretend to be an analyst. I'm aware that this pretense requires a significant amount of suspension of disbelief, but please .. let me have this vital rest for my sanity.

That means, given that I work in (cyber) threat intelligence, I have to read a lot of reports that pop up in our reporting tracker, within community exchanges, on social media, or elsewhere. And often enough with these reports I'm encountering some of the same issues I consider to be highly problematic in other areas of cyber threat intelligence.

Just because you add "intelligence" as a label to something doesn't mean that it's actually intelligence. I'm a big fan of dashboards, tools, data sources and all the other fancy toys that come with working in information security. But I wish that more "intelligence analysts" working in cyber threat intelligence would spend time on educating themselves on what the concept of "intelligence" actually entails.

Don't get me wrong, I'm not a trained intelligence analyst either (although my boss keeps using my "Certificate of Intelligence Practitioner" as a smokescreen to tell other people exactly that, to my great dismay), but things like talking to a "Senior Threat Intelligence Expert" and realizing they have never even heard of the concept of the intelligence lifecycle .. hurts.

Similarly, quite a few "cyber threat intelligence reports" could also do with some more "intelligence analysis theory", because they really aren't intelligence products. I am not (necessarily) referring to the literary quality of the writing - although I do sometimes take issue with how a lot of reports are written, something I have talked about on this blog in the past.

I'm talking about them not fulfilling basic requirements that make an intelligence report actually useful. Because while most of the reports written by and for our industry won't serve as a vital tool for decision-makers in the higher echelons of national security organisations, they are still a tool that has a lot of potential to it. The quality of a report directly influences the effectiveness of the actions taken in response to its findings - if certain aspects, certain standards are adhered to. Or, to tap an already well-tapped sign: IOCs are not threat intelligence.

In the following paragraphs I want to talk about what I would consider important aspects, characteristics of intelligence reports and how these help with providing a tool that helps decision makers to navigate and overcome the challenges of an environment (that being information security) that probably the majority of decision makers isn't intimately aware of.


Actionability and Timeliness

Those two are pretty much the absolute baseline for intelligence products. The intelligence you provide needs to be current and relevant to the decision-making context. On the other hand, your report may be timely, but if its insights are vague, overly complex, or lack clear recommendations, its value diminishes.

Actionability bridges the gap between analysis and real-world impact, requiring reports to not only present findings but also to articulate what should be done, by whom, and by when.

To give a typical example: A threat assessment is only useful if it not simply identifies a threat actor or a vulnerability, but also tells you about what to do now - as frustrating as it might be, because sometimes that's all you can publicly tell people, "Be scared!" is not exactly what I would consider actionable.

Relevance (aka "cut the noise")

While it's understandable that reports generated and published by commercial vendors have an inherent need to convince potential customers of the utility of the services the company is selling, I'd nonetheless argue that ensuring that even freely available reports should strive for the maximum amount of relevancy. Because if the free reports are of high quality, how good are the ones you have to pay for going to be?

So yes, highlighting your successes and how amazing your company is might sound appealing. But it dillutes the essence of a report. Ideally, all the information provided in the report is pertinent to the specific context, audience, and objectives. It should focus readers' attention on what matters most, facilitating effective action. Irrelevant information wastes time and might obscure critical details.

Specificity

The things you consider relevant for your audience should be presented in a precise, unambiguous way. It should be clear to the reader what the facts are, what your analysis is, what your conclusions are, and what you recommend as potential course of action.

Given how much information usually ends up feeding into a finished product it's easy to overwhelm readers with excessive details. Something I'm definitely guilty of myself.

Readability (aka "please don't do the academia thing")

A former partner of mine studied philosophy, and she asked me for help with one of her homework assignments, because she couldn't figure out what the text they were given to analyze was saying. And honestly? After going through the single page of writing several times, I couldn't either.

It was as if the author decided, after completing the writing process, to open a thesaurus and proceed to mangle the text as much as possible in order to make it unreadable. That's obviously not something exclusive to academic writing, but throughout the years I have noted a certain tendency towards the "mangling" in it.

Ideally, you want your report to be easily read and understoof by its intended audience. Readability ensures rapid comprehension which in turn, you guessed it, facilitates quick and decisive action.

You can achieve readability clear language, logical structure, and visual aids. But even with the best intentions and utmost effort, you will obviously face challenges. Balancing clarity with technical precision and ensuring that the visual aids enhance rather than distract from the content are going to be common issues. As much as I'm not a friend of self-promotion, the post I wrote a while ago about infosec writing covers a lot of this.

Brevity (aka "why many words when few words good")

While at first glance potentially similar to "relevance", there's a difference. Brevity is the quality of being concise and to the point, presenting information efficiently and without unnecessary detail. A trainer once told me that "brevity is your way of showing that you respect your audience's time and cognitive load", which is a good summary.

As with readability, there are challenges to that. I myself have accidentally omitted ultimately critical details in the pursuit of brevity. Don't overdo it, sometimes an extra word (or an extra visual help) is actually the right choice.

Imagination

At first glance "imagination" might not be something you'd want in an intelligence report. The part where you imagine different scenarios, where you look at competing hypertheses, that's borderline textbook definition of analytical work.

Nonetheless, imagination can be a part of your report; specifically, the imagination of your audience. Aside from simply giving out analyzed information and recommendations, you want to encourage reader's perspective beyond the immediate data.

You want to encourage them to think about possibilities for their specific situation that are only visible to them. This is something that is mostly going to be relevant for strategic level threat intelligence.

For example, instead of simply documenting current geopolitical tensions and informing your readers in general terms, your report could map out multiple escalation pathways, illustrating how minor incidents could spiral into larger conflicts (or how diplomatic interventions could de-escalate), and how these could affect the cyber threat for the organisation the reader is responsible for.


I know that perfection is the enemy of progress, and I am aware that working in information security, especially in cyber threat intelligence, isn't always an environment that's helpful for ensuring that all of the things mentioned above can be adhered to. But I'd wish we, as a community, would try harder.

That, having gone through some of the stuff I produced in the past few years, includes me as well.