As I have written before, intrusion analysis is equal parts knowing the technical elements of an intrusion and being an analyst. However, most in this domain spend an inordinate amount of time studying technical details compared to honing any analytic skills.
How long has it been since you’ve taken a highly technical course? (probably within the last year or two) How about an analysis course? (probably in the last 5 years, 10 years, never?)
I read several threat intelligence reports daily. It is painfully obvious how the lack of analytic skill is harming the discipline. Many folks come from technical degree backgrounds and analyze packets and binaries well enough but can’t seem to tell the difference between inductive, deductive, or abductive reasoning. Furthermore, their managers and mentors never recognize a problem, they just send them to more technical courses.
What is the risk? Threat intelligence provides insight and context to improve decision making. The risk of bad intelligence is high. Bad decisions can easily be made from poor intelligence – potentially doing more harm than good. Good analytic practices improve analysis thereby decreasing the risk of poor intelligence. You could have the best packet analysis skills in the world, but if you cannot communicate your conclusions effectively to those who need to act on your information those skills are effectively useless in threat intelligence.
We need to do better. I started this post about a month ago and wrote down a “lesson” whenever I saw an example of poor analysis. Needless to say, I saw some of these several times. (Contrary to the recommendation of others, I will not cite/quote specific examples – I believe that would only name and shame others)
Trend – the word actually means something
How many times per week must I read about a new “trend” from threat intelligence? One or two events does not constitute a trend. Even three or more events, depending on the universe of events, may not constitute a trend. Trends are serious. True trends in adversary activity and methodologies inferred by threat intelligence should drive data collection, analytic tradecraft, and defensive decisions. Before you start throwing out the word trend just because you’ve seen something a few times, consider the millions of other events you’re not seeing and consider if they’re just an anomaly rather than a trend.
Analysts and conclusions are like horses: sometimes you need to lead them to water
In many cases I can follow the logical progression of hypotheses and facts to the conclusion. In some cases I cannot. Either because an analyst failed to include the appropriate evidence/fact on which now an assumption must rest or because of convoluted logical reasoning. Ensure evidence supports your conclusions and the logical reasoning is clear. Don’t assume that what is clear in your mind will be clear in mine.
You can’t be completely confident all of the time – use words of estimative probability
Do you know how often I see the effective use of estimative probability in recent threat intelligence reporting? Almost never. This is a problem. Not everything presented is irrefutable fact; in fact, a good analysis will have a proper mix of data/fact, hypotheses and conclusions. The confidence values of these conclusions vary. When you don’t effectively apply estimative probability and variable measures of confidence it removes value from the analysis and increases the risk of poor decision making by consumers. First, if you don’t know what estimative probability is, LEARN about it. Then learn how and when to apply it properly. Importantly, also know what words/phrases to avoid (i.e. weasel words).
Never be afraid to include contrary evidence
Do you know how many times I saw evidence contrary to the conclusion presented in a threat intelligence report this month? Never. Practice analytic honesty. If there is exculpatory evidence, contrary evidence, or an alternative hypothesis – show it. As long as you’re following some of the other lessons here (e.g., separating fact and hypothesis, using words of estimative probability) it will strengthen your analysis and provide more value to the consumer.
Just because you’ve seen something for the first time doesn’t mean it’s the first time it happened
We all love finding something awesome and telling the world. It’s cool because we all want to know what you’ve found! But, please don’t assume it is the first time it has happened or even the first time it has been seen. Having confidence is critical, but hubris is deadly to analysis.
Don’t operate on an island
You are not alone! Don’t act like it. Share and consume, enrich and enhance. Go ahead and build on the analysis of others (citing appropriately). Whatever your observation point or data sources, they’re not omnipresent. I rarely see analysis reference other (obviously) related pieces. How is that? The power of defenders lies in our community and our ability to work together against an adversary.
Be bold, but don’t be stupid
I like my analysis like I like my coffee: bold. But, there is a line between taking facts to their logical conclusion and taking facts to crazy-land. The difference is logic. Ensure your conclusions and hypotheses follow logically from the facts through induction, deduction, or abduction. If your conclusions cannot be logically traced or tested, then they’re likely living in crazy-land.
Don’t mix hypotheses, conclusions, and facts
Hypotheses, conclusions, and facts are not the same. Your intelligence reports should not treat them as such. Ensure that your readers can effectively separate the three through your use of language, formatting, etc. When the three are confused it can lead to erroneous assumptions by consumers and lead to decisions made on weak conclusions rather than facts.
Save hyperbole for the glossy promotion material
Hyperbole has its place. It doesn’t have a place in threat intelligence. Save that for the glossies. Be precise, honest, and accurate. Don’t embellish or exaggerate. Trust me when I say we have enough people running around like chickens with their heads cut off in this field.
Logical fallacies are just that, get to know them
Enough said. I’m sorry I have to say this, but please understand the differences and applicability of deductive, inductive, and abductive reasoning BEFORE writing your first threat intelligence report. Or, at the very least, have an editor who knows the difference.
Don’t create new words when existing words suffice
I’m not going to name-call here. You know who you are. There are times when words/phrases have multiple meanings. I understand that. But, aside from that….stop it.
Tell a story!
Your analysis is a story. You’re effectively documenting history – studying the past – in the hopes of making conclusions and judgments which will help the present and future. While you are documenting the activity of computers you are ultimately describing the actions caused by adversaries. Just like any story your report should have a beginning, middle, and an end.
Answer my questions
Write as if you are in a conversation. Think about what somebody else may ask of what you’re saying, and address those questions in the text. Any questions left unanswered have the ability to form into assumptions on the part of the consumer/customer. If you don’t have an answer, feel free to write: no further information.
Be concise, be accurate
Practice analytic honesty and respect the time of your reader. The report you’re considering may actually need to be three different reports – one describing all of the malware reverse engineering, one describing all of the network activity, and another describing the threat itself which references the other report. Putting everything in one report does not make it more consumable, it makes it less consumable and allows analysts to muddle up various lines of analysis.
Describe diagrams, charts, and tables both in the narrative text but also in a caption
This is just a pet-peeve of mine, but one which I find increases the readability of threat intelligence reports. Make sure you describe your diagrams, charts, and tables in both the narrative text (as part of the story) and also in a caption. I find this necessary because as I move backwards and forwards through a report reading and re-reading, forming and re-forming logical chains, I don’t want to hunt for the description in the text every time. I also don’t want to jump to the caption in the middle of text if not necessary which breaks my concentration.
TT
Good advice – you talk about analysis training – where? Any good recommendations? Everything I find these days about threat analysis or intelligence is 80% search engines and tools, and maybe 10% about the thought process. I have a cadre of technically sharp people, but i need help with getting them to think the right way. Right now I am leaning toward a philosophy could are arguments/elementary logic…
Michael Paisley
Its ‘just’ critical thinking that you need to develop. A very short phrase but actually something that takes a while to master. My recommendations below:
Use a clear risk framework and apply analysis that reduces your uncertainty around the factors that drive the risk
Understand and broadly apply the scientific method
Know common cognitive bias and logical facilities that we are all guilty of suffering from, but the impact of which you can reduce in your reasoning – professor google can give you lots of info – just use a reputable site.
Learn quantitative methods to support your analysis (I recommend “Quantitative Intelligence Analysis by Waltz probability – i cant recommend this highly enough and it will reduce error from bias or fallacy)
Learn qualitative techniques to support your analysis (Search for a book by 2 chaps named Heur and Pherson called ” structured analytical techniques for intelligence analysts”. )
Hope this helps
Mick
Ethan
I think intrusion analysts without good analysis skills are more harm then they are helpful. Its not only important to continue analysis training, I think its necessary. The main reason is because if your not spending your days implementing or troubleshooting technologies, then no amount of technical classes will make you extremely proficient any way. So if you lack the ability to correlate events and draw conclusions bases on reasoning rather then technical know how, then you will fail, because 9 times out of 10 your technical know how is not sufficient to be to be able to do that. Great article! Thanks Sergio!
john scroggins
awesome article… good number of take-aways. Thanks!!
Michael Cloppert
Serg,
Finally getting around to commenting on this great post. You state:
“The risk of bad intelligence is high. Bad decisions can easily be made from poor intelligence – potentially doing more harm than good.”
I think many in our industry don’t realize just how acute this risk is. In reading DNI Clapper’s 2015 “Worldwide Threat Assessment of the US Intelligence Community,” it struck me just how often industry, not intelligence community, reports were cited in the first three pages of the very first section – covering “Cyber.” I printed it out and highlighted every sentence that referenced or was based on intelligence reports produced by a company. I stood back, looked at the three pages from a distance, and it was clear: these reports substantially contributed to the assessment he provided.
Of course, this is an easy way for the IC to provide evidence without revealing sources and methods, and they can cherry-pick those reports which they have “other data” to validate, but the point is the same nevertheless: analysis in industry is important for national security, both directly as in this case, as well as indirectly by informing defenders whose collective efforts impact the economic security of the U.S. The same could probably be said for many other countries.
Again, great post! Let’s hope some nameless vendors get around to reading it sometime…
Kirk Schafer
Psychology of Intelligence Analysis. Richard J. Heuer, Jr.
“Highlights the problems with human perception and thinking.”
It’s old but provided free on the CIA website; if their PDF link is broken, Google the title for the corrected link from same site.