March 25

The story of my mistakes

Hello everyone!

Even though I consider myself a high-level cyber intelligence specialist, I still make mistakes — just like anyone else. Admitting your mistakes isn’t a bad thing; what’s bad is not fixing them. That’s why an essential part of my job is constantly asking myself, “What if I’m wrong? Have I tried every method?”

In this post, I won’t mention any organizational or project management mistakes — only analytical ones.

Mistake #1: Bias, overreliance on experience, and skipping “unlikely” tools

Let’s say I need to build a dossier on a person, and I don’t have much data. I start thinking there’s probably nothing more to find. I skip searching by nickname because in my experience, it rarely works.
But — is that a valid reason not to check?
No.
Could it work anyway and become a pivot point for new findings?
Absolutely.
Overcoming this mindset isn’t always easy. Sometimes we know about a method or tool, but it feels “too niche” or like it’ll never be useful. Then later, we realize it would’ve been perfect for a particular situation.

Mistake #2: Not updating methods and keeping up with trends

I have well-established search algorithms for individuals, companies, images, and so on — along with templates for each report.
But I still need to monitor new tools and techniques that could help uncover more than my current methods allow.
When you’ve already somewhat “made it” professionally, learning becomes harder. Reasons vary: pride, overwhelming workloads, or simply struggling to find good-quality resources.
Still, I actively fight this problem, and I’m doing pretty well.
To summarize: once you get used to certain techniques, there’s a risk of applying them even where they don’t work well.
That’s why it’s important to check for alternative approaches and try broader strategies.

Mistake #3: Bias during data verification

When I find some rare, valuable unverified data, I really want to believe it’s accurate. But that’s dangerous — it’s where tunnel vision can creep in.
That’s when we start favoring information that confirms our expectations.
I do manage to fight this well, but it’s still hard.

Also, I used to fall into the habit of trusting “authoritative” sources without cross-checking.
Authority ≠ Accuracy.

Sometimes, random comments on social media are more informative than full articles in seemingly credible media outlets.

Mistake #4: Overcomplicating things

Sometimes I want to apply something fancy — like graph analysis or machine learning.
But often, problems can be solved much more simply.
A basic example:
Let’s say the target person has a social media account, and we want to understand their inner circle.
Sure, we could analyze their friends from the same city, compare timestamps on shared photos, dig deep into cross-relations…
But sometimes, just scrolling their feed and seeing who consistently likes their posts is enough.
Not very exciting, but effective.

Other common mistakes I’ve learned to handle.

Here are a few that I now manage much better:

  • Underestimating time sensitivity
    OSINT data goes stale fast. You have to always ask when the data was obtained, what’s changed since then, and how that affects your conclusions.
  • Too much or too little info in a report
    If the task is to assess someone’s reputation, there’s no need to include all their geolocation data or every single friend from social media.
    The report structure should match the task and the client’s expectations.
  • Ignoring false positives/negatives
    Say you find two people with the same name and nearly identical birthdates — only one digit differs.
    You have to check if it’s really the same person or just a coincidence.
    Misinterpreting this can lead to false links and flawed analysis.
  • Underestimating human error
    Everyone has a social circle. People in that circle tend to behave similarly. That creates bias in how we think people behave.
    For instance, if your target surrounds themselves with people who value security and privacy, you might assume everyone in that circle is disciplined.
    Then you read a news story: a secret agent posts a photo with geotags saying “I love my job” right outside their workplace.
    It sounds fake, right? A trap? Disinformation?
    But sometimes... people just do dumb stuff.
    There’s a funny example of bodyguards for high-profile people who didn’t turn off route tracking in their apps.
    So how do you avoid this bias?
    Study diverse social groups: rural folks, scientists, homeopathy fans, alcoholics, drug users, etc. Everyone behaves differently.

Let’s wrap it up here for now :)