Military OPSEC: Coincidence or Deliberate Signal?
And How to Apply the Same Thinking in Your Organization...
Three years ago, I was taking part in a major French military exercise as a civilian contractor. Among the many participating nations was a contingent of U.S. service members.
One day at lunch, while waiting in line, I overheard a group of young soldiers speaking English. The shoulder patch made their U.S. affiliation unmistakable. Having lived and studied in the United States, I’ve always enjoyed interacting with Americans. So, naturally, I greeted them and asked where they were from to start a casual conversation.
The response was immediate, guarded and unambiguous: they clearly had been briefed about the risk of foreign elicitation attempts and would not disclose anything. The fact that I was one of the few people dressed in civilian clothing amid a sea of uniforms probably didn’t help my case either.
Recognizing what had just happened, I didn’t press the matter and went back to my smartphone. The exchange seemed trivial at the time, and I quickly forgot about it.
Until that evening…
Back at my hotel, I noticed that my LinkedIn profile had been viewed by an intelligence analyst from a U.S. National Guard unit.
I was genuinely stunned. If this wasn’t coincidence—and it absolutely could have been—it would imply several things:
The soldiers had been properly briefed on elicitation risks (entirely expected).
They were able to identify me in real time (perhaps via a discreet photo later while I was having lunch).
The information was reported and processed within just a few hours.
The analyst’s deliberate visit to my LinkedIn profile from his own public account may have been a subtle message. The classic “we see you.”
If that interpretation is correct, it’s impressive.
Of course, Occam’s razor strongly suggests a far simpler explanation: random chance. Ian Fleming captured this ambiguity well: “Once is happenstance. Twice is coincidence. Three times is enemy action.” So, realistically, there was probably nothing extraordinary about it.
What Does This Mean for Organizations?
Regardless of the true explanation, the episode led me to reflect on a more practical question:
How could a comparable level of awareness and responsiveness be implemented within corporate environments, particularly during the reconnaissance phase described in the MITRE ATT&CK framework: a phase that is notoriously difficult to detect and counter?
This is not about encouraging employees to photograph every stranger who asks questions about their role. That would raise obvious legal and privacy concerns. But still, it does raise legitimate questions:
Do exposed or sensitive roles within your organization receive specific awareness training on social engineering and elicitation tactics?
Is there a low-friction mechanism for reporting unusual interactions or behaviors?
Do you have the analytical capability to validate and correlate weak signals (for example, through Threat Intelligence and OSINT practices)?
Pushing the idea further, a four-part approach emerges:
1. Targeted Awareness
High-exposure roles—executives, R&D, procurement, executive assistants—benefit from tailored briefings. The goal is to recognize patterns such as the “vendor” asking unusually detailed questions, the “journalist” probing into supply chains, or the LinkedIn contact displaying excessive curiosity about internal initiatives.
2. Frictionless Reporting Channels
The mechanism matters less than the usability: a dedicated email alias, an intranet button, a lightweight form. The key requirement is speed and simplicity. Employees must be able to flag something suspicious in seconds, without the burden of formal reporting.
3. Analysis and Correlation Capability
This is where Threat Intelligence functions become critical. An isolated report rarely means anything. Multiple reports involving related teams or projects begin to form a pattern. OSINT techniques then help assess credibility: does the individual exist? Is the digital footprint coherent? Are there inconsistencies?
4. Feedback Loops
Closing the loop reinforces the security culture. Even minimal responses—“Thank you, reviewed, no issue detected” or “Good catch, this warranted attention”—strengthen awareness and encourage future reporting behaviors.
What do you think?
Is this already a thing where you work? If not, what prevents it?



