Administrivia: I’ve posted dates for the next ICAC on the Training page. We have a few seats left for those so inclined.
Poor Operations Security (OPSEC) leads to mission failure. We cover OPSEC in pretty good detail in the ICAC, and this is a decent recap. Because good OPSEC is going to be critical to any mission set or activities, I feel the need to cover it here. Also, keep in mind that when I say “post-SHTF” I mean it in both the prepper and resistance sense.
Cleaning supplies go under the sink. Canned goods go into the pantry. Cups and dishes go into the cupboard. All these items reside in the kitchen, but each type of product has its place. Alternatively, we’ve all heard the idiom, “Don’t put all your eggs in one basket.” This is compartmentalization.
Similarly, if pieces of information are our kitchen products then we put them, too, in their proper places. All the information resides within the ACE (or maybe within the members of the ACE team), but not every individual knows everything. In the military and intelligence communities, it’s called “need to know.” If you don’t ‘need to know’ then you don’t know (or shouldn’t know).
This is a simple safeguard against compromise. When US and Coalition Forces share mission information with host nation partners (Iraqi or Afghan), that mission information seems to find its way to other places. I’ll never forget the time I was eating breakfast in Afghanistan during the fall of 2006, and saw Fox News reporting that Operation Mountain Fury was about to begin. My jaw dropped while scrambled egg substitute spilled out onto the ground. Fox News didn’t need to know that, nor did the Taliban. Someone violated operations security.
In all fairness, the Taliban probably knew that something was up before the operation officially kicked off. The point being, and mission success notwithstanding, had Mountain Fury been of a more sensitive nature then we could have considered it to have been ‘compromised’.
The term ‘compromise’ refers to making something vulnerable. Mountain Fury was compromised to an extent. Geraldo Rivera’s famous Iraq sand table also comes to mind. (Rivera was embedded with the 101st in 2003 and created a sand drawing outlining a military maneuver as well as friendly positions while live on air.) But compromise can also be leverage. A political campaign can become compromised by discovering that a politician is having an affair. In this case, an opposing politician could put that information to good use. NSA spying programs have been compromised due to unauthorized leakage of sensitive and classified information. In that case, everyday citizens might be more aware about what and how they communicate. NSA probably won’t be able to collect information if it isn’t communicated electronically. Your plans and activities may be compromised as well if you don’t follow good OPSEC.
In the case of military maneuvers in times of war, a unit’s commanders may not know that their mission has been compromised. It could give the adversary an advantage, or force the friendly unit to make a costly strategic or tactical error… all because someone a) didn’t keep his mouth shut; or b) knew something that he shouldn’t have known. Compartmentalize your sensitive information. These are the keys to your operational vault.
OPSEC is famously everyone’s responsibility. 99 men can implement and observe perfect OPSEC, but all it takes is that one individual to create a charlie foxtrot. While it’s everyone’s responsibility to observe appropriate OPSEC measures, that determination and authority comes from command. Command, whether it’s a captain of the local militia or the leader of a five-man squad, dictates OPSEC measures based on the threat, perceived or otherwise. It’s of little concern for a grocery store to implement strict OPSEC measures because there is no real threat. A bank, on the other hand, implements strict OPSEC measures because there is both an historical and capable threat. Those OPSEC measures aren’t generated by each local branch, but rather by the top of the organization.
Critical Information List.
The first step in developing OPSEC measures is to identify a Critical Information List (CIL). This is going to be a very sensitive document so I wouldn’t keep an electronic copy, and I’d observe very good physical security over it. My list of most critical information contains seven topics:
- Identities. Who is in your prepper or security/defense group? Who provides support? If nothing else, I protect these identities with my life.
- Locations. Where do you and your teammates live? Where do you plan to meet up? Where do you store your supplies?
- Communications. How do you communicate? How do you plan to communicate post-SHTF?
- Operations. How do you travel? What routes do you take? In what activities are you or will you be involved? What operational materiel do you possess or is required?
- Tactics, Techniques, and Procedures (TTP). How do you do the things you do? How do you operate? What’s your Standard Operating Procedure (SOP)?
- Vulnerabilities. Where are your weakest links? How are they exploited?
- Limitations. What can’t you do? What self-imposed or external limitations have been or will be placed on you or your group?
Once we’ve gone through these topics, we have our Critical Information List. Although our adversaries may not know this information, any credible threat is going to be trying to collect this information. Collection may be direct, it may be indirect, or it may be through advanced technological means. What indicators do you give as clues to this information?
Who are the people you contact most often? Where are these people located? This list of people are indicators of who belongs to your group.
Where are you filling up with gas, both near to and far from your home? These locations will help form a pattern of life, and may enable an adversary to identify your bug out or secure location.
How do you contact the people in your group? Do you utilize burner phones, encrypted email, carrier pigeon, smoke signals, telepathy? (Telepathy is highly recommended. So hard to track these days.)
Where do you make most of your large purchases, and what do you buy? Bud’s Gun Shop, Costco, Sam’s, Anytown Tactical all come to mind as pretty good indicators.
Was someone kicked out of your group for non-compliance or failure to adapt? Does that individual have a bad case of ‘butt-hurt’? Is that individual particularly vulnerable to cooperation with your adversary?
Essential Elements of Friendly Information.
Essential Elements of Friendly Information (EEFI) are a list of our critical information posed as questions. It would make no sense to disseminate our CIL (closely held), so instead we disseminate our EEFI (open). I’ve already listed some of our EEFIs:
Who is in your prepper or security/defense group? Who provides support? Where do you and your teammates live? Where do you plan to meet up? Where do you store your supplies? How do you communicate? How do you plan to communicate post-SHTF? How do you travel? What routes do you take? In what activities are you or will you be involved? What operational materiel do you possess or is required? How do you do the things you do? How do you operate? What’s your SOP? Where are your weakest links? How are they exploited? What can’t you do? What self-imposed or external limitations have been or will be placed on you or your group?
We’ll pass around these questions as we educate our teammates about OPSEC. The EEFI list is a general guideline: these are the questions people are likely to ask that we cannot answer. Part of OPSEC is identifying the collection threat, so use the EEFI to publicize the information that your team needs to protect.
In this step, we identify potential threats to our organization. It might be tyrannical regime forces, those willing to enforce unconstitutional law, gang and criminal elements, and any other expected threat post-SHTF. I’ll list a few appropriate steps for OPSEC threat analysis below.
1. Assess what the adversary already knows. We can’t protect what’s already out there. If it’s been communicated electronically, then I have to assume the worst: it’s in a vast network of databases just waiting to be plucked at a time when I can least afford it. I have to make a determination as to the potential consequences of any one piece of information being known. I may want to re-visit my planning, change how I’m currently operating or will operate, or do nothing and just hope for the best. If you opt for the latter, just remember that there were no do-overs for the 170 million people last century who died at the hands of their own governments.
2. Identify the most likely method and target of collection. In this case, we’ll want to break it down by threat, as follows.
- Regime. Everything and anything by method of Human Intelligence (HUMINT – includes surveillance and source operations), Open Source Intelligence (OSINT), Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), Technical Intelligence (TECHINT), and Counterintelligence (CI). The regime will have at their disposal any conceived means of collection and will be used with extreme prejudice.
- Domestic. I include gangs and other hostile groups mostly concerned with targets of opportunity but may also be strategic in nature to pursue goals including turf and influence. The domestic threat will collect whatever is directly available through HUMINT and OSINT.
- Criminal. I include hostile individuals who are mainly concerned with targets of opportunity, and will not necessarily operate towards any goals other than their own subsistence. The criminal threat will collect whatever is directly available through HUMINT, OSINT as it pertains to a specific target.
- Hackers. If the interwebs still work post-SHTF, then I expect it to be a digital playground for both foreign and non-state mischievous and nefarious actors. They will likely target public figures and civil servants through SIGINT.
- Insiders. Insiders will overwhelmingly utilize HUMINT but may be aided by external actors through SIGINT. This underscores the need to compartmentalize sensitive information.
3. Identify indicators of enemy collection. What are the signs that our sensitive information is escaping, or not being kept secret? Are our safe houses getting hit? Are our sources getting worked over? Are we experiencing mission failure? Is any one part of our compartmentalization standing out? If so, then we have to be prepared to make the determination to radically re-arrange our organization, work to identify our source(s) of exposure, or – worst case scenario reserved for organizational compromise – abandon ship, go dark, and GTFO of the AO. That’s a contingency that you need to make with your team. I can’t formulate your response to leakage for you.
4. Assess the education and training of our team to the collection threat. An EEFI only goes so far. Collection may not be direct. You may not even know that your critical information is being collected. Identify and correct deficiencies in OPSEC now.
Analysis of vulnerabilities.
In our analysis of vulnerabilities, we need to identify what our teammates know and the potential consequences or fallout if that individual is compromised. I automatically assume the worst: everything that individual knows will be garnered through coercion. Tyrannical regimes and lawless elements are notorious the world over for brutal and often effective collection methods. Understand that they want actionable intelligence – information that will drive operations right now. For most individuals, intelligence value is diminished over time. What was actionable an hour ago, a day ago, a week ago, or a month ago is now relatively useless except in forming an historical baseline. So what do your people know and how will their compromise negatively affect your organization?
Risk assessment is where we make our final determination as to what OPSEC measures we implement. What are the risks to the mission if a specific OPSEC measure isn’t followed? If a grocery store doesn’t encrypt when they’re getting a shipment of citrus fruits in, there’s not going to be any risk of mission failure. They don’t need to consider that sensitive information, and their OPSEC measures for that information are going to be very basic; maybe even none outside of compartmentalization (the deli and pharmacy don’t need to know). If the grocery store doesn’t protect the time and route of the closing manager as he goes to the bank to deposit cash, then they run the risk of losing a presumably sizable sum of cash. Losing one night’s worth of cash won’t put the grocery store’s operation in jeopardy, so the closing manager won’t need to utilize an armored car. We just performed some risk assessment. You can conduct risk assessment on specific activities or specific teammates. Smart leaders will assess the risk for damn near everything (time and resources permitting).
Contingency planning isn’t a part of OPSEC, per se, but it’s one way we can mitigate risk to our organization. In Afghanistan, (like many other soldiers) I carried a blood chit, a small paper-like document that promised a reward if I was reunited with US/CF. If I was ever captured, or was wounded and separated, my hope would be that I could give my blood chit to a local who might be more enticed by the reward to make contact for my return than he would be to sell me to the Taliban, HiG, or whatever other group that wanted a soldier to play with. In the event that a soldier doesn’t return, US/CF had a contingency plan for search and rescue (CSAR). Your organization, if you don’t already, needs to think along these lines. If a teammate is wounded and separated, or otherwise left behind, what should you do? I’m not trained in CSAR. I wouldn’t know exactly where to begin, other than the obvious. What should the team do? What can the team assume? What OPSEC or communications security (COMSEC) measures should be adopted, modified, or scrapped altogether as a result? In this event, I always assume the worst: everything that individual knows is now or will soon by known by the adversary.
Finally we get to OPSEC violations. No one wants to self-report. No one wants to admit that they screwed up and released sensitive or classified information. You can and will probably get into trouble in the military. That’s a major reason why most OPSEC violations go unreported. The other major reason is that members of the military and their families don’t always immediately recognize an OPSEC violation when it happens. But OPSEC violations will bring ruin to your organization, and it’s for that reason that you need a reporting mechanism. It could be as simple as a designated OPSEC officer who handles all things OPSEC, or as complex as an anonymous reporting system. Either way, OPSEC violations require immediate action to rectify your current or future operations. (Refer to Step 3 of Threat Analysis).
OPSEC is a serious matter and does require some planning and foresight. Hopefully this relatively brief article provides you a solid foundational understanding of OPSEC and how you can implement good OPSEC for your team. As always, if you have any questions, send them over or post them below.