Suspicious Activity Reports (SARs) could be one of the most promising data sources for effective threat analysis in the coming years, and under the NSI program (National SAR Initiative), law enforcement agencies can submit local reports for analysis by participating agencies as well as compare their local reports to the broader database to find matches and vet threats for their regions.
This kind of reporting and analysis is key to identifying people like those who, prior to 9/11, were taking flying lessons but skipping the landing lessons part. In other words, the reporting of suspicious situations by ordinary citizens and police officers in the normal course of their duties has the potential to be very helpful in discovering precursors of terrorism or other criminal activity.
The key to success is HOW agencies are participating, and the technologies they are using to receive, vet, manage, and share Suspicious Activity Reports. Really, SARs are not a new idea – there have been tips and suspicious activities reported by the public for as long as there have been police departments. But in the past, these tips have been scribbled onto field interview cards or jotted down in a police officer’s note book, which were filed in cabinets or in the bottom of the officer’s lockers and then forgotten.
It sounds like a quaint history review, except that a lot of agencies are still using this approach. The key to the success of SARs is to receive them, vet them, and conduct the proper analysis – tapping the power of the large national dataset. For example, a SAR about a white van taking photos of the infrastructure of bridges needs to be checked against the national database to determine if there are other matches that might affect the threat assessment.
This implicitly means entering SARs into a searchable format in a database, but that is just the beginning, and there are a number of best practices that need to be applied to the management of SARs. As agencies learn about these best practices, they can compile a checklist of needs and make sure the technology solution they are planning to use will support the best practices. Some of the available technology tools look slick, but it’s wise to do the “gap analysis” to determine if a tool supports the right process.
Best Practices
Typically, the local agency is going to do some vetting of the SARs and possibly work with an analyst at a regional center to do so. As the analysts do this work, there might be phone calls placed to other agencies, searches performed, and interviews conducted. Often, an analyst will have findings as part of the vetting and workup. Sometimes, the workup will need to be transitioned into a full intelligence report or product.
With all this in mind, the tools used for management of SARs must be able to support much more than basic capture of the SARs. There must be capability to record the workup, document the vetting steps, and attach analyst findings. If the SARs you just vetted turn out to be relevant to a threat analysis, all the workup data and findings are going to be useful and need to remain recorded with that SAR. Is the technology you’re considering going to support those needs?
Another area of best practices deals with efficiency. Does the solution you are considering require staff to re-key all the data in order to send a SAR to the NSI database and the federal e-Guardian system maintained by the FBI?
If it does, reconsider your choice. Is the tool you are considering networked and integrated with other records systems for faster search, or is it more akin to a standalone database? Does the tool allow your agency to automate capture of SARs via a public portal where citizens can file reports online via the Internet? Efficiency is going to become more important as the “See Something, Say Something” program expands, as it did recently to include new sectors of the economy like hotels. Over time, the increasing volume of calls and reports will demand efficient systems to manage the intake and processing of these reports. Additionally, picking a solution where analysts can work within a single login system and avoid re-keying data also helps agencies drive toward efficiency and effectiveness.
SARs and Search – Finding Matches
If SARs are really to make a difference, we cannot overlook the importance of search capability against the SARs database, both local data and the national NSI database. Is the SAR tool you are considering able to do effective searches to find similar reports across the nation?
One type of search that is becoming increasingly important for analysts is called “proximity search.” This is pretty sophisticated – far beyond what engines like Google perform – and allows analysts to specify that search results must have the words “van” and “bridge” within 20 words of each other in a report. This allows analysts to pair concepts without using exact quotes, and without seeing a ton of false positives. Proximity search is another way of improving efficiency, since reduced false positives also mean reduced time going though search results.
In conclusion, SARs data is going to be incredibly valuable – but only if we manage it correctly. We can’t use a siloed approach because keeping data in standalone platforms is a step back toward the old paper field interview cards. We need to make it easy for analysts to arrive at conclusions and escalate SARs into an intelligence report, within the same workflow. We need to help analysts see each other’s work, learn from prior workups, and harness the power of all these observations.
So, the success of NSI depends on the technology decisions each participating agency is making. If we all keep this in mind, we can hopefully measure the success in reduced crime and thwarted terrorism. We all want to be part of this success: Failure is not an option we can live with. Read more in our Tech & Innovation section to explore how technology is transforming emergency communications.