Updated: Feb 21
PRIVATE SECTOR PERSPECTIVE — Legislators are calling for mandatory disclosure of cybersecurity events in previously unregulated industries. On the surface, this seems like a reasonable way for defense and intelligence agencies to acquire more data on adversary activity in the civilian sector. With more data on hand, more actionable intelligence can be generated. But this is true only under certain conditions. In reality, this type of data acquisition and synthesis is quite complicated as the input data must be uniform and pristine or the resulting intelligence will not be accurate. Legislators who expect under-resourced security teams with disparate discovery and verification protocols to produce timely, untainted, unified data show a naïve understanding of what it takes to turn raw data into viable intelligence. Pristine data obtained via required-disclosure regulation is an unreasonable expectation which will yield unviable intelligence.
The alternative is surveillance, or automated, first-party collection of raw data for synthesis into actionable intelligence. When authorities acquire data directly via first-party collection methods, data integrity and signal fidelity are more likely to be intact, resulting in more viable, accurate, actionable intelligence. But Americans have a hypocritical relationship with surveillance. While it’s highly objectionable for heavily-regulated government authorities to conduct domestic surveillance for the prosocial purpose of national security, there is little objection to the largely unregulated private sector conducting granular, persistent digital surveillance for the purpose of promulgating consumerism.
In America, surveillance is situationally acceptable. Privacy is most valued when there is a perceived risk of discovery of illicit behavior and/or if culpability is present. Privacy is not valued when potential culpability is not a factor, and any measure of law compliance or risk of arrest are absent.
Surveillance, in some form, is fundamentally essential for cyber national security. Defense and intelligence agencies require timely insight into advanced persistent threat (APT) activity within the inviolable homeland to uphold their security missions. But even with express consent, omniscient surveillance is impossible at national scale, and even more untenable given the exponentially expanding cyber domain attack surface and automation of APT aggression. There are simply too few eyes for the scope, scale and frequency of security events.
Privacy is most valued when there is a perceived risk of discovery of illicit behavior and/or if culpability is present.
A similar conundrum was addressed in the work of 18th century philosopher and social reformer Jeremy Bentham to address surveillance and management of the booming British prison inmate population. Bentham is the architect of the Panopticon, a prison structure which permits a single guard to observe thousands of inmates at once. The structure featured a novel architectural design which afforded the opportunity for visual surveillance of prisoners at a previously unattainable scale. The design was reliably effective at deterring unruly behavior in a delinquent population.
The key to prisoner compliance in Bentham’s Panopticon was not omniscient surveillance, but uncertainty. Even with blatant understanding that a guard could not always see all individual prisoners at all times, the uncertainty over when and where observation occurred was the deterrent that created a state of orderly compliance.
The parallels between Bentham’s Panopticon and current cyberconflict are notable. In Bentham’s Panopticon, the prison guard’s objective was not omniscient surveillance, but instead the purpose of surveillance was to assure persistent, sustainable, orderly incarceration. In other words, business as usual. Similarly, defense and intelligence agencies require insight into private space, not for omniscient surveillance or for law enforcement purposes, but to obtain APT behavior data required to assure a state of persistent, sustainable, national security. In other words, to assure business as usual in this dual-use domain.
Another striking similarity between Panopticon prisoners and APTs is the relative indifference toward credible threats of punishment of those under surveillance. The threat of punishment is less threatening to both those already serving a punitive sentence and to those outside the reaches of the American legal system. Despite this indifference, the panoptic surveillance and inherent uncertainty deterred noncompliant behavior untethered to any explicit threat of further punishment.
At present, APTs are successfully exploiting the size, scale and relative anonymity afforded by the cyber domain. They know hardly anyone is watching and they know where the watchers reside. Their tactics reliably evade federal authority surveillance/data collection methods and rush to take refuge in endpoints, the most intimate area of the attack surface, specifically because they know that USG surveillance is absent there. The current U.S. cyber warfighting force availability does not now and could not ever scale to the level required for omniscient surveillance. It’s Bentham’s conundrum: There are simply too few eyes for the scope and frequency of aggression.
This begs the question what would a panoptic construct look like in cyberspace, a domain where forensic surveillance is available, but traditional visibility is not? It would certainly require visibility into domestic endpoints, the safe harbor where APTs reside free from observation. Privacy-preserving artificial intelligence and machine learning (PPAI/ML) provides part of the solution. PPAI/ML employs techniques that assure privacy at every level of analysis. Different than anonymization which can be reverse engineered to assign attribution, PPAI/ML is truly indifferent to the identity and digital content of those being monitored.
Panoptic surveillance requires demonstrative proof and persistent accountability that any measure of law compliance or risk of arrest are absent. This does not mean blanket immunity from criminal behavior by the party under surveillance, but public accountability and affirmation of the surveilling authority’s indifference toward criminal activity, supported by lack of enforcement.
Assuming a panoptic construct is implemented at scale, how would defense and intelligence agencies leverage the ensuing uncertainty to assure “business as usual”? The uncertainty only has it’s intended effect if APTs understand that they are under surveillance. The paucity of traditional visibility inherent to the cyber domain, requires easily-verifiable proof of panoptic (ie: if you can’t physically see the jailer, how can you know they’re at their post?).
Leveraging uncertainty around mission success, detection and attribution, and how much the United States is willing to tolerate as a deterrent, is underemployed in current cyber national security strategy. For APTs, the destabilizing effect of uncertainty when calculating cyber operations cost-benefit calculus is a novel variable.
For American companies under cyber assault, panoptic surveillance is a middle option to the right of authoritarian omniscient surveillance and to the left of ineffective, self-reported mandatory disclosure. Finding the balance between a highly desirable ends (a cessation or significant de-escalation of cyberdomain hostilities) and a necessary but odious means (surveillance) is complicated. But since diminished reasonable privacy is acceptable when certain conditions are met, and because the deterrent power of uncertainty for behavioral compliance in the population that is indifferent to the threat of punishment is relevant, panoptic cyberdomain surveillance merits further consideration.
Gentry Lane is the CEO and founder of ANOVA Intelligence, a cyber national security software company. Ms. Lane is also a Fellow at the Potomac Institute for Policy Studies, and a Visiting Fellow at the National Security Institute at George Mason University’s Antonin Law School. She is a recognized subject matter expert on cyber conflict strategy, and she advises members of Congress, NATO, and U.S. defense and intelligence agencies.