top of page

The Intelligence Community ‘Can’ Tackle Open-Source Data in a Hyper-Connected World

Updated: Feb 12

Chip Usher

Senior Director for Intelligence at the Special Competitiveness Studies Project

Kristin Wood

Former Chief of Innovation and Technology at the CIA Open Source Center

EXPERT PERSPECTIVE/OPINION — The explosion of online data, rapid advances in artificial intelligence and the proliferation of entities using and sharing them are transforming the way the world works.

According to the World Economic Forum, we are in the Fourth Industrial Revolution. And the world’s datasphere is growing. Every minute of the day, 5.9 million searches run through Google, 16.2 million text messages are sent, 167 million TikTok videos are watched, and 6 million people shop online. According to the International Data Corporation (IDC), worldwide data will grow by 61% to 175 zettabytes in 2025.

For U.S. national security to thrive in this new era, the Intelligence Community (IC) will need better access to – and understanding of – data and insights from open-source systems outside of the U.S. government purview at the speed and scope of mission demands.

Digital interconnectivity already powers much of the international economy, with thousands of companies collecting, buying, and selling massive amounts of data to better understand risk, explore opportunities and make strategic decisions. Companies such as Starbucks, Disney, and Dow Chemical all have open-source units to help them understand the market, protect their people and facilities, inform their decisions, and protect their brands.

It’s not just for the President anymore. Cipher Brief Subscriber+Members have access to their own Open Source Daily Brief, keeping you up to date on global events impacting national security.  It pays to be a Subscriber+Member.

The United States’ prime nation-state adversaries – China, Russia, and others – already recognize the value to be found in open-source systems. According to a June 2023 report from Recorded Future, “A growing ecosystem of private companies, state-owned research organizations, and universities is supporting the People’s Liberation Army’s push to leverage open source intelligence by providing research, platforms, and data.”

Where the Intelligence Community Finds Itself

The national security community’s legacy of open-source collection dates back to the second World War when the Foreign Broadcast Monitoring Service was first stood up, and there are several small units across the national security community, such as CIA’s Open Source Enterprise and NGA’s Tearline Project, that continue to carry that mission forward today. These open-source units can and do, make remarkable contributions to the mission by coupling open insights with classified data, but they do not “live” in the open-source space, and they tend to focus most of their energies on classified intelligence priorities while they split their time online between classified and unclassified networks.

More importantly, the IC’s open-source units generally do not have access to the full spectrum of private sector data, analytics, and business expertise across the myriad of disciplines required for companies to grow and function. The lack of access is partly driven by a lack of resources (the IC has ten times as many imagery analysts than open-source intelligence (OSINT) analysts for example), onerous security and counterintelligence restrictions, and an outdated set of procurement rules that make it nearly impossible for the IC to keep pace with the speed of innovation.

While the IC struggles to keep pace, the ease of accessing the world’s data means that anyone with a computer, smartphone, access to the internet, and persistence, can expose nation-state secrets.

Following the example set by Bellingcat founder Elliot Higgins and the Institute for the Study of War founder Kim Kagan, nearly anyone can become an open source investigator, and thousands have become part of the crowd-sourcing networks that publish their findings online. While this kind of work can sometimes be uneven, it also can be exceptional and U.S. policymakers are right to wonder why the U.S. Intelligence Community cannot act with the same speed and access the same data.

Over the last decade, the IC and various entities outside the IC, have engaged in studies about open-source data, how to better find and analyze it, and how to better institutionalize and mainstream the use of open-source data by the intelligence community and the U.S. government writ large. 

Cipher Brief Subscriber+Members enjoy unlimited access to Cipher Brief content, including analysis with experts, private virtual briefings with experts, the M-F Open Source Report and the weekly Dead Drop – an insider look at the latest gossip in the national security space.  It pays to be a Subscriber+Member.

Last year, the Special Competitive Studies Project (SCSP) outlined a spectrum of four concrete options to better leverage open source – from creating a new entity within the executive branch, to establishing a new intelligence agency, to forming a new office within DNI, to attempting to mainstream it with all-source analysts.  As policymakers deliberate on the best way to address this issue on a more enduring basis, there is another potential solution that could more immediately help the IC meet this moment.   

A Public/Private Partnership for Open Source

The United States needs a more comprehensive effort with the private sector at a level that reflects the importance, scope, speed, and scale of the world’s data and much deeper levels of collaboration, including with those who have not been traditional partners. The precedent is there, most notably with In-Q-Tel (IQT), a 501(c)3 non-profit organization built as a government-funded, public-private partnership to help discover, fund and bring new technologies to the Intelligence Community. In its 20 years of operations, IQT has transformed the IC’s ability to access emerging technology while offering new companies access to otherwise hard-to-reach customers. It is partly funded with U.S. government dollars and partly by revenues and returns it generates itself.

The IC and IQT could discuss expanding the latter’s remit (and funding) to include a program for processing and assessing the broader world of open-source data. This new division of In-Q-Tel – let’s call it “Os-N-Tel” for OSINT – would focus on supporting the IC’s collective mission by identifying and providing access to the world’s data and open-source tools through a consortium of companies offering the best data sources, tools, and data technology. The new division would consist of a small team with deep expertise in data-related technologies, open-source information, and privacy and civil liberties laws and regulations. Os-N-Tel could take on the task of scanning for companies that offer either useful datasets or unique open-source analytic capabilities that match IC needs to include in the consortium. It would vet these companies, get them on contract, and push their capabilities out to IC customers. It would establish how IC requirements would be levied, how OSINT service contracts would work, and how industry would be paid. And it would offer a single, easy-to-find point of entry for private sector vendors seeking to sell their wares to the IC.

With focus, such a unit could be stood up quickly and start delivering products within a year, probably less. The IC could further task Os-N-Tel to drive progress by collaborating with the private sector to tackle the intelligence challenges of the future, including:

  • Testbedding the use of artificial intelligence tools at scale, to develop open-source assessments that are rich in data and detail. Os-N-Tel could offer a venue for IC experts to work side-by-side with private sector technologists to try out new tradecraft and develop new tools for making sense of world dataflows.

  • Standardizing IC approaches to data rights, pricing, and data pedigree, making available a sandbox for users to try out vetted new tools or technologies, including large language models and generative AI.

  • Serving as an additional mechanism for sharing information with private-sector national security decision makers that includes a discussion of privacy, civil liberties, and how U.S. adversaries already use Americans’ data.

As it makes progress, such an entity would demonstrate the incredible utility of open-source and could possibly help overcome any remaining obstacles to a more enduring, institutional solution.

This Will Be Hard…But It Can Be Done

Creating this expansive capability to draw insights from the world’s data in an era of heightened global uncertainty will not be easy. However, data, tools, technology, and insights available now in the public domain – but not in government hands – could help close critical knowledge gaps about the country’s most significant challenges; understanding and operationalizing the information is crucial to the success of the national security community and the nation.

The Cipher Brief is committed to publishing a range of perspectives on national security issues submitted by deeply experienced national security professionals. 

Opinions expressed are those of the author and do not represent the views or opinions of The Cipher Brief.

Have a perspective to share based on your experience in the national security field?  Send it to for publication consideration.

Read more expert-driven national security insights, perspective and analysis in The Cipher Brief

William “Chip” Usher is the Senior Director for Intelligence at the Special Competitiveness Studies Project. Prior to SCSP, Chip served 32 years in the Central Intelligence Agency where he held a variety of executive positions. Chip is a former member of the Senior Intelligence Service and has expertise on East Asia, the Near East, and Eurasia.

During her 20-year CIA career, Kristin Wood served in the Director’s area and three Agency directorates – analysis, operations, and digital innovation – leading a wide variety of the Agency’s missions in positions of increasing authority. Among her key Agency assignments were Deputy Chief of the Innovation & Technology Group at the Open Source Center (OSC). She led OSC’s open-source IT and innovation efforts to extract meaning from big data.


bottom of page