21 December 2016

How to rethink what’s ‘top secret’ for the Internet age


By Dianne Feinstein

Dianne Feinstein, a Democrat from California, is the vice chairman of the Senate Select Committee on Intelligence. 

We cannot expect to keep our nation’s secrets secure — or provide meaningful oversight for our intelligence agencies — if proper classification of our country’s secrets is as likely as a coin flip. 

Yet, according to the most recent review of classified documents by the Information Security Oversight Office (the office responsible for oversight of the government’s classification system), improper classification markings were found on half of all classified documents. The rate was as high as 70 percent in certain agencies. 

That even seasoned national security professionals frequently fail to properly classify documents suggests that the system is broken. That’s why the incoming administration must update its methods to protect classified national security information to reflect the realities of the digital era. 

Rooted in a paper-based era, the existing classification system has become so complex and distorted that it no longer serves its fundamental goals: sharing secrets with our allies and partners while safeguarding this information from adversaries who would do us harm. 

What makes proper classification so difficult? To start with, there are simply too many guides. 

The Information Security Oversight Office found more than 3,000 classification guides in existence across the U.S. government, including more than 2,000 in the Defense Department. 

Government personnel use this patchwork of guides — which include several hundred “classification markings,” or words and phrases used to denote classified material — to determine how to protect documents or emails. 

Hundreds of compartments and sub-compartments — essentially different categories or types of information — further confuse who can see what. For example, two individuals with top-secret security clearances are able to see vastly different information depending on the compartments they can access. 

The consequences of underclassifying a document are obvious: Individuals without proper clearance gain access to information they shouldn’t be allowed to see. But most of the mistakes result from overclassification — classifying documents at levels higher than necessary or for periods longer than required. 

Overclassification frequently occurs because there is significant incentive for intelligence professionals to do it. While overclassifying a document incurs no risk or penalty, classifying information at lower levels invites risk to one’s access to classified material if a mistake is made. 

But the negative effects of overclassification are real: preventing the sharing of vital information, blocking transparency needed to ensure accountability and wasting taxpayer dollars. 

Information sharing is elementary to good intelligence work. The 9/11 Commission warned that overclassification could make it more difficult to make those connections because it prevents a wider audience that may not have the highest clearance levels from gaining access to information. 

Unnecessarily withholding information from the American people also allows the government to insulate itself from public scrutiny and even congressional and judicial oversight. Congressional overseers and the public must be given appropriate insight into government activities, as well as their legal rationale, to ensure America’s values are being upheld and our interests advanced. Proper classification policy makes such oversight possible. 

Overclassification is also extremely expensive. Security classification activities cost the executive branch almost $15 billion annually, five times what it cost in 1996. 

Director of National Intelligence James R. Clapper Jr. deserves credit for beginning to reform the system by asking to reduce the number of people who can classify information, increase the ability of individuals to make discretionary declassification decisions, create a single classification guide and eliminate classification markings that serve no meaningful purpose. 

But we should go even further. Compartments need to be further consolidated, and the system of markings and handling caveats should be significantly simplified. 

We also must do what we can to change incentives to further encourage government personnel to classify at the lowest appropriate levels and for the shortest durations. 

Opinions newsletter

Thought-provoking opinions and commentary, in your inbox daily.

Sign up 

For example, some agencies require every document to be marked “top secret” as the default; personnel must then take separate steps to lower that classification level. Instead, the default should be changed to “unclassified,” with personnel required to act to raise the level when necessary. 

Likewise, the default duration before a document can be declassified is frequently 25 years. But only the most sensitive documents require such protection. Decreasing that to five or 10 years may be appropriate for many agencies. 

Additionally, technologies can help automate declassification and classification decisions in digital records using algorithms that are not only more reliable than humans, but also free intelligence professionals to focus on providing the best information and analysis to policymakers. 

Finally, we need to empower bodies such as the National Declassification Center and the DNI’s Intelligence Transparency Council to expedite review of materials important to public debate and the historical record so they can be made available to the public sooner without jeopardizing national security interests. 

We may never fully eliminate the tendency of reviewers to overclassify information. But by working to implement these solutions, we can begin to change the dynamic. At the heart of this issue are dedicated government employees who truly want to do the right thing, and there’s an obvious natural instinct to protect secrets. But classification should shield secrets, not bury them.

No comments: