Catalog

Record Details

Catalog Search



The declassification engine : what history reveals about America's top secrets  Cover Image Book Book

The declassification engine : what history reveals about America's top secrets / Matthew Connelly.

Summary:

"A captivating study of US state secrecy that examines how officials use it to hoard power and prevent meaningful public oversight The United States was founded on the promise of a transparent government, but time and again we have abandoned the ideals of our open republic. In recent history, we have permitted ourselves to engage in costly wars, opened ourselves to preventable attacks, and ceded unaccountable power to officials both elected and unelected. Secrecy may now be an integral policy to preserving the American way of life, but its true costs have gone unacknowledged for too long. Using the latest techniques in data science, historian Matthew Connelly analyzes the millions of state documents both accessible to the public and still under review to unearth not only what the government does not want us to know, but what it says about the very authority we bequeath to our leaders. By culling this research and carefully studying a series of pivotal moments in recent history from Pearl Harbor to drone strikes, Connelly sheds light on the drivers of state secrecy--especially consolidating power or hiding incompetence--and how the classification of documents has become untenable. What results is an astonishing study of power: of the greed that develops out of its possession, of the negligence that it protects, and of what we lose as citizens when it remains unchecked. A crucial examination of the self-defeating nature of secrecy and the dire state of our nation's archives, The Declassification Engine is a powerful reminder of the importance of preserving the past so that we may secure our future" -- Provided by publisher.

Record details

  • ISBN: 9781101871577
  • ISBN: 1101871571
  • Physical Description: xvii, 540 pages : illustrations, maps, charts, plans ; 25 cm
  • Edition: First edition.
  • Publisher: New York : Pantheon Books, a division of Penguin Random House LLC, [2023]

Content descriptions

Bibliography, etc. Note:
Includes bibliographical references and index.
Formatted Contents Note:
The radical transparency of the American republic -- Pearl Harbor -- The bomb -- Code making and code breaking -- The military-industrial complex -- Surveillance -- Weird science -- Following the money -- There is no there there -- Deleting the archive.
Subject: Transparency in government > United States.
Government information > Access control > United States.
Public administration > United States.
United States > Politics and government.

Available copies

  • 12 of 13 copies available at Missouri Evergreen. (Show)
  • 1 of 1 copy available at Crawford County.

Holds

  • 0 current holds with 13 total copies.
Show Only Available Copies
Location Call Number / Copy Notes Barcode Shelving Location Status Due Date
Crawford County Library-Recklein Memorial-Cuba 352.3 CON (Text) 33431000663417 Adult Non-Fiction Available -

Syndetic Solutions - Excerpt for ISBN Number 9781101871577
The Declassification Engine : What History Reveals about America's Top Secrets
The Declassification Engine : What History Reveals about America's Top Secrets
by Connelly, Matthew
Rate this title:
vote data
Click an element below to view details:

Excerpt

The Declassification Engine : What History Reveals about America's Top Secrets

PREFACE: Should This Book Be Legal? There I was, sitting at a massive conference table inside a multibillion-dollar foundation, staring at the wood-paneled walls. I was facing a battery of high-powered attorneys, including the former general counsel to the National Security Agency, and another who had been chief of the Major Crimes Unit at the U.S. Attorney's Office in the Southern District of New York. The foundation was paying each of them about a thousand dollars an hour to determine whether I could be prosecuted under the Espionage Act. I am a history professor, and my only offense had been to apply for a research grant. I proposed to team up with data scientists at Columbia University to investigate the exponential growth in government secrecy. Earlier that year, in 2013, officials reported that they had classified information more than ninetyfive million times over the preceding twelve months, or three times every second. Every time one of these officials decided that some transcript, or e-mail, or PowerPoint presentation was "confidential," "secret," or "top secret," it became subject to elaborate protocols to ensure safe handling. No one without a security clearance would see these records until, decades from now, other government officials decided disclosure no longer endangered national security. The cost of keeping all these secrets was growing year by year, covering everything from retinal scanners to barbed-wire fencing to personnel training programs, and already totaled well over eleven billion dollars. But so, too, were the number and size of data breaches and leaks. At the same time, archivists were overwhelmed by the challenge of managing just the first generation of classified electronic records, dating to the 1970s. Charged with identifying and preserving the subset of public records with enduring historical significance but with no increase in staff or any new technology, they were recommending the deletion of hundreds of thousands of State Department cables, memoranda, and reports, sight unseen. The costs in terms of democratic accountability were incalculable and included the loss of public confidence in political institutions, the proliferation of conspiracy theories, and the increasing difficulty historians would have in reconstructing what our leaders do under the cloak of secrecy. We wanted to assemble a database of declassified documents and use algorithms to reveal patterns and anomalies in the way bureaucrats decide what information must be kept secret and what information can be released. To what extent were these decisions balanced and rule-based, as official spokesmen have long claimed? Were they consistent with federal laws and executive orders requiring the preservation of public records, and prompt disclosure when possible? Were the exceptions so numerous as to prove the existence of unwritten rules that really served the interests of a "deep state"? Or was the whole system so dysfunctional as to be random and inexplicable, as other critics insist? We were trying to determine whether we could reverse engineer these processes, and develop technology that could help identify truly sensitive information. If we assembled millions of documents in databases, and harnessed the power of high-performance computing clusters, it might be possible to train algorithms to look for sensitive records requiring the closest scrutiny and accelerate the release of everything else. The promise was to make the crucial but dysfunctional declassification process more equitable and far more efficient. We had begun to call it a "declassification engine," and if someone did not start building and testing prototypes, the exponential increase in government secrets--more and more of them consisting of data rather than paper documents--might make it impossible for public officials to meet their own legal responsibilities to maximize transparency. Even if we failed to get the government to adopt this kind of technology, testing these tools and techniques would reveal gaps and distortions in the public record, whether from official secrecy or archival destruction. The lawyers in front of me started to discuss the worst-case scenarios, and the officers of the foundation grew visibly uncomfortable. What if my team was able to reveal the identity of covert operatives? What if we uncovered information that would help someone build a nuclear weapon? If the foundation gave us the money, their lawyers warned that the foundation staff might be prosecuted for aiding and abetting a criminal conspiracy. Why, the most senior program officer asked, should they help us build "a tool that is purpose-built to break the law"? The only one who did not seem nervous was the former ACLU lawyer whom Columbia had hired to represent us. He had argued cases before the Supreme Court. He had defended people who published schematics of nuclear weapons--and won. He had shown how any successful prosecution required proving that someone had possession of actual classified information. How could the government go after scholars doing research on declassified documents? The ex-government lawyers pointed out that we were not just academics making educated guesses about state secrets--not when we were using high-performance computers and sophisticated algorithms. True, no journalist, no historian, can absorb hundreds of thousands of documents, analyze all of the words in them, instantly recall every one, and rank each according to one or multiple criteria. But scientists and engineers can turn millions of documents into billions of data points and use machine learning--or teaching a computer to teach itself--to detect patterns and make predictions. We agree with these predictions every time we watch a movie Netflix recommends, or buy a book that Amazon suggests. If we threw enough data at the problem of parsing redacted documents--the ones in which government officials have covered up the parts they do not want us to see-- couldn't these techniques "recommend" the words most likely to be hiding behind the black boxes, which presumably were hidden for good reason? Excerpted from The Declassification Engine: What History Reveals about America's Top Secrets by Matthew Connelly All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.

Additional Resources