White House

Beth Noveck: think about how to open up the API of government

Former White House deputy CTO for open government Beth Noveck recently gave a TED Talk on a “more open source government. For more perspective in the vein of “open,” read CNN’s TED summary, “What if you could make anything you wanted?” Noveck’s talk is embedded below:

“…start by teaching young people that we live, not in a passive society, a read-only society, but in a writable society, where we have the power to change our communities, to change our institutions, that’s when we begin to really put ourselves on the pathway towards this open government innovation”

MIT Civic Media conference examines the success and failures of open government in the U.S.

The 2012 Civic Media Conference featured two full days of conversations about (what else?) the future of civic media and democracy. One conversation is particularly worth calling out and sharing with the Govfresh audience: a panel assessing what’s gone wrong and what’s gone right with open government in the United States over the past three years. The discussion was moderated by Susan Crawford, currently of the Harvard Law School and Kennedy School (and formerly a special advisor at the White House) and featured Mike Norman of Wefunder.com, Mark Headd of Code for America and Chris Vein, Deputy United States Chief Technology Officer for Government Innovation in the White House Office of Science and Technology Policy. I’ve embedded the video below:

Watch live streaming video from knightfoundation at livestream.com

You can read an excellent, comprehensive liveblog of the open gov panel at the Civic Media blog.

Citizen Audit: Which federal agencies have published open government plans 2.0 online?

If you look at the White House open government dashboard, you might be forgiven for thinking that all agencies had complied with the mandate to post plans, publish “high value” datasets to Data.gov and implemented them. It’s all “green” and “yellow.”

The reality is more complex. Measuring the outcomes of open government initiatives effectively is a task that will require the attention of watchdogs, inspector generals, economists and academics. (See the work of the Sunlight Foundation on building ClearSpending.org to evaluate the quality of federal spending data on USASpending.gov, for example.)

What I could do was to simply assess whether agencies have updated their 2010 plans and published them online. When I visited the websites of 29 federal agencies, I found that less than half of them (13) had posted a “2.0″ version of their plans.) 9 had not updated them since 2010. Based upon that audit, there should be a lot more red on the White House’s /open dashboard.

UPDATE: As of May 7th, several agencies had updated and published their plans online. (I’ve updated the spreadsheet below.) The remaining laggards: Defense, Labor, OMB, USTR, Veterans Affairs, National Archives, NSF, NRC, ONDCP and White House OSTP.

UPDATE: As of September 7th, Labor and OMB, ONDCP still had not updated their plans. Amy Bennett, director of Open The Government, brought the lag to the public’s attention. More context follows after the dashboard.

This issue came up this past week at the “Open Government Reality Check” at Information Week’s Government IT Forum. Given the prompt, perhaps the administration will put more pressure on agencies to update, publish and share their plans, including sharing them with the public on social media. (NASA, Energy and USDA have led the way on the latter count.) Applying pressure from OMB to implement against them is another matter, though some agencies continue to move forward. The Department of Transporation, for instance, recently launched Safety.Data.gov.

White House officials might also look to updating their own plans: the open government plan from the White House Office of Science and Technology Policy doesn’t appear to have been updated since April 7, 2010.

If and when the White House does update this dashboard, additional metrics might be added for each agency: “Number of Freedom of Information Act (FOIA) requests made,” “FOIA requests fulfilled” and “average time to response.”

A story this week on airline passenger complaints by Michael Grabell at ProPublica, for instance, showed that a FOIA request to TSA took four years to fulfill.

Since this data is, in theory, being posted at FOIA.gov, integrating and displaying these metrics would offer both a broader and more granular insight into the administration’s open government performance. Changes in culture will naturally be harder to measure.

UPDATE:

White House announces 200m in funding for big data research and development, hosts forum at AAAS

In 2012, making sense of big data through narrative and context, particularly unstructured data, is now a strategic imperative for leaders around the world, whether they serve in Washington, run media companies or trading floors in New York City or guide tech titans in Silicon Valley.

While big data carries the baggage of huge hype, the institutions of federal government are getting serious about its genuine promise. On Thursday morning, the Obama Administration announced a “Big Data Research and Development Initiative,” with more than $200 million in new commitments. (See fact sheet provided by the White House Office of Science and technology policy at the bottom of this post.)

“In the same way that past Federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education, and national security,” said Dr. John P. Holdren, Assistant to the President and Director of the White House Office of Science and Technology Policy, in a prepared statement.

The research and development effort will focus on advancing “state-of-the-art core technologies” need for big data, harnessing said technologies “to accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning,” and “expand the workforce needed to develop and use Big Data technologies.”

In other words, the nation’s major research institutions will focus on improving available technology to collect and use big data, apply them to science and national security, and look for ways to train more data scientists.

“IBM views Big Data as organizations’ most valuable natural resource, and the ability to use technology to understand it holds enormous promise for society at large,” said David McQueeney, vice president of software, IBM Research, in a statement. “The Administration’s work to advance research and funding of big data projects, in partnership with the private sector, will help federal agencies accelerate innovations in science, engineering, education, business and government.”

While $200 million dollars is a relatively small amount of funding, particularly in the context of the federal budget or as compared to investments that are (probably) being made by Google or other major tech players, specific support for training and subsequent application of big data within federal government is important and sorely needed. The job market for data scientists in the private sector is so hot that government may well need to build up its own internal expertise, much in the same way Living Social is training coders at the Hungry Academy.

Big data is a big deal,” blogged Tom Kalil, deputy director for policy at White House OSTP, at the White House blog this morning.

We also want to challenge industry, research universities, and non-profits to join with the Administration to make the most of the opportunities created by Big Data. Clearly, the government can’t do this on its own. We need what the President calls an “all hands on deck” effort.

Some companies are already sponsoring Big Data-related competitions, and providing funding for university research. Universities are beginning to create new courses—and entire courses of study—to prepare the next generation of “data scientists.” Organizations like Data Without Borders are helping non-profits by providing pro bono data collection, analysis, and visualization. OSTP would be very interested in supporting the creation of a forum to highlight new public-private partnerships related to Big Data.

The White House is hosting a forum today in Washington to explore the challenges and opportunities of big data and discuss the investment. The event will be streamed online in live webcast from the headquarters of the AAAS in Washington, DC. I’ll be in attendance and sharing what I learn.

“Researchers in a growing number of fields are generating extremely large and complicated data sets, commonly referred to as ‘big data,’” reads the invitation to the event from the White House Office of Science and Technology Policy. “A wealth of information may be found within these sets, with enormous potential to shed light on some of the toughest and most pressing challenges facing the nation. To capitalize on this unprecedented opportunity — to extract insights, discover new patterns and make new connections across disciplines — we need better tools to access, store, search, visualize, and analyze these data.”

Speakers:

  • John Holdren, Assistant to the President and Director, White House Office of Science and Technology Policy
  • Subra Suresh, Director, National Science Foundation
  • Francis Collins, Director, National Institutes of Health
  • William Brinkman, Director, Department of Energy Office of Science

Panel discussion:

  • Moderator: Steve Lohr, New York Times, author of “Big Data’s Impact in the World
  • Alex Szalay, Johns Hopkins University
  • Lucila Ohno-Machado, UC San Diego
  • Daphne Koller, Stanford
  • James Manyika, McKinsey

What is big data?

Anyone planning for big data to use data for public good — or profit — through applied data science must know first understand what big data is.

On that count, turn to my colleague Edd Dumbill, who posted a useful definition last year on the O’Reilly Radar in his introduction to the big data landscape:

Big data is data that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn’t fit the strictures of your database architectures. To gain value from this data, you must choose an alternative way to process it.

The hot IT buzzword of 2012, big data has become viable as cost-effective approaches have emerged to tame the volume, velocity and variability of massive data. Within this data lie valuable patterns and information, previously hidden because of the amount of work required to extract them. To leading corporations, such as Walmart or Google, this power has been in reach for some time, but at fantastic cost. Today’s commodity hardware, cloud architectures and open source software bring big data processing into the reach of the less well-resourced. Big data processing is eminently feasible for even the small garage startups, who can cheaply rent server time in the cloud.

Teams of data scientists are increasingly leveraging a powerful, growing set of common tools, whether they’re employed by government technologists opening cities, developers driving a revolution in healthcare or hacks and hackers defining the practice of data journalism.

To learn more about the growing ecosystem of big data tools, watch my interview with Cloudera architect Doug Cutting, embedded below. @Cutting created Lucerne and led the Hadoop project at Yahoo before he joined Cloudera. Apache Hadoop is an open source framework that allows distributed applications based upon the MapReduce paradigm to run on immense clusters of commodity hardware, which in turn enables the processing of massive amounts of big data.

Details on the administration’s big data investments

A fact sheet released by the White House OSTP follows, verbatim:

National Science Foundation and the National Institutes of Health – Core Techniques and Technologies for Advancing Big Data Science & Engineering

“Big Data” is a new joint solicitation supported by the National Science Foundation (NSF) and the National Institutes of Health (NIH) that will advance the core scientific and technological means of managing, analyzing, visualizing, and extracting useful information from large and diverse data sets. This will accelerate scientific discovery and lead to new fields of inquiry that would otherwise not be possible. NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical, and other data sets related to health and disease.

National Science Foundation: In addition to funding the Big Data solicitation, and keeping with its focus on basic research, NSF is implementing a comprehensive, long-term strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate, and serve data to communities; and new approaches to education and workforce development. Specifically, NSF is:

· Encouraging research universities to develop interdisciplinary graduate programs to prepare the next generation of data scientists and engineers;
· Funding a $10 million Expeditions in Computing project based at the University of California, Berkeley, that will integrate three powerful approaches for turning data into information – machine learning, cloud computing, and crowd sourcing;
· Providing the first round of grants to support “EarthCube” – a system that will allow geoscientists to access, analyze and share information about our planet;
Issuing a $2 million award for a research training group to support training for undergraduates to use graphical and visualization techniques for complex data.
Providing $1.4 million in support for a focused research group of statisticians and biologists to determine protein structures and biological pathways.
· Convening researchers across disciplines to determine how Big Data can transform teaching and learning.

Department of Defense – Data to Decisions: The Department of Defense (DoD) is “placing a big bet on big data” investing approximately $250 million annually (with $60 million available for new research projects) across the Military Departments in a series of programs that will:

*Harness and utilize massive data in new ways and bring together sensing, perception and decision support to make truly autonomous systems that can maneuver and make decisions on their own.
*Improve situational awareness to help warfighters and analysts and provide increased support to operations. The Department is seeking a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities, and events that an analyst can observe.

To accelerate innovation in Big Data that meets these and other requirements, DoD will announce a series of open prize competitions over the next several months.

In addition, the Defense Advanced Research Projects Agency (DARPA) is beginning the XDATA program, which intends to invest approximately $25 million annually for four years to develop computational techniques and software tools for analyzing large volumes of data, both semi-structured (e.g., tabular, relational, categorical, meta-data) and unstructured (e.g., text documents, message traffic). Central challenges to be addressed include:

· Developing scalable algorithms for processing imperfect data in distributed data stores; and
· Creating effective human-computer interaction tools for facilitating rapidly customizable visual reasoning for diverse missions.

The XDATA program will support open source software toolkits to enable flexible software development for users to process large volumes of data in timelines commensurate with mission workflows of targeted defense applications.

National Institutes of Health – 1000 Genomes Project Data Available on Cloud: The National Institutes of Health is announcing that the world’s largest set of data on human genetic variation – produced by the international 1000 Genomes Project – is now freely available on the Amazon Web Services (AWS) cloud. At 200 terabytes – the equivalent of 16 million file cabinets filled with text, or more than 30,000 standard DVDs – the current 1000 Genomes Project data set is a prime example of big data, where data sets become so massive that few researchers have the computing power to make best use of them. AWS is storing the 1000 Genomes Project as a publically available data set for free and researchers only will pay for the computing services that they use.

Department of Energy – Scientific Discovery Through Advanced Computing: The Department of Energy will provide $25 million in funding to establish the Scalable Data Management, Analysis and Visualization (SDAV) Institute. Led by the Energy Department’s Lawrence Berkeley National Laboratory, the SDAV Institute will bring together the expertise of six national laboratories and seven universities to develop new tools to help scientists manage and visualize data on the Department’s supercomputers, which will further streamline the processes that lead to discoveries made by scientists using the Department’s research facilities. The need for these new tools has grown as the simulations running on the Department’s supercomputers have increased in size and complexity.

US Geological Survey – Big Data for Earth System Science: USGS is announcing the latest awardees for grants it issues through its John Wesley Powell Center for Analysis and Synthesis. The Center catalyzes innovative thinking in Earth system science by providing scientists a place and time for in-depth analysis, state-of-the-art computing capabilities, and collaborative tools invaluable for making sense of huge data sets. These Big Data projects will improve our understanding of issues such as species response to climate change, earthquake recurrence rates, and the next generation of ecological indicators.”

Further details about each department’s or agency’s commitments can be found at the following websites by 2 pm today:

NSF: http://www.nsf.gov/news/news_summ.jsp?cntn_id=123607
HHS/NIH: http://www.nih.gov/news/health/mar2012/nhgri-29.htm
DOE: http://science.energy.gov/news/
DOD: www.DefenseInnovationMarketplace.mil
DARPA: http://www.darpa.mil/NewsEvents/Releases/2012/03/29.aspx
USGS: http://powellcenter.usgs.gov

IBM infographic on big data

Big Data: The New Natural Resource

This post and headline have been updated as more information on the big data R&D initiative became available.

Apps for Energy looks to jumpstart open innovation around the Green Button

Data standards are the railway gauges of the 21st century. With more adoption of the ‘Green Button,’ are we about to see an explosion of innovation around energy data?

Today, the Obama Administration announced that nine major utilities and electricity suppliers have committed to using and extending the Green Button to enable some 15 million additional households to access data about their energy usage, creating a potential market of 27 million households for energy apps.

As with the Blue Button for healthcare data, the White House asserts that providing energy consumers with secure access to information about energy usage will increase innovation in the sector and empower citizens with more information.

“This is the kind of innovation that gets me excited,” wrote venture capitalist Fred Wilson earlier this year. “The Green Button is like OAuth for energy data. It is a simple standard that the utilities can implement on one side and web/mobile developers can implement on the other side. And the result is a ton of information sharing about energy consumption and in all likelihood energy savings that result from more informed consumers.”

The thinking here, as with Blue Button, which enables veterans (and soon all federal workers) to download their personal health data, is that broad adoption by utilities and engagement with industry will lead to new opportunities for software developers and civic entrepreneurs to serve a new market of millions of consumers who want better tools to analyze and manage their energy data.

To stimulate app creation, the U.S Department of Energy announced an Apps for Energy challenge today. This effort is meant to “change the way you think about your utility bill data,” wrote data integration specialist Matthew Loveless at the DoE blog:

With the Energy Department’s new Apps for Energy competition, we’re challenging developers to use the Green Button data access program to bring residential and commercial utility data to life.

The Energy Department – in partnership with Pacific Gas and Electric Company, Itron, and Gridwise Alliance – is offering $100,000 in cash prizes to the software developers and designers that submit the best apps, as judged by a prestigious panel of judges selected from government, the energy industry, and the tech community.

Apps for ENERGY leverages Green Button, an initiative that gives access to energy usage data in a streamlined and easy-to-understand format (learn more about the Green Button open standard here). In addition to leveraging Green Button, app developers are encouraged to combine data from a variety of sources to present a complete picture of the customer’s energy usage.

The competition is all about creating tools and products that help consumers get the most out of their Green Button data – from apps that track personal energy savings goals to software that helps businesses optimize building energy usage. In addition, the 27 million households that will have access to Green Button data by the end of the year represent an untapped market that can serve as a catalyst for an active, energy focused developer community.

Apps for Energy will join over one hundred other challenges on Challenge.gov next month.

HUD, Veterans Affairs and Jon Bon Jovi’s foundation launch app challenge for homeless veterans

To paraphrase President Kennedy: Ask not what your country can code for you — ask what you can code to help your country. If you’re a developer, consider empowering your fellow citizens help the homeless veterans in your community. The Departments of Housing and Urban Development, Health and Human Services, and the Jon Bon Jovi Soul Foundation have collaborated to back a new challenge to developers to create a better way to help the homeless veterans using the Internet and mobile devices.

“Last year’s 12 percent drop in Veterans homelessness shows the results of President Obama’s and the whole administration’s commitment to ending Veterans homelessness,” said Secretary of House and Urban Development Shaun Donovan, in a prepared statement. “I want to thank Jon Bon Jovi for being a part of that effort and for using competition and innovation to advance the cause of ending homelessness.”

The idea here is relatively straightforward: use the open innovation approach that the White House has successfully applied elsewhere federal government to tap into the distributed creativity of the technology community all over the country.

“This contest taps the talent and deep compassion of the Nation’s developer community,” said Secretary of Veterans Affairs Eric K. Shinseki, in a prepared statement. “We are asking them to make a free, easy-to-use Web and smartphone app that provides current information about housing, health clinics and food banks.”

While “Project REACH” stands for “Real-time Electronic Access for Caregivers and the Homeless (REACH),” it actually aspires to do something more meaningful: give mobile citizens and caregivers the information they need to help a homeless veteran where and when it’s needed.

This app “will better connect our nation’s homeless to resources that are already available to them in a manner that reaches them where they are,” said Aneesh Chopra, the first US CTO, in a conference call today with reporters. Chopra, who left the administration earlier this year, later clarified that he was serving as a volunteer and judge for the challenge.

To say that improving the current state of affairs with homeless veterans is needed would be a gross understatement. “Homelessness for anyone is a national tragedy,” said Sean Donovan, secretary of HUD, in today’s call. “It’s never worse than for our nation’s veterans.”

The “Obama administation believes that no one who has fought for our country should ever be invisible to the American people,” said Donovan, who noted that while HUD has housed 28,000 veterans and has gotten nearly “nearly 1 in 5 homeless veterans off our nation’s streets,” more effort is needed.

He’s right. Here’s your jarring statistic of the day: One out of every six men and women in the United States’ homeless shelters are veterans. Veterans are 50 percent, according to the VA, are more likely to fall into homelessness compared to other Americans

The Project REACH challenge asks developers to create a mobile or Web application that will connect service providers to real-time information about resources for the homeless and others in need. “What if we had the ability, in real-time, drawing on local data, to help the homeless vet?” asked Donovan today. He wants to see information that can help them find a place to sleep, find services or work put in the palms of the hands of anyone, giving ordinary citizens the ability to help homeless veterans.

Instead of offering spare change, in other words, a citizen could try to help connect a homeless veteran with services and providers.

The first five entries to meet the requirements will receive a $10,000 cash prize and the opportunity to test their app at the JBJ Soul Kitchen. The winner will receive a $25,000 prize.

“At the Soul Kitchen we’ve seen the need for a simple, user-friendly, comprehensive application that connects those in need to resources in their community,” said Jon Bon Jovi, legendary rock musician, chairman of the JBJ Soul Foundation and White House Council Member, in a prepared statement. “As we sought out a solution to resolve the disconnect, we found the VA, HUD and HHS to be of like mind. Together we can provide the information about existing services – now we need the bright minds in the developer community to create a platform to tie it all together.”

Empowering people to help one another through mobile technology when they want to do so is more about the right-time Web than real-time. And yes, that should sound familiar.

Community groups and service providers sometime lack the right tools, too, explained W. Scott Gould, deputy secretary of veterans affairs, on the call today. The contest launched today will use Internet and smartphones to help them. The app should use tech to show which community provider has a bed or find an employer with openings, he said.

“It’s a high tech, high compassion, low cost solution,” said Gould, that “puts the power in the hands of anyone” to use data to help veterans get the help that they need. He wrote more about using technology to help homeless veterans at the White House blog:

Project REACH (Real-Time Electronic Access for Caregivers and the Homeless) challenges applicants to make a free, easy-to-use, and broadly accessible web- and Smartphone app to provide current and up-to-date information about housing and shelter, health clinics, food banks, and other services available to the homeless. It is designed to tap the enormous talent and deep compassion of the nation’s developer community to help us deliver vital information to the people who care for the homeless.

People caring for homeless veterans will be able to use this app to look up the location and availability of shelters, free clinics, and other social services – and instantaneously be able to share this critical information with those in need.

Bon Jovi, when asked about whether homeless veterans have smartphones on today’s call, told a story about a man at the Soul Kitchen who stayed late into the evening. The staff realized that he didn’t have a place to go and turned to the Internet to try to find a place for him. Although they found that it was easy to find local shelters, said Bon Joivthe websites didn’t inform them of hours and bed availability.

“People like me, who want to help, sometimes just don’t know, real-time, if there are beds available,” he said. “Think about the guys like me that have a computer, in the Soul Kitchen, that want to help.”

As healthcare blogger Brian Ahier noted this afternoon in sharing his post on Project REACH, this is the sort of opportunity that developers who want to make a major contribution to their communities can be proud to work upon.

Improving the ability of citizens to help homeless veterans is a canonical example of working on stuff that matters.

“We will, through our broad and deep network at HUD, make sure that whoever wins this competition, will make sure that app and tech is available to more than 8,000 providers,” said Donovan.

If that network Bon Jovi’s star power can help draw more attention to the challenge and any eventual services, more of the nation’s civic surplus just might get tapped, as more coders find that’s there’s a new form of public service available to them in the 21st century.

A Conversation About Social Media, Open Government and eDemocracy [VIDEO]

If the town square now includes public discourse online, democratic governments in the 21st century are finding that part of civic life now includes listening there. Given what we’ve seen in this young century, how governments deal with social media is now part of how they deal with civil liberties, press freedom, privacy and freedom of expression in general.

At the end of Social Media Week 2012, I moderated a discussion with Matt Lira, Lorelei Kelly our Clay Johnson at the U.S. National Archives. This conversation explored more than how social media is changing politics in Washington: we looked at its potential to can help elected officials and other public servants make better policy decisions in the 21st century.

I hope you find it of interest; all three of the panelists gave thoughtful answers to the questions that I and the audience posed.

Prosecuting whistleblowers is antithetical to open government

As David Carr reported at the New York Times, the White House is using the Espionage Act to prosecute leaks to the media. Dan Kennedy explored the issue of aggressive prosecution further this morning at the Huffington Post. As both Carr and Kennedy observed, this White House has used the Espionage Act six times during this presidency. Prior to 2009, it has been used 3 times in total since it was passed in 1917.

Putting the questions of whether Wikileaks is open government or deserves to be on a list of the top 10 Gov 2.0 initiatives aside, let’s be clear on a critical issue: prosecuting citizens who share information about billions of dollars of government fraud, corruption or criminality undermines open government initiatives.

Open government should not and cannot risk national security, despite what proponents of radical transparency might advocate. If the release of open data leads to such outcomes, the death of open government won’t be far behind. Those that choose to risk the lives of diplomats, human rights workers and service members abroad through willful leaks of locations or cables are legitimate targets of the Espionage Act.

If open government is truly about transparency and accountability, however, whistleblowers whose actions do not meet the standard of putting lives at danger should be protected. For instance, is Thomas Drake an enemy of the state because he went public about billions of dollars that were being wasted in “financial waste, bureaucratic dysfunction, and dubious legal practices in N.S.A. counterterrorism programs?”

Last year, I talked with Drake specifically about his case; our interview is embedded below. Judge for yourself whether his actions fit the standard laid out above — and keep in mind the following details from Carr as you watch:

When his agency was about to spend hundreds of millions of dollars on a software program bought from the private sector intended to monitor digital data, he spoke with a reporter at The Baltimore Sun. He suggested an internally developed program that cost significantly less would be more effective and not violate privacy in the way the product from the vendor would. (He turned out to be right, by the way.)

He was charged with 10 felony counts that accused him of lying to investigators and obstructing justice. Last summer, the case against him collapsed, and he pleaded guilty to a single misdemeanor, of misuse of a government computer.

While the Obama administration deserves credit for federal open government initiatives, on this count the actions of its Justice Department undermine both the efforts of public servants trying to act in good faith and those of investigative journalists trying to serve the public trust, along with leaving it open to charges of hypocrisy on open government promises or veneration of are correspondents who have been killed abroad.

As David Carr points out, that’s problematic on several levels:

These kinds of prosecutions can have ripples well beyond the immediate proceedings. Two reporters in Washington who work on national security issues said that the rulings had created a chilly environment between journalists and people who work at the various government agencies.

During a point in history when our government has been accused of sending prisoners to secret locations where they were said to have been tortured and the C.I.A. is conducting remote-controlled wars in far-flung places, it’s not a good time to treat the people who aid in the publication of critical information as spies.

Whistleblowers that focus upon waste and corruption, where the risk is primarily to those guilty of bureaucratic incompetence, cost overruns, environmental degradation, safety hazards or rigged procurements, should be people that the White House uses its considerable power to protect, not prosecute. That’s why whistleblower and retaliation protections exist under the law.

If, Ralph Nader said, information is the currency of democracy, perhaps our elected leaders should take action to ensure that those who risk their careers by sharing direct threats to the public interest are not made beggars.