open government

Beware openwashing. Question secrecy. Acknowledge ideology.

You could spend a long day listing all of the organizations or individuals who are putting government data online, from Carl Malamud to open government activists in Brazil, Africa or Canada. As many conversations in the public domain over the past few years have demonstrated, there are many different perspectives on what purposes “open data” should serve, often informed by what advocates intend or related to an organization or institution’s goals. For those interested, I recommend the open data seminar and associated comments highly.

When and if such data includes ratings or malpractice information about hospitals or doctors, or fees for insurance companies, transparency and accountability is an important byproduct, which in turn does have political implications. (Watch the reaction of unions or doctors’ groups to performance or claims data going online for those conflicts.)

There are people who want to see legislatures open their data, to provide more insight into those processes, and others who want to to see transit data or health data become more open, in the service of more civic utility or patient empowerment.

Other people may support publishing more information about the business or performance of government because evidence of fraud, mismanagement or incompetence will support their arguments for shrinking the size of the state. A big tent for open government can mean that libertarians could end up supporting the same bills liberals do.

In the U.S., Govtrack.us has been making government legislative data open, despite the lack of bulk access to Thomas.gov, by “scraping.” There are many people who wish to see campaign finance data open, like the Sunlight Foundation, to show where influence and power lies in the political system. There are many members of civil society, media organizations and startups that are collecting, sharing or using open data, from OpenCorporates to OpenCongress, to Brightscope or ProPublica.

Whether anyone chooses to describe those activities as a movement is up to them — but it is indisputable that 3 years ago, a neutral observer would be hard-pressed to find an open government data platform. Now there are dozens at the national level. What matters more than their existence is what goes onto them, however, and there people have to be extremely careful about giving governments credit for just putting a “portal” online.

While the raw number of open government data platforms around the globe looks set to continue to increase in 2013 at every level of government, advocates should be wary of governments claiming “open government” victories as a result.

Since Morozov sent out that tweet, he’s published a book with a chapter that extends that critique, along with a series of New York Times op-eds, reviews, Slate debates, and a 16,000 word essay in The Baffler that explores the career and thinking of Tim O’Reilly (my publisher). Morozov’s essay catalyzed Annaleen Newitz to paraphrase and link to it at post at iO9, where Tim responded to in a comment.

While his style can distract and detract from his work — and his behavior on Twitter can be fairly characterized as contemptuous at times — the issues Morozov raises around technology and philosophy are important and deserve to be directly engaged by open government advocates, as John Wilbanks suggests.

 

 

That’s happening, slowly. Sunlight Foundation policy director John Wonderlich has also responded, quoting Morozov’s recommendations to reflect out how he might specific uses of technology that support open government. Wilbanks himself has written one of the most effective (short) responses to date:

One of the reasons I do “open” work is that I think, in the sciences, it’s a philosophical approach that is more likely to lead to that epistemic transformation. If we have more data available about a scientific problem like climate change, or cancer, then the odds of the algorithms figuring something out that is “true” but incomprehensible to us humans go up. Sam Arbesman has written about this nicely both in his book the Half Life of Facts and in another recent Slate article.

I work for “open” not because “open” solves a specific scientific problem, but because it increases the overall probability of success in sensorism-driven science. Even if the odds of success themselves don’t change, increasing the sample size of attempts will increase the net number of successes. I have philosophical reasons for liking open as well, and those clearly cause me cognitive bias on the topic, but I deeply believe that the greatest value in open science is precisely the increased sample size of those looking.

I also tend to think there’s a truly, deeply political element to enabling access to knowledge and science. I don’t think it’s openwashing (and you should read this paper recommended by Morozov on the topic) to say that letting individuals read science can have a real political impact.

Morozov’s critique of “openwashing” isn’t specious, though it’s fair to question his depiction of the history of open source and free software and an absence of balance in his consideration of various open government efforts. Civil society and media must be extremely careful about giving governments credit for just putting a “portal” online.

On that count, Wonderlich wrote about the “missing open data policy” that every government that has stood up or will stand up an open data platform could benefit from reading:

Most newly implemented open data policies, much like the Open Government Directive, are announced along alongside a package of newly released datasets, and often new data portals, like Data.gov. In a sense, these pieces have become the standard parts of the government data transparency structure.  There’s a policy that says data should generally be open and usefully released, a central site for accessing it, some set of new data, and perhaps a few apps that demonstrate the data’s value.

Unfortunately, this is not the anatomy of an open government.  Instead, this is the anatomy of the popular open government data initiatives that are currently in favor. Governments have learned to say that data will be open, provide a place to find it, release some selected datasets, and point to its reuse.

This goes to the concerns of traditional advocates working for good government, as explored in a excellent research paper by Yu and Robinson on the ambiguity of open government and open data, along with the broader discussion you’ll find in civil society in the lead up to the Open Government Partnership, where this dynamic was the subject of much concern — and not just in the Canadian or United Kingdom context. The work exploring this dynamic by Nathaniel Heller at Global Integrity is instructive.

As I’ve written before (unrepentant self-plagiarism alert), standing up open data platforms and publishing data sets regarding services is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.

Socrata, a venture-capital backed startup whose technology powers the open data platforms of several city, state and federal governments, including Kenya and the United States, is also part of this ecosystem and indisputably has “skin in the game.”

That said, the insights that Kevin Merritt, the founder of Socrata, shared in post on reinventing government are worth considering:

An open Government strategy needs to include Open Data as a component of enabling transparency and engaging citizens. However, Open Government is also about a commitment to open public meetings; releasing public information in all its forms, if not proactively at least in a timely fashion; engaging the public in decision making; and it is also a general mindset, backed up by clear policy, that citizens need to be empowered with information and a voice so they can hold their government accountable.

At the same time, a good Open Data strategy should support Open Government goals, by making structured data that relates to accountability and ethics like spending data, contracts, staff salaries, elections, political contributions, program effectiveness…etc. available in machine- and human-readable formats.

The open data strategy advanced by the White House and 10 Downing Street has not embraced releasing all of those data types, although the Obama administration did follow through on the President’s promise to launch Ethics.gov.

The Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. It’s fair to say that this White House has advanced an unprecedented effort to open up government information while it has much more of mixed record on transparency and accountability, particularly with respect to national security and a culture of secrecy around the surveillance state.

Open government advocates assert that the transparency that President Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote in the Washington Post. In fact, the current administration’s open data initiatives are one of the bright spots its transparency record — and that’s in the context of real data quality and cultural issues that need to be addressed to match the rhetoric of the past four years.

“Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, former president of the Metro Chicago Information Center. “I think the ‘New Tech’ world forgets that — open data is a political process first and foremost, and a technology problem second.”

If we look at what’s happening with open government in Chicago, a similar dynamic seems to have emerged, as the city methodically works to release high quality open data related to services, performance or lobbying but is more resistant to media organizations pushing for more access to data about the Mayor’s negotiations or electronic communications, the traditional targets of open government advocacy. This tension was explored quite well in an article by WBEZ on the people behind Chicago’s government 2.0 efforts.

In the United States, there is a sizable group of people that believe that data created using public funds should in turn be made available to the public — and that the Internet is a highly effective place to make such data available. Such thinking extends to open access to research or public sector code, too.

As those policy decisions are implemented, asking hard questions about data quality, use, licenses, outcomes and cost is both important and useful, particularly given that motivations and context will differ from country to country and from industry to civil society.

Who benefits and how? What existing entities are affected? Should all public data be subject to FOIA? If so, under what timelines and conditions? Should commercial entities that create or derive economic value from data pay for bulk access? What about licensing? If government goes digital, how can the poor, disabled or technically illiterate be given access and voice as well? (Answers to some of these questions are in the Sunlight Foundation’s principles of open government data, which were based on the recommendatations of an earlier working group.)

In the United Kingdom, there are also concerns that the current administrations “open data agenda” obscures a push towards privatization of public services should be more prominent in public debates, a dynamic that Morozov recently explored in the opinion pages of the New York Times. My colleague, Nat Torkington, highlighted the needs for a discussion about which services should be provided by government at Radar back in 2010:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Whether one agrees with the side of the argument that supports investment or the other that is looking for cost-savings — or both — is something that people of democratic societies will need to debate and decide for themselves, along with the size and role of government. The politics can’t be abstracted away.

I don’t think that many open government advocates are blind to the ideologies involved, including the goals of libertarians, nor that the “open dystopia” that Newitz described at iO9 is a particularly likely outcome.

That said, given the stakes, these policies deserve to be the subject of debate in every nation whose leaders are putting them forward. We’ve never had better tools for debate, discussion and collective action. Let’s use them.

What is the ROI of open government?

Putting a dollar value on clean water, stable markets, the quality of schooling or access to the judiciary is no easy task. Each of these elements of society, however, are to some extent related to and enabled by open government.

If we think about how the fundamental democratic principles established centuries ago extend today purely in terms of the abstraction of transparency, the “business value” of open government isn’t always immediately clear, at least with respect to investment or outcomes.

Transparency and accountability are core to how we think about government of, by and for the people, where a polity elects representative government. When budgets are constrained, however, city managers, mayors, controllers and chief information officers question the value of every single spending decision. (Or at least they should.)

It’s that context, of course, that’s driving good, hard questions about the business case for open government. Tim Berners-Lee, the inventor of the World Wide Web, said in 2011, at the launch of the Open Government Partnership in New York City, said that increased transparency into a state’s finances and services directly relates to the willingness of the businesses and other nations to invest in a country.

That’s the kind of thinking that has driven the World Bank to open up its data, to give people access to more information about where spending is happening and what those funds are spent upon. While transparency into government budgets varies immensely around the world, from frequently updated portals to paper records filed in local county offices, technology has given states new opportunities to be more accountable — or to be held accountable, by civic media and the emerging practice of data journalism.

The challenges with releasing spending data, however, are manifold, from quality assurance to the (limited) costs of publishing to access to making it comprehensible to taxpayers through visualizations and calculators.

People in and outside of government are working to mitigate these issues, from using better visualization tools to adopting Web-based online platforms for publishing. The process of cleaning and preparing data to be published itself has returns for people inside of government who need access to it. According to the McKinsey Global Institute, on average, government workers spend 19% of their days simply looking for information.

In other words, opening information government to citizens also can mean it’s more available to government itself.

Organizing and establishing governance practices for data, even if some of it will never be published online, also has significant returns. Chicago chief data officer Brett Goldstein established probability curves for violent crime, explained John Tolva, the chief technology officer of the city of Chicago, when we talked in 2011. Since then, “we’re trying to do that elsewhere, uncovering cost savings, intervention points, and efficiencies,” he said.

“We have multiple phases for how we roll out data internally, starting with working with the business owner,” said Goldstein, in an interview. “We figure out how we’ll get it out of the transactional database. After that, we determine if it’s clean, if it’s verified, and if we can sign off on it technically.”

Tolva makes the business case for open data by identifying four areas that support investment, including an economic rationale.

  1. Trust
  2. Accountability of the work force
  3. Business building
  4. Urban analytics

After New York City moved to consolidate and clean its regulatory data, city officials were able to apply predictive data analytics to save lives and money. According to Mike Flowers, the chief analytics officer of NYC, the city achieved:

  • A five-fold return on the time of building inspectors looking for illegal apartments
  • An increase in the rate of detection for dangerous buildings that are highly likely to result in firefighter injury or death
  • The discovery of more than twice as many stores selling bootlegged cigarettes
  • A five-fold increase in the detection of business licenses being flipped

California’s recent budget woes coincided with unprecedented demand for government to be more and efficient online. The state connected citizens to e-services with social media. Both California Unemployment Office and the Department of Motor Vehicles were able to deliver better services online without additional cost.

“You can tweet @CA_EDD and get answers like how long until you get a check, where to go on the website or job fairs,” said Carolyn Lawson, the former deputy director for the technology services governance division in the eServices Office of California, in an interview. “I don’t think the creators of Twitter thought it would be a helpdesk for EDD.”

These kinds of efforts are far from the only places where there are clear returns for investments. A world away from big cities and states in the United States and urban data analytics, the World Bank found the ROI in open government through civic participation and mobile phones. Mobile participatory budgeting helped raise tax revenues in Congo, combining technology, civic participation and systems thinking to give citizens a voice in government.

“Beyond creating a more inclusive environment, the beauty of the project in South Kivu is that citizen participation translates into demonstrated and measurable results on mobilizing more public funds for services for the poor,” said Boris Weber, team leader for ICT4Gov at the World Bank Institute for Open Government, in an interview in Washington. “This makes a strong case when we ask ourselves where the return of investment of open government approaches is.”

This post originally appeared on LaserFiche.

Cameras in the courtroom: Will SCOTUS ever go live online?

In an age where setting up a livestream to the Web and the rest of the networked world is as easy as holding up a smartphone and making a few taps, the United States Supreme Court appears more uniformly opposed to adding cameras in the courtroom than ever.

SupremeCourt.gov provides online access to opinionsordersdocket, court calendarstranscriptsschedulesrulesvisitors’ guidescase-handling guides and press releases and even adopted responsive Web design for in 2012.

As Adam Liptak reported for the New York Times today, despite a trend towards cameras in the rest of the legal system of the United States and in higher courts around the globe, the Supreme Court still rejects video coverage.

Moreover, the two newest members of the court, Associate Justices Sotomayor and Kagan, have shifted their positions towards opposing the addition of cameras since taking the bench.

As a result, the vast majority of Americans will only be able to listen to oral arguments, read transcripts, and learn about verdicts in the somewhat bizarre fashion that has emerged in the absence of live video for the Supreme Court.

There’s no liveblogging or tweeting from within the Supreme Court’s hearing room either, which leads to the beautiful mashup of old and new media below:

 

Are cameras in the courtroom better or worse for justice?

As Liptak reported, the question of adding cameras to the Supreme Court is considered by two new papers in the Brigham Young University Law Review.

IIT Chicago-Kent law professor Nancy Marder, the author of “The Conundrum of Cameras in the Courtroom,” is opposed to adding cameras, essentially arguing that status quo in SCOTUS and lower federal courts remain in place:

Federal courts should post transcripts and audio recordings of court proceedings online, but stop short of permitting cameras in the courtroom. Federal judges need to consider the power of the image, the omnipresence of the camera, the spread of images via the Web, and the current lack of a “technology etiquette” that will guide the use of courtroom images on the Web. Until that etiquette develops, federal judges should take incremental steps to make courts more accessible, but should not allow cameras in federal courts, particularly in federal district courts.

As I suggested to Dan Diamond last year, it’s worth considering how courts in other nations have embraced the Web. He did exactly that, in Forbes:

For example, Australia’s High Court makes decisions available as text files, not just PDFs, and has a prominent link to daily transcripts. Canada’s Court offers webcasts and a front-page link to statistics on ten years’ worth of decisions.

And the United Kingdom’s Supreme Court website doesn’t just offer a link to live, streaming video – it even has a Twitter feed, too.

Brazil, in fact, has been broadcasting all of the judicial and administrative meetings of its Supreme Court live on television since 2002.

University of Oregon journalism professor Kyu Ho Youm, went further down this line of inquiry in his new paper surveying the use of cameras in supreme courts and international human rights courts, concluding that the concerns of justices abroad have been allayed by the outcomes:

Foreign and international courts’ consistently positive experience
with allowing electronic media access to courtrooms should be a useful
guide for the justices of the U.S. Supreme Court. Nearly all the major
assumptions, worries, and concerns that several Justices cite in opposing
cameras are unlikely to be substantiated as learned from the real-life
experience of justices of the Supreme Courts of England and Canada

Given the example of other nations, will the U.S. follow? As Liptak reported in the New York Times, Chief Justice John Roberts enumerated several ways the court has adopted technology but expressed reservations about cameras in particular.

“Cameras present all sorts of challenges that these other areas don’t,” said the chief justice, referring to making audio recordings and transcripts of hearings available. “I’m not going to go through the whole debate, it’s a fairly common one. We worry about the impact on lawyers. I worry about the impact on judges.”

His complete comments upon adding cameras to the Supreme Court in a 2011 conversation with Judge J. Harvie Wilkinson III at the Fourth Circuit Judicial Conference in the video excerpt embedded below.

“It would be interesting to hear what government institutions people think function better, now that they’re on television,” said the chief justice, “than if they’re not.”

Update: Justices Breyer and Kennedy recently were asked about this issue in Congress. Here’s their response, via Nancy Scola:

C-SPAN: “Justices Anthony Kennedy & Stephen Breyer discuss having television cameras in the Supreme Court. They do so in response to a question from Rep. Mike Quigley (D-IL).

Samantha Power: OGP is President Obama’s signature governance initiative

On January 10th, 2013, the OpenGov Hub officially launched in Washington, DC.

The OpenGov Hub has similarities to incubators and accelerators, in terms of physically housing different organizations in one location, but focuses on scaling open government and building community, as opposed to scaling a startup and building a business.

Samantha Power, special assistant to President Obama and senior director for multilateral affairs and human rights in the White House, spoke about the Hub, the Open Government Partnership, which she was at the heart of starting — and the broader importance of why “open government” is important to everyday citizens: improving lives and delivering results.

A video I recorded at the event, embedded below, captured her talk. Afterwards, I’ve posted text of her remarks, lightly edited for clarity. The emphases are mine.

“I’m jealous. It just feels cool. It feels like you’d come up with lots of ideas if you worked here. My office doesn’t feel quite like this, but we did hatch, collaboratively, the Open Government Partnership. 

I’ll just say a few things, mainly just to applaud this and to say how exciting it is.

The White House is a couple blocks in one direction, the State Department is another couple blocks in another, and there are a gazillion departments and agencies around who would really benefit from the infusion of energy and insight that you all bring to bear every day to your work.

President Obama started his first term issuing this Open Government Memorandum and it really did set the tone for the administration, and it does signal what a priority this was to him.

We are now on the verge of starting a second term and everybody in the administration is working to think through how does this manifest itself in the second term, the last term. You don’t get a chance after this next four years to do it again. We’re all very aware of that and we’re going to benefit from the ideas that you have.

Just to give you an indicator of what OGP has come to mean to the President — and this was catalyzed in a speech that he gave before the UN General Assembly. Those speeches are a kind of ‘State of the Union’ for foreign policy, and he chose to use that speech in year two of his presidency to talk about the fact that the old divisions, the old way of thinking of North and South, East and West, have been overtaken by open and closed and scales of openness, degrees of openness.

He challenged the countries there, the leaders, the peoples, to come back with ideas for how we could achieve more transparency, fight corruption, harness new technologies for innovation, and empower citizens. And that gave rise to this brainstorm, which in turn gave rise to this OpenGovHub, with this new leadership. We’re very, very excited about this next phrase of OGP’s growth.

This, I think in many ways, is President Obama’s signature governance initiative, and it’s something he takes extremely seriously. In bilateral meetings with foreign heads of state he often brings this up, spontaneously, if we have failed, somehow, to get it into the talking points. It is something he’s talked to Prime Minister Cameron about in the U.K. The Indonesians of course are the co-chairs now, so it’s not longer his.

The trip to Burma, which just occurred, was a very moving trip. I got to be a part of that. It was amazing to see President Barack Hussein Obama at the home of Aung San Suu Kyi, maybe the next leader of that country, talking about open government, and the Open Government Partnership, and the Burmese coming out on that trip and committing to be part of the Open Government Partnership by 2015, and articulating each of the milestones for budget transparency, on disclosure for public officials, on civil liberties, freedom of information.

So [using] OGP, and this open government conversation, as a hook to make progress on issues that this stage of Burma’s long journey it’s critical that they make progress on. So I just wanted to convey how much this really matters to him personally.

Second, and you talking about this earlier today, the challenge of conversions still exists, with other governments, with officials, in my own government, and certainly with citizens and other groups around the world who don’t self-identify within the space. And so, I think, thinking through the ways in which platforms like this one that pull together success stories and ways in which citizens have concretely benefitted, this is what it’s all about.

It’s not about the abstraction about ‘fighting corruption’ or ‘promoting transparency’ or ‘harnessing innovation’ — it’s about ‘are the kids getting the textbooks they’re supposed to get’ or does transparency provide a window into whether resources are going where they’re supposed to go and, to the degree to which that window exists, are citizens aware and benefiting from the data and that information such that they can hold their governments accountable. And then, does the government care that citizens care that those discrepancies exist?

That’s ultimately what this is about, and, I think, the more that we have concrete examples of real children, of real hospitals, real polluted water and clean water, real cost savings, in administrative budget terms, the more success we’re going to have in bringing new people into this community – and I confess, I was not one. Jeremy Weinstein used to come and knock on my door, and say, ‘What is this, open government?’ and I didn’t understand it.

Then, with a few examples, I said, ‘Oh, this is exactly what I’ve been trying to do under another rubric, you know, for a very long time.” This creates the possibility for another kind of conversation. 

Sometimes, democracy and human rights, issues like that, can get other governments on their heels. Open government creates the opportunity for conversations that sometimes doesn’t exist.

The last thing I’d say is, just to underscore a data point that’s been made, but in some sense, art imitates life, like this space imitates life. This space itself seems to be kind of predicated on the logic of open government — open idea sharing, information sharing, it’s great.

Our little OGP experiment, I think, is one that a lot of these groups are using. We benefited from what most of these groups and most of you have been doing, again, for a very long time, which is to recognize that we don’t know what we’re doing. We need to hear and learn from people who are out in the field. We have ideas and can be very abstract.

What the civil society partners have brought to the Open Government Partnership is just one example of what you’re bringing to people’s lives every day. You have to interface with people [to get] the ability to track whether policies are working. J

Just as the partnership itself has this originality to it, of being multi-stakeholder and having civil society and governments at the table, figuring out what we’re doing, so too our criteria, whether a country is or isn’t eligible, is the product of NGO data, or academic frameworks, there just has to be cross-pollination.

Again, OGP is just one version of this, but I think the more that our communities are talking to one another, and certainly, speaking from the government perspective now, just sucking in the work and the insights that you all bring to bear, the better off real people are going to be in the world, and the more likely those kids are going to be to get those textbooks.

Thanks for having me.”

Civic app for finding flu shots goes viral

The 2012-2013 influenza season has been a bad one, with flu reaching epidemic levels in the United States. The data from Google Flu Trends, visualized below, shows the tale.

google-flu-trends-january-17-2013

Given the risk, mayors and public health officials around the country are using new technologies to connect residents with health care, from social media to widgets to flu shot finder maps. This week, it looks like the code for a flu shot location application created in Chicago is doing what viruses do best: go viral in cities.

Sam Roudman at TechPresident:

“In the middle of what might be theworst flu season in a decade, Boston Mayor Thomas Menino declared apublic health emergency — and civic hackers found a way to help the cause. With help from Code for America volunteers, the Boston Mayor’s Office of New Urban Mechanics was able to repurpose aChicago app that maps free vaccination locations in little more than a day, just in time for a weekend vaccination campaign at 24 locations. The app’s journey from Chicago to Boston is a model of intra-civic partnership.”

Chris Whitaker explored the origins of the app at the Code for America blog today:

Originally developed in Chicago by Tom Kompare, the flu shot app helps users find nearby clinics offering free flu shots by entering in their address or by using a GPS-enabled mobile device. It also allows users to get public-transit directions to those clinics at the click of a button.

Built at the request of Chicago’s Department of Health, Kompare started work on the app after representatives from the department dropped by Chicago’s OpenGov Hack Night during the Google API challenge presentation in October, and asked about an easy way for citizens to find out where to get a free flu shot. Within weeks, Kompare’s app was built, adopted, and hosted on Smart Chicago’s Collaborative’s servers.

…Hours after Boston’s Mayor Menino had declared a public health emergency, Boston’s Brigade CaptainHarlan Weber reached out to me about the use of the flu shot app.… The app was launched and ready for the public less than 36 hours after the initial email was sent.

Now, it looks like the code is spreading from Chicago to Philadelphia as well, according to a tweet from Philly’s chief data officer, Mark Headd.

If you still need a shot, these maps can help you learn where to find one. In the meantime, take care to keep healthy by frequently washing your hands and protecting others if you do fall ill by covering your mouth when you cough.

 

Open Government Partnership hosts regional meeting in Chile

The Open Government Partnership (OGP) has released statistics on its first 16 months since its historic launch in New York City, collected together in the infographic embedded below. This week, Open government leaders are meeting in Chile to discuss the formal addition of Argentina to the partnership and the national plans that Latin American countries have pledged to implement. [LivestreamÁlvaro Ramirez Alujas, Founder of the Group of Investigation in Government, Administration and Public Policy (GIGAPP), assisted GOP with an analysis of these OPG action plans. Alujas found that:

  • 46% are linked to commitments on public integrity
  • 27% are related to the improvement of public services
  • 14% are linked to more effectively managing public resources and
  • 12% are related to increasing accountability and corporate responsibility.

Gobierno abierto

The infographic is also available en Español:

Accountability for accountability

As I noted in my assessment of 2012 trends for Radar, last year the Economist’s assessment was that open government grew globally in scope and clout.

As we head into 2013, it’s worth reiterating a point I made last summer in a post on oversight of the Open Government Partnership:

There will be inevitable diplomatic challenges for OGP, from South Africa’s proposed secrecy law to Russia’s membership. Given that context, all of the stakeholders in the Open Government Partnership — from the government co-chairs in Brazil and the United Kingdom to the leaders of participating countries to the members of civil society that have been given a seat at the table — will need to keep pressure on other stakeholders if significant progress is going to be made on all of these fronts.

If OGP is to be judged more than a PR opportunity for politicians and diplomats to make bold framing statements, government and civil society leaders will need to do more to hold countries accountable to the commitments required for participation: they must submit Action Plans after a bonafide public consultation. Moreover, they’ll need to define the metrics by which progress should be judged and be clear with citizens about the timelines for change.

African News Challenge funds data journalism and open government tech

The post-industrial future of journalism is already here. It’s just not evenly distributed yet. The same trends changing journalism and society have the potential to create significant social change throughout the African continent, as states moves from conditions of information scarcity to abundance.

That reality was clear on my recent trip to Africa, where I had the opportunity to interview Justin Arenstein at length during my visit to Zanzibar. Arenstein is building the capacity of African media to practice data-driven journalism, a task that has taken on new importance as the digital disruption that has permanently altered how we discover, read, share and participate in news.

One of the primary ways he’s been able to build that capacity is through African News Innovation Challenge (ANIC), a variety of the Knight News Challenge in the United States.

The 2011 Knight News Challenge winners illustrated data’s ascendance in media and government, with platforms for data journalism and civic connections dominating the field.

As I wrote last September, the projects that the Knight Foundation has chosen to fund over the last two years are notable examples of working on stuff that matters: they represent collective investments in digital civic infrastructure.

The first winners of the African News Innovation Challenge, which concluded this winter, look set to extend that investment throughout the continent of Africa.

“Africa’s media face some serious challenges, and each of our winners tries to solve a real-world problem that journalists are grappling with. This includes the public’s growing concern about the manipulation and accuracy of online content, plus concerns around the security of communications and of whistleblowers or journalistic sources,” wrote Arenstein on the News Challenge blog.

While the twenty 2012 winners include investigative journalism tools and whistleblower security, there’s also a focus on citizen engagement, digitization and making public data actionable. To put it another way, the “news innovation” that’s being funded on both continents isn’t just gathering and disseminating information: it’s now generating data and putting it to work in the service of the needs of residents or the benefit of society.

“The other major theme evident in many of the 500 entries to ANIC is the realisation that the media needs better ways to engage with audiences,” wrote Arenstein. “Many of our winners try tackle this, with projects ranging from mobile apps to mobilise citizens against corruption, to improved infographics to better explain complex issues, to completely new platforms for beaming content into buses and taxis, or even using drone aircraft to get cameras to isolated communities.”

In the first half of our interview, published last year at Radar, Arenstein talked about Hacks/Hackers, and expanding the capacity of data journalism. In the second half, below, we talk about his work at African Media Initiative (AMI), the role of open source in civic media, and how an unconference model for convening people is relevant to innovation.

What have you accomplished at the AMI to date?

Justin Arenstein: The AMI has been going on for just over three years. It’s a fairly young organization, and I’ve been embedded now for about 18 months. The major deliverables and the major successes so far have been:

  1. A $1 million African News Innovation Challenge, which was modeled fairly closely on the Knight Challenge, but a different state of intended outputs.
  2. A network of Hacks/Hackers chapters across the continent.
  3. A number of technology support or technology development initiatives. Little pilot projects, invariably newsroom-based.

The idea is that we test ideas that are allowed to fail. We fund them in newsrooms and they’re driven by newsrooms. We match them up with technologists. We try and lower the barrier for companies to start experimenting and try and minimize risk as much as possible for them. We’ve launched a couple of slightly larger funds for helping to scale some of these ideas. We’ve just started work on a social venture or a VC fund as well.

You mentioned different outputs in the News Challenge. What does that mean?

Justin Arenstein: Africa hasn’t had the five-year kind of evolutionary growth that the Knight News Challenge has had in the U.S. What the News Challenge has done in the U.S. is effectively grown an ecosystem where newsrooms started to grapple with and accepted the reality that they have to innovate. They have to experiment. Digital is core to the way that they’re not only pushing news out but to the way that they produce it and the way that they process it.

We haven’t had any of that evolution yet in Africa. When you think about digital news in African media, they think you’re speaking about social media or a website. We’re almost right back at where the News Challenge started originally. At the moment, what we’re trying to do is raise sensitivity to the fact that there are far more efficient ways of gathering, ingesting, processing and then publishing digital content — and building tools that are specifically suited for the African environment.

There are bandwidth issues. There are issues around literacy, language use and also, in some cases, very different traditions of producing news. The output of what would be considered news in Africa might not be considered news product in some Western markets. We’re trying to develop products to deal with those gaps in the ecosystem.

What were the most promising News Challenge entrants that actually relate to those outputs?

Justin Arenstein: Some of the projects that we thought were particularly strong or apt amongst the African News Challenge finalists included more efficient or more integrated ways to manage workflow. If you look at many of the workflow software suites in the north, they’re, by African standards, completely unaffordable. As a result, there hasn’t been any systemic way that media down here produced news, which means that there’s virtually no way that they are storing and managing content for repackaging and for multi-platform publishing.

We’re looking at ways of not reinventing a CMS [content management system], but actually managing and streamlining workflow from ingesting reporting all the way to publishing.

Some of the biggest blogs in the world are running on WordPress for a CMS. Why not use that where needed?

Justin Arenstein: I think I may have I misspoken by saying “content management systems.” I’m referring to managing, gathering and storing old news, the production and the writing of new content, a three or four phase editing process, and then publishing across multiple platforms. Ingesting creative design, layout, and making packages into podcasting or radio formats, and then publishing into things like Drupal or WordPress.

There have been attempts to take existing CMS systems like Drupal and turn it into a broader, more ambitious workflow management tool. We haven’t seen very many successful ones. A lot of the kinds of media that we work with are effectively offline media, so these have been very lightweight applications.

The one thing that we have focused on is trying to “future-proof” it, to some extent, by building a lot of meta tagging and data management tools into these new products. That’s because we’re also trying to position a lot of the media partners we’re working with to be able to think about their businesses as data or content-driven businesses, as opposed to producing newspapers or manufacturing businesses. This seems to be working well in some early pilots we’ve been doing in Kenya.

What were your takeaways from the Tech Camp? Was a hybrid unconference a good model for the News Challenge?

Justin Arenstein: A big goal that we think we’ve achieved was to try and build a community of use. We put people together. We deliberately took them to an exotic location, far away from a town or location, where they’re effectively held hostage in a hotel. We built in as much free time as possible, with many opportunities to socialize, so that they start creating bonds. Right from the beginning, we did a “speed dating” kind of thing. There’s been very few presentations — in fact, there was only one PowerPoint in five days. The rest of the time, it’s actually the participants teaching each other.

We brought in some additional technology experts or facilitators, but they were handpicked largely from previous challenges to share the experience of going through a similar process and to point people to existing resources that they might not be aware of. That seems to have worked very well.

On the sidelines of the Tech Camp, we’ve seen additional collaborations happen for which people are not asking for funding. It just makes logical sense. We’ve already seen some of the initial fruits of that: three of the applicants actually partnered and merged their applications. We’ve seen a workflow editorial CMS project partner up with an ad booking and production management system, to create a more holistic suite. They’re still building as two separate teams, but they’re now sharing standards and they’re building them as modular products that could be sold as a broader product suite.

The Knight News Challenge has stimulated the creation of many open source tools. Is any of that code being re-used?

Justin Arenstein: We’ve tried to tap into quite a few of them. Some of the more recent tools are transferable. I think there was grand realization that people weren’t able to deliver on their promises — and where they did deliver on tools, there wasn’t documentation. The code was quite messy. They weren’t really robust. Often, applications were written for specific local markets or data requirements that didn’t transfer. You actually effectively had to rebuild them. We have been able to re-purpose DocumentCloud and some other tools.

I think we’ve learned from that process. What we’re trying to do with our News Challenge is to workshop finalists quite aggressively before they put in their final proposals.

Firstly, make sure that they’re being realistic, that they’re not unnecessarily building components, or wasting money and energy on building components for their project that are not unique, not revolutionary or innovative. They should try and almost “plug and play” with what already exists in the ecosystem, and then concentrate on building the new extensions, the real kind of innovations. We’re trying to improve on the Knight model.

Secondly, once the grantees actually get money, it comes in a tranche format so they agree to an implementation plan. They get cash, in fairly small grants by Knight standards. The maximum is $100,000. In addition, they get engineering or programming support from external developers that are on our payroll, working out of our labs. We’ve got a civic lab running out of Kenya and partners, such as Google.

Thirdly, they get business mentorship support from some leading commercial business consultants. These aren’t nonprofit types. These are people who are already advising some of the largest media companies in the world.

The idea is that, through that process, we’re hopefully going to arrive at a more realistic set of projects that have either sustainable revenue models and scaling plans, from the beginning, or built-in mechanisms for assessments, reporting back and learning, if they’re designed purely as experiments.

We’re not certain if it’s going to work. It’s an experiment. On the basis of the Tech Camp that we’ve gone through, it seems to have worked very well. We’ve seen people abandon what were, we thought, overly ambitious technology plans and rather matched up or partnered with existing technologists. They will still achieve their goals but do so in a more streamlined, agile manner by re-purposing existing tech.

Editors’s Note: This interview is part of an ongoing series at the O’Reilly Radar on the people, tools and techniques driving data journalism.

PollWatchUSA enables anyone with a smartphone to act as a poll monitor

Pollwatch, a mobile application that enabled crowdsourced poll monitoring, has launched a final version at pollwatch.us, just in time for Election Day 2012. The initial iteration of the app was conceived, developed and demonstrated at the hackathon at the 2012 Personal Democracy Forum in New York City. The app aggregates reports and visualizes the user-generated data at pollwatchusa.org/viz.

Pollwatch iPhone app

The app is result of a collaboration between the PollWatch team, which includes RebootWebSava, and Common Cause/NY, along with input from TurboVote, The project also received support from the Voter Information Project and Latino Justice.

“Election Day is often hampered by inefficiency and confusion, leaving voters with little recourse. PollWatchUSA was conceived to help voters report problems in real time, by putting the tool in the palm of their hands. Through crowd sourcing, Common Cause/NY hopes to collect a broad data set to better identify the issues and help create a more effective elections administration system,” said Susan Lerner, Executive Director of Common Cause/NY, in a prepared statement.

The data for polling locations is coming from the Voting Information Project, which has acted as civic infrastructure for a number of efforts this year.

“Susan Lerner, our project co-sponsor at Common Cause, was instrumental in making sure the New York polling sites were included in that dataset (with much nudging and cajoling to the Board of Elections),” emailed Jeremy Canfield, service designer at Reboot.

Canfield explained that the project went through three iterations since June.

“We tested it out with users in two primaries, plus got some help from one of Union Square Ventures Product Feedback days,” he wrote. “We used that feedback to simplify the flow, making it as easy as possible for users to report on their voting experience. By making it easy and lightweight to report, plus sharing those reports widely, we can get better data to election advocates (chief among them, Common Cause), who can provide immediate help or work with the various boards of elections to make real time adjustments.”

Notably, Pollwatch is made to work on any smartphone, not just a singular platform. They chose to develop a mobile website, not a native app, avoiding the “shiny app syndrome” that has been problematic for some local governments. Well done, all.

To use social media in a time of need, start building networks before disasters

As is the case in every major event in the U.S., social media was part of the fabric of communications during Hurricane Sandy. Twitter was a window into what was happening in real-time. Facebook gave families and friends a way to stay in touch about safety or power. And government officials and employees, from first responders mayors to governors to the President of the United States, put critical information into the hands of citizens that needed it.

While Hurricane Sandy cemented the utility of these networks, neither they nor their role are new. With all due respect to Gartner analyst Andrea Di Maio, his notion that people aren’t conveying “useful information” every day there — that it’s just ” chatting about sport results, or favorite actors, or how to bake” — is like some weird flashback to a 2007 blog post or ignorant cable news anchor.

Public sector, first responders and emergency management officials have recognized the utility of social media reports as a means for situational awareness before, during and after natural or man-made disasters for years now and have integrated tools into crisis response.

Officials at local, state and federal levels have confirmed to me again and again that it’s critical to build trusted networks *before* disaster strikes so that when crises occur, the quality of intelligence is improved and existing relationships with influence can amplify their messages.

Media and civil society serve as infomediaries and critical filters (aka, B.S. detectors) for vetting information, something that has proved crucial with fake reports and pictures popping up. Official government accounts play a critical role for putting trusted information into the networks to share, something we saw in real-time up and down the East Coast this week.

To be frank, Di Maio’s advice that authorities shouldn’t incorporate social media into their normal course of business is precisely the opposite of the experience on the ground of organizations like the Los Angeles Fire Department, Red Cross or FEMA. Here’s Brian Humphrey, public information officer of the LAFD, on best practices for social media:

If public safety officials come across Di Maio’s advice, I hope they’ll choose instead to listen to citizens every day and look to scale the best practices of their peers for using technology for emergency response, not start during a crisis.