Gov 2.0

Beware openwashing. Question secrecy. Acknowledge ideology.

You could spend a long day listing all of the organizations or individuals who are putting government data online, from Carl Malamud to open government activists in Brazil, Africa or Canada. As many conversations in the public domain over the past few years have demonstrated, there are many different perspectives on what purposes “open data” should serve, often informed by what advocates intend or related to an organization or institution’s goals. For those interested, I recommend the open data seminar and associated comments highly.

When and if such data includes ratings or malpractice information about hospitals or doctors, or fees for insurance companies, transparency and accountability is an important byproduct, which in turn does have political implications. (Watch the reaction of unions or doctors’ groups to performance or claims data going online for those conflicts.)

There are people who want to see legislatures open their data, to provide more insight into those processes, and others who want to to see transit data or health data become more open, in the service of more civic utility or patient empowerment.

Other people may support publishing more information about the business or performance of government because evidence of fraud, mismanagement or incompetence will support their arguments for shrinking the size of the state. A big tent for open government can mean that libertarians could end up supporting the same bills liberals do.

In the U.S., Govtrack.us has been making government legislative data open, despite the lack of bulk access to Thomas.gov, by “scraping.” There are many people who wish to see campaign finance data open, like the Sunlight Foundation, to show where influence and power lies in the political system. There are many members of civil society, media organizations and startups that are collecting, sharing or using open data, from OpenCorporates to OpenCongress, to Brightscope or ProPublica.

Whether anyone chooses to describe those activities as a movement is up to them — but it is indisputable that 3 years ago, a neutral observer would be hard-pressed to find an open government data platform. Now there are dozens at the national level. What matters more than their existence is what goes onto them, however, and there people have to be extremely careful about giving governments credit for just putting a “portal” online.

While the raw number of open government data platforms around the globe looks set to continue to increase in 2013 at every level of government, advocates should be wary of governments claiming “open government” victories as a result.

Since Morozov sent out that tweet, he’s published a book with a chapter that extends that critique, along with a series of New York Times op-eds, reviews, Slate debates, and a 16,000 word essay in The Baffler that explores the career and thinking of Tim O’Reilly (my publisher). Morozov’s essay catalyzed Annaleen Newitz to paraphrase and link to it at post at iO9, where Tim responded to in a comment.

While his style can distract and detract from his work — and his behavior on Twitter can be fairly characterized as contemptuous at times — the issues Morozov raises around technology and philosophy are important and deserve to be directly engaged by open government advocates, as John Wilbanks suggests.

 

 

That’s happening, slowly. Sunlight Foundation policy director John Wonderlich has also responded, quoting Morozov’s recommendations to reflect out how he might specific uses of technology that support open government. Wilbanks himself has written one of the most effective (short) responses to date:

One of the reasons I do “open” work is that I think, in the sciences, it’s a philosophical approach that is more likely to lead to that epistemic transformation. If we have more data available about a scientific problem like climate change, or cancer, then the odds of the algorithms figuring something out that is “true” but incomprehensible to us humans go up. Sam Arbesman has written about this nicely both in his book the Half Life of Facts and in another recent Slate article.

I work for “open” not because “open” solves a specific scientific problem, but because it increases the overall probability of success in sensorism-driven science. Even if the odds of success themselves don’t change, increasing the sample size of attempts will increase the net number of successes. I have philosophical reasons for liking open as well, and those clearly cause me cognitive bias on the topic, but I deeply believe that the greatest value in open science is precisely the increased sample size of those looking.

I also tend to think there’s a truly, deeply political element to enabling access to knowledge and science. I don’t think it’s openwashing (and you should read this paper recommended by Morozov on the topic) to say that letting individuals read science can have a real political impact.

Morozov’s critique of “openwashing” isn’t specious, though it’s fair to question his depiction of the history of open source and free software and an absence of balance in his consideration of various open government efforts. Civil society and media must be extremely careful about giving governments credit for just putting a “portal” online.

On that count, Wonderlich wrote about the “missing open data policy” that every government that has stood up or will stand up an open data platform could benefit from reading:

Most newly implemented open data policies, much like the Open Government Directive, are announced along alongside a package of newly released datasets, and often new data portals, like Data.gov. In a sense, these pieces have become the standard parts of the government data transparency structure.  There’s a policy that says data should generally be open and usefully released, a central site for accessing it, some set of new data, and perhaps a few apps that demonstrate the data’s value.

Unfortunately, this is not the anatomy of an open government.  Instead, this is the anatomy of the popular open government data initiatives that are currently in favor. Governments have learned to say that data will be open, provide a place to find it, release some selected datasets, and point to its reuse.

This goes to the concerns of traditional advocates working for good government, as explored in a excellent research paper by Yu and Robinson on the ambiguity of open government and open data, along with the broader discussion you’ll find in civil society in the lead up to the Open Government Partnership, where this dynamic was the subject of much concern — and not just in the Canadian or United Kingdom context. The work exploring this dynamic by Nathaniel Heller at Global Integrity is instructive.

As I’ve written before (unrepentant self-plagiarism alert), standing up open data platforms and publishing data sets regarding services is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.

Socrata, a venture-capital backed startup whose technology powers the open data platforms of several city, state and federal governments, including Kenya and the United States, is also part of this ecosystem and indisputably has “skin in the game.”

That said, the insights that Kevin Merritt, the founder of Socrata, shared in post on reinventing government are worth considering:

An open Government strategy needs to include Open Data as a component of enabling transparency and engaging citizens. However, Open Government is also about a commitment to open public meetings; releasing public information in all its forms, if not proactively at least in a timely fashion; engaging the public in decision making; and it is also a general mindset, backed up by clear policy, that citizens need to be empowered with information and a voice so they can hold their government accountable.

At the same time, a good Open Data strategy should support Open Government goals, by making structured data that relates to accountability and ethics like spending data, contracts, staff salaries, elections, political contributions, program effectiveness…etc. available in machine- and human-readable formats.

The open data strategy advanced by the White House and 10 Downing Street has not embraced releasing all of those data types, although the Obama administration did follow through on the President’s promise to launch Ethics.gov.

The Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. It’s fair to say that this White House has advanced an unprecedented effort to open up government information while it has much more of mixed record on transparency and accountability, particularly with respect to national security and a culture of secrecy around the surveillance state.

Open government advocates assert that the transparency that President Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote in the Washington Post. In fact, the current administration’s open data initiatives are one of the bright spots its transparency record — and that’s in the context of real data quality and cultural issues that need to be addressed to match the rhetoric of the past four years.

“Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, former president of the Metro Chicago Information Center. “I think the ‘New Tech’ world forgets that — open data is a political process first and foremost, and a technology problem second.”

If we look at what’s happening with open government in Chicago, a similar dynamic seems to have emerged, as the city methodically works to release high quality open data related to services, performance or lobbying but is more resistant to media organizations pushing for more access to data about the Mayor’s negotiations or electronic communications, the traditional targets of open government advocacy. This tension was explored quite well in an article by WBEZ on the people behind Chicago’s government 2.0 efforts.

In the United States, there is a sizable group of people that believe that data created using public funds should in turn be made available to the public — and that the Internet is a highly effective place to make such data available. Such thinking extends to open access to research or public sector code, too.

As those policy decisions are implemented, asking hard questions about data quality, use, licenses, outcomes and cost is both important and useful, particularly given that motivations and context will differ from country to country and from industry to civil society.

Who benefits and how? What existing entities are affected? Should all public data be subject to FOIA? If so, under what timelines and conditions? Should commercial entities that create or derive economic value from data pay for bulk access? What about licensing? If government goes digital, how can the poor, disabled or technically illiterate be given access and voice as well? (Answers to some of these questions are in the Sunlight Foundation’s principles of open government data, which were based on the recommendatations of an earlier working group.)

In the United Kingdom, there are also concerns that the current administrations “open data agenda” obscures a push towards privatization of public services should be more prominent in public debates, a dynamic that Morozov recently explored in the opinion pages of the New York Times. My colleague, Nat Torkington, highlighted the needs for a discussion about which services should be provided by government at Radar back in 2010:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Whether one agrees with the side of the argument that supports investment or the other that is looking for cost-savings — or both — is something that people of democratic societies will need to debate and decide for themselves, along with the size and role of government. The politics can’t be abstracted away.

I don’t think that many open government advocates are blind to the ideologies involved, including the goals of libertarians, nor that the “open dystopia” that Newitz described at iO9 is a particularly likely outcome.

That said, given the stakes, these policies deserve to be the subject of debate in every nation whose leaders are putting them forward. We’ve never had better tools for debate, discussion and collective action. Let’s use them.

Cameras in the courtroom: Will SCOTUS ever go live online?

In an age where setting up a livestream to the Web and the rest of the networked world is as easy as holding up a smartphone and making a few taps, the United States Supreme Court appears more uniformly opposed to adding cameras in the courtroom than ever.

SupremeCourt.gov provides online access to opinionsordersdocket, court calendarstranscriptsschedulesrulesvisitors’ guidescase-handling guides and press releases and even adopted responsive Web design for in 2012.

As Adam Liptak reported for the New York Times today, despite a trend towards cameras in the rest of the legal system of the United States and in higher courts around the globe, the Supreme Court still rejects video coverage.

Moreover, the two newest members of the court, Associate Justices Sotomayor and Kagan, have shifted their positions towards opposing the addition of cameras since taking the bench.

As a result, the vast majority of Americans will only be able to listen to oral arguments, read transcripts, and learn about verdicts in the somewhat bizarre fashion that has emerged in the absence of live video for the Supreme Court.

There’s no liveblogging or tweeting from within the Supreme Court’s hearing room either, which leads to the beautiful mashup of old and new media below:

 

Are cameras in the courtroom better or worse for justice?

As Liptak reported, the question of adding cameras to the Supreme Court is considered by two new papers in the Brigham Young University Law Review.

IIT Chicago-Kent law professor Nancy Marder, the author of “The Conundrum of Cameras in the Courtroom,” is opposed to adding cameras, essentially arguing that status quo in SCOTUS and lower federal courts remain in place:

Federal courts should post transcripts and audio recordings of court proceedings online, but stop short of permitting cameras in the courtroom. Federal judges need to consider the power of the image, the omnipresence of the camera, the spread of images via the Web, and the current lack of a “technology etiquette” that will guide the use of courtroom images on the Web. Until that etiquette develops, federal judges should take incremental steps to make courts more accessible, but should not allow cameras in federal courts, particularly in federal district courts.

As I suggested to Dan Diamond last year, it’s worth considering how courts in other nations have embraced the Web. He did exactly that, in Forbes:

For example, Australia’s High Court makes decisions available as text files, not just PDFs, and has a prominent link to daily transcripts. Canada’s Court offers webcasts and a front-page link to statistics on ten years’ worth of decisions.

And the United Kingdom’s Supreme Court website doesn’t just offer a link to live, streaming video – it even has a Twitter feed, too.

Brazil, in fact, has been broadcasting all of the judicial and administrative meetings of its Supreme Court live on television since 2002.

University of Oregon journalism professor Kyu Ho Youm, went further down this line of inquiry in his new paper surveying the use of cameras in supreme courts and international human rights courts, concluding that the concerns of justices abroad have been allayed by the outcomes:

Foreign and international courts’ consistently positive experience
with allowing electronic media access to courtrooms should be a useful
guide for the justices of the U.S. Supreme Court. Nearly all the major
assumptions, worries, and concerns that several Justices cite in opposing
cameras are unlikely to be substantiated as learned from the real-life
experience of justices of the Supreme Courts of England and Canada

Given the example of other nations, will the U.S. follow? As Liptak reported in the New York Times, Chief Justice John Roberts enumerated several ways the court has adopted technology but expressed reservations about cameras in particular.

“Cameras present all sorts of challenges that these other areas don’t,” said the chief justice, referring to making audio recordings and transcripts of hearings available. “I’m not going to go through the whole debate, it’s a fairly common one. We worry about the impact on lawyers. I worry about the impact on judges.”

His complete comments upon adding cameras to the Supreme Court in a 2011 conversation with Judge J. Harvie Wilkinson III at the Fourth Circuit Judicial Conference in the video excerpt embedded below.

“It would be interesting to hear what government institutions people think function better, now that they’re on television,” said the chief justice, “than if they’re not.”

Update: Justices Breyer and Kennedy recently were asked about this issue in Congress. Here’s their response, via Nancy Scola:

C-SPAN: “Justices Anthony Kennedy & Stephen Breyer discuss having television cameras in the Supreme Court. They do so in response to a question from Rep. Mike Quigley (D-IL).

Civic app for finding flu shots goes viral

The 2012-2013 influenza season has been a bad one, with flu reaching epidemic levels in the United States. The data from Google Flu Trends, visualized below, shows the tale.

google-flu-trends-january-17-2013

Given the risk, mayors and public health officials around the country are using new technologies to connect residents with health care, from social media to widgets to flu shot finder maps. This week, it looks like the code for a flu shot location application created in Chicago is doing what viruses do best: go viral in cities.

Sam Roudman at TechPresident:

“In the middle of what might be theworst flu season in a decade, Boston Mayor Thomas Menino declared apublic health emergency — and civic hackers found a way to help the cause. With help from Code for America volunteers, the Boston Mayor’s Office of New Urban Mechanics was able to repurpose aChicago app that maps free vaccination locations in little more than a day, just in time for a weekend vaccination campaign at 24 locations. The app’s journey from Chicago to Boston is a model of intra-civic partnership.”

Chris Whitaker explored the origins of the app at the Code for America blog today:

Originally developed in Chicago by Tom Kompare, the flu shot app helps users find nearby clinics offering free flu shots by entering in their address or by using a GPS-enabled mobile device. It also allows users to get public-transit directions to those clinics at the click of a button.

Built at the request of Chicago’s Department of Health, Kompare started work on the app after representatives from the department dropped by Chicago’s OpenGov Hack Night during the Google API challenge presentation in October, and asked about an easy way for citizens to find out where to get a free flu shot. Within weeks, Kompare’s app was built, adopted, and hosted on Smart Chicago’s Collaborative’s servers.

…Hours after Boston’s Mayor Menino had declared a public health emergency, Boston’s Brigade CaptainHarlan Weber reached out to me about the use of the flu shot app.… The app was launched and ready for the public less than 36 hours after the initial email was sent.

Now, it looks like the code is spreading from Chicago to Philadelphia as well, according to a tweet from Philly’s chief data officer, Mark Headd.

If you still need a shot, these maps can help you learn where to find one. In the meantime, take care to keep healthy by frequently washing your hands and protecting others if you do fall ill by covering your mouth when you cough.

 

Open Government Partnership hosts regional meeting in Chile

The Open Government Partnership (OGP) has released statistics on its first 16 months since its historic launch in New York City, collected together in the infographic embedded below. This week, Open government leaders are meeting in Chile to discuss the formal addition of Argentina to the partnership and the national plans that Latin American countries have pledged to implement. [LivestreamÁlvaro Ramirez Alujas, Founder of the Group of Investigation in Government, Administration and Public Policy (GIGAPP), assisted GOP with an analysis of these OPG action plans. Alujas found that:

  • 46% are linked to commitments on public integrity
  • 27% are related to the improvement of public services
  • 14% are linked to more effectively managing public resources and
  • 12% are related to increasing accountability and corporate responsibility.

Gobierno abierto

The infographic is also available en Español:

Accountability for accountability

As I noted in my assessment of 2012 trends for Radar, last year the Economist’s assessment was that open government grew globally in scope and clout.

As we head into 2013, it’s worth reiterating a point I made last summer in a post on oversight of the Open Government Partnership:

There will be inevitable diplomatic challenges for OGP, from South Africa’s proposed secrecy law to Russia’s membership. Given that context, all of the stakeholders in the Open Government Partnership — from the government co-chairs in Brazil and the United Kingdom to the leaders of participating countries to the members of civil society that have been given a seat at the table — will need to keep pressure on other stakeholders if significant progress is going to be made on all of these fronts.

If OGP is to be judged more than a PR opportunity for politicians and diplomats to make bold framing statements, government and civil society leaders will need to do more to hold countries accountable to the commitments required for participation: they must submit Action Plans after a bonafide public consultation. Moreover, they’ll need to define the metrics by which progress should be judged and be clear with citizens about the timelines for change.

African News Challenge funds data journalism and open government tech

The post-industrial future of journalism is already here. It’s just not evenly distributed yet. The same trends changing journalism and society have the potential to create significant social change throughout the African continent, as states moves from conditions of information scarcity to abundance.

That reality was clear on my recent trip to Africa, where I had the opportunity to interview Justin Arenstein at length during my visit to Zanzibar. Arenstein is building the capacity of African media to practice data-driven journalism, a task that has taken on new importance as the digital disruption that has permanently altered how we discover, read, share and participate in news.

One of the primary ways he’s been able to build that capacity is through African News Innovation Challenge (ANIC), a variety of the Knight News Challenge in the United States.

The 2011 Knight News Challenge winners illustrated data’s ascendance in media and government, with platforms for data journalism and civic connections dominating the field.

As I wrote last September, the projects that the Knight Foundation has chosen to fund over the last two years are notable examples of working on stuff that matters: they represent collective investments in digital civic infrastructure.

The first winners of the African News Innovation Challenge, which concluded this winter, look set to extend that investment throughout the continent of Africa.

“Africa’s media face some serious challenges, and each of our winners tries to solve a real-world problem that journalists are grappling with. This includes the public’s growing concern about the manipulation and accuracy of online content, plus concerns around the security of communications and of whistleblowers or journalistic sources,” wrote Arenstein on the News Challenge blog.

While the twenty 2012 winners include investigative journalism tools and whistleblower security, there’s also a focus on citizen engagement, digitization and making public data actionable. To put it another way, the “news innovation” that’s being funded on both continents isn’t just gathering and disseminating information: it’s now generating data and putting it to work in the service of the needs of residents or the benefit of society.

“The other major theme evident in many of the 500 entries to ANIC is the realisation that the media needs better ways to engage with audiences,” wrote Arenstein. “Many of our winners try tackle this, with projects ranging from mobile apps to mobilise citizens against corruption, to improved infographics to better explain complex issues, to completely new platforms for beaming content into buses and taxis, or even using drone aircraft to get cameras to isolated communities.”

In the first half of our interview, published last year at Radar, Arenstein talked about Hacks/Hackers, and expanding the capacity of data journalism. In the second half, below, we talk about his work at African Media Initiative (AMI), the role of open source in civic media, and how an unconference model for convening people is relevant to innovation.

What have you accomplished at the AMI to date?

Justin Arenstein: The AMI has been going on for just over three years. It’s a fairly young organization, and I’ve been embedded now for about 18 months. The major deliverables and the major successes so far have been:

  1. A $1 million African News Innovation Challenge, which was modeled fairly closely on the Knight Challenge, but a different state of intended outputs.
  2. A network of Hacks/Hackers chapters across the continent.
  3. A number of technology support or technology development initiatives. Little pilot projects, invariably newsroom-based.

The idea is that we test ideas that are allowed to fail. We fund them in newsrooms and they’re driven by newsrooms. We match them up with technologists. We try and lower the barrier for companies to start experimenting and try and minimize risk as much as possible for them. We’ve launched a couple of slightly larger funds for helping to scale some of these ideas. We’ve just started work on a social venture or a VC fund as well.

You mentioned different outputs in the News Challenge. What does that mean?

Justin Arenstein: Africa hasn’t had the five-year kind of evolutionary growth that the Knight News Challenge has had in the U.S. What the News Challenge has done in the U.S. is effectively grown an ecosystem where newsrooms started to grapple with and accepted the reality that they have to innovate. They have to experiment. Digital is core to the way that they’re not only pushing news out but to the way that they produce it and the way that they process it.

We haven’t had any of that evolution yet in Africa. When you think about digital news in African media, they think you’re speaking about social media or a website. We’re almost right back at where the News Challenge started originally. At the moment, what we’re trying to do is raise sensitivity to the fact that there are far more efficient ways of gathering, ingesting, processing and then publishing digital content — and building tools that are specifically suited for the African environment.

There are bandwidth issues. There are issues around literacy, language use and also, in some cases, very different traditions of producing news. The output of what would be considered news in Africa might not be considered news product in some Western markets. We’re trying to develop products to deal with those gaps in the ecosystem.

What were the most promising News Challenge entrants that actually relate to those outputs?

Justin Arenstein: Some of the projects that we thought were particularly strong or apt amongst the African News Challenge finalists included more efficient or more integrated ways to manage workflow. If you look at many of the workflow software suites in the north, they’re, by African standards, completely unaffordable. As a result, there hasn’t been any systemic way that media down here produced news, which means that there’s virtually no way that they are storing and managing content for repackaging and for multi-platform publishing.

We’re looking at ways of not reinventing a CMS [content management system], but actually managing and streamlining workflow from ingesting reporting all the way to publishing.

Some of the biggest blogs in the world are running on WordPress for a CMS. Why not use that where needed?

Justin Arenstein: I think I may have I misspoken by saying “content management systems.” I’m referring to managing, gathering and storing old news, the production and the writing of new content, a three or four phase editing process, and then publishing across multiple platforms. Ingesting creative design, layout, and making packages into podcasting or radio formats, and then publishing into things like Drupal or WordPress.

There have been attempts to take existing CMS systems like Drupal and turn it into a broader, more ambitious workflow management tool. We haven’t seen very many successful ones. A lot of the kinds of media that we work with are effectively offline media, so these have been very lightweight applications.

The one thing that we have focused on is trying to “future-proof” it, to some extent, by building a lot of meta tagging and data management tools into these new products. That’s because we’re also trying to position a lot of the media partners we’re working with to be able to think about their businesses as data or content-driven businesses, as opposed to producing newspapers or manufacturing businesses. This seems to be working well in some early pilots we’ve been doing in Kenya.

What were your takeaways from the Tech Camp? Was a hybrid unconference a good model for the News Challenge?

Justin Arenstein: A big goal that we think we’ve achieved was to try and build a community of use. We put people together. We deliberately took them to an exotic location, far away from a town or location, where they’re effectively held hostage in a hotel. We built in as much free time as possible, with many opportunities to socialize, so that they start creating bonds. Right from the beginning, we did a “speed dating” kind of thing. There’s been very few presentations — in fact, there was only one PowerPoint in five days. The rest of the time, it’s actually the participants teaching each other.

We brought in some additional technology experts or facilitators, but they were handpicked largely from previous challenges to share the experience of going through a similar process and to point people to existing resources that they might not be aware of. That seems to have worked very well.

On the sidelines of the Tech Camp, we’ve seen additional collaborations happen for which people are not asking for funding. It just makes logical sense. We’ve already seen some of the initial fruits of that: three of the applicants actually partnered and merged their applications. We’ve seen a workflow editorial CMS project partner up with an ad booking and production management system, to create a more holistic suite. They’re still building as two separate teams, but they’re now sharing standards and they’re building them as modular products that could be sold as a broader product suite.

The Knight News Challenge has stimulated the creation of many open source tools. Is any of that code being re-used?

Justin Arenstein: We’ve tried to tap into quite a few of them. Some of the more recent tools are transferable. I think there was grand realization that people weren’t able to deliver on their promises — and where they did deliver on tools, there wasn’t documentation. The code was quite messy. They weren’t really robust. Often, applications were written for specific local markets or data requirements that didn’t transfer. You actually effectively had to rebuild them. We have been able to re-purpose DocumentCloud and some other tools.

I think we’ve learned from that process. What we’re trying to do with our News Challenge is to workshop finalists quite aggressively before they put in their final proposals.

Firstly, make sure that they’re being realistic, that they’re not unnecessarily building components, or wasting money and energy on building components for their project that are not unique, not revolutionary or innovative. They should try and almost “plug and play” with what already exists in the ecosystem, and then concentrate on building the new extensions, the real kind of innovations. We’re trying to improve on the Knight model.

Secondly, once the grantees actually get money, it comes in a tranche format so they agree to an implementation plan. They get cash, in fairly small grants by Knight standards. The maximum is $100,000. In addition, they get engineering or programming support from external developers that are on our payroll, working out of our labs. We’ve got a civic lab running out of Kenya and partners, such as Google.

Thirdly, they get business mentorship support from some leading commercial business consultants. These aren’t nonprofit types. These are people who are already advising some of the largest media companies in the world.

The idea is that, through that process, we’re hopefully going to arrive at a more realistic set of projects that have either sustainable revenue models and scaling plans, from the beginning, or built-in mechanisms for assessments, reporting back and learning, if they’re designed purely as experiments.

We’re not certain if it’s going to work. It’s an experiment. On the basis of the Tech Camp that we’ve gone through, it seems to have worked very well. We’ve seen people abandon what were, we thought, overly ambitious technology plans and rather matched up or partnered with existing technologists. They will still achieve their goals but do so in a more streamlined, agile manner by re-purposing existing tech.

Editors’s Note: This interview is part of an ongoing series at the O’Reilly Radar on the people, tools and techniques driving data journalism.

PollWatchUSA enables anyone with a smartphone to act as a poll monitor

Pollwatch, a mobile application that enabled crowdsourced poll monitoring, has launched a final version at pollwatch.us, just in time for Election Day 2012. The initial iteration of the app was conceived, developed and demonstrated at the hackathon at the 2012 Personal Democracy Forum in New York City. The app aggregates reports and visualizes the user-generated data at pollwatchusa.org/viz.

Pollwatch iPhone app

The app is result of a collaboration between the PollWatch team, which includes RebootWebSava, and Common Cause/NY, along with input from TurboVote, The project also received support from the Voter Information Project and Latino Justice.

“Election Day is often hampered by inefficiency and confusion, leaving voters with little recourse. PollWatchUSA was conceived to help voters report problems in real time, by putting the tool in the palm of their hands. Through crowd sourcing, Common Cause/NY hopes to collect a broad data set to better identify the issues and help create a more effective elections administration system,” said Susan Lerner, Executive Director of Common Cause/NY, in a prepared statement.

The data for polling locations is coming from the Voting Information Project, which has acted as civic infrastructure for a number of efforts this year.

“Susan Lerner, our project co-sponsor at Common Cause, was instrumental in making sure the New York polling sites were included in that dataset (with much nudging and cajoling to the Board of Elections),” emailed Jeremy Canfield, service designer at Reboot.

Canfield explained that the project went through three iterations since June.

“We tested it out with users in two primaries, plus got some help from one of Union Square Ventures Product Feedback days,” he wrote. “We used that feedback to simplify the flow, making it as easy as possible for users to report on their voting experience. By making it easy and lightweight to report, plus sharing those reports widely, we can get better data to election advocates (chief among them, Common Cause), who can provide immediate help or work with the various boards of elections to make real time adjustments.”

Notably, Pollwatch is made to work on any smartphone, not just a singular platform. They chose to develop a mobile website, not a native app, avoiding the “shiny app syndrome” that has been problematic for some local governments. Well done, all.

To use social media in a time of need, start building networks before disasters

As is the case in every major event in the U.S., social media was part of the fabric of communications during Hurricane Sandy. Twitter was a window into what was happening in real-time. Facebook gave families and friends a way to stay in touch about safety or power. And government officials and employees, from first responders mayors to governors to the President of the United States, put critical information into the hands of citizens that needed it.

While Hurricane Sandy cemented the utility of these networks, neither they nor their role are new. With all due respect to Gartner analyst Andrea Di Maio, his notion that people aren’t conveying “useful information” every day there — that it’s just ” chatting about sport results, or favorite actors, or how to bake” — is like some weird flashback to a 2007 blog post or ignorant cable news anchor.

Public sector, first responders and emergency management officials have recognized the utility of social media reports as a means for situational awareness before, during and after natural or man-made disasters for years now and have integrated tools into crisis response.

Officials at local, state and federal levels have confirmed to me again and again that it’s critical to build trusted networks *before* disaster strikes so that when crises occur, the quality of intelligence is improved and existing relationships with influence can amplify their messages.

Media and civil society serve as infomediaries and critical filters (aka, B.S. detectors) for vetting information, something that has proved crucial with fake reports and pictures popping up. Official government accounts play a critical role for putting trusted information into the networks to share, something we saw in real-time up and down the East Coast this week.

To be frank, Di Maio’s advice that authorities shouldn’t incorporate social media into their normal course of business is precisely the opposite of the experience on the ground of organizations like the Los Angeles Fire Department, Red Cross or FEMA. Here’s Brian Humphrey, public information officer of the LAFD, on best practices for social media:

If public safety officials come across Di Maio’s advice, I hope they’ll choose instead to listen to citizens every day and look to scale the best practices of their peers for using technology for emergency response, not start during a crisis.

TechCrunch’s “CrunchGov” grades Congress on tech, pilots legislative crowdsourcing platform

In general, connecting more citizens with their legislators and create more resources for Congress to understand where their constituents and tech community stands on proposed legislation is a good thing. Last year’s Congressional hearings on the Stop Online Piracy Act and the PROTECT IP Act made it pretty darn clear that many technologists felt that it was no longer ok to not know how the Internet works. Conversely, however, if the tech world cares about what happens in DC, it’s no longer ok to not know how Congress works.

In that context, the launch of a policy platform by one of the biggest tech blogs on the planet could definitely be a positive development. TechCrunch contributor Greg Ferenstein writes that the effort is aimed at “helping policymakers become better listeners, and technologists to be more effective citizens.”

The problem with the initial set of tools is that they’re an incomplete picture of what’s online, at best. CrunchGov won’t satisfy the needs of tech journalists, staffers or analysts, who need deeper dives into expert opinion, policy briefings and data. (Public Knowledge, the Center for Democracy and Technology, OpenSecrets.org, the Sunlight Foundation, and the Electronic Frontier Foundation already offer those resources.)

Will “grading” Members of the House of Representatives on TechCrunch’s new Congressional leaderboard lead to them being better listeners? Color me, well, unconvinced. Will an “F” from TechCrunch result in Reps. Smith, Grassley, or Blackburn changing the bills they introduce, support or vote for or against?

Hard to know. True, it’s the sort of symbol that a political opponent could use in an election — but if Reddit’s community couldn’t defeat SOPA’s chief sponsor in a primary, will a bad grade do it? Ferenstein says the leaderboard provides a “a quantified opinion” of the alignment of Reps with the consensus of the tech industry.

Update: as reported by Adrian Jeffries at The Verge, this quantified opinion is based upon TechCrunch editorial and “data and guidance from four tech lobbies.”

Engine Advocacy, which represents startups; TechNet, which represents CEOs in areas from finance and ecommerce to biotech and clean tech; the Silicon Valley Leadership Group, which represents major Silicon Valley employers; and the powerhouse conglomerate The Internet Association, which represents Amazon, Google, and Facebook, among others.

Ferenstein told Hamish McKenzie at PandoDaily that “We’re saying this is generally the view of many people who read our site.” If that’s the case, it would be useful to transparently see the data that shows how TechCrunch readers feel about proposed or passed bills — much in the same way that POPVOX or OpenCongress allow users to express support or opposition to legislation. At the moment, readers are stuck taking their word for it.

McKenzie also highlighted some problems with the rankings and the proposition of rankings themselves:

On three major issues – net neutrality, privacy, and cyber security – TechCrunch’s surveys found no consensus, which somewhat undermines the leaderboard rankings. After all, those rankings appear to be based mainly on three data points: a Congressperson’s position on SOPA, and his or her votes on the Jumpstart Our Business Startups Act and the Fairness for High-Skilled Immigrants Act. It might be true that CrunchGov takes a data-driven approach to its rankings, but when three data points out a possible set of six are omitted, it’s fair to question just how useful the measure is.

As much as anything else, that speaks to the complicated definition of “those in the technology industry.” The industry is so broad and varied, from solo developers creating social games in their basements to hardware executives wanting to drive profits on their devices, that trying to establish consensus on political issues across a broad section of a relatively amorphous community is probably an impossible task. It also overemphasizes tech issues among the myriad of policy concerns that people working in the industry hold, some of which might seem tangential but are actually inextricably tied to the industry. What of climate change? What of taxes? What of puppies?

Also, applying grades to legislators puts TechCrunch in the same camp as the NRA, Americans For Tax Reform, and the Sierra Club in terms of assessing representatives based on narrow, and politically loaded, interests. It’s a headline-oriented approach that provides low-information people with a low-information look at a process and system that is actually very complicated.

More effective citizenship through the Internet?

I’m not unconvinced these limited bill summaries or leaderboard will help “technologists” become “more effective citizens,” though I plan to keep an open mind: this new policy platform is in beta, from the copy to the design to the number of bills in the legislative database or the data around them.

Helping readers to be “more effective” citizens is a bigger challenge than educating them just about how legislators are graded on tech-related bills. The scope of that  knowing who your Representative, Senators or where they stand on issues, what bills are up for a vote or introduced, how they voted, The new Congress.gov will connect you to many of the above needs, at the federal level. It might mean following the money, communicating your support or opposition to your elected officials, registering to vote, and participating the democratic processes of state and local government, from schools to . Oh, and voting: tens of millions of American citizens will head to the polls in under two weeks.

To be fair, CrunchGov does do some of these things, linking out to existing open government ecosystem online. Clicking “more info” shows positions Representatives have taken on the tech issues CrunchGov editors have determined that the industry has a “consensus” around, including votes, and links to their profiles in OpenCongress and Influence Explorer. Bill summaries link to maplight.org.

When it comes to the initial set of issues in the legislative database, there’s an overly heavy editorial thumb on the till of what’s deemed important to the tech community.

For one, “cybersecurity” is a poor choice for a Silicon Valley blog. It’s a Washington word, used often in the context of national defense and wars, accompanied by fears of a “cyber Pearl Harbor.” Network security, mobile device security or Web application security are all more specific issues, and ones that startups and huge enterprises all have to deal with in their operations. The security experts I trust see Capitol Hill rhetoric taking aim at the wrong cybersecurity threats.

CrunchGov has only one bill selection for the issue — the Cyber Intelligence Sharing and Protection Act (CISPA) (H.R. 3523). The summary explains that CISPA proposes more information sharing, has a pie chart showing that “tech-friendly legislators” are split 50/50 on it, shows endorsements and opposition, links to 3 articles about the bill, including TechCrunch’s own coverage.

What’s left unclear? For one, that Rep. Darrell Issa (R-CA) – an “A-lister” who TechCrunch writes “has received numerous awards and accolades from the industry,” supported CISPA. Or that organizations and advocates concerned about its implications for privacy and civil rights strongly opposed it. If you’re a technologist, legislator or citizen, honestly, you’re better off reading ProPublica’s explainer or the Center for Democracy and Technology’s CISPA resource page.

There’s also framing choices that meant a number of bills aren’t listed — and that the Senate is left out entirely. Why? According to Ferenstein, “the “do-nothing” congress made it impossible to rank the Senate, because they didn’t pass enough bills related to technology policy.”

It’s true that the Senate hasn’t passed many bills — but the 51 laws that did go through the Senate in the 112th Congress include more tech policy issues than that statement might lead you to believe, from e-verify to online leak prevention. It’s also moved laws that every citizens should know about, like the extension of the PATRIOT Act, given that provisions affect the tech industry. (Yes, digital due process matters in the age of the cloud: your email isn’t as private as you might think it is.)

Putting a legislative crowdsourcing platform to re-use

Congressional leaderboard and limited legislative dashboard aside, CrunchGov is trying to crowdsource legislation using a local installation of MADISON, the software Congressman Issa’s office developed and rolled out last December during the first Congressional hackathon. MADISON was subsequently open sourced, which made the code available to TechCrunch.

It’s in this context that CrunchGov’s aspirations for technology to “democratize democracy itself” may be the most tested. The first test case will be a bill from Congressman Issa to reform government IT procurement. For this experiment to matter, the blog’s readership will need to participate, do so meaningfully, and see that their edits are given weight by bill authors in Washington. Rep. Issa’s office, which has distinguished itself in its use of the Internet to engage the public, may well do so. If proposals from the initial pilot aren’t put into bills, that may be the end of reader interest.

Will other Congressmen and staffers do the same, should their bills be posted? It’s hard to say. As with so many efforts to engage citizens online, this effort is in beta.

This post has been updated, including links to coverage from Pando Daily and the Verge.

A Twitter chat with @VotingInfo on voting, elections and tech

Today, I hosted a Twitter chat with the Voting Information Project. They partner with states to provide official election data that developers can use to create free, open source tools for voters.

I’ve embedded a storify of our conversation below, along with a video explaining more about what they do. Of special note: VIP is partnering with Mobile Commons to let registered voters know where to vote. Just txt “where” or “donde” to 877-877.