open data

Beware openwashing. Question secrecy. Acknowledge ideology.

You could spend a long day listing all of the organizations or individuals who are putting government data online, from Carl Malamud to open government activists in Brazil, Africa or Canada. As many conversations in the public domain over the past few years have demonstrated, there are many different perspectives on what purposes “open data” should serve, often informed by what advocates intend or related to an organization or institution’s goals. For those interested, I recommend the open data seminar and associated comments highly.

When and if such data includes ratings or malpractice information about hospitals or doctors, or fees for insurance companies, transparency and accountability is an important byproduct, which in turn does have political implications. (Watch the reaction of unions or doctors’ groups to performance or claims data going online for those conflicts.)

There are people who want to see legislatures open their data, to provide more insight into those processes, and others who want to to see transit data or health data become more open, in the service of more civic utility or patient empowerment.

Other people may support publishing more information about the business or performance of government because evidence of fraud, mismanagement or incompetence will support their arguments for shrinking the size of the state. A big tent for open government can mean that libertarians could end up supporting the same bills liberals do.

In the U.S., Govtrack.us has been making government legislative data open, despite the lack of bulk access to Thomas.gov, by “scraping.” There are many people who wish to see campaign finance data open, like the Sunlight Foundation, to show where influence and power lies in the political system. There are many members of civil society, media organizations and startups that are collecting, sharing or using open data, from OpenCorporates to OpenCongress, to Brightscope or ProPublica.

Whether anyone chooses to describe those activities as a movement is up to them — but it is indisputable that 3 years ago, a neutral observer would be hard-pressed to find an open government data platform. Now there are dozens at the national level. What matters more than their existence is what goes onto them, however, and there people have to be extremely careful about giving governments credit for just putting a “portal” online.

While the raw number of open government data platforms around the globe looks set to continue to increase in 2013 at every level of government, advocates should be wary of governments claiming “open government” victories as a result.

Since Morozov sent out that tweet, he’s published a book with a chapter that extends that critique, along with a series of New York Times op-eds, reviews, Slate debates, and a 16,000 word essay in The Baffler that explores the career and thinking of Tim O’Reilly (my publisher). Morozov’s essay catalyzed Annaleen Newitz to paraphrase and link to it at post at iO9, where Tim responded to in a comment.

While his style can distract and detract from his work — and his behavior on Twitter can be fairly characterized as contemptuous at times — the issues Morozov raises around technology and philosophy are important and deserve to be directly engaged by open government advocates, as John Wilbanks suggests.

 

 

That’s happening, slowly. Sunlight Foundation policy director John Wonderlich has also responded, quoting Morozov’s recommendations to reflect out how he might specific uses of technology that support open government. Wilbanks himself has written one of the most effective (short) responses to date:

One of the reasons I do “open” work is that I think, in the sciences, it’s a philosophical approach that is more likely to lead to that epistemic transformation. If we have more data available about a scientific problem like climate change, or cancer, then the odds of the algorithms figuring something out that is “true” but incomprehensible to us humans go up. Sam Arbesman has written about this nicely both in his book the Half Life of Facts and in another recent Slate article.

I work for “open” not because “open” solves a specific scientific problem, but because it increases the overall probability of success in sensorism-driven science. Even if the odds of success themselves don’t change, increasing the sample size of attempts will increase the net number of successes. I have philosophical reasons for liking open as well, and those clearly cause me cognitive bias on the topic, but I deeply believe that the greatest value in open science is precisely the increased sample size of those looking.

I also tend to think there’s a truly, deeply political element to enabling access to knowledge and science. I don’t think it’s openwashing (and you should read this paper recommended by Morozov on the topic) to say that letting individuals read science can have a real political impact.

Morozov’s critique of “openwashing” isn’t specious, though it’s fair to question his depiction of the history of open source and free software and an absence of balance in his consideration of various open government efforts. Civil society and media must be extremely careful about giving governments credit for just putting a “portal” online.

On that count, Wonderlich wrote about the “missing open data policy” that every government that has stood up or will stand up an open data platform could benefit from reading:

Most newly implemented open data policies, much like the Open Government Directive, are announced along alongside a package of newly released datasets, and often new data portals, like Data.gov. In a sense, these pieces have become the standard parts of the government data transparency structure.  There’s a policy that says data should generally be open and usefully released, a central site for accessing it, some set of new data, and perhaps a few apps that demonstrate the data’s value.

Unfortunately, this is not the anatomy of an open government.  Instead, this is the anatomy of the popular open government data initiatives that are currently in favor. Governments have learned to say that data will be open, provide a place to find it, release some selected datasets, and point to its reuse.

This goes to the concerns of traditional advocates working for good government, as explored in a excellent research paper by Yu and Robinson on the ambiguity of open government and open data, along with the broader discussion you’ll find in civil society in the lead up to the Open Government Partnership, where this dynamic was the subject of much concern — and not just in the Canadian or United Kingdom context. The work exploring this dynamic by Nathaniel Heller at Global Integrity is instructive.

As I’ve written before (unrepentant self-plagiarism alert), standing up open data platforms and publishing data sets regarding services is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.

Socrata, a venture-capital backed startup whose technology powers the open data platforms of several city, state and federal governments, including Kenya and the United States, is also part of this ecosystem and indisputably has “skin in the game.”

That said, the insights that Kevin Merritt, the founder of Socrata, shared in post on reinventing government are worth considering:

An open Government strategy needs to include Open Data as a component of enabling transparency and engaging citizens. However, Open Government is also about a commitment to open public meetings; releasing public information in all its forms, if not proactively at least in a timely fashion; engaging the public in decision making; and it is also a general mindset, backed up by clear policy, that citizens need to be empowered with information and a voice so they can hold their government accountable.

At the same time, a good Open Data strategy should support Open Government goals, by making structured data that relates to accountability and ethics like spending data, contracts, staff salaries, elections, political contributions, program effectiveness…etc. available in machine- and human-readable formats.

The open data strategy advanced by the White House and 10 Downing Street has not embraced releasing all of those data types, although the Obama administration did follow through on the President’s promise to launch Ethics.gov.

The Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. It’s fair to say that this White House has advanced an unprecedented effort to open up government information while it has much more of mixed record on transparency and accountability, particularly with respect to national security and a culture of secrecy around the surveillance state.

Open government advocates assert that the transparency that President Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote in the Washington Post. In fact, the current administration’s open data initiatives are one of the bright spots its transparency record — and that’s in the context of real data quality and cultural issues that need to be addressed to match the rhetoric of the past four years.

“Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, former president of the Metro Chicago Information Center. “I think the ‘New Tech’ world forgets that — open data is a political process first and foremost, and a technology problem second.”

If we look at what’s happening with open government in Chicago, a similar dynamic seems to have emerged, as the city methodically works to release high quality open data related to services, performance or lobbying but is more resistant to media organizations pushing for more access to data about the Mayor’s negotiations or electronic communications, the traditional targets of open government advocacy. This tension was explored quite well in an article by WBEZ on the people behind Chicago’s government 2.0 efforts.

In the United States, there is a sizable group of people that believe that data created using public funds should in turn be made available to the public — and that the Internet is a highly effective place to make such data available. Such thinking extends to open access to research or public sector code, too.

As those policy decisions are implemented, asking hard questions about data quality, use, licenses, outcomes and cost is both important and useful, particularly given that motivations and context will differ from country to country and from industry to civil society.

Who benefits and how? What existing entities are affected? Should all public data be subject to FOIA? If so, under what timelines and conditions? Should commercial entities that create or derive economic value from data pay for bulk access? What about licensing? If government goes digital, how can the poor, disabled or technically illiterate be given access and voice as well? (Answers to some of these questions are in the Sunlight Foundation’s principles of open government data, which were based on the recommendatations of an earlier working group.)

In the United Kingdom, there are also concerns that the current administrations “open data agenda” obscures a push towards privatization of public services should be more prominent in public debates, a dynamic that Morozov recently explored in the opinion pages of the New York Times. My colleague, Nat Torkington, highlighted the needs for a discussion about which services should be provided by government at Radar back in 2010:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Whether one agrees with the side of the argument that supports investment or the other that is looking for cost-savings — or both — is something that people of democratic societies will need to debate and decide for themselves, along with the size and role of government. The politics can’t be abstracted away.

I don’t think that many open government advocates are blind to the ideologies involved, including the goals of libertarians, nor that the “open dystopia” that Newitz described at iO9 is a particularly likely outcome.

That said, given the stakes, these policies deserve to be the subject of debate in every nation whose leaders are putting them forward. We’ve never had better tools for debate, discussion and collective action. Let’s use them.

Civic app for finding flu shots goes viral

The 2012-2013 influenza season has been a bad one, with flu reaching epidemic levels in the United States. The data from Google Flu Trends, visualized below, shows the tale.

google-flu-trends-january-17-2013

Given the risk, mayors and public health officials around the country are using new technologies to connect residents with health care, from social media to widgets to flu shot finder maps. This week, it looks like the code for a flu shot location application created in Chicago is doing what viruses do best: go viral in cities.

Sam Roudman at TechPresident:

“In the middle of what might be theworst flu season in a decade, Boston Mayor Thomas Menino declared apublic health emergency — and civic hackers found a way to help the cause. With help from Code for America volunteers, the Boston Mayor’s Office of New Urban Mechanics was able to repurpose aChicago app that maps free vaccination locations in little more than a day, just in time for a weekend vaccination campaign at 24 locations. The app’s journey from Chicago to Boston is a model of intra-civic partnership.”

Chris Whitaker explored the origins of the app at the Code for America blog today:

Originally developed in Chicago by Tom Kompare, the flu shot app helps users find nearby clinics offering free flu shots by entering in their address or by using a GPS-enabled mobile device. It also allows users to get public-transit directions to those clinics at the click of a button.

Built at the request of Chicago’s Department of Health, Kompare started work on the app after representatives from the department dropped by Chicago’s OpenGov Hack Night during the Google API challenge presentation in October, and asked about an easy way for citizens to find out where to get a free flu shot. Within weeks, Kompare’s app was built, adopted, and hosted on Smart Chicago’s Collaborative’s servers.

…Hours after Boston’s Mayor Menino had declared a public health emergency, Boston’s Brigade CaptainHarlan Weber reached out to me about the use of the flu shot app.… The app was launched and ready for the public less than 36 hours after the initial email was sent.

Now, it looks like the code is spreading from Chicago to Philadelphia as well, according to a tweet from Philly’s chief data officer, Mark Headd.

If you still need a shot, these maps can help you learn where to find one. In the meantime, take care to keep healthy by frequently washing your hands and protecting others if you do fall ill by covering your mouth when you cough.

 

African News Challenge funds data journalism and open government tech

The post-industrial future of journalism is already here. It’s just not evenly distributed yet. The same trends changing journalism and society have the potential to create significant social change throughout the African continent, as states moves from conditions of information scarcity to abundance.

That reality was clear on my recent trip to Africa, where I had the opportunity to interview Justin Arenstein at length during my visit to Zanzibar. Arenstein is building the capacity of African media to practice data-driven journalism, a task that has taken on new importance as the digital disruption that has permanently altered how we discover, read, share and participate in news.

One of the primary ways he’s been able to build that capacity is through African News Innovation Challenge (ANIC), a variety of the Knight News Challenge in the United States.

The 2011 Knight News Challenge winners illustrated data’s ascendance in media and government, with platforms for data journalism and civic connections dominating the field.

As I wrote last September, the projects that the Knight Foundation has chosen to fund over the last two years are notable examples of working on stuff that matters: they represent collective investments in digital civic infrastructure.

The first winners of the African News Innovation Challenge, which concluded this winter, look set to extend that investment throughout the continent of Africa.

“Africa’s media face some serious challenges, and each of our winners tries to solve a real-world problem that journalists are grappling with. This includes the public’s growing concern about the manipulation and accuracy of online content, plus concerns around the security of communications and of whistleblowers or journalistic sources,” wrote Arenstein on the News Challenge blog.

While the twenty 2012 winners include investigative journalism tools and whistleblower security, there’s also a focus on citizen engagement, digitization and making public data actionable. To put it another way, the “news innovation” that’s being funded on both continents isn’t just gathering and disseminating information: it’s now generating data and putting it to work in the service of the needs of residents or the benefit of society.

“The other major theme evident in many of the 500 entries to ANIC is the realisation that the media needs better ways to engage with audiences,” wrote Arenstein. “Many of our winners try tackle this, with projects ranging from mobile apps to mobilise citizens against corruption, to improved infographics to better explain complex issues, to completely new platforms for beaming content into buses and taxis, or even using drone aircraft to get cameras to isolated communities.”

In the first half of our interview, published last year at Radar, Arenstein talked about Hacks/Hackers, and expanding the capacity of data journalism. In the second half, below, we talk about his work at African Media Initiative (AMI), the role of open source in civic media, and how an unconference model for convening people is relevant to innovation.

What have you accomplished at the AMI to date?

Justin Arenstein: The AMI has been going on for just over three years. It’s a fairly young organization, and I’ve been embedded now for about 18 months. The major deliverables and the major successes so far have been:

  1. A $1 million African News Innovation Challenge, which was modeled fairly closely on the Knight Challenge, but a different state of intended outputs.
  2. A network of Hacks/Hackers chapters across the continent.
  3. A number of technology support or technology development initiatives. Little pilot projects, invariably newsroom-based.

The idea is that we test ideas that are allowed to fail. We fund them in newsrooms and they’re driven by newsrooms. We match them up with technologists. We try and lower the barrier for companies to start experimenting and try and minimize risk as much as possible for them. We’ve launched a couple of slightly larger funds for helping to scale some of these ideas. We’ve just started work on a social venture or a VC fund as well.

You mentioned different outputs in the News Challenge. What does that mean?

Justin Arenstein: Africa hasn’t had the five-year kind of evolutionary growth that the Knight News Challenge has had in the U.S. What the News Challenge has done in the U.S. is effectively grown an ecosystem where newsrooms started to grapple with and accepted the reality that they have to innovate. They have to experiment. Digital is core to the way that they’re not only pushing news out but to the way that they produce it and the way that they process it.

We haven’t had any of that evolution yet in Africa. When you think about digital news in African media, they think you’re speaking about social media or a website. We’re almost right back at where the News Challenge started originally. At the moment, what we’re trying to do is raise sensitivity to the fact that there are far more efficient ways of gathering, ingesting, processing and then publishing digital content — and building tools that are specifically suited for the African environment.

There are bandwidth issues. There are issues around literacy, language use and also, in some cases, very different traditions of producing news. The output of what would be considered news in Africa might not be considered news product in some Western markets. We’re trying to develop products to deal with those gaps in the ecosystem.

What were the most promising News Challenge entrants that actually relate to those outputs?

Justin Arenstein: Some of the projects that we thought were particularly strong or apt amongst the African News Challenge finalists included more efficient or more integrated ways to manage workflow. If you look at many of the workflow software suites in the north, they’re, by African standards, completely unaffordable. As a result, there hasn’t been any systemic way that media down here produced news, which means that there’s virtually no way that they are storing and managing content for repackaging and for multi-platform publishing.

We’re looking at ways of not reinventing a CMS [content management system], but actually managing and streamlining workflow from ingesting reporting all the way to publishing.

Some of the biggest blogs in the world are running on WordPress for a CMS. Why not use that where needed?

Justin Arenstein: I think I may have I misspoken by saying “content management systems.” I’m referring to managing, gathering and storing old news, the production and the writing of new content, a three or four phase editing process, and then publishing across multiple platforms. Ingesting creative design, layout, and making packages into podcasting or radio formats, and then publishing into things like Drupal or WordPress.

There have been attempts to take existing CMS systems like Drupal and turn it into a broader, more ambitious workflow management tool. We haven’t seen very many successful ones. A lot of the kinds of media that we work with are effectively offline media, so these have been very lightweight applications.

The one thing that we have focused on is trying to “future-proof” it, to some extent, by building a lot of meta tagging and data management tools into these new products. That’s because we’re also trying to position a lot of the media partners we’re working with to be able to think about their businesses as data or content-driven businesses, as opposed to producing newspapers or manufacturing businesses. This seems to be working well in some early pilots we’ve been doing in Kenya.

What were your takeaways from the Tech Camp? Was a hybrid unconference a good model for the News Challenge?

Justin Arenstein: A big goal that we think we’ve achieved was to try and build a community of use. We put people together. We deliberately took them to an exotic location, far away from a town or location, where they’re effectively held hostage in a hotel. We built in as much free time as possible, with many opportunities to socialize, so that they start creating bonds. Right from the beginning, we did a “speed dating” kind of thing. There’s been very few presentations — in fact, there was only one PowerPoint in five days. The rest of the time, it’s actually the participants teaching each other.

We brought in some additional technology experts or facilitators, but they were handpicked largely from previous challenges to share the experience of going through a similar process and to point people to existing resources that they might not be aware of. That seems to have worked very well.

On the sidelines of the Tech Camp, we’ve seen additional collaborations happen for which people are not asking for funding. It just makes logical sense. We’ve already seen some of the initial fruits of that: three of the applicants actually partnered and merged their applications. We’ve seen a workflow editorial CMS project partner up with an ad booking and production management system, to create a more holistic suite. They’re still building as two separate teams, but they’re now sharing standards and they’re building them as modular products that could be sold as a broader product suite.

The Knight News Challenge has stimulated the creation of many open source tools. Is any of that code being re-used?

Justin Arenstein: We’ve tried to tap into quite a few of them. Some of the more recent tools are transferable. I think there was grand realization that people weren’t able to deliver on their promises — and where they did deliver on tools, there wasn’t documentation. The code was quite messy. They weren’t really robust. Often, applications were written for specific local markets or data requirements that didn’t transfer. You actually effectively had to rebuild them. We have been able to re-purpose DocumentCloud and some other tools.

I think we’ve learned from that process. What we’re trying to do with our News Challenge is to workshop finalists quite aggressively before they put in their final proposals.

Firstly, make sure that they’re being realistic, that they’re not unnecessarily building components, or wasting money and energy on building components for their project that are not unique, not revolutionary or innovative. They should try and almost “plug and play” with what already exists in the ecosystem, and then concentrate on building the new extensions, the real kind of innovations. We’re trying to improve on the Knight model.

Secondly, once the grantees actually get money, it comes in a tranche format so they agree to an implementation plan. They get cash, in fairly small grants by Knight standards. The maximum is $100,000. In addition, they get engineering or programming support from external developers that are on our payroll, working out of our labs. We’ve got a civic lab running out of Kenya and partners, such as Google.

Thirdly, they get business mentorship support from some leading commercial business consultants. These aren’t nonprofit types. These are people who are already advising some of the largest media companies in the world.

The idea is that, through that process, we’re hopefully going to arrive at a more realistic set of projects that have either sustainable revenue models and scaling plans, from the beginning, or built-in mechanisms for assessments, reporting back and learning, if they’re designed purely as experiments.

We’re not certain if it’s going to work. It’s an experiment. On the basis of the Tech Camp that we’ve gone through, it seems to have worked very well. We’ve seen people abandon what were, we thought, overly ambitious technology plans and rather matched up or partnered with existing technologists. They will still achieve their goals but do so in a more streamlined, agile manner by re-purposing existing tech.

Editors’s Note: This interview is part of an ongoing series at the O’Reilly Radar on the people, tools and techniques driving data journalism.

Beth Noveck: think about how to open up the API of government

Former White House deputy CTO for open government Beth Noveck recently gave a TED Talk on a “more open source government. For more perspective in the vein of “open,” read CNN’s TED summary, “What if you could make anything you wanted?” Noveck’s talk is embedded below:

“…start by teaching young people that we live, not in a passive society, a read-only society, but in a writable society, where we have the power to change our communities, to change our institutions, that’s when we begin to really put ourselves on the pathway towards this open government innovation”

Are “Commons 2.0″ and participatory urbanism hype or hope?

“…armed with low-cost phones and an Internet connection, people are using civic-minded apps like ForageCity to tackle everything from public safety to potholes. The question is whether the technology will have the long-term effect that some foresee, or whether the “commons 2.0″ and “participatory urbanism” will become empty marketing slogans.”

-Angela Woodall, writing in the Oakland Tribune about a new mobile application from Oakland’s Youth Radio that is designed to help people redistribute extra fruit and vegetables to people in need.

Forage City app

[Image Credit: Susan Mernit]

Woodal asks good questions and, as it happens, posed them to me last week in a phone interview. (I’m quoted in the article.)

Here’s a couple of thoughts that didn’t make it in. Mobile applications that civic developers are creating around the world — like ForageCity — are making it increasingly possible for more people to interact more easily and for less cost where ever and whenever they wish. That does lead to giving more power to more people to connect to one another and solve problems, or at least discuss them.

The potential for such apps to connect and, crucially, scale is particularly significant when there is a shared standard for the open government data that fuels, as with the standard for transit data (GTFS) that now exists in 450 different cities. Around the U.S., cities are slowly working with one another to define more such standards — but it’s a complicated process that doesn’t happen overnight, or even years.

The question is whether the technology will have the long-term effect that Code for America founder Jen Pahlka described to Woodall. On that count, I tend to give Pahlka — and my publisher, Tim O’Reilly — the benefit of the doubt.

As I said to the reporter, the potential for civic apps is enormous — but these the tools are only as good as the people who use them and adapt them. The tools can be quite good on their own — full stop — but many network effects will only take place with broad, mainstream adoption.

Smartphones can now be used for finding shelter, improving medical care and documenting riots — but the same devices are also used for gaming, pornography, celebrity gossip and shopping. While the apps used to find city services are generally not the ones used to surveil citizens, in practice the mobile device itself may be an agent of both actions.

Working out how to both protect the rights of citizens and empower citizens using mobile devices will be a difficult and crucial need in the years ahead.

It’s not immediately clear, at least to this observer, that state governments, Congress, regulators and law enforcement are up to the challenge, but it’s hard not to hope that they rise to the challenge.

Jay Nath on how San Franscisco is working to get its Gov 2.0 groove back

Back in January, Govfresh founder wrote about how San Francisco can “get its Gov 2.0 groove back,” offering six recommendations to the city government to use technology better.

[Image Credit: Fog City Journal]

When asked for comment, San Francisco chief innovation officer Jay Nath (@Jay_Nath) responded to Fretwell’s suggestions via email. While I’ll be sharing more from Nath and SF CIO Jon Walton over at the O’Reilly Radar civic innovation channel, in the meantime I’m publishing his specific responses to those recommendations below.

Build the best mayoral website in the world

Nath: We can always improve how we communicate with our constituents. If we were to undertake an effort to redesign the Mayor’s site, we should take a holistic approach and not just focus on the Mayor’s site. The approach NYC took to invite their design community is one that I think is very smart and something that SF should consider.

Use “Built in SF” technology

Nath: We agree and launched our City Hall iZone concept where we pilot great local technologies and services. We frequently meet with great companies like Square, Twitter, Uber, Yammer and invite each of them to work with the City. Specifically, we’re actively exploring Yammer, Zendesk, Get Satisfaction, Cozybit and 802.11s mesh, Google+ hangouts, and others. Additionally, we’re already using local tech like WordPress (which powers our innovation site), Twitter via Open311API, and Instagram.

Go back to the (data) fundamentals

Nath: We have an open data roadmap to strengthen our leadership in this area. It’s in our 2012 innovation portfolio as well. Our goal is to structurally change how we share data so that our default position is one of sharing. One idea is to require that all software purchased that stores structured data to have a public API. As we secure staffing for this effort, we will invite the community to help us shape the final form and execute.

Leverage the civic surplus

Nath: I would argue that we’ve done a great job in this area. Last summer, we partnered with Gray Area Foundation for the Arts (GAFFTA) to produce the “Summer of Smart.” This series of hackathons produced over 20 prototypes, 500 participants and 10,000 hours of civic engagement. We’ve continued our efforts this year with the City’s first unhackathon around taxi dispatch and real-time mass communication. Our Mayor and transit director both attended the event and thanked our community for their efforts to make SF a better city.

Additionally, we launched our citizen engagement platform, ImproveSF, in a very big way in April.

Open source the infrastructure

Nath: While we can do more to increase open source software adoption, I want to
recognize our efforts to date:

  • open source policy
  • SFPark Android/iPhone app
  • Enterprise Addressing System
  • SmartPDF
  • LAMP as an option for internal customers
  • Pligg (DataSF)
  • Several Drupal applications

Additionally, the idea of moving our City from the existing CMS (Vision) to WordPress is not just about open source technology. We, as a City, made the decision to utilize Vision CMS a couple of years ago and the switching costs to migrate to WordPress currently outweigh the benefits. I will encourage the City to strongly consider WordPress, Drupal, etc for consideration when Vision no longer meets our needs.

Give citizens a dashboard

Nath: This is more than just adopting the IT Dashboard. We have to implement the governance and project management model to ensure that the data is accurate. This is something we need to do but requires time and culture change. I agree that we need to increase access to high value datasets like expenditures. This is part of our open data roadmap and will receive renewed focus in 2012.

SahelResponse.org showcases the power of open data and neocartography

On January 9th, I wondered whether 2012 would be “the year of the open map.” I started reporting on digital maps made with powerful new software and open data last winter, in the context of open government.

In the months since, I’ve seen many more maps emerge from the work of data journalists and government, including a beautiful one made with TileMill and open data from aid agencies at SahelResponse.org. You can explore the map in the embed below:

Nate Smith, who works at DevelopmentSeed, the makers of MapBox and TileMill, blogged about SahelReponse.org at PBS Mediashift.

To bring key aid agencies together and help drive international response, the SahelResponse.org data-sharing initiative maps information about the ongoing food crisis in the Sahel region of West Africa. More than 18 million people across the Sahel are at risk and in need of food assistance in the coming months, according to the United Nations. Recent drought, population movements, and conflict have created a rapidly changing emergency situation. As in any crisis, multiple agencies need to respond and ramp up their coordination, and access to data is critical for effective collaboration. In a large region like the Sahel, the band of mostly arid land below the Sahara Desert stretching across the continent, effective coordination and collaboration are critical for responding effectively.

Thanks to new technologies like TileMill, and an increased adoption of open data, it was possible to put all the key data about the crisis — from relief access routes to drought conditions and population movements — in one place, openly available and mapped to give it further context.

More than half a year later, on other words, I think the prediction that 2012 will be the year of the open map is being born out. The adoption of OpenStreetMap by Foursquare was a notable data point, as was StreetEast moving to OpenStreetMap from Google Maps. In response to the challenge, Google slashed its price for using the Google Maps API by 88%. In an ideal world, the new competition will result in better products and more informed citizens.

The City of Quebec launched an open data site

Up in the currently not-so-frozen north, the City of Quebec has stood up an open data directory online. There are currently 26 datasets listed, spanning a variety of data formats, from .CSV to .XML to .XLS to to .KML to .SHP. (The latter two are GIS files, of interest to folks who like to make maps.)

The city published the video embedded below last night, in addition to a “demarche” (or statement) on the open data website about the project.

ExpoTI-GVQ – Projets étudiants, CÉGEP Limoilou from E-Gouv Québec on Vimeo.

Hat tip @Data_BC

UPDATE: As Richard Ackerman pointed out on Twitter, this open data site went live in February. While the video is new, the site is not.

MIT Civic Media conference examines the success and failures of open government in the U.S.

The 2012 Civic Media Conference featured two full days of conversations about (what else?) the future of civic media and democracy. One conversation is particularly worth calling out and sharing with the Govfresh audience: a panel assessing what’s gone wrong and what’s gone right with open government in the United States over the past three years. The discussion was moderated by Susan Crawford, currently of the Harvard Law School and Kennedy School (and formerly a special advisor at the White House) and featured Mike Norman of Wefunder.com, Mark Headd of Code for America and Chris Vein, Deputy United States Chief Technology Officer for Government Innovation in the White House Office of Science and Technology Policy. I’ve embedded the video below:

Watch live streaming video from knightfoundation at livestream.com

You can read an excellent, comprehensive liveblog of the open gov panel at the Civic Media blog.

What is smart government?

Last month, I traveled to Moldova to speak at a “smart society” summit hosted by the Moldovan national e-government center and the World Bank. I talked about what I’ve been seeing and reporting on around the world and some broad principles for “smart government.” It was one of the first keynote talks I’ve ever given and, from what I gather, it went well: the Moldovan government asked me to give a reprise to their cabinet and prime minister the next day.

I’ve embedded the entirety of the morning session above, including my talk (which is about half an hour long). I was preceded by professor Beth Noveck, the former deputy CTO for open government at The White House. If you watch the entire program, you’ll hear from:

  • Victor Bodiu, General Secretary, Government of the Republic of Moldova, National Coordinator, Governance e-Transformation Agenda
  • Dona Scola, Deputy Minister, Ministry of Information Technology and Communication
  • Andrew Stott, UK Transparency Board, former UK Government Director for Transparency and Digital Engagement
  • Victor Bodiu, General Secretary, Government of the Republic of Moldova
  • Arcadie Barbarosie, Executive Director, Institute of Public Policy, Moldova

Without planning on it, I managed to deliver a one-liner that morning that’s worth rephrasing and reiterating here: Smart government should not just serve citizens with smartphones.

I look forward to your thoughts and comments, for those of you who make it through the whole keynote.