San Francisco experiments with citizensourcing better ideas

As significant as the revisions to San Francisco’s open data policy may prove to be, city officials and civic startups alike emphasize that it’s people are fundamental to sustained improvements in governance and city life.

“Open data would not exist without our community,” said Jay Nath, the city’s first chief innovation officer, this Monday at the Hatchery.

San Francisco’s approach to open innovation in the public sector — what businesses might describe as crowdsourcing, you might think of as citizensourcing for cities — involves a digital mix of hackathons, public engagement and a renewed focus on the city’s dynamic tech community, including the San Francisco Citizens Initiative for Technology and Innovation, or SF.citi.

Cities have been asking their residents how government could work better for some time, of course — and residents have been telling city governments how they could work better for much longer than that. New technologies, however, have created new horizons for participatory platforms to engage citizens, including mobile apps and social media.

Open data and civic coders also represent a “new class of civic engagement focused on solving issues, not just sharing problems,” argues Nath. “We have dozens and dozens of apps in San Francisco. I think it’s such a rich community. We haven’t awarded prizes. It’s really about sustainability and creating community. We’ve six or seven events and more than 10,000 hours of civic engagement.”

San Francisco’s dedicated citizensourcing platform is called “ImproveSF.” The initiative had its genesis as an internal effort to allow employees to make government better, said Walton. The ideas that come out of both, he said, are typically about budget savings.

The explosion of social media in the past few years has created new challenges for San Francisco to take public comments digitally on Facebook or Twitter that officials haven’t fully surmounted yet.

“We don’t try to answer and have end-to-end dialog,” said Jon Walton, San Francisco’s CIO, in an interview earlier this year. Part of that choice is driven by the city’s staffing constraints.

“What’s important is that we store, archive and make comments available to policy makers so that they can see what the public input is,” he said.

Many priorities are generated by citizen ideas submitted digitally, emphasized Walton, which then can be put on a ballot that residents then vote on and become policy by public mandate.

“How do you get a more robust conversation going on with the public?” asked Walton. “In local government, what we’re trying to do is form better decisions on where we spend time and money. That means learning about other ideas and facilitating conversations.”

He pointed to the deployment of free public Wi-Fi this year as an example of how online public comments can help shape city decisions. “We had limited funds for the project,” he said. “Just $80,000. What can you do with that?”

Walton said that one of the first things they thought about doing was putting up a website to ask the public to suggest where the hotspots should be.

The city is taking that feedback into account as it plans future wifi deployments:


View Larger Map

green dot Completed sites

blue dot Sites in progress

Walton said they’re working with the mayor’s office to make the next generation of ImproveSF more public-facing.

“How do we take the same idea and expose it to the public?” he asked. “Any new ‘town hall’ should really involve the public in asking what the business of government should be? Where should sacrifices and investments be made? There’s so much energy around the annual ballot process. People haven’t really talked about expanding that. The thing that we’re focusing on is to make decision-making more interactive.”

At least some of San Francisco’s focus has gone into mobile development.

“If you look at the new social media app, we’re answering the question of ‘how do we make public meetings available to people on handhelds and tablets’?” said Walton.

“The next generation will focus on how do they not just watch a meeting but see it live, text in questions and have a dialog with policy makers about priorities, live, instead of coming in in person.”

Kundra: Closing the IT gap is the key to making government work better for the American people

Today, the first chief information officer of the United States, Vivek Kundra, shared his reflections on public service.

Kundra, whose last day of work at the White House Office of Management and Budget was last Friday, is now at the Harvard Kennedy School and Berkman Center.

I arrived at a White House that was, as the Washington Post put it, “stuck” in the “Dark Ages of technology.” In their words, “If the Obama campaign represented a sleek, new iPhone kind of future, the first day of the Obama administration looked more like the rotary-dial past.”

As my team congratulated me on the new job, they handed me a stack of documents with $27 billion worth of technology projects that were years behind schedule and millions of dollars over budget. At the time, those documents were what passed for real-time updates on the performance of IT projects. My neighbor’s ten year old could look up the latest stats of his favorite baseball player on his phone on the school bus, but I couldn’t get an update on how we were spending billions of taxpayer dollars while at my desk in the White House. And at the same time, the President of the United States had to fight tooth and nail to simply get a blackberry.

These were symptoms of a much larger problem.

The information technology gap between the public and private sectors makes the Federal Government less productive and less effective at providing basic services to its citizens. Closing this gap is the key to making government work better for the American people – the ultimate goal.

His complete thoughts are embedded below. If you’re interested in frank insight into why changing government through information technology isn’t easy, read on.

Vivek Kundra’s Reflections on Public Service 2011

The US CIO goes to the white board to describe good government

Earlier this week, United States CIO Vivek Kundra turned to the White House whiteboard to talk about sunshine, savings and service. If you’re unfamiliar with Kundra, he’s the man who has proposed and now is entrusted with implementing sweeping federal IT reform. One of the tools he’s been applying to the task is the so-called IT dashboard, which helps the White House Office of Management and Budget, where he serves to track IT spending. He claims to have reduced federal IT spending by some $3 billion dollars over the past two years with increased tracking and scrutiny.The federal CIO explains more about the results from that work, below.

UPDATE: As open data consultant Dan Morgan pointed out, however, the Government Accountability Office reported that while OMB has made improvements to its dashboard, “further work is needed by agencies and OMB to ensure data accuracy.”

…inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance. Agencies generally concurred with the recommendations; OMB did not concur with the first recommendation but concurred with the second. GAO maintains that until OMB implements both, performance may continue to be inaccurately represented on the Dashboard.

One question left unanswered: Is /good the new /open? Decide for yourself at the newGood Government” section at WhiteHouse.gov.

IBM initiative adds Big Blue to government cloud computing market

What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.

[Image Credit: Envis-Precisely.com]

“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.

The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.

As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.

Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.

“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”

The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.

“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”

Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.

We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.

The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.

Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.

UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below:

What does Gov 2.0 have to do with cloud computing?

Last week, Gartner analyst Andrea DiMaio rendered his opinion of what Gov 2.0 has to do with cloud computing. In his post, he writes that “ironically, the terms “cloud” and “open” do not even fit very well with each other,” with respect to auditability and compliance issues.

I’m not convinced. Specifically, consider open source cloud computing at NASA Nebula and the OpenStack collaboration with Rackspace and other industry players, or Eucalyptus.For more, read my former colleague Carl Brooks at SearchCloudComputing for extensive reporting in those areas. Or watch NASA CTO for IT Chris Kemp below:

Aside from the work that CloudAudit.org is doing to address cloud computing, after reading DiMaio’s post, I was a bit curious about how familiar he is with certain aspects of what the U.S. federal government is doing in this area. After all, Nebula is one of the pillars of NASA’s open government plan.

Beyond that relationship, the assertion that responsibility for cloud computing deployment investment resides in the Office for Citizen Engagement might come as a surprise to the CIO of GSA. McClure certainly is more than conversant with the technology and its implications — but I have a feeling Casey Coleman holds the purse strings and accountability for implementation. Watch the GSA’s RFP for email in the cloud for the outcome there.

To Adriel Hampton’s point on DiMaio’s post about cloud and Gov 2.0 having “nothing to do with one another,” I’d posit that that’s overly reductive. He’s right that cloud in of itself doesn’t equal Gov 2.0. It’s a tool that enables it.

Moving Recovery.gov to Amazon’s cloud, for instance, is estimated to save the federal government some $750,000 over time and gives people the means to be “citizen inspector generals.” (Whether they use them is another matter.) Like other tools borne of the Web 2.0 revolution, cloud has the potential enable more agile, lean government that enables better outcomes for citizens, particularly with respect to cost savings, assuming those compliance concerns can be met.

The latter point is why Google Apps receiving FISMA certification was significant, and why Microsoft has been steadily working towards it for its Azure platform. As many observers know, Salesforce.com has long since signed many federal customers, including the U.S. Census.

DiMaio’s cynicism regarding last week’s Summit is interesting, although it’s not something I can spend a great deal of time in addressing. Would you tell the Gov 2.0 community to stop coming together at camps, forums, hearings, seminars, expos, summits, conferences or local government convocations because an analyst told you to? That’s not a position I’m coming around to any time soon, not least as I look forward to heading to Manor, Texas next week.

In the end, cloud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

[*Note Gartner’s reply in the comments regarding the resolution of the magic quadrant suit. -Ed.]

State CIOs rank cloud computing, green IT and social media as top emerging tech

According to a March 2010 survey of state chief information officers by NASCIO , Grant Thornton and Tech America, public IT executives in the United States are looking seriously at investing in the cloud and green IT. 50% of the 40 CIOs, IT resource management officials and OMB representatives surveyed planned to invest in cloud computing. Additionally, some two thirds of those surveyed are using social media. The report is embedded below.

2010 Tech America Federal CIO Survey Final Report

[Hat Tip: Governing People]