As NYC.gov buckles, city government pivots to the Internet to share Hurricane #Irene resources

Tens of millions of citizens in the United States are watching as Hurricane Irene churns up the East Coast. If you’re in the path of the immense storm, today is a critical day to prepare. Visit Ready.gov for relevant resources. Unfortunately for citizens in my home state, New York City is right in the path of Hurricane Irene. As many New Yorkers look for information online, however, we’re watching NYC.gov is buckling under demand. For part of Friday morning, NYC.gov would not resolve. The outage is providing a real-time experiment in how a megalopolis with millions of citizens provides information during a natural disaster.

As the Village Voice reported, NYC is evacuating the most vulnerable and putting out advisories but city websites are down. As a result, we’re watching how city government is forced to pivot to the Internet and commercial websites, including social media, to get information out.

Dropbox is hosting a Hurricane #Irene Evacuation PDF (It’s not completely clear if city government uploaded the PDF or not, when this post was published). NYC chief digital officer Rachel Sterne and the official NYC.gov Twitter account have acknowledged and apologized for the outage and pointed citizens to docstoc.com for the official evacuation map:

NYC Hurricane Evacuation Map

Notably, Mayor Bloomberg’s staff has uploaded the New York City Hurricane Evacuation Zones PDF to his personal website, MikeBloomberg.com, and tweeted it out. We’re in unexplored territory here, in terms of a mayor sharing information this way, but in the context of incoming weather, it’s hard to fault the move, though it’s likely inevitable.. [Ed: As Nick Clark Judd pointed out in his excellent post on how governments are scrambling to deliver information to citizens looking for hurricane information online, Mayor Bloomberg has posted press releases and other information to his website several times before.]

What is clear, amidst growing concerns of a multi-billion dollar disaster, is that the New York City government’s website hosting strategy needs to be revisited. According to Provide Security, NYC servers are hosted in a data center in Brooklyn. Spikes in demand are precisely what cloud computing offers to the private sector and, increasingly, to federal government. As hurricane clouds gather, it’s probably past time for New York government to get familiar into cloudbursting or move quickly implementing internal architectures that include a private cloud, through Nebula or something similar, to handle the load. In the context of disasters, surge capacity for government websites is no longer a “nice-to-have” — it’s a must-have.

UPDATE: Civic technologist Philip Ashlock is mirroring NYC Irene data & links on Amazon Web Services (AWS). Even though NYC didn’t move critical resources to the cloud itself, a member of New York City’s technology community stepped up to help the city and citizens in a crisis. That’s Gov 2.0 in action:

Maps

NYC.gov Hurricane Evacuation Zone Finder
OASIS Map (more info)
ArcGIS Map
hurricane_map_english.pdf 

Raw Data

googleearth_hurricane_zone.kmz 
Shapefiles: OEM_HurricaneEvacCenters_001.zip
Shapefiles: OEM_HurricaneEvacZones_001.zip

Hurricane resources from the Feds

The federal government is providing information on Hurricane Irene at Hurricanes.gov and sharing news and advisories in real-time on the radio, television, mobile devices and online using social media channels. A curated list from the Federal Emergency Management Agency (@FEMA) is embedded below:

If you use Twitter, a key follow this weekend is FEMA Administrator Craig Fugate, who tweets at @CraigAtFEMA. This morning, Fugate tweeted out a link to new digital tools, including a FEMA Android app and text shortcodes. If you’re at risk, this information is for you. Shayne Adamski, senior manager for digital engagement, blogged the details:

In the new FEMA App, you’ll be able to:

  • Check off the items you have in your family’s emergency kit,
  • Enter your family emergency meeting locations,
  • Review safety tips on what to do before, during and after a disaster,
  • View a map of shelters and disaster recovery centers across the U.S., and
  • Read our latest blog posts.

When we built the app, we kept the disaster survivor in mind, making sure much of the information would be available even if cell phone service isn’t, so you’ll be able to access the important information on how to safe after a disaster, as well as your family emergency meeting locations.

So as Administrator Fugate said, you can download our app today in the Android market, and look for FEMA App for Blackberry version 6 devices and iPhones in the coming weeks.

FEMA Text Messages 

A new and separate service from the new app, our text message updates will allow cell phone users to receive text message updates from FEMA.

  • Text PREPARE to 43362 (4FEMA) to sign up to receive monthly disaster safety tips
  • Text SHELTER + your ZIP code to 43362 (4FEMA) to find the nearest shelter in your area (example: shelter 12345)
    (For availability of shelters and services, contact your local emergency management agency.)
  • Text DRC + your ZIP code to 44362 (4FEMA) to find the nearest disaster recovery center in your area (for example, if you lived in Annandale, Virginia with a Zip Code of 22003, you’d text DRC 22003).

We’re excited to provide these two new ways you can access information on your mobile device, in addition to our already existing mobile site – m.fema.gov. Stay tuned to our blog, Facebook and Twitter channels as we roll out our app to the remaining smartphone operating systems and make enhancements to our text messages program.

So download the app or text PREPARE to 44362, and then leave us a comment and let us know what you think. We encourage you to tell a family member, friend, or neighbor as well, so they can have disaster safety information always at their fingertips.

[Image Credit: NASA Earth Observatory]

Open government innovation from NASA fuels launch of OpenStack and Nebula

Earlier today, a new startup emerged from stealth at OSCON in Portland, Oregon. Nebula looks to democratize cloud computing with open source hardware.

Nebula appliance As Venturebeat reported, by combining OpenStack with Facebook’s OpenCompute project, Nebula could bring cloud computing to everyone with a cloud appliance.

It’s going to be a while before we’ll know if this bold vision comes to pass, but it’s important to be clear: this private sector innovation and startup is the outgrowth of one of NASA’s open government initiatives, where a technology developed by the government was released to the public to innovate upon.

That outcome can be at least partially attributed to Nebula CEO Chris Kemp, the former NASA CTO for IT, built a cloud “dream team” for Nebula’s launch from Kleiner Perkins’ basement. Nebula has the potential to bring cheaper private clouds to enterprises and small to medium-sized business to government, which could stand to leapfrog a generation of technology. (Putting a cloud behind an organization’s firewall could also address the security and compliance challenges that have hampered adoption of public cloud by enterprise and government users.) You can watch the announcement of Nebula at OSCON in the video below:

I talked with Kemp yesterday about OpenStack, his new startup, enterprise IT and innovation in government. “I am just unbelievably excited about all of the innovation that’s going to happen, he said. “When I left NASA, there was an open playing field. Citrix has bet their company on a tech that emerged out of NASA. Rackspace has incorporated it as well. Dell and HP are working with OpenStack too.”

Kemp, at least for now, doesn’t appear to be looking towards acquisition as his exit strategy. “We’re building a whole new company,” he said. “It’s not going to acquired by Dell or another large vendor. It’s too important to be lost in a big organization. The opportunity here is to build a lasting company that plays a key role in how computing unfolds.”

It’s the potential to change the world that seems to have brought a glint to Kemp’s eye. “This is why I left NASA,” he said. “I had this idea, this concept, I knew it had the potential to change the world, I knew it was time to build that. There are things you can only do inside of government, and there are things you can only do outside of government.”

In at least one sense, this outcome is about Gov 2.0 versus the beast of bureaucracy, once again. “The thing I learned at NASA is the biggest barrier to this stuff is the culture within the organization,” said Kemp. “It’s people. In a federal agency, people have been there forever and have spent tons of money on tools. What we’re doing with this appliance will disrupt a lot of that.”

Kemp also offered a suggestion to government agencies with innovators trying to make a difference. “The real shame is that you take the most risk-averse people in the world – government civil servants – and make them take the most dangerous leap, to end their careers, to be entrepreneurs. Imagine if government allowed people to take one year without pay, try to create something, and then return to public service.”

While that may be an unlikely dream, Kemp has left government himself, jumping to an endeavour that has the potential to disrupt the future of computing. “We want people to build on a platform that isn’t unnecessarily expensive or reliable,” he said. “We’re selling a little box that creates an infrastructure service and supporting it. You plug it in at the top of the rack where basically joins ‘the collective.’ It becomes part of a massive compute and storage cloud that’s compatible with Amazon and allows anyone to use a cloud that based on standards.
And they can do it with the cheapest hardware.”

Open source has been a key component of NASA’s open government work. Now one of its open source projects may become part of the work of many other people in industries unrelated to aerospace. With the launch of Nebula, an open government initiative looks set to create significant value — and jobs — in the private sector, along with driving open innovation in information technology.

Platforms for citizensourcing emerge in Egypt


As people watching the impact of social media in the events in Egypt know, Facebook, Twitter and YouTube played a role. Today, Microsoft’s director of public sector engagement, Mark Drapeau, sent word that the Redmond-based software company’s open source ideation platform, Town Hall, has been deployed at nebnymasr.org to collect ideas.

The highest profile implementation of Town Hall to date was for crowdsourcing ideas in Congress for the incoming Republican majority in Congress at “America Speaking Out.

This Town Hall instance and others show how citizensourcing platforms can be tailored to channel feedback around specific topics, as opposed to less structured platforms. As governments and citizens try to catalyze civic engagement using the Internet, creating better architectures for citizen participation will be critical. Clay Shirky’s talk about the Internet, citizenship and lessons for government agencies at the Personal Democracy Forum offered some insight on that count. Using taxonomies to aggregate ideas instead of a single list was a key takeaway.

To date, the Egyptian citizensourcing site has logged a few dozen questions and votes. Whether usage of the site will grow more or not is up for debate. The network effect may working against it. As ReadWriteWeb reported last week, Egyptians are using Google Moderator to brainstorm Egypt’s future. Wael Ghonim, the Google executive who played a role in Egypt’s recent revolution, started a Google Moderator page for Egypt entitled, “Egypt 2.0, what does we need? What are our dreams?!.” To date, the Moderator instance has logged 1,361,694 votes for more than 50,000 of the ideas submitted by nearly 40,000 users.

White House issues guidance on “technology neutral” IT acquisition

Victoria Espinel, the White House intellectual property enforcement coordinator, wrote a blog post providing guidance to federal agencies on making technology neutral IT procurement decisions.

Each year, the U.S. Government spends almost $80 billion dollars buying information technology (IT); the software, computer equipment and network devices that help the Government run efficiently. It is important that those purchases be fair, neutral and based on an objective assessment of relevant criteria. To ensure that the agencies and the public are aware of our policy, today U.S. Chief Information Officer Vivek Kundra, Administrator for Federal Procurement Policy Dan Gordon and I issued a statement to Senior Procurement Executives and Chief Information Officers reminding them to select IT based on appropriate criteria while analyzing available alternatives including proprietary, open source and mixed source technologies.

Aliya Sernstein, over at NextGov, extracted an interested headline from the guidance: “Kundra encourages open source.” Getting to that conclusion from the memo in question, embedded below, might be a stretch, though it is notable that a document signed by the United States chief information officer specifically said that agencies should “analyze alternatives” that include open source.

One key phrase in the memo gives a bit more insight here, in terms of the acquisition process: should “selecting suitable IT on a case-by-case basis to meet the particular operational needs of the agency by considering factors such as performance, cost, security, interoperability, ability to share or re-use, and availability of quality support.”

Open source software has both competitive advantages and disadvantages in those areas.

Here’s the memo from CIO.gov:

Technology Neutrality

Possibly related: “Google wins: Interior forbidden to award noncompetitive contract to Microsoft” [Federal Computer Week]

And no, this isn’t the only outlet to wonder about that link: read Nancy Scola over at techPresident on the White House reminder to be technology neutral:

(So why the memo, and why today? It’s not entirely clear yet, but a smart source points out a related news item in the space: yesterday Google won a preliminary injunction in a case where it had argued that the U.S. Department of the Interior had inappropriately geared a nearly $60 million contract for cloud-based email and collaboration software tools to fit only Microsoft’s proprietary products. Again, though, we’re indulging in a bit of speculation here, and it’s worth pointing out that Google’s revelant products aren’t themselves open-source.)

By the way, if you’d like to stay instantly up on such developments, you might try following Kundra’s new Twitter feed. He’s only tweeted three times thus far, but once was an indeed a pointer to this memo. “Open source vs proprietary?,” he posted. Follow @VivekKundra here.

IBM initiative adds Big Blue to government cloud computing market

What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.

[Image Credit: Envis-Precisely.com]

“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.

The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.

As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.

Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.

“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”

The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.

“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”

Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.

We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.

The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.

Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.

UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below:

State of Minnesota Moves to Microsoft’s Cloud for Collaboration

This morning, the state of Minnesota announced that it would use Microsoft’s private cloud computing technology as a platform for its collaboration software. Microsoft’s blog post reasonably Minnesota’s move to the cloud as an “historic first.” Given that the state’s press release, embedded below, describes it the same way, that’s not unfair. Details have yet to emerge on the security or privacy requirements that the Redmond-based software giants signed to gain the customer but, as the release notes, “the move makes Minnesota the first U.S. state to move to a large collaboration and communication suite in a private cloud environment.”

While federal, state and local government entities have used Amazon, Google Apps or Salesforce.com, today’s news at least adds Microsoft’s offerings into the conversation. The implementation will likely deploy the Windows Azure platform to deliver Microsoft’s Business Productivity Online Suite (BPOS).

“As states battle growing deficits, they are continually being asked to do more with less,” said Gopal Khanna, Minnesota’s State Chief Information Officer in a prepared statement. “Rethinking the way we manage our digital infrastructure centrally, to save locally across all units of government, is a crucial part of the solution. The private sector has utilized technological advancements like cloud computing to realize operational efficiencies for some time now. Government must follow suit.”

Not all reactions are quite as optimistic, however, particularly with respect to reduced costs. “I forsee short term gain,” tweeted researcher Simon Wardley, “large future exit costs, increased consumption, no long term reduction in IT expenditure.”

Why no long term reductions in state IT expenditures by going to Microsoft’s private cloud?

“See Jevons’ paradox,” Wardley replied. “Causes are co-evolution, long tail of demand, componentisation and increased innovation. In other words, you’ll just end up doing more. Countries & States are in competition with each other … not just firms. It’s not MSFT specific, it’s general to all clouds. The ‘cloud will save you money’argument forgets consumption effects. You might as well argue that Moore’s law should have reduced IT expenditure. [Cloud will] reduce your costs if your workload stays the same but alas it won’t, it’ll increase for the reasons previously listed.”

State of Minnesota Signs Historic Cloud Computing Agreement With M 092710090511 MN BPOS Announcement Releas…

Senate considers update to Electronic Communications Privacy Act

Today in Washington, the Senate Judiciary Committee held a hearing on updating the Electronic Communications Privacy Act (ECPA), the landmark 1986 legislation that governs the protections citizens have when they communicate using the Internet or cellphones.

The statements of the witnesses before the Senate from the Commerce Department, Justice Department and witnesses are embedded in ths post. Below, find an exclusive interview with digital privacy and security researcher Chris Soghoian, who until recently was the resident geek at the Federal Trade Commission, and some context on “Digital Due Process,” the coalition of industry and privacy advocates advocating for an ECPA update.

“From the perspective of industry and definitely the public interest groups, people shouldn’t have to consider government access as one of the issues when they embrace cloud computing,” said Soghoian. “It should be about cost, about efficiency, about green energy, about reliability, about backups, but government access shouldn’t be an issue.”

While the tech blogosphere may be focused on Twitter, Facebook and inside baseball among the venture capitalists of Silicon Valley’s today, the matter before Congress should be earning more attention from citizens, media and technologists alike. Over at Forbes, Kashmir Hill made the case that industry will benefit from a clearer Electronic Communications Privacy Law. Take it one step further: updates to the ECPA have the potential to improve the privacy protections for every connected citizen, cloud computing provider or government employee. As she pointed out there:

One of the most egregious ECPA issues is how it treats the protection of email. “Why should email in someone’s inbox be treated different from something in someone’s sent folder?” asked Smith [Microsoft’s general counsel]. “Why is something unread in my junk folder subjected to greater privacy than something read in my inbox? Why does an email I sent in April have fewer privacy protections than one I sent in September?”

Smith discussed security and privacy concerns with respect to cloud computing after the hearing: Get Microsoft Silverlight

It’s important to be clear: Congress is unlikely to move on updating ECPA before the mid-term elections or in the lame duck session. That said, the hearing in the Senate today and the hearing on ECPA reform and the revolution in cloud computing in the House of Representatives tomorrow will inform any legislative action in the next Congress.

Chairman Patrick Leahy was clear in his opening statement today: American innovation has outpaced digital privacy laws.

When Congress enacted ECPA in 1986, we wanted to ensure that all Americans would enjoy the same privacy protections in their online communications as they did in the offline world, while ensuring that law enforcement had access to information needed to combat crime. The result was a careful, bipartisan law designed in part to protect electronic communications from real-time monitoring or interception by the Government, as emails were being delivered and from searches when these communications were stored electronically. At the time, ECPA was a cutting-edge piece of legislation. But, the many advances in communication technologies since have outpaced the privacy protections that Congress put in place.

Today, ECPA is a law that is often hampered by conflicting privacy standards that create uncertainty and confusion for law enforcement, the business community and American consumers.

For example, the content of a single e-mail could be subject to as many as four different levels of privacy protections under ECPA, depending on where it is stored, and when it is sent. There are also no clear standards under that law for how and under what circumstances the Government can access cell phone, or other mobile location information when investigating crime or national security matters. In addition, the growing popularity of social networking sites, such as Facebook and MySpace, present new privacy challenges that were not envisioned when ECPA was passed.

Simply put, the times have changed, and so ECPA must be updated to keep up with the times. Today’s hearing is an opportunity for this Committee to begin to examine this important issue.

“There does seem to be wide agreement that current ECPA standards are a muddled mess,” said Julian Sanchez, a research fellow at the libertarian Cato Institute, and contributing editor for Reason Magazine. “The fear about “uncertainty” expressed by Baker is ridiculous when you consider the scholarly consensus and the evident confusion in the courts trying to apply it. In reality, DOJ finds the ambiguity convenient, since they can jurisidiction-shop for magistrates whose interpretations they find congenial.”

Jim Dempsey of the Center for Democracy and Technology made the following statement on ECPA, promoting security and protecting privacy:

Justice Brandeis famously called privacy “the most comprehensive of rights, and the right most valued by a free people.” The Fourth Amendment embodies this right, requiring a judicial warrant for most searches or seizures, and Congress has enacted numerous laws affording privacy protections going beyond those mandated by the Constitution.

In setting rules for electronic surveillance, the courts and Congress have sought to balance two critical interests: the individual’s right to privacy and the government’s need to obtain evidence to prevent and investigate crimes, respond to emergency circumstances and protect the public. More recently, as technological developments have opened vast new opportunities for communication and commerce, Congress has added a third goal: providing a sound trust framework for communications technology and affording companies the clarity and certainty they need to invest in the development of innovative new services.

Today, it is clear that the balance among these three interests – the individual’s right to privacy, the government’s need for tools to conduct investigations, and the interest of service providers in clarity and customer trust – has been lost as powerful new technologies create and store more and more information about our daily lives. The protections provided by judicial precedent and statute have failed to keep pace, and important information is falling outside the traditional warrant standard.

The personal and economic benefits of technological development should not come at the price of privacy. In the absence of judicial protections, it is time for Congress to respond, as it has in the past, to afford adequate privacy protections, while preserving law enforcement tools and providing clarity to service providers.

Dempsey’s full testimony is embedded below:
Jim Dempsey Testimony on ECPA Update

The American Civil Liberties Union also had specific recommendations for Congress on ECPA reform. “The Electronic Communications Privacy Act was written in 1986 before the Web was even invented and is in desperate need of an upgrade,” said Laura W. Murphy, Director of the ACLU Washington Legislative Office. “While Americans have embraced technology as an essential part of everyday life, they have not surrendered their fundamental right to privacy. Congress must ensure that our privacy laws reflect the technology Americans use every day.”

The testimony of the ACLU on ECPA reform is embedded below:

ACLU statement on update to ECPA

The written testimony of Microsoft general counsel Brad Smith is embedded below:

Microsoft counsel Brad Smith’s Testimony before Senate

The written testimony of he Honorable James A. Baker, Esq., Associate Deputy Attorney General, United States Department of Justice, is embedded below:

Baker Testimony on ECPA Updates

The written testimony of the Honorable Cameron F. Kerry, Esq., General Counsel of the United States Department of Commerce is embedded below:

Cameron Kerry Testimony before the Senate

The written testimony of attorney Jamil Jaffer Testimony is below:

Jamil Jaffer Testimony before the Senate Judiciary Comittee

Digital Due Process

Earlier this year, I reported on the launch of DigitalDueProcess.org, a coalition pushing for an ECPA update for online privacy in cloud computing age. A powerful collection of organizations has been pushing for an update to ECPA. Members of the coalition include Google, Microsoft, AT&T, AOL, Intel, the ACLU and the Electronic Frontier Foundation. The guidance from the coalition would enshrine principles for “digital due process,” online privacy and data protection in the age of cloud computing within an updated ECPA.

The coalition set up a website, DigitalDueProcess.orgcontaining its proposals for updating ECPA in the face of new cloud computing security and online privacy challenges. Google Public Policy released a video, embedded below, describing the concept of “digital due process,”

What does Gov 2.0 have to do with cloud computing?

Last week, Gartner analyst Andrea DiMaio rendered his opinion of what Gov 2.0 has to do with cloud computing. In his post, he writes that “ironically, the terms “cloud” and “open” do not even fit very well with each other,” with respect to auditability and compliance issues.

I’m not convinced. Specifically, consider open source cloud computing at NASA Nebula and the OpenStack collaboration with Rackspace and other industry players, or Eucalyptus.For more, read my former colleague Carl Brooks at SearchCloudComputing for extensive reporting in those areas. Or watch NASA CTO for IT Chris Kemp below:

Aside from the work that CloudAudit.org is doing to address cloud computing, after reading DiMaio’s post, I was a bit curious about how familiar he is with certain aspects of what the U.S. federal government is doing in this area. After all, Nebula is one of the pillars of NASA’s open government plan.

Beyond that relationship, the assertion that responsibility for cloud computing deployment investment resides in the Office for Citizen Engagement might come as a surprise to the CIO of GSA. McClure certainly is more than conversant with the technology and its implications — but I have a feeling Casey Coleman holds the purse strings and accountability for implementation. Watch the GSA’s RFP for email in the cloud for the outcome there.

To Adriel Hampton’s point on DiMaio’s post about cloud and Gov 2.0 having “nothing to do with one another,” I’d posit that that’s overly reductive. He’s right that cloud in of itself doesn’t equal Gov 2.0. It’s a tool that enables it.

Moving Recovery.gov to Amazon’s cloud, for instance, is estimated to save the federal government some $750,000 over time and gives people the means to be “citizen inspector generals.” (Whether they use them is another matter.) Like other tools borne of the Web 2.0 revolution, cloud has the potential enable more agile, lean government that enables better outcomes for citizens, particularly with respect to cost savings, assuming those compliance concerns can be met.

The latter point is why Google Apps receiving FISMA certification was significant, and why Microsoft has been steadily working towards it for its Azure platform. As many observers know, Salesforce.com has long since signed many federal customers, including the U.S. Census.

DiMaio’s cynicism regarding last week’s Summit is interesting, although it’s not something I can spend a great deal of time in addressing. Would you tell the Gov 2.0 community to stop coming together at camps, forums, hearings, seminars, expos, summits, conferences or local government convocations because an analyst told you to? That’s not a position I’m coming around to any time soon, not least as I look forward to heading to Manor, Texas next week.

In the end, cloud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

[*Note Gartner’s reply in the comments regarding the resolution of the magic quadrant suit. -Ed.]

Exploring the future of online privacy with Jules Polonetsky

How will regulations and laws that address the new challenges of online privacy evolve? What are the tradeoffs between societal benefit and individual rights? How should the opportunities inherent in data mining be balanced with harm-based standards? What are the responsibilities of governments, businesses and citizens to protect privacy?

Yesterday at the Gov 2.0 Summit in Washington, my interview with Jules Polonetsky covered all of those topics and more. Polonestsky is the Co-chair and Director of the Future of Privacy Forum, a think tank seeking to improve the state of online privacy by advancing responsible data practices. His writing and research can be found at Futureofprivacy.org.

State CIOs rank cloud computing, green IT and social media as top emerging tech

According to a March 2010 survey of state chief information officers by NASCIO , Grant Thornton and Tech America, public IT executives in the United States are looking seriously at investing in the cloud and green IT. 50% of the 40 CIOs, IT resource management officials and OMB representatives surveyed planned to invest in cloud computing. Additionally, some two thirds of those surveyed are using social media. The report is embedded below.

2010 Tech America Federal CIO Survey Final Report

[Hat Tip: Governing People]