Guest Post by Julian Carver
Julian Carver was the acting Chief Information Officer at the Canterbury Earthquake Recovery Authority (CERA) until June 2012. From March 2013 – April 2015 he led the Canterbury Spatial Data Infrastructure (SDI) Programme at Land Information New Zealand (LINZ), a $5m investment in a set of tech projects to support the $40 billion Christchurch rebuild.
I’m coaching a GovLab Academy course on Innovations in Tech-Enabled Disaster Management. Why do I think this topic is important? Technology and data play a growing and critical role in all phases of disaster response and recovery, from immediate, to short term to long term. What is required to support this and make it work? Here’s what I learned following the devastating earthquakes of 2011 in Christchurch, New Zealand:
- Data infrastructure (the technology, data and policy) that enable both open and trusted data sharing between organisations is crucial in a disaster recovery; it supports almost every operational and policy decision.
- Once the disaster has hit, technology staff in public and private sector organisations have very high levels of permission to innovate, share data, and use technologies in new ways, but they are also incredibly busy servicing immediate needs. That makes building out core data infrastructure “on the fly” quite difficult.
- Data infrastructure is therefore a key aspect of preparedness for disasters and an essential element in risk reduction.
- Knowing how to rapidly innovate on top of existing data infrastructure, using crowdsourcing, open and trusted data sharing, and agile deployment methods really helps when a disaster strikes.
- These lessons apply in non-disaster situations – building out core data sharing infrastructure (the technology and the policies) and having people inside and outside government able to experiment and innovate on top of it is a big part of open government movements worldwide. There are huge benefits in ‘everyday’ policy and management contexts.
In this post, I’ll share some stories of the huge challenges we faced, and the way we used technology, data sharing and data infrastructures to overcome them to support the recovery of my city.
In a moment our world changed
On February 22, 2011, 12:51pm, I’m working from home, lying on my bed, reading email on my iPhone. 30 seconds later, my city, my life, and my future had changed irrevocably. Anything not bolted down was on the floor and half of it was smashed. Computer monitors, TVs, bookshelves, food from the fridge. The power went off then stayed off for five days.
The earthquake’s epicenter was very close to the city, very shallow, and the quake had very high horizontal AND vertical acceleration. 185 people were killed due to building collapse and rock fall.
Mobile calls worked for a few minutes, then failed. Texts became patchy after an hour. The only thing that was semi-reliable was Twitter over 3G. It took until 9pm that night for me to know that my 10 year old son was OK as he was at school in the the most devastated part of the city, and his teachers couldn’t get hold of us.
Technology in the immediate response
Within three hours of the quake a group of volunteers had set up a crowdsourced open source disaster response platform to provide essential maps on sources of water, fuel and food, road closures, and offers and requests for help. Within a day, an information site was set up on WordPress.com by the official agencies.
As described in much more detail in these articles, the city became reliant on cloud, social media and mobile based technologies, leveraging existing non-government data infrastructures. These were largely new and unfamiliar approaches for the city government and emergency response agencies back in 2011.
Technology in the Recovery Phase
The Canterbury Earthquake Recovery Authority (CERA) was established just weeks after the earthquake. It was a central government agency tasked with setting policy, strategy and plans for the recovery, coordinating many central and local government agencies as well as the NGO and private sector efforts in providing community wellbeing, infrastructure, and economic recovery services, and managing the central city demolitions. I joined in the first week, and as the Acting Chief Information Officer, I became responsible for the information technology and acquiring, managing and sharing the data the agency needed to inform the myriad of tasks required of the agency both in the short, medium and long term phases of the recovery.
Here’s an example. In June, less than three months after being established CERA completed the initial process of land zoning based on detailed geotechnical investigations. This determined which houses on land were too badly damaged to make it cost effective to repair them. The government offered to buy around 8,000 of these properties from citizens at a fair recent valuation.
Like everything in the recovery, time frames were tight. CERA needed a convenient and trusted way to disseminate information and let people know exactly which zone their property was in. That required an interactive website, capable of taking a massive initial load of users, which could also be implemented in a very short time frame.
In stepped Trade Me, New Zealand’s equivalent of eBay.
Working with Tonkin & Taylor, the engineering firm that had done the geotechnical investigations and created the zoning maps, Trade Me built Landcheck in four days. The site was hosted from their server farms in Auckland and Wellington. In the first 24 hours of the site being live, there were 5 million page views, and 3.3 million individual property searches. Trade Me did this for free, for the people of greater Christchurch.
By September 2011, CERA had taken over the hosting of Landcheck following three further land zoning decision announcements. The functionality was migrated to the CERA website. Interactive mapping was added using open data sourced from the country’s official mapping agency, Land Information New Zealand, from a key piece of national data infrastructure, the LINZ Data Service.
In October 2011, the Department of Building and Housing, a central government agency, developed a new property classification – the ‘Technical Categories’, describing expected land performance and damage in further earthquakes, and the house foundation systems likely to be required to withstand future quakes.
The announcement generated huge interest, similar to the initial Landcheck one. This time, an Amazon cloud hosted solution, using mapping data deployed directly from government data infrastructure was deployed to take the load.
Technology in the long-term recovery: Data sharing for better decisions
In a post-disaster setting, creative solutions using existing data sources in new ways are needed to inform policy-making and to determine how resources and city services are deployed.
For example, tracking population movement is important, as knowing which houses are occupied supports prioritisation decisions on repair to power and water infrastructure, and the deployment of social support services.
In a non-disaster situation, a census or survey would be used to determine housing occupancy, but the speed at which displaced populations move makes this nearly impossible. In the case of Christchurch, houses were being red zoned or condemned; people left the city because of lost jobs or psychological trauma from the ongoing aftershocks; and many new migrants arrived to support the rebuild.
As my colleagues Stephen and Martin describe in their white paper:
“Both NZ Post and local power companies had property (address) level data. Once combined, a property with both a postal redirection and zero electricity meter reading indicated a ‘potentially vacant’ property. A heuristic technique is an approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient to achieve the immediate goals, particularly where finding an optimal solution is impossible, impractical or too time-consuming. While not definitive at an individual property level, when combined with Statistics NZ population data these simple maps and underlying data presented a clear, suburb-level view of population movement that was invaluable to CERA and agencies involved in the recovery.”
Tracking population movement quickly was challenging. But it was possible. Possible because of the existing underlying national data infrastructure, data standards, and legal frameworks. Without the open spatial data from the LINZ Data Service, and StatisticsNZ’s population geographies, it would have been difficult or impossible. Strong privacy laws and data aggregation methods meant this commercial sector data (power metering) could be combined with government data while protecting privacy and commercial confidentiality. Having a technology team that knew how to work with agility, while safely navigating the data licensing and data protection challenges as well as the technical hurdles was another element.
Data Infrastructure in the Rebuild
In late 2012, I left the CERA Information Services team to take up a new challenge. I moved to Land Information New Zealand to lead the Canterbury Spatial Data Infrastructure programme. Our job was to make it easier for those undertaking the $40 billion rebuild of the city to share data to coordinate their efforts.
The government agencies and private sector property developers had to demolish 1,200 commercial buildings, repair all below-ground infrastructure (wastewater, stormwater, water supply, power, and broadband), and begin the process of reconstructing new buildings, all within a small geographic area, in which people were now living and working, and all at the same time.
The Forward Works Viewer
The only way of viably doing that was to let everyone to see each other’s forward construction intentions well enough in advance to avoid expensive clashes and delays. So we worked together to build the Forward Works Viewer, a tool which gave those agencies and other public and private sector users a shared online view of horizontal infrastructure repair, planned buildings, and other construction. It let them manage and view projects and their impacts spatially and over time, and detect potential clashes and opportunities for collaboration (e.g. digging up the road once rather than several times). The Forward Works Viewer drew heavily on data infrastructure including open property and road data, open geospatial standards, and open data licensing including NZGOAL, our country’s implementation of Creative Commons.
Even so, we had to make improvements to the underlying data. The road network data was simply road center-line, it didn’t tell you anything about lanes, directions, and turn restrictions. We wanted to build those into the Forward Works’ viewer to assess the impact on the road network of road closures due to construction activity. Is it this lane closed, or this lane? Is it total closure or reduced capacity? This was needed for traffic modelling to ensure as many works could happen as fast as possible without disrupting peak traffic flows. But we didn’t have an open, freely reusable, routable roading network.
So we contracted four postgraduate GIS students for two weeks to bring OpenStreetMap for the relevant area fully up to date, and make it fully routable for Christchurch. This low cost open data initiative, and the availability of open source tools built for OpenStreetMap, let us quickly integrate the data into the Forward Works viewer and provide a more precise construction impact selector.
Building the Next Generation of Data Innovators
Hackfests/crowdsourcing had proven to be a successful, and low cost tool for solving public problems, so we used it again.
Residential building footprint databases didn’t exist for the satellite municipalities of Selwyn and Waimakariri, and the Christchurch dataset was incomplete. Jeremy Severinson, a LINZ employee who had been conducting postgraduate research assessing the trustworthiness of crowdsourced data proposed the creation of new building footprint data using a competition for school students.
Environment Canterbury, the regional council, created a Web app with instructions, registration, and login for participants, who digitized building outlines from open aerial photographs. Again this used existing data infrastructure, so could be quickly and cost effectively deployed.
The first participant to achieve 75 percent or better quality/trust score was awarded the point for that building, and the participant with the greatest number of points won. The competition ran for a month and generated 18,789 building footprints, which were integrated into the relevant council databases and OpenStreetMap. The winning kids got iPads and money for their school.
We did this as an experiment. It worked and gave us great data. Just as importantly, we got a group of kids, who might not have considered tech careers, engaged with spatial and open data. And who knows, maybe they’ll be part of the next generation of crisis mappers and disaster recovery innovators for events in other parts of the world.
What was Learned?
These experiences over the phases of disaster response, recovery and rebuild taught me the following.
The Important of Data Infrastructure
- Government policy and recovery decisions rely on sharing huge amounts of data between organisations, often at great pace.
- Disaster responses and recoveries are about people and things at places, so you need good spatial data infrastructure and open data policies to share non-personal data, and you need good privacy and data aggregation/data protection mechanisms to manage the use of personal data
- You can improve data infrastructure on the fly, as you go, BUT the better data infrastructure you have before a disaster, the easier it will be.
- The fundamentals of good data infrastructure and government policy such as foundational spatial data, technical standards, and open data licensing NEED to be in place before disaster strikes. Preparedness here is as important as building standards, lifeline infrastructures, and trained emergency responders.
- As Dave Snowden says, the necessary preconditions for innovation are starvation, pressure, and perspective shift. Decision makers move from a peacetime view of “it has to be perfect and new approaches are risky” to “just get it done and use what works”.
- That means you’re under pressure, without clear certainty on exactly what you’re building and no time to plan it to the n-th degree. Being experienced in using agile and iterative approaches works here. You get to deploy quickly, get feedback from real users, and improve the solution. It’s possible for government agencies to be agile as we showed.
- If you let volunteers and young people contribute via crowdsourcing in a disaster recovery, and coordinate their efforts with the official response agencies, you can get the best of both worlds – innovation and agility, happening in harmony with safety and control
- These lessons apply in non-disaster situations – building out core data sharing infrastructure (the technology and the policies) and having people able to experiment and innovate on top of it is a big part of open government movements worldwide. There are huge benefits in ‘everyday’ policy and management context.
Attribution and References
More details on the examples above can be found in this case study: http://odimpact.org/static/files/case-studies-new-zealand.pdf
More information on the CERA Spatial Data Infrastructure can be found in this whitepaper by my colleagues who led much of the GIS and data work at CERA:http://www.stratsim.co.nz/s/The-CERA-Spatial-Data-Infrastructure-SDI-whitepaper.pdf
Information on the Forward Works Viewer: http://www.linz.govt.nz/news/2014-07/online-tool-enhances-canterbury-rebuild
Information on the ‘Building Our Footprints’ competition: http://www.canterburymaps.govt.nz/buildingourfootprints/
Open Data for Resilience Initiative Policy Note and Principles: https://www.gfdrr.org/sites/default/files/publication/OpenDRI%20Policy%20Note.pdf
Code for Resilience: http://codeforresilience.org/