LOGO

Data, Oil, and the Future of Information

May 2, 2021
Data, Oil, and the Future of Information

The Evolving Role of Data in Disaster Response

For over ten years, the phrase “data is the new oil” has been prevalent, and in numerous industries, this assertion has proven accurate. Data now fundamentally shapes decision-making processes across all levels within major private organizations, impacting areas like marketing, logistics, finance, and product development.

Despite this widespread data reliance, its application to emergency response has been surprisingly limited over the past decade. Efforts to significantly increase the volume of data utilized in disaster management have yielded comparatively small improvements.

The Shift Towards Data-Driven Emergency Management

However, a transformation is underway, largely driven by the expansion of the Internet of Things (IoT). Crisis managers on the front lines are now gaining access to the data necessary for improved decision-making throughout the entire disaster lifecycle – encompassing resilience, response, and recovery.

Current advancements represent only the initial stages of a potential revolution in disaster response during the 2020s. Future innovations promise even greater capabilities.

Future Innovations in Disaster Response Data

  • Drone Technology: Increased aerial data collection.
  • Simulated Visualizations: Enhanced understanding of potential impacts.
  • Artificial Intelligence: Predictive modeling and disaster simulations.

These emerging technologies will further empower responders with critical insights, ultimately leading to more effective and timely interventions. The potential for improved outcomes is substantial as data integration matures within the field.

A Surge in Disaster Data Availability is Transforming Emergency Response

Effective emergency response hinges on overcoming uncertainty and the relentless pressure of time. During events like wildfires or hurricanes, conditions can shift rapidly – even within seconds. Previously safe evacuation routes can become blocked, response teams can become overextended, and unforeseen issues can escalate quickly, leaving operations centers without reliable information.

Obtaining even basic data regarding unfolding disasters has historically presented significant challenges. Unlike businesses, which readily adopted data-driven practices with the digitalization of operations, emergency management agencies often lacked foundational data infrastructure. The transition from paper-based systems to machine-readable data proved slow to materialize.

Consider a flood scenario – pinpointing the precise location and movement of rising waters was, until recently, a major obstacle. Similarly, comprehensive datasets detailing tree locations and fire susceptibility were unavailable for wildfire management. Even critical infrastructure, such as power lines and cell towers, often lacked digital integration, remaining ‘invisible’ to monitoring systems.

data was the new oil, until the oil caught fireAnalysis, predictions, and simulations are rendered ineffective without robust raw data, a resource that was historically scarce in disaster response.

The anticipated Internet of Things (IoT) revolution is now gaining momentum, with an increasing number of IoT sensors being deployed across the United States and globally. These sensors continuously transmit data on temperature, atmospheric pressure, water levels, humidity, pollution, and power status, feeding into data warehouses for analysis.

In the American West, knowledge of wildfire locations was previously limited. Tom Harbour, former head of fire response at the U.S. Forest Service and now chief fire officer at Cornea, described firefighting as “100 years of tradition unimpeded by progress.”

Firefighting is fundamentally a direct, visual activity – responders can observe and feel the fires. However, in vast and sparsely populated areas, data was less impactful. While satellites could detect large-scale conflagrations, smoldering brush fires remained hidden from geospatial authorities. Simply knowing there was smoke over California offered limited actionable intelligence to firefighters on the ground.

Today, IoT sensors are beginning to alleviate this information gap. Aaron Clark-Ginsberg, a social scientist at RAND Corporation researching community resilience, notes the increasing prevalence of affordable and easy-to-use air quality sensors. These sensors provide detailed insights into pollution levels – a crucial indicator of wildfires – and are exemplified by companies like Purple Air, which offers a consumer-facing air quality map.

Maps are central to data utilization in disaster scenarios. Geospatial information systems (GIS) underpin most planning and response efforts, and Esri is a leading provider in this sector. Ryan Lanclos, who leads public safety solutions at Esri, highlights the significant impact of expanding water sensor networks on disaster response. “Flood sensors are constantly providing data,” he explains, and combined with a “national water model from the federal government,” GIS analysis can now predict flood impacts on communities with unprecedented accuracy.

data was the new oil, until the oil caught fireCory Davis, director of public safety strategy and crisis response at Verizon, emphasizes how these sensors are transforming infrastructure maintenance. “Utilities can now deploy sensors on power lines, enabling quicker response times and faster power restoration.”

A key development in recent years has been improved sensor battery life. Advances in ultra-low-power wireless chips, coupled with better batteries and energy management, allow sensors to operate for extended periods without maintenance. “We now have devices with 10-year battery lives,” Davis states, which is crucial for deployment in remote areas where power connectivity is unavailable.

T-Mobile shares a similar perspective. Jay Naillon, senior director of national technology service operations strategy, notes the increasing value of storm surge data for proactive planning. “This data helps us ensure we have the right assets in place,” he says, providing real-time warnings to planners nationwide.

Commercial interest is accelerating the adoption of sensors and data streams related to disasters. While governments are the primary users of flood or wildfire data, the private sector also recognizes its value. Jonathan Sury, project director at the National Center for Disaster Preparedness at Columbia University, explains that climate change-related risks impact commercial bottom lines, driving interest in sensor data for bond ratings, insurance underwriting, and other applications.

While sensors aren’t yet universally deployed, they provide emergency managers with a level of visibility previously unattainable.

Furthermore, extensive datasets derived from mobile usage are becoming increasingly available. Facebook’s Data for Good project, for example, provides data layers on connectivity patterns – identifying user movement from one location to another, indicating potential displacement. This data, along with information from telecommunications companies, assists emergency planners in tracking population shifts in real-time.

The Rising Tide of Data and its Impact on AI in Disaster Response

The volume of data available has surged, transitioning from streams to overwhelming floods of information. Similar to rising floodwaters in urban centers, this data deluge necessitates a dedicated and comprehensive response. Within the business world, the challenges of big data are routinely addressed through sophisticated IT infrastructures, ranging from data warehouses to business intelligence applications.

However, processing data related to disasters presents a unique set of difficulties. Information crucial for disaster management is dispersed across numerous organizations – encompassing the private sector, governmental bodies, and nonprofit entities – resulting in significant interoperability issues. Even when data harmonization is achievable, distilling findings into actionable guidance for frontline responders remains a substantial hurdle, hindering the widespread adoption of AI, especially in proactive planning scenarios. As a Verizon representative, Davis, observed, “many cities and federal agencies are grappling with effectively utilizing the vast amounts of data they now possess.”

Challenges of Standardization and Interoperability

Standardization proves challenging across all levels of implementation. Globally, countries often lack seamless interoperability, although improvements are continually being made. Amir Elichai, CEO of Carbyne, a 911 call-handling platform, highlighted the “significant differences in technology and standards between countries,” explaining that protocols designed for one nation frequently require complete revisions to function in another.

Establishing communication between responders can also be problematic in international settings, as noted by Tom Cotter, director of emergency response at Project HOPE. “Certain platforms are permitted in some countries but not others, and these regulations are constantly evolving,” he stated. “I maintain a comprehensive collection of virtually every available technology communication platform.”

A senior federal emergency management official confirmed that data portability is now a key consideration in technology procurement contracts. The government is prioritizing the acquisition of commercially available software over custom-built solutions. This shift has been embraced by companies like Esri, with Lanclos emphasizing their commitment to “openness and the creation of openly shared data, either publicly or through secure, open standards.”

The Unexpected Benefits of Fragmentation

Ironically, the absence of interoperability can sometimes foster innovation. Elichai suggested that “the lack of established standards can be advantageous – you aren’t constrained by legacy systems.” In situations where standards are lacking, robust protocols can be developed based on modern data workflows.

Data Quality and the Rise of Citizen Reporting

Even with improved interoperability, data sanitation remains a critical challenge, particularly given the inherent “dirtiness” of disaster-related data. While sensor data can be verified against other datasets, there has been a notable increase in citizen-submitted information in recent years, requiring careful vetting before dissemination to responders or the public.

Bailey Farren, CEO of Perimeter, a disaster communications platform, emphasized that “citizens often possess the most accurate and real-time information, sometimes even before first responders arrive – we want to facilitate the sharing of this information with government officials.” The key lies in effectively filtering valuable insights from unhelpful or malicious contributions. Raj Kamachee, CIO of Team Rubicon, stressed the importance of verification, a core component of the organization’s infrastructure since 2017. “Increased user engagement translates to more feedback and data,” he explained. “We’re fostering a collaborative, self-service approach.”

AI: Hype vs. Reality in Disaster Response

Does the availability of quality data automatically translate to effective AI applications? Not necessarily, according to Sury of Columbia University. “A crucial caveat is that machine learning and big data applications are not a universal solution,” he cautioned. “While they can process vast amounts of information, they won’t dictate the precise course of action.” He added that first responders are already adept at processing information and may not require additional guidance.

Consequently, AI in disaster response is increasingly focused on planning and resilience. Sury cited OneConcern, a resiliency planning platform, as an example of how data and AI can be integrated into the disaster planning process. He also mentioned the CDC’s Social Vulnerability Index and FEMA’s risk tools, which utilize diverse data signals to generate scalar values for emergency planners to refine their contingency plans.

Hesitancy and Pragmatism

Despite the potential, many experts remain cautious about the power of AI. As discussed previously, data tools must be reliable and accurate in real-time, given the life-or-death stakes. Kamachee of Team Rubicon prioritizes practical utility over cutting-edge technology, stating, “We embrace high-tech solutions, but we prepare for low-tech scenarios,” emphasizing the need for agility and adaptability in disaster response.

Elichai of Carbyne observed a “sensitivity and reluctance to adopt” new technologies within the market, while acknowledging that “AI will undoubtedly provide benefits at some point.”

Naillon of T-Mobile echoed this sentiment, stating that “we don’t heavily leverage AI” in their disaster planning. Instead, the company relies on data and forecast modeling to strategically position equipment, without the need for complex Generative Adversarial Networks (GANs).

AI's Role in Post-Disaster Recovery

Outside of proactive planning, AI has proven valuable in post-disaster recovery, particularly in damage assessments. Art delaCruz, COO and president of Team Rubicon, noted that technology and AI have significantly improved the efficiency of assessing infrastructure and property damage, crucial for filing insurance claims and facilitating community recovery. As his organization frequently participates in rebuilding efforts, accurate damage triage is a vital component of their response strategy.

The Potential and Perils of AI in Disaster Management

Current applications of Artificial Intelligence are proving beneficial in bolstering resilience planning and aiding in disaster recovery efforts. While AI’s role in immediate emergency response is still developing, its potential across the entire disaster lifecycle is substantial. Excitement surrounds the increasing use of drones, yet concerns persist regarding whether AI and the data it utilizes might ultimately generate more challenges than solutions.

Drones and AI-Powered Survivor Detection

The value of drones in disaster response is readily apparent, providing crucial aerial imagery and situational awareness when direct access for responders is restricted. Team Rubicon’s Kamachee highlighted a mission in the Bahamas where drones were instrumental in locating survivors, as roadways were impassable. Images captured by the drones were processed using AI, enabling the team to efficiently identify individuals requiring evacuation. He characterized the technology and its capabilities as exceptionally promising.

data was the new oil, until the oil caught fireThe Importance of Speed and Remote Management

Faster data processing directly correlates with improved response effectiveness, as noted by Cotter of Project HOPE. “Speed is paramount when lives are at stake during disasters,” he stated. Furthermore, the ability to manage responses remotely reduces the need to deploy personnel to hazardous areas, maximizing resource allocation in challenging environments.

Enhanced First Responder Capabilities

Davis of Verizon anticipates wider adoption of drone technology by emergency management agencies for tasks like search and rescue and aerial photography. A proactive approach, involving deploying machines into potentially dangerous situations first, is becoming increasingly common. He believes that ongoing advancements in artificial intelligence will empower first responders to operate more effectively, efficiently, and safely.

A Double-Edged Sword: Risks and Vulnerabilities

As data streams in from sensors and drones, and is processed with greater accuracy, disaster response capabilities stand to improve, potentially even exceeding the capacity to counter the escalating severity of natural disasters. However, a critical question arises: could the AI algorithms themselves introduce new problems?

Technological Risks and Potential for Sabotage

Clark-Ginsburg of RAND offered a balanced perspective, acknowledging that these solutions can also create vulnerabilities. She pointed to the potential for “technological risks leading to disaster and the facilitation of disaster through technology.” Systems are susceptible to failure, errors, and, alarmingly, deliberate sabotage that could exacerbate chaos and damage.

The Growing Threat of Cybersecurity

Bob Kerrey, co-chair of the 9/11 Commission and chairman of Risk & Return, emphasized the increasing importance of cybersecurity in disaster response. He noted that the concept of “zero-day” exploits was nonexistent during the 9/11 Commission’s work in 2004, but now represents a significant threat. Unlike the 9/11 attacks, which required physical presence and hijacking, modern attacks can be launched remotely by individuals operating from anywhere in the world.

A Revolutionary Tool with Potential Drawbacks

Data represents a revolution in disaster response, but it also carries the risk of creating a new set of unforeseen problems. The benefits offered by this technology may be offset by emerging challenges. Just as oil can fuel progress, it can also ignite and cause destruction – a cautionary parallel to the potential of data.

#data#information#oil#digital age#data security