DeepER tool uses deep learning to better allocate emergency services

November 19, 2020

BINGHAMTON, NY -- Emergencies, by their very nature, are hard to predict. When and where the next crime, fire or vehicle accident will happen is often a matter of random chance.

What can be measured, however, is how long it takes for emergency services personnel to consider a particular incident to be resolved -- for instance, suspects apprehended, flames extinguished or damaged cars removed from the street.

New York City is among the large urban areas that maintain those kinds of statistics, and a team of researchers at Binghamton University, State University of New York has used deep-learning techniques to analyze the numbers and suggest improved public safety through re-allocation of resources.

Arti Ramesh and Anand Seetharam -- both assistant professors in the Department of Computer Science at the Thomas J. Watson College of Engineering and Applied Science -- worked with PhD students Gissella Bejarano, MS '17, and Adita Kulkarni, MS '17 (who earned her doctorate earlier this year), and master's student Xianzhi Luo to develop DeepER, an encoder-decoder sequence-to-sequence model that uses Recurrent Neural Networks (RNNs) as the neural network architecture.

The research utilized 10 years of publicly available data from New York City's five boroughs, broken down by categories and subcategories reflecting the types of emergencies as well as the time between when the incident was reported and when it "closed."

"Multiple events can occur at the same time, and we would expect the timetable to resolve those incidents to be longer because the personnel, resources and equipment are going to be shared across the incident sites," Seetharam said. "That is reflected in the resolution times. Then we use that to predict what's going to happen in the future."

This latest study builds on previous research looking at similar data for non-emergency events -- essentially all of the 311 calls throughout New York City.

"The differences between the two sets of data are that emergency incidents are fewer in number, and non-emergency incidents are a little more predictable," Seetharam said.

"Emergency incidents are harder to predict, such as when a fire is going to start or the nature of that fire. The resolution time would depend on how big the fire is. Non-emergency incidents are more predictable. A streetlight is not working, a repair technician is sent, and it gets fixed."

The research team believes that DeepER could be tweaked for other large cities such as Los Angeles and Chicago, or possibly a cluster of smaller cities with similar characteristics that would provide enough data to make predictions.

"You need to understand the characteristics of that particular city," Seetharam said. "For instance, Los Angeles may have fewer incidents related to structural problems during winter because they do not see snow. That could be a different set of incidents.

"The only practical difficulty would be how they collect their data and how they label their data. If similar incidents are labeled in the same way, we can train the model on these other numbers."
The research paper, "DeepER: A Deep Learning based Emergency Resolution Time Prediction System," was published in IEEE CPSCom 2020.

Binghamton University

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to