New data science method makes charts easier to read at a glance

October 18, 2018

New York, NY--October 18, 2018--Doctors reading EEGs in emergency rooms, first responders looking at multiple screens showing live data feeds from sensors in a disaster zone, brokers buying and selling financial instruments all need to make informed decisions very quickly. Visualization complexity can complicate decision-making when one is looking at data on a chart. When timing is critical, it is essential that a chart be easy to read and interpret.

To help decision-makers in scenarios like these, computer scientists at Columbia Engineering and Tufts University have developed a new method--"Pixel Approximate Entropy"--that measures the complexity of a data visualization and can be used to develop easier to read visualizations. Eugene Wu, assistant professor of computer science, and Gabriel Ryan, who was then a Masters student and now PhD student at Columbia, will present their paper at the IEEE VIS 2018 conference on Thursday, October 25, in Berlin, Germany.

"This is a brand new approach to working with line charts with many different potential applications," says Ryan, first author on the paper. "Our method gives visualization systems a way to measure how difficult line charts are to read, so now we can design these systems to automatically simplify or summarize charts that would be hard to read on their own."

Other than visually inspecting a visualization, there have been few ways to automatically quantify the complexity of a data visualization. To solve this problem, Wu's group created Pixel Approximate Entropy to provide a "visual complexity score" that can automatically identify difficult charts. They modified a low dimensional entropy measure to operate on line charts, and then conducted a series of user studies that demonstrated the measure could predict how well users perceived charts.

"In fast-paced settings, it is important to know if the visualization is going to be so complex that the signals may be obscured," says Wu, who is also co-chair of the Data, Media, & Society Center in the Data Science Institute. "The ability to quantify complexity is the first step towards automatically doing something about this."

The team expects their system, which is open source, will be especially useful to data scientists and engineers who are developing AI-driven data science systems. By providing a method that allows the system to better understand the visualizations it is displaying, Pixel Approximate Entropy will help to drive the development of more intelligent data science systems.

"For instance, in industrial control an operator may need to observe and react to trends in readouts from a variety of system monitors over time, such as at a chemical or power plant," Ryan adds. "A system that is aware of chart complexity could adapt readouts to ensure the operator can identify important trends and reduce fatigue from trying to interpret potentially noisy signals.

Wu's group plans to extend data visualization to use these models to automatically alert users and designers when visualizations may be too complex and suggest smoothing techniques, and to develop other quantitative perceptual models that can infor the design of data processing and visualization systems.
About the Study

The study is titled "At a Glance: Pixel Approximate Entropy as a Measure of Line Chart Complexity."

Authors are: Gabriel Ryan and Eugene Wu (Columbia Engineering), and Abigail Mosca and Remco Chang (Tufts University).

The study was funded by grants from the National Science Foundation (#1527765 and #1564049).



Columbia Engineering

Columbia Engineering, based in New York City, is one of the top engineering schools in the U.S. and one of the oldest in the nation. Also known as The Fu Foundation School of Engineering and Applied Science, the School expands knowledge and advances technology through the pioneering research of its more than 220 faculty, while educating undergraduate and graduate students in a collaborative environment to become leaders informed by a firm foundation in engineering. The School's faculty are at the center of the University's cross-disciplinary research, contributing to the Data Science Institute, Earth Institute, Zuckerman Mind Brain Behavior Institute, Precision Medicine Initiative, and the Columbia Nano Initiative. Guided by its strategic vision, "Columbia Engineering for Humanity," the School aims to translate ideas into innovations that foster a sustainable, healthy, secure, connected, and creative humanity.

Columbia University School of Engineering and Applied Science

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to