Computer scientists create a 'laboratory' to improve streaming video

April 21, 2020

In these days of social distancing, as millions cloister at home to binge-watch TV over the internet, Stanford researchers have unveiled an algorithm that demonstrates a significant improvement in streaming video technology.

This new algorithm, called Fugu, was developed with the help of volunteer viewers who watched a stream of video, served up by computer scientists who used machine learning to scrutinize this data flow in real time, looking for ways to reduce glitches and stalls.

In a scientific paper, the researchers describe how they created an algorithm that pushes out only as much data as the viewer's internet connection can receive without degrading quality.

"In streaming, avoiding stalls depends heavily on these algorithms," says Francis Yan, a doctoral candidate in computer science and first author of the paper, which received the 2020 USENIX NSDI Community Award.

Many of the prevailing systems for streaming video are based on something called the Buffer-Based Algorithm, known as BBA, which was developed seven years ago by then-Stanford graduate student Te-Yuan Huang, along with professors Nick McKeown and Ramesh Johari.

BBA simply asks the viewer's device how much video it has in its buffer. For example, if it has less than 5 seconds stored, the algorithm sends lower quality footage to guard against interruptions. If the buffer has more than 15 seconds stored, the algorithm sends the highest quality video possible. If the number falls in between, the algorithm adjusts the quality accordingly.

Although BBA and similar algorithms are widespread in the industry, there have been repeated attempts by researchers over the years to develop more sophisticated algorithms using machine learning -- a form of artificial intelligence in which computers teach themselves to optimize some process.

But in a modern variation of the old garbage-in-garbage-out computer adage, these machine learning algorithms generally require simulated data to learn from, rather than the real thing delivered over the real internet. Therein lies a problem.

"The internet turns out to be a much messier place than our simulations can model," said Keith Winstein, an assistant professor of computer science who supervised the project and advised Yan along with associate professor of computer science and electrical engineering Philip Levis. "What Francis found is that there can be a gulf between making one of these algorithms work in simulation versus making it work on the real internet."

To create a realistic microcosm of the TV-viewing world, Winstein's team erected an antenna atop Stanford's Packard Building to pull in free, over-the-air broadcast signals which they then compressed and streamed to volunteers who signed up to participate in the research project, known as Puffer.

Starting in late 2018, the volunteers streamed and watched TV programs via Puffer and the computer scientists simultaneously monitored the data stream using their own machine learning algorithm, Fugu, and four other leading contenders, including BBA, that were trained to adjust their performance based on the actual quality conditions the viewers were experiencing.

At the start of their stream, each viewer was randomly assigned one of the five streaming algorithms and the Stanford team recorded streaming data like the average video quality, the number of stalls and the length of time the viewer tuned in.

The results disagreed with some earlier research studies that had been based on simulations or on smaller tests. When the supposedly sophisticated machine learning algorithms were tested against BBA in the real world, the simpler standard held its own. By the end of the trial, however, Fugu had outperformed the other algorithms -- including BBA -- in terms of least interruption time, highest image resolution and the consistency of video quality. What's more, those improvements appear to have the power to keep viewers tuned in. Viewers watching Fugu-fed video streams lingered an average of 5-9% longer than other tested algorithms.

"We've found some surprising ways in which the real world differs from simulation, and how machine learning can sometimes produce misleading results. That's exciting in that it suggests a lot of interesting challenges to be solved," Winstein says.

Stanford School of Engineering

Related Algorithm Articles from Brightsurf:

CCNY & partners in quantum algorithm breakthrough
Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers.

Machine learning algorithm could provide Soldiers feedback
A new machine learning algorithm, developed with Army funding, can isolate patterns in brain signals that relate to a specific behavior and then decode it, potentially providing Soldiers with behavioral-based feedback.

New algorithm predicts likelihood of acute kidney injury
In a recent study, a new algorithm outperformed the standard method for predicting which hospitalized patients will develop acute kidney injury.

New algorithm could unleash the power of quantum computers
A new algorithm that fast forwards simulations could bring greater use ability to current and near-term quantum computers, opening the way for applications to run past strict time limits that hamper many quantum calculations.

QUT algorithm could quash Twitter abuse of women
Online abuse targeting women, including threats of harm or sexual violence, has proliferated across all social media platforms but QUT researchers have developed a sophisticated statistical model to identify misogynistic content and help drum it out of the Twittersphere.

New learning algorithm should significantly expand the possible applications of AI
The e-prop learning method developed at Graz University of Technology forms the basis for drastically more energy-efficient hardware implementations of Artificial Intelligence.

Algorithm predicts risk for PTSD after traumatic injury
With high precision, a new algorithm predicts which patients treated for traumatic injuries in the emergency department will later develop posttraumatic stress disorder.

New algorithm uses artificial intelligence to help manage type 1 diabetes
Researchers and physicians at Oregon Health & Science University have designed a method to help people with type 1 diabetes better manage their glucose levels.

A new algorithm predicts the difficulty in fighting fire
The tool completes previous studies with new variables and could improve the ability to respond to forest fires.

New algorithm predicts optimal materials among all possible compounds
Skoltech researchers have offered a solution to the problem of searching for materials with required properties among all possible combinations of chemical elements.

Read More: Algorithm News and Algorithm Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to