MIT uses AI to kill video buffering

We all hate it when our video is interrupted by buffering or its resolution has turned into a pixelated mess. A group of MIT researchers believe they’ve figured out a solution to those annoyances plaguing millions of people a day.

MIT discovered a way to improve video streaming by reducing buffering times and pixelation. A new AI developed at the university’s Computer Science and Artificial Intelligence Laboratory uses machine learning to pick different algorithms depending on network conditions. In doing so, the AI, called Pensieve, has been shown to deliver a higher-quality streaming experience with less buffering than existing systems.

Instead of having a video arrive at your computer in one complete piece, sites like YouTube and Netflix break it up into smaller pieces and sends them sequentially, relying on ABR algorithms to determine which resolution each piece will play at. This is an attempt to give users a more consistent viewing experience while also saving bandwidth, but it created problems. If the connection is too slow, YouTube may temporarily lower the resolution – pixelating the video- to keep it playing. And since the video is sent in pieces, skipping ahead is impossible.

There are two types of ABR: a rate-based one that measures how fast a network can transmit data and a buffer-based one tasked with maintaining a sufficient buffer at the head of the video.  The current algorithms only consider one of these factors, but MIT’s new algorithm Pensieve uses machine learning to choose the best system based on the network condition

In experiments that tested the AI using wifi and LTE, the team found that it could stream video at the same resolution with 10 to 30 percent less rebuffering than other approaches. Additionally, users rated the video play with the AI 10 to 25 percent higher in terms of ‘quality of experience.’  The researchers, however, only tested Pensieve on a month’s worth of downloaded video and believe performance would be even higher with a number of data streaming giants YouTube and Netflix have.

As a next project, the team will be working to test Pensieve on virtual-reality (VR) video.“The bitrates you need for 4K-quality VR can easily top hundreds of megabits per second, which today’s networks simply can’t support,” Alizadeh says. “We’re excited to see what systems like Pensieve can do for things like VR. This is really just the first step in seeing what we can do.”

Pensieve was funded, in part, by the National Science Foundation and an innovation fellowship from Qualcomm.