U.S. Senator Warns ‘Foreign Intelligence Entities’ Could Abuse YouTube Algorithm

U.S. Democrat Senator Mark Warner, of Virginia, has warned that YouTube’s recommendation algorithm may be susceptible to “manipulation by bad actors, including foreign intelligence entities,” The Guardian reported.

The top-ranking Democrat on the Senate Intelligence Committee said the powerful algorithm is “optimizing for outrageous, salacious, and often fraudulent content.”

An earlier investigation by the Guardian found that the Google-owned video platform was systematically promoting divisive and conspiratorial videos that were damaging to Hillary Clinton’s campaign in the months leading up to the 2016 election.

YouTube’s recommendation algorithm is a closely guarded formula that determines which videos are promoted in the “Up next” column beside the video player. It drives the bulk of traffic to many videos on YouTube, where over a billion hours of footage are watched each day.

However, critics have for months been warning that the complex recommendation algorithm has also been developing alarming biases or tendencies, pushing disturbing content directed at children or giving enormous oxygen to conspiracy theories about mass shootings.

The algorithm’s role in the 2016 election has, until now, largely gone unexplored.

“Companies like YouTube have immense power and influence in shaping the media and content that users see,” Warner said. “I’ve been increasingly concerned that the recommendation engine algorithms behind platforms like YouTube are, at best, intrinsically flawed in optimizing for outrageous, salacious, and often fraudulent content,” Senator Warner said.

“At worst, they can be highly susceptible to gaming and manipulation by bad actors, including foreign intelligence entities,” he added.

The Guardian’s research was based on a previously unseen database of 8,000 videos recommended by the algorithm in the months leading up to the election. The database was collated at the time by Guillaume Chaslot, a former YouTube engineer who built a program to detect which videos the company recommends.

An analysis of the videos contained in the database suggests the algorithm was six times more likely to recommend videos that were damaging to Clinton than Trump, and also tended to amplify wild conspiracy theories about the former secretary of state.