top of page

AI Case Study

Youtube's algorithms mistakenly links the fire incident at Notre Dame with 9/11 attacks

One of Youtube's latest features, which is informational panels appearing below videos to provide context and relevant information from websites such as Wikipedia, mistakenly linked the fire at Notre Dame, Paris with the 9/11 attacks. The failure of the algorithms demonstrates the limitations in video system understanding. youtube acknowledged the failure and disabled the feature for live streams related to the fire incident.



Internet Services Consumer

Project Overview

"A new YouTube tool for battling misinformation failed in a highly public way on Monday, wrongly linking video of the flaming collapse of the spire at Notre Dame Cathedral in Paris to the Sept. 11, 2001, terrorist attacks.

As images of the iconic spire falling played on newscasts around the world — and on the YouTube channels mirroring those newscasts — “information panels” appeared in boxes below the videos providing details about the collapses of New York’s World Trade Center after the terrorist attack, which killed thousands of people. There appeared to be few injuries in the Paris fire.

The 9/11 tragedy is a frequent subject of hoaxes, and the information panels were posted automatically, likely because of visual similarities that computer algorithms detected between the two incidents. YouTube began rolling out the information panels providing factual information about the subjects of frequent hoaxes in the past few months.

The misfire underscored the ongoing limits of computerized tools for detecting and combating misinformation — as well as their potential for inadvertently fueling it. While major technology companies have hired tens of thousands of human moderators in recent years, Silicon Valley executives have said that computers are faster and more efficient at detecting problems.

But Monday’s incident shows the weaknesses of computerized systems. It comes just a month after YouTube and Facebook struggled for hours to detect and block video of a mass shooting at a New Zealand mosque that Internet users were posting and reposting.

YouTube said in a statement: “We are deeply saddened by the ongoing fire at the Notre Dame cathedral. Last year, we launched information panels with links to third-party sources like Encyclopedia Britannica and Wikipedia for subjects subject to misinformation. These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire”."

Reported Results

"YouTube acknowledged the failure, which BuzzFeed reported it found on three news channels on the site."




"YouTube and other technology companies have reported successes in using artificial intelligence to detect some types of common images that users upload to their platforms. These include child pornography and also, increasingly, images from extremist terrorist groups, which rely on familiar flags, logos and certain violent images, such as beheadings."



bottom of page