Some viewers following live coverage of the Notre-Dame Cathedral broadcast on YouTube were met with a strangely out of place info box offering facts about the September 11 attacks.
Buzzfeed first reported the appearance of the misplaced fact check box on at least three livestreams from major news outlets. Twitter users also took note of the information mismatch.
Ironically, the feature is a tool designed to fact check topics that generate misinformation on the platform. It adds a small info box below videos that provides third-party factual information from YouTube partners — in this case Encyclopedia Britannica.
YouTube began rolling out the fact checking “information panels” this year in India and they now appears to be available in other countries.
“Users may see information from third parties, including Encyclopedia Britannica and Wikipedia, alongside videos on a small number of well-established historical and scientific topics that have often been subject to misinformation online, like the moon landing,” the company wrote in its announcement at the time.
The information boxes are clearly algorithmically generated and today’s unfortunate slip-up makes it clear that the tool doesn’t have much human oversight. It’s possible that imagery of a tower-like structure burning triggered the algorithm to provide the 9/11 information, but we’ve asked YouTube for more details on what specifically went wrong here.