As technology firms continue to try to bolster their defenses against fake news, YouTube reportedly plans to introduce information from Wikipedia articles to combat the spread of conspiracy theory videos on the platform.

TechCrunch reported YouTube CEO Susan Wojcicki told a South by Southwest panel Tuesday that the streaming video host will soon launch “information cues”— pop-up text boxes that will contain sourced information from Wikipedia that may dispel the unsourced claims of conspiracy videos.

According to Wojcicki, the informative pop-ups will appear between the video and the video title and description, giving the information prominent placement as the viewer watches the video.

The boxes are expected to be added to videos that discuss topics like "chemtrails," which are the visible trails left by aircraft that some have incorrectly claimed to be chemical or biological agents spread by the government.

Wojcicki didn’t provide additional details about the plan but suggested the solution would provide the platform with flexibility to fight back with sourced information against the spread of conspiracy theories and factually incorrect information.

The solution could be easily scaled, which is a necessity given the number of videos that perpetuate false information. The plan could also be expanded to include information from other sources other than Wikipedia when needed.

YouTube has recently made a point to crack down on conspiracy videos. The approach primarily began after the mass shooting that took place at Marjory Stoneman Douglas High School in Parkland, Florida, that left 17 students and faculty members dead.

A prominent conspiracy theory video claiming one of the victims of the mass shooting was a crisis actor was removed by YouTube. The video reached the top trending spot on the platform and was viewed more than 200,000 times before YouTube took action.

YouTube has delivered “strikes” to a number channels, such as the platform’s most prominent conspiracy theory creators Alex Jones and InfoWars, for spreading misinformation about the shooting.

YouTube is not the first website to try to provide fact-checking to counter false information. Facebook launched a similar experiment in which it added context from fact checkers to known fake news articles. The effort has primarily been considered a failure.