Fact-Check Dispatch: Devastating earthquake spawns fakes, AI-generated emergency services, and protesting Pikachu
Issue 14 of the Spotlight Fact-Check Dispatch
Spotlight is a newsletter created by Eurovision News in collaboration with Members of the European Broadcasting Union (EBU). It aims to combat misinformation and promote fact-checking efforts in Europe. The newsletter serves as a platform to showcase the fact-checking work of European public service media broadcasters.
Earthquake in Myanmar spawns fakes
The past week brought an earthquake in a notoriously difficult part of the world for the flow of information: Myanmar. In the hours after the earthquake struck, a flood of real videos, pictures and social media posts were published from Bangkok in Thailand – where the earthquake was also felt. But closer to the quake’s epicentre in Myanmar, an information vacuum opened up.
Amid a clear appetite across the internet for any material showing the scale of the disaster, a slew of AI-generated or misrepresented content was widely shared.
One such image was that of a road cleaved open by the earthquake, which was investigated by Thais Porto of the Eurovision Social Newswire. The image was posted with claims that it showed damage in Chiang Mai, a city in Thailand near the Burmese border.
After tracing the picture back to an article published in 2017, which credited it to a 2011 International Business Times article about a powerful earthquake in Myanmar, Porto concludes: “IBT credited the picture to Reuters, saying it showed a road in Tarlay, in Myanmar’s Shan State, where the earthquake hit.”
Similar pictures showing large cracks on the roads in the Tarlay township were published by Getty Images in March 2011, Porto writes.
Our colleague Pascal Siggelkow from Tagesschau also looked into the numerous fakes circulating after the earthquake for ARD-faktenfinder. There was yet another showing a large crack in a road, which Siggelkow was able to trace back to the New Zealand earthquake in 2011 through a reverse image search.
Siggelkow also notes the proliferation of AI-generated images and videos in the aftermath of the quake.
“One video shows a half-destroyed high-rise leaning heavily to one side. Fire engines can be seen on the street. However, there is no other footage of a high-rise building that looks exactly the same and leans so heavily to one side. Furthermore, the emergency personnel in the video move very unnaturally; some appear to be standing still. A man, for example, simply walks through a vehicle,” he writes.
But not all of the errors are so easy to spot. The ARD-faktenfinder piece outlines one video that featured distinctive temples that are similar to those found in Myanmar. However, “a reverse image search reveals the original video, which contains a watermark for an AI generator at the edge of the image” which had been deliberately cropped out in later reshares.
The EBU’s Jenny Hauser similarly found a video from UNESCO World Heritage Site Bagan — showing much of the area reduced to rubble — to be AI-generated.
We also had a reminder that all is not always as it seems when Jenny Hauser of the Eurovision Social Newswire analysed an EU Copernicus satellite image showing damage.
The European Union space programme’s Earth observation programme published a map of a village located just northwest of Mandalay covered in red dots, denoting totally destroyed buildings, and attributing the destruction to the earthquake.
Questions were initially raised by OSINT accounts on X about the very certain attribution of the damage to the quake, and Hauser analysed older satellite imagery to compare the situation before and after the earthquake.
“Much of the damage may, in fact, date back to 2023. NASA’s Fire Information for Resource Management System has previously recorded major fires in Pa Du village. One on May 22, 2023 and another on June 26, 2023,” Hauser writes.
Millions of views on false visuals of Turkey unrest
The recent unrest in Turkey also threw up some examples of misrepresented and AI-generated content.
The bizarre appearance of an inflatable Pikachu at the protests was an immediate viral hit. Even though such comedic — and surreal — scenes would usually be an immediate red flag, someone dressed as the big yellow Pokémon character did indeed show up to protest. But one image that was seen millions of times was found to be AI-generated by our colleague Javier Menasalvas at VerificaRTVE.
The clues came from an AI detection tool, as well as mistakes in the lettering and spelling on text seen on the back of police uniforms and on police vehicles. The RTVE team also compared the police trucks seen in real footage from the protests with the law enforcement vehicles seen in the image.
Staying in Turkey – or should I say Serbia? – Thais Porto from the Eurovision Social Newswire probed the provenance of a clip that purported to show a large nighttime protest in Istanbul.
The footage of the huge crowd was shared by news organisations in Paraguay and Venezuela, as well as by a news aggregator account on X.
Porto was able to find older versions of the clip, including a video from Belgrade in December, verified by the Social Newswire team on the day of the protest. “A geolocation analysis of the video also shows landmarks in Belgrade’s Slavija Square,” she reports.