This article is part of a series, Bots and the ballot: how artificial intelligence is reshaping elections around the worldpresented by shiny,
When Hamas attacked Israel on October 7, many people sought updates from their main source for news: social media.
But unlike previous global conflicts, where digital discourse was dominated by Facebook and X (formerly Twitter), the ongoing Middle East crisis has seen people flock to TikTok by the millions to connect with news and express opinions.
Even as the video-sharing app has grown in popularity, the inner workings of its complex, artificial intelligence-powered algorithms remain a mystery.
Individuals see only a fraction of the content posted on TikTok every day. And what they see is highly curated by the company’s automated systems designed to keep people glued to their smartphones. Using AI technology called machine learning and so-called recommendation systemThese systems determine within milliseconds what content to display to social media users.
Politico shed light on how TikTok’s algorithms work, and how to uproot which side in the war in the Middle East – Israeli or Palestinian – was winning hearts and minds on the social network, which is now overwhelmingly followed by young people. Actually being liked.
This has become a hot political question after pro-Israel groups and some Western lawmakers accused TikTok, owned by Beijing-based ByteDance, is unfairly promoting pro-Palestinian content for possible political influence. TIC Toc deny the allegations,
The political effects of the conflict are already evident in partisan clashes in Western democracies as people choose sides in the war – and decide how to vote. US President Joe Biden’s support for Israel Criticized by Arab-Americans, and it could ultimately cost them the November election. In the United Kingdom, populist independent candidate George Galloway used pro-Palestine sentiment to win a seat in the British Parliament in March. Protest in university campus have burst into tears But both sides Of the Atlantic.
TikTok’s algorithms are crucial to how political content of all types reaches social media feeds. Examining a company’s algorithms is a good proxy for how artificial intelligence is now a major player in determining what we see online.
Politico teamed up laura adelsonA researcher from Northeastern University in Boston tracked pro-Palestine and pro-Israel TikTok content over the four months between October 7, 2023, and January 29, 2024.
This involved creating a list of 50 popular hashtags such as #IStandWithIsrael or #SavePalestine that could be directly linked to either party. More apolitical hashtags like #Gaza or #Israel were used to collect data on posts that had no specific leaning.
In total, Adelson analyzed 350,000 TikTok posts from the United States.
To make the data more digestible, she broke posts into three-day windows around specific events. This includes Hamas’ initial attacks (7–9 October); Israel’s offensive on Gaza (October 27–29); And to control for the release of the first Israeli hostages (24–27 November) bias, they also included 6–8 November in the analysis, as a proxy for the period when no major events occurred.
“Like other social media platforms, TikTok amplifies some content more than others,” Adelson said. “This can have a distorting effect on what people see in their feeds.”
What emerged was evidence that TikTok is grappling with its role – in real time – as one of the main global digital city squares where people gather to express their opinions and often disagree.
Over a period of four months, Adelson’s research Based on the hashtags analyzed, it was found that pro-Palestinian content was produced approximately 20 times more than pro-Israel content. Yet this does not necessarily equate to more pro-Palestine posts appearing in the average person’s TikTok feed.
Instead, Adelson found three distinct times when the likelihood of people seeing pro-Israel or pro-Palestine content in their TikTok feed changed markedly — no matter how much overall content was being produced by both sides. Was.
TikTok did not respond to specific requests for comment about the Northeastern University research. one in blog post In April, the company said it had removed more than 3.1 million videos and suspended more than 140,000 livestreams in Israel and Palestine for violating its terms of service.
There is a lot that is unknown about how these social media algorithms work. It is not clear who within the companies – engineers, policy officers or top executives – determines how they function. It is also difficult to determine when changes are made, although regulatory efforts in the European Union and the United States are trying to shed a broader light on these practices.
What follows is an example of how, when you dig into the numbers, much of what users see on social media depends heavily on complex algorithms that are used very rarely on a regular basis. – If any – are replaced with inspection.
The TikTok post was collected separately JunkiepediaA repository of social media content managed by National Conference on Citizenship, a non-profit organization. They represent the most viewed partisan posts in each time period.
October 7 – October 27: Pro-Palestinian content dominates
In the first three and a half weeks of the conflict, views per post – the number of times actual content is served to people’s TikTok feeds – skewed toward pro-Palestinian content.
During that time, generally apolitical content such as mainstream news received the most actual views. But between pro-Israel and pro-Palestinian posts, the latter were likely to be included in anyone’s feed, regardless of their stance on the conflict.
October 7-9: Hamas’s initial attacks against Israel
As Hamas attacked Israel, TikTok was flooded with pro-Palestinian views, many of which showed solidarity for the Palestinian cause despite the violent attacks.
13–15 October: Israel warns Palestinians to leave northern Gaza
In the early days of the war, social media users posted harrowing videos of life in Gaza or demonstrations in favor of the Palestinian cause.
October 18-20: US President Joe Biden visits the Middle East
As the US President visited the region, pro-Palestine content dominated people’s feeds, based on average views per post. It included a rallying call for the wider Muslim world to support Gaza.
October 27 – December 15: Pro-Israel content takes off
In late October, without any warning, things began to change at TikTok.
Between October 27 and December 15, based on per post view data, pro-Israel content overtook pro-Palestinian content, despite the total volume of pro-Palestine content still far exceeding pro-Israel content.
In short, over that seven-week period, TikTok users were, on average, much more likely to view pro-Israel content. The most likely explanation – based on overall pro-Palestinian content still outpacing pro-Israel posts – is an adjustment to how the company’s algorithms populate people’s feeds. Academician Adelson told POLITICO that more research is needed to replicate his results.
27–29 October: Israel invades the Gaza Strip
On TikTok, influencers hit back at people who accused them of copying Israeli government talk or attacking celebrities for alleged pro-Palestinian bias.
November 6–8: Quiet period
American pro-Israel groups produced viral videos that showed pro-Palestinian campaigners callously ignoring the hostages’ plight, while others advocated for the country’s law enforcement agencies.
15–17 November: Israeli soldiers entered Al Shifa hospital in Gaza City.
Given America’s close ties with Israel, American social media influencers – many of whom are affiliated with the country’s evangelical churches – took issue with TikTok. Others linked the Middle East conflict to domestic American politics.
24–27 November: Hamas releases first hostages
By far the most viewed content in this period is related to the release of Israeli hostages. It included an emotional reunion between family members and pro-Israel TikTok users explaining what had just happened.
Nov. 30-Dec. 2: End of Hamas-Israel ceasefire
Official social media accounts made their presence felt when hostilities resumed in late November. This also included the Israeli Defense Forces, whose posts were collectively viewed hundreds of thousands of times.
December 15 – January 29: Both sides lose their audience
And then, after December 15, TikTok’s algorithmic approach to these posts changed again.
Gradually, as the conflict continued with no end in sight, both pro-Israel and pro-Palestinian content often failed to reach TikTok users on a per-post views basis. In part, this is due to indifference toward the war as the world’s attention began to focus elsewhere.
But the decline in views for content on both sides was faster than expected due to the decline in TikTok posts made about the war, Adelson said. There may be other explanations besides the company changing its content algorithms. But changes in viewing patterns have not been matched by changes in the amount of content produced over the same period.
December 15–17: Israeli Defense Forces accidentally execute three hostages
Despite the decline in views, pro-Israel posts still provide vivid first-person accounts of what life was like in the country amid the ongoing war.
January 2–4: Israel killed Hamas deputy leader Salih al-Arouri in Beirut.
Tel Aviv had no objection to using TikTok to get its political message across to the world, especially after a South Africa-led effort to hold Israel legally responsible for alleged genocide.
20–22 January: Israeli Prime Minister Benjamin Netanyahu says there is no two-state solution
Four months into the conflict, social media influencers tried to mobilize global support for Palestine through so-called TikTok challenges, which were replicated by multiple accounts.
26–29 January: The United Nations Agency for Palestinian Refugees accused some activists of having ties to Hamas
Part of the pro-Palestinian crowd-sourcing strategy was to highlight to supporters around the world the perceived hypocrisy of those supporting Israel in the conflict.
tiktok effect
Many people – especially those over the age of 30 – consider video-sharing networks to be a sham, mostly a dance craze and digital craze that has nothing to do with politics.
They are wrong.
Adelson said TikTok is similar to other social media giants in that its algorithm was designed to promote what was popular. The logic: serving people what they want to see so they stay as long as possible.
It’s okay when it’s viral videos of dogs or cute babies. It’s an entirely different thing when it’s highly charged political content about a geopolitical hotspot where people are dying every day. Such events leave social networks like TikTok and their automated curation models in the unenviable position of determining what is popular – at the risk of eliminating minority opinions.
“When it comes to politics, like anything else, social media discourse prioritizes the majority,” Adelson said. “We should think very seriously about what this means.”
This article is part of a series, Bots and the ballot: how artificial intelligence is reshaping elections around the worldpresented by shiny, This article was produced by POLITICO reporters and editors with complete editorial independence. learn more About editorial content submitted by external advertisers.