A study found TikTok's algorithm recommended more Republican-aligned content during the 2024 US election. Republican accounts saw more like-minded content, while Democratic accounts were shown more opposing views. This suggests a pro-Republican skew.
Summary
A study found that TikTok’s recommendation algorithm favored Republican-leaning content during the 2024 U.S. presidential race.
TikTok, with over a billion active users worldwide, has become a key source of news, particularly for younger audiences.
Using 323 simulated accounts, researchers discovered that Republican-leaning users received 11.8% more aligned content than Democratic-leaning users, who were exposed to more opposing viewpoints.
The bias was largely driven by negative partisanship, with more anti-Democratic content recommended.
Democrats could have a pretty powerful anti-establishment and anti-billionaire narrative in the near future, and it would probably be pretty successful.
But only if the Democrat billionaires and establishment get out of the way.
Or we could stop hoping the Democrats take the lead and force them to follow the people. However, that would likely get ugly.
It also depends on the non-fascist elements of society putting aside their differences and working together, which has historically been a requirement and a stumbling block any time the populace wants to get out from under the oppression of the 1%.
Using a controlled experiment involving hundreds of simulated user accounts, the study found that Republican-leaning accounts received significantly more ideologically aligned content than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing viewpoints.
Does this mean the algorithm was designed to push a republican agenda? Or does the algorithm know that liberals are more likely to watch videos from the opposing side than conservatives?
I don’t doubt that billion dollar social media companies wanted Trump to win and put their fingers on the scale in whatever way they could. But I wonder how you can prove the algorithm is pushing an ideology at the expense of its users as opposed to the algorithm is just pushing the ideology that gets the most views from its users.
Does this mean the algorithm was designed to push a republican agenda? Or does the algorithm know that liberals are more likely to watch videos from the opposing side than conservatives?
Both of these things can be true.
A friend of mine likes to say, a systems goal is what it does in practice, not its design intent.
Sure, kinda like saying, if it looks like shit and it smells like shit, it’s probably shit. Apt metaphor.
I guess I’m just wondering about the intent. Like, is it possible to prove that an algorithm was designed to have a bias vs the bias is a natural result of what people spend their time watching. I am sure it’s the former, but how does one prove that without leaks from the inside.
You just need to keep screaming "China! China! China! They hacked our elections! We have to stop China!" and either you'll get something banned eventually.
They created 323 “sock puppet” accounts—fake accounts programmed to simulate user behavior—across three politically diverse states: Texas, New York, and Georgia. Each account was assigned a political leaning: Democratic, Republican, or neutral (the control group)...
To analyze the political content of the recommended videos, the researchers downloaded the English transcripts of videos when available (22.8% of unique videos). They then used a system involving three large language models—GPT-4o, Gemini-Pro, and GPT-4—to classify each video. The language models answered questions about whether the video was political, whether it concerned the 2024 U.S. elections or major political figures, and what the ideological stance of the video was (pro-Democratic, anti-Democratic, pro-Republican, anti-Republican, or neutral). The majority vote of the three language models was used as the final classification for each question.
Imagine believing one super sketchy study from some dudes in the UAE ffs. Even if this is accurate (highly doubtful), it might be because kamalacaust didn't campaign in swing states for example. There's no real evidence that this is real and, even if it's real, there's no evidence that's it's due to bias at TikTok.
However there is plenty of evidence that people are trying to ban TikTok because it reveals unfiltered information about palestine, etc.
You quoted a bunch of stuff from the study like there's a problem with it. If you're going to attempt to dismiss the study as bullshit, you should probably try finding evidence of bullshit, rather than pointing and screaming, like we all understand whatever incredibly biased point it is you think you have.
Pre-ban/restoration I used to watch a lot of TikTok. If anything I saw very little if any right wing content, and more anti right wing content. Maybe I was too far left and deemed a lost cause.