Bluesky Faces Its First Big Challenge Following X’s User Exodus: Controlling the Influx of Trolls

Bluesky has rapidly increased its user base from 14-15 million to 19 million in just a few days. This growth presents not only technical challenges in maintaining platform operations but also significant issues related to moderation.

Jose García

Writer

Tech journalist. Head of new formats at Xataka and TikTok presenter. I specialize in consumer tech and video games. LinkedIn

Although Threads has more users, Bluesky appears to be emerging as the premier alternative to X. Recent events in the U.S. and on Elon Musk’s platform have prompted many users to leave X, leading to significant growth for Bluesky. However, this rapid expansion presents a major challenge for any social media platform: moderation.

Bluesky’s growth. In just a few days, Bluesky has increased its user base to 19 million, adding approximately one million users per day. This growth indicates a genuine interest in finding a viable alternative to X. However, with more users comes more problems.

Reports. Bluesky users can report content they deem dangerous, illegal, or in violation of community guidelines. According to the platform, it received 42,000 reports in just 24 hours. While this may seem low when considering the 19 million registered users, it becomes significant when we look at it from a broader context. In 2023, Bluesky received a total of 360,000 reports.

To put it in perspective, Bluesky received 11.6% of last year’s total reports in just a single day. Naturally, this represents a significant technical challenge for the platform.

In the past 24 hours, we have received more than 42,000 reports (an all-time high for one day). We’re receiving about 3,000 reports/hour. To put that into context, in all of 2023, we received 360k reports. We’re triaging this large queue so the most harmful content such as CSAM is removed quickly.

Bluesky needs to address the issue. The platform, which employs 20 full-time staff members, reports, “With this significant influx of users, we’ve also seen increased spam, scam, and trolling activity.” The company also says that it’s “triaging this large queue so the most harmful content such as CSAM is removed quickly.”

First measures. A straightforward solution is to require newly registered users to verify their email addresses before they can publish content. This simple requirement aims to prevent users with malicious intentions from signing up with temporary email addresses. While it doesn’t eliminate risks entirely, enforcing email verification could reduce potential issues by making it more difficult for individuals to create multiple accounts using temporary emails.

How to avoid the problems that have plagued X. Social media platforms themselves aren’t inherently harmful, but they can become problematic when certain behaviors are tolerated or when moderation is diminished. In 2022, Musk cut 80% of Twitter’s workforce, and since October 2023, X has further reduced its content moderation resources by 20%. As of April 2024, the latest available data indicates that X has 1,849 moderators, which translates to one moderator for every 60,200 users. In contrast, Meta employs 15,000 moderators, providing one for every 17,600 users.

And X has more problems. Its CEO has a firm stance on “freedom of speech,” which can lead to ineffective features, such as the inability to block certain users effectively. Additionally, Musk often engages in behavior that contradicts the terms of service of his own platform. He’s implemented several anti-bot measures, but these have also proven ineffective.

Bluesky wants to differentiate itself and avoid becoming “another” X. To achieve this, it provides some interesting tools.

Combating trolls. Bluesky offers a useful tool that allows users to untag mentions from their posts. If someone mentions you negatively or takes your words out of context to mock or harm you, you can choose to untag that mention, preventing your original post from appearing under theirs. While this doesn’t stop screenshots from being shared, it does provide some level of control.

Another valuable feature is the user list for mass blocking. Anyone can create these lists to group together accounts that discuss specific topics or exhibit certain behaviors. For instance, if you want to avoid interacting with users who spread misinformation about chemtrails, pseudoscience, and flat Earth theories, you can compile all relevant profiles into a single list and mute or block them collectively. Additionally, these lists can be shared publicly, allowing others to benefit from the same filtering.

These features help keep your feed clean and free from harmful accounts. It’s important to note that, unlike X, blocking restricts all forms of interaction. Blocked users won’t be able to like, mention, reply to, and follow you, and their profiles and posts won’t appear in your feed.

Image | Bluesky

Related | How to Stop X (Twitter) From Using Your Posts and Messages to Train Grok, Its Artificial Intelligence Model

See all comments on https://www.xatakaon.com

SEE 0 Comment

Cover of Xataka On