Bluesky is rolling out a fresh wave of moderation updates designed to make the platform safer and more transparent for its growing community. The decentralized network has grown quickly in the past year, and the company says the latest changes are meant to bring more clarity to how it tracks violations and enforces its Community Guidelines.
These updates land in the newest version of the Bluesky app and mark a continued push to set clear expectations for how people communicate on the platform.
The social network describes these Bluesky moderation changes as the next step in building a healthier environment where users feel comfortable expressing themselves.
The company noted that many people come to Bluesky to connect, debate, discover artists, and even build new relationships. But it also acknowledged that online conversations can sometimes drift into behavior that feels harsher than what would happen in person.
The timing of the update is notable. It follows a recent incident that stirred conversation across the platform. Writer and influencer Sarah Kendzior was suspended after posting a comment that referenced a Johnny Cash lyric. She said she wanted to “shoot the author of this article just to watch him die,” which Bluesky interpreted as a threat of violence.
The mention was tied to a Cash song and was meant as commentary about an article she disliked. Even so, moderators read it literally and enforced a suspension. The moment sparked debate about intent, context, and how far platforms should go when interpreting language.
The new standards appear designed to help avoid similar misunderstandings while still keeping the platform safe. Bluesky is expanding the reporting categories inside the app. Until now, users could choose from six options when flagging a post. The update increases that to nine.
The goal is to help people report harmful content more accurately and to give moderators the context they need to act quickly. Users can now report things like youth harassment, bullying, or content related to eating disorders. There is also an option to flag possible human trafficking, which helps Bluesky comply with new global safety laws such as the United Kingdom’s Online Safety Act.
Alongside this change, Bluesky improved its internal moderation tools. Moderators can now track violations and enforcement actions in a single view. This removes the guesswork and makes it easier for the team to stay consistent when reviewing cases. The company stressed that it is not changing what it enforces. Instead, it is improving the way it documents and handles issues so that users have more confidence in the system.
A major part of the update is the revamped strike system. Every flagged post is now assigned a severity rating. The rating influences the type of enforcement action taken. Content that falls into the “critical risk” category could lead to an immediate permanent ban.
Lower-level issues may get temporary penalties. But repeated violations can push an account toward more serious action over time. This structure creates a clearer path for enforcement and makes expectations easier to understand.
Users involved in an enforcement case will also receive more detailed explanations. When an account gets penalized, the user will be told what guideline was violated, the severity level assigned, how many violations they have accumulated, and how close they are to the next threshold.
They will also see how long a suspension will last and when it will end. Bluesky says this helps reduce confusion and gives people a chance to correct their behavior. It also allows users to appeal actions they believe were taken in error.
These updates build on the revised Community Guidelines introduced in October. Bluesky has been tightening its policies as the platform expands, and the company says these improvements are part of a long-term plan to create a healthier network. The platform wants to avoid the kind of toxicity that has overtaken older centralized social networks where snark, hostility, and harassment have become normal.
Yet even with these improvements, some Bluesky users are still frustrated. A long-running controversy centers on a figure known for writing divisive commentary about trans issues. Critics argue that the user continues to violate Bluesky’s values, while others say that removing the account would undermine the platform’s commitment to hosting a wide range of voices. The tension resurfaced in October when CEO Jay Graber responded to complaints in a way many felt was dismissive. The conversation resurfaced again as the new moderation changes were announced.
This friction points to a deeper challenge. Bluesky wants to be seen as a broad, inclusive network rather than a political refuge. Much of its early community came from people leaving Twitter after feeling alienated by the platform’s shift under Elon Musk. Many of these users assumed Bluesky would naturally align with their values. But the company wants to strike a balance between safety, openness, and the decentralized ethos at the heart of its mission.
Bluesky also faces pressure from an increasing number of laws that require social platforms to protect users, especially minors. These regulations come with steep penalties for noncompliance. Earlier this year,
Bluesky temporarily blocked access in Mississippi after determining that it did not have the resources to meet the state’s age assurance law. The law could have exposed the company to fines reaching ten thousand dollars per user. That episode highlighted the difficult choices small platforms face when trying to grow while also navigating complex regulatory environments.
The Bluesky moderation changes reflect the company’s attempt to protect its users without losing the spirit of open conversation that helped it grow. The network is still young, and the team appears focused on preventing the community from drifting into the chaos and hostility seen elsewhere.
By improving reporting options, adding clearer enforcement rules, and increasing transparency, Bluesky hopes to create a stronger foundation for the kind of diverse communities it wants to attract.
At the same time, the company knows that trust is not built through policies alone. Moderation decisions shape how people feel about a platform, and even small choices can ripple across the network. That is why Bluesky is trying to show its work, reveal more details about enforcement, and give people tools to appeal.
The coming months will show whether these updates are enough to strengthen confidence among existing users while also making the platform a more welcoming place for new ones.
Bluesky’s rapid growth means the stakes are higher now. The network’s identity is still forming. These moderation shifts suggest the company wants a platform where safety, respect, and transparency guide how people interact, while still leaving room for debate and creativity.
As more users join and regulations evolve, Bluesky will need to continue refining how it moderates content. But for now, the latest update marks one of its biggest steps toward building a healthier and more accountable social space.