Hey everyone, Rabble here. I wanted to take a moment to address the recent Business Insider article about Jack Dorsey’s funding of Nostr. While the article has inaccuracies and it lacks nuance, this piece it’s also an invitation to all of us to discuss Nostr openly.
Is the article true? First things first: We do not actually know fiatjaf’s identity so we cannot weigh in on the veracity of who he is. His real identity remains a mystery to us. While we do not share Fiatjaf’s beliefs, he’s always been clear that Nostr was built to support all types of speech, even the stuff he personally disagrees with. That’s a fundamental principle of the platform.
Why is Nos built on a protocol that was built by someone who supports fascists? Let’s clear up a major point of confusion. Merriam Webster defines fascism as : a political philosophy, movement, or regime (such as that of the Fascisti) that exalts nation and often race above the individual and that stands for a centralized autocratic government headed by a dictatorial leader, severe economic and social regimentation, and forcible suppression of opposition.
Based on that definition, fascism is then a centralized autocratic government led by a dictatorial leader, enforcing several economic and social regimentation, and suppressing opposition. Nostr, on the other hand, is designed to prevent centralized control altogether. It’s a decentralized network where no single entity has ultimate power.
Nostr is designed so there is no central authority of control. By distributing content across multiple relays, the architecture of Nostr eliminates the possibility of centralized autocratic control.
Nos chose to build an app on the Nostr protocol because traditional corporate social media platforms often stifle views outside the mainstream, these also include views of leftist activists, LGBTQ community, and others. This stifling takes the form of shadow banning when people use terms that the platform doesn’t want to support such as abortion. More recently there has been an uptick in account suspensions and shadow bannings as people use reporting tools to flag accounts that support Gaza. Often the people flagging are looking at older posts and flagging those to get the accounts shut-down.
On the other hand, Nos is about giving users control over their accounts, identities, and feeds. Nostr makes this possible in a way no other protocol does today. At Nos, we are committed to building an app using tools that put the user in charge and Nostr enables this user-first approach.This means that everybody has something to say, and we may not like it.
How does Nos reconcile being on a network that can’t ban people for their views? Unlike corporate social media, such as Facebook, Instagram and TikTok, Nostr is built on the idea of a web of trust, meaning you only see the content from the people you follow. There are no algorithms pushing sensational content to keep you glued to your screen. Corporate platforms thrive on “engagement” and are optimized for eyeballs and time on site. Over the years, the algorithms have learned that the content that is most engaging is content that induces moral outrage in the viewer.
As a result, the corporate platforms feed users more and more morally outrageous content to keep them online. As mentioned before, Nostr operates on a different principle. It’s built on a web of trust, where you only see content from people you follow: Nos and most of the other Nostr clients do not have algorithm-driven feeds, instead content from the people you follow appears in reverse chronological order in your feed. Those clients that do have algorithmic feeds today show the most popular content, but are not optimizing for morally outrageous content.
This means that it is much more difficult for toxic, hateful content to go viral as there’s no behind-the-scenes mechanism amplifying content for views similar to what you have on Youtube and X today. You won’t find the same amplification of divisive content here that you see on these traditional platforms.
Nos offers the ability to have “Freedom from” unwanted content at the user level. There is no central authority shutting down one account or another or blocking certain accounts.
This is especially important for activists. At present on Mastodon and other ActivityPub servers, we are witnessing pro-Palestinian activists accounts’ being blocked from certain servers. While happening at a smaller level, this is still a form of shutting down the dialogue and conversation.
I get it –after more than a decade of algorithmic fueled hot-takes and virtue signaling on X (formerly Twitter), it might be a bit difficult to conceive of a social media experience where dialogue exists, but the network that has evolved on top of Nostr is that space. Yes, as difficult as it sounds, Nostr allows for dialogue without central censorship.
Many folks disagree on Nostr in the same way disagreements used to happen on Twitter (now X) in the early days, where there are long text based dialogues. Folks may walk away still disagreeing and a small subset get nasty, but those conversations do not spiral out of control in the same way they do on X or even on Mastodon and Bluesky today.
And if things get ugly, Nos and a few other apps have user-led moderation tools to help mitigate anything that comes into replies or mentions. Nos is leading efforts to enhance user-led moderation across the network.
This discussion is crucial. We have the chance to reshape the future of decentralized social media and we can build a more open and inclusive digital space. The pathway is and will be messy. How do we balance free speech while protecting users from harmful content? What role should decentralization play in the next generation of social media platforms? I’d love to hear your thoughts and keep this conversation going.
-rabble