At Nos.social we believe that both freedom of speech and freedom of listening are required to build healthy social spaces. "Moderation" refers to a set of features that allow users to choose what they consume in social spaces.
Many users are put off by the term “moderation” because of the way it has been implemented on big social platforms. Typically it is performed by the corporation owning the network in conjunction with the government where the company exists. These two parties have all the power over who gets to speak and who doesn’t. The rules are often unclear, unequally enforced, and there is little or no transparency or recourse for injustice.
On the other hand places with little or no moderation tend to become very unhealthy or only comfortable for a very specific group of people. Creating one space or set of rules where everyone can feel comfortable is not possible; which is why our vision is one where thousands of communities, governed by and for the people, are nested in a larger social media commons.
Moderation in a decentralized network therefore must adhere to the following principles:
Our vision is not one moderation strategy to rule them all, but a collection of models that each user and community can apply as they see fit.
Moderation on Nostr will likely evolve in many directions, from complex webs of trust to more traditional designated moderators policing the health of their own communities. Our short-term plan at Nos is to build the simplest tools we can that allow moderation to happen according to the principles listed above. The core users stories for this effort are:
The two biggest risks to realizing our vision of decentralized content moderation are over-fragmentation of vocabulary and legal liability. Some networks (like Mastodon) have allowed freeform content warnings and content reports that have put a massive burden on moderators on the platform. Coming up with a classification system informed by real-world experience is necessary to make moderation possible at scale and enable features like opting in or out of certain types of content. Such a vocabulary saves moderators time and gives users more choice over what they see. Here's an example of a feature that's only possible with this type of vocabulary (from the BlueSky app):
In addition, a functioning moderation system is required for Nostr apps to abide by the Apple App Store and Google Play Store rules, and for relay owners to comply with laws in most countries around the world. If we want Nostr to work for a large part of humanity it is paramount that we enable people and business to comply with these laws and guidelines where they are so inclined.
We’re focused on adding the necessary features for basic decentralized moderation to our app Nos, as well as building a micro-app to allow any Nostr user to engage with this moderation system. We’re working with the community to standardize on a reporting format and vocabulary (see NIP-68/69. We’ve been in touch with experts from Trust and Safety teams at Twitter and Facebook to confirm that our system is feasible. We’d love to hear your feedback on our efforts as we continue on our journey of building a global social media commons powered by Nostr.