How do we, as a Community, Balance Self Regulation vs Mob Mentality
At its core, Crypto is nigh impossible to regulate within itself. The anonymity of the system (inherent) means the only source of regulation is from within.
Wisdom of the Crowd can be powerful, but Mob Mentality (or as I like to call it, Stupidity of the Crowd), can be extremely dangerous and damaging.
Trust systems for things like Eigenvalues within Networks are considered fairly solid, but still capable of brigading / mobs.
I think for Crypto backed, Decentralized internet to function as we have the internet now (Social media and whatnot), we need some form of internal censorship system.
And by that, I mean the ability for our own community to regulate itself, from within itself, in a reasonable and non-Mob way.
So perhaps an example.
Imagine if you will a decentralized, crypto backed Youtube or Reddit or Forum alternative. A means by which Content Creators can upload and apply content to the chain, and other users can upvote/downvote/thumbs up/whatever content to promote it.
Systems can be designed no different from existing social media, to recommend similar content to what you have already voted on, etc.
But this is where our ability to replicate Social Media ends with a dapp.
What happens if people upload illegal/offensive content?
Specifically, how does one tackle the extremely difficult task of making it possible for a community to self regulate out this content, while maintaining decentralization?
One approach I have been batting around is some form of mass spread Filter, customizable and sharable, that users can use. There would be widely popular Filters that the community likely promotes. These could be hosted up open source and contributed to, refined, improved.
These filters would not only act as an injectable / importable means to filter content, but would also be used by the host nodes/workers in the network to block spreading the content they dont want to associate with
In other words, if enough of the network of host nodes had a filter on <offensive content>, uploads of that content would have difficulty propagating amongst nodes and potentially outright fail.
I think any type of Social Media dapp right now that is dipping their toes into getting off the ground needs to pay an extremely serious mind to this complex problem. It is not easy to solve and I doubt there is any silver bullet solution.
If a social media style dapp doesnt pay attention to this, I think its only a matter of time before they find themselves either:
A: Inundated with individuals using it as an anonymous vector to share <offensive content> because its the only place they can (look no further than Tors reputation to see where that gets you)
B: Having to now sacrifice some of their dapp and centralize themselves partially to implement some form of ban/removal/deletion/block/filter/censor policy. Which means no longer truly being a dapp.
Im curious to hear the communities thoughts on this puzzle. Has anyone else here spent time pondering the implications in the future of an unregulated, unregulatable internet?
submitted by /u/lionhart280
[link] [comments]