Decentralized Web of Trust Moderation would provide transparent, uncensorable, automated curation competitive with the corpos' algorithms.

I don't think decentralized WOTM can ever be p2p.. The application needs to be able to decide reachability in microseconds over large graphs. It's intensive. It would probably have to involve outsourcing computation to specialists.

If you're squeamish about agoric computing, get some gloves. It is going to be needed.

· · Web · 1 · 2 · 2

It is unimaginably difficult to specialize computational work without paying the servers.

Some see the necessity of payment as a barrier to access. This needn't be true. The costs of running a server, per user, are always low. Almost everyone can afford to cover the costs of the servers they use, for the ones who can't, WOTs provide a decent way of figuring out who's a real person and covering their costs through some public funding/donation process.

@faun obviously i disagree on the p2p-part, but i'm happy to see you write more on the subject. more of this, faun! :tealheart:

@cblgh I'll try to write this up properly tomorrow because I think this is an important question with interesting points on both sides, but, one of the reasons that occur to me just trying to remember why I think this:

@cblgh It's important for there to be some very very large graphs, for instance, the `isn't a spammer` web, or the `unique human` web, those ought to encompass pretty much everyone in the world, and you should want to be able to include fairly distant people in your own queries over it, you wouldn't want to try to store and run that on your own computer

@faun i disagree on this importance though, so that's probably where we differ. particularly the "unique human web" angle, legibility of that kind falls outside my interests at the same time as it enters the interests of states and profit extracting entities (spammers would love having access to a unique human web, for instance)

@faun i think i understand what you mean though and why you think it would be important. my understanding of that, in brief:

in global singleton networks you would want to filter out the spammers, and that would be a benefit for everyone (nobody likes unsolicited wünderpill salesmen). similarly valuable for such networks is being guaranteed to have engagements with verified authentic people.

am i off by much with the above re why you consider large graphs important?

@cblgh That's definitely part of it, that's important for making it possible for most people to message strangers. Otherwise most places would end up being pretty exclusive, lots of people just wouldn't be able to talk.

@cblgh Another example of a necessarily large graph would be, any taste web, `good music`, but in that case queries would tend to be limited to fairly near nodes, easier to divide

@cblgh It also enters the interests of anyone trying to make a UBI, or other decentralized democratic systems. I think listening to Charles Hoskinson has made me really believe that the Cardano Treasury or something like it could just step up and replace legacy states.

I'm pretty sure Proof of Work (and most instances of Proof of Stake) never would have had any utility if we'd had a decentralized means to identify individual humans, they were pretty much only ever proxies for that

@cblgh I notice that to get to a billion humans.. there's no way that's going to fit on one machine, so I'm going to need to find something for splitting the workload anyway, so isn't it conceivable that you could just take that sharding solution and turn up n until it's p2p..

Sign in to participate in the conversation

Revel in the marvels of the universe. We are a collective of forward-thinking individuals who strive to better ourselves and our surroundings through constant creation. We express ourselves through music, art, games, and writing. We also put great value in play. A warm welcome to any like-minded people who feel these ideals resonate with them.