A few times in your life, you meet someone who gets the whole picture. Nikolai Mushegian was one of those people. Below is a distillation of some of his principles in design, ethics and liberatory technology. "What would Nikolai do?", or "How was this not obvious to me before?"
Keep it dumb. Create systems that are simple and have as few dependencies as possible. [1]
Less governance, more automation. The systems you build shouldn't be middleware connecting people to your company. Create systems that automate as much of the work as possible and don't rely on human input. [2]
Ossification. Iterate on rules until you arrive at a good-enough solution, then lock the system down such that it's very hard or impossible to change. This is what makes a system reliable on the long term. Use Kelvin versioning, a quantifiable commitment to ossification. [1] [3]
Gradual decentralization is a myth. DNA matters. The more centralized parts a system has at launch, the harder it will be to give that power away later. Start with as many fixed rules are possible. [4]
No rent-seeking. Granting yourself system-wide privileges just because you started a system is not optimal for you, and it's destructive to others. [5]
Circular flows of value. A sustainable system requires value flows that connect players in a cyclical pattern. [6]
Credible neutrality is an advantage. Creating dumb, ossified systems is hard, they are less efficient than flexible ones, and less profitable on the short term. However, on the long term, they outcompete and outlast systems that are easy to change. [1]
Show systems as they are. Don't hide any detail about a system. Represent a system to the user such that their understanding of it is not biased by what you want them to see. [4]
Root systems in sound foundations. A resilient system must be rooted in something that can resist heavy corruption. Ensure that all levels of the stack are sound, down to the very bottom. [7]
Incentive-compatibility. Incentives are like laws that govern the behavior of players in a system. If a system design doesn't properly reflect the incentives of each player, some will inevitably exploit others. [4]
What can go wrong, will go wrong. Every crack in a system that you try to hide instead of fixing it is guaranteed to be found and exploited. [2] [8]
"The biggest obstacles are the chains we put on each other." Decentralized systems face powerful adversaries, but the main bottleneck is the weaknesses and ignorance in each of us, and how we treat each other. [5] [9] [10]
125K @ 2023-10-29