As a human living in a human civilization, it's tempting to think that social reality mostly makes sense. Everyone allegedly knows that institutions are flawed and that our leaders are merely flawed humans. Everyone wants to think that they're sufficiently edgy and cynical, that they've seen through the official lies to the true, gritty reality.
But what if ... what if almost no one is edgy and cynical enough? Like, the only reason you think there's a true, gritty reality out there that you think you can see through to is because you're a predatory animal with a brain designed by evolution to murder other forms of life for the benefit of you, your family, and your friends.
To the extent that we have this glorious technological civilization that keeps most of us mostly safe and mostly happy most of the time, it's mostly because occasionally, one of the predatory animals happens to try out a behavior that happens to be useful, and then all of her friends copy it, and then all of the animals have the behavior.
Some conceited assholes who think they're smart also like to talk about things that they think make the last five hundred years or whatever different: things like science (a social competition that incentivizes the animals to try to mirror the process of Bayesian updating), markets (a pattern of incentives that mirrors the Bayes-structure of the microeconomic theory), or democracy (a corporate governance structure that mirrors the Bayes-structure of counterfactual civil war amongst equals).
These causal processes are useful and we should continue to cooperate with them. They sort of work. But they don't work very well. We're mostly still animals organized into interlocking control systems that suppress variance.
Thus—
School Is Not About Learning
Politics Is Not About Policy
Effective Altruism Doesn't Work; Try to Master Unadulterated Effective First
Ideology Makes You Stupid
Status Makes You Stupid
Institutions Don't Work
Discourse Doesn't Work
Language Doesn't Work
No One Knows Anything
No One Has Ever Known Anything
Don't Read the Comments
Never Read the Comments
∀x ∀y, x Is Not About y
X Has Never Been About Y
Enjoy Arby's
But this is crazy. Suppressing variance feels like a good idea because variance is scary (because it means very bad things could happen as well as very good things, and bad things are scarier than good things are fun) and we want to be safe. But like, the way to actually make yourself safer is by acquiring optimization power, and then spending some of the power on safety measures! And the way you acquire optimization power is by increasing variance and then rewarding the successes!
Anyway, maybe someone should be looking for social technologies that mirror the Bayes-structure of the universe sort of like how science, markets, or democracy do, but which also take into account that we're not anything remotely like agents and are instead animals that want to help our friends. ("We need game theory for monkeys and game theory for rocks.")
So, I had an idea. You know how some people say we should fund the solutions to problems with after-the-fact prizes, rather than picking a team in advance that we think might solve the problem and funding them? What if ... you did something like that, but on a much smaller scale? A personal scale.
Like, suppose you've just successfully navigated a major personal life crisis that could have gone much worse if it weren't for some of the people in your life (both thanks to direct help they provided during the crisis, and things you learned from them that made you the sort of person that could navigate the crisis successfully). These people don't and shouldn't expect a reward (that's what friends are for) ... but maybe you could reward them anyway (with a special emphasis on people who helped you in low-status ways that you didn't understand at the time) in some sort of public ritual, to make them more powerful and incentivize others to emulate them, thereby increasing the measure of algorithms that result in humans successfully navigating major personal life crises.
It might look something like this—
-
If you have some spare money lying around, set aside some of it for rewarding the people you want to reward. If you don't have any spare money lying around, this ritual will be less effective! Maybe you should fix that!
-
Decide how much of the money you want to use to reward each of the people you want to reward.
(Note: giving away something as powerful as money carries risks of breeding dependence and resentment if such gifts come to be expected! If people know that you've been going through a crisis and anyone so much as hints that they think they deserve an award, that person is missing the point and therefore does not deserve an award.)
-
Privately go to each of the people, explain all this, and give them the amount of money you decided to give them. Make it very clear that this is a special unilateral one-time award made for decision-theoretic reasons and that it's very important that they accept it in the service of your mutual coherent extrapolated volition in accordance with the Bayes-structure of the universe. Refuse to accept words of thanks (it's not about you; it's not about me; it's about credit-assignment). If they try to refuse the money, explain that you will literally burn that much money in paper currency if they don't take it. (Shredding instead of burning is also acceptable.)
-
Ask if they'd like to be publicly named and praised as having received an award as part of the credit-assignment ritual. (Remember that it's quite possible and understandable and good that they might want to accept the money, but not be publicly praised by you. After all, if you're the sort of person who is considering actually doing this, you're probably kind of weird! Maybe people don't want to be associated with you!)
-
To complete the ritual, publish a blog post naming the people and the the awards they received. People who prefered not to be named should be credited as Anonymous Friend A, B, C, &c. Also list the amount of money you burned or shredded if anyone foolishly rejected their award in defiance of the Bayes-structure of the universe. Do not explain the nature of the crisis or how the named people helped you. (You might want to tell the story in a different post, but that's not part of the ritual, which is about credit-assignment.)
This has the potential to go very badly because this is exactly the situation humans have spent a long time evolveing to deal with (evolving to exploit and counter exploit). The traditional currency for repaying significant assistance is unspecified future favors, assistance, and good treatment. (A person express gratitude to indicate that they will reward their benefactor later).
People are highly tuned to the expected reward of helping someone. This is known as sympathy.
(Someone is worth more sympathy if there are few people helping, they will improve significantly with your help, and they will be able to repay one you have helped)
I don't think there is anything actually wrong with people expecting a reward from helping someone. We have social taboos about expressing it because someone demanding a reward is probably trying to cheat their way to a higher reward.
Inconsistant rewards look a lot like someone cheating the exchange and people will treat them as such.
Publicly declairing that you have rewarded someone looks like you are attempting to make your gift seem larger than comperable ones from other people and will be treated as a minor attempt to cheat. (it would recive the same penalty as bragging)
Publically declaring that you are grateful to someone however, will cause people who have good feeling towards you to to reward that person on your behalf with good feeling and good treatment.
In a nutshell, the traditional system does a better job at rewarding people for helping and with better guards against cheating.
I agree with anders that the traditional system is probably a better idea for rewards.
If it's the case that your helpful friends have a financial need, then a personal, private 'helping hand' seems more appropriate!