Though sometimes the obligation to answer is right, right? I guess maybe it’s that obligation works well at some scale, but then becomes bad at some larger scale. In a coversation, it’s fine, in a public debate, sometimes it seems to me that it doesn’t work.
Nathan Young
I think the motivating instances are largely:
Online debates are bad
Freedom Of Information requests suck
I think I probably backfilled from there.
I do sometimes get persistant questions on twitter, but I don’t think there is a single strong example.
Sadly you are the second person to correct me on this @Paul Crowley was first. Ooops.
The solution is not to prevent the questions, but to remove the obligation to generate an expensive answer.
Good suggestion.
Thank you, this is the kind of thing I was hoping to find.
What changes do you think the polyamory community has made?
I find this a very suspect detail, though the base rate of cospiracies is very low.
“He wasn’t concerned about safety because I asked him,” Jennifer said. “I said, ‘Aren’t you scared?’ And he said, ‘No, I ain’t scared, but if anything happens to me, it’s not suicide.’”
To be more explicit about my model, I see communities as a bit like people. And sometimes people do the hard work of changing (especially as they have incentives to) but sometimes they ignore it or blame someone else.
Similarly often communties scapegoat something or someone, or give vague general advice.
Sure sounds good. Can you crosspost to the EA forum? Also I think Nicky’s pronouns are they/them.
It seems underrated for LessWrong to have cached high quality answers to questions like this. Also stuff on exercise, nutrition, parenting and schooling. That we don’t really have a clear set seems to point towards this being difficult or us being less competent than we’d like.
Nevertheless lots of people were hassled. That has real costs, both to them and to you.
If that were true then there are many ways you could partially do that—eg give people a set of tokens to represent their mana at the time of the devluation and if at future point you raise. you could give them 10x those tokens back.
I’m discussing with Carson. I might change my mind but i don’t know that i’ll argue with both of you at once.
Austin said they have $1.5 million in the bank, vs $1.2 million mana issued. The only outflows right now are to the charity programme which even with a lot of outflows is only at $200k. they also recently raised at a $40 million valuation. I am confused by running out of money. They have a large user base that wants to bet and will do so at larger amounts if given the opportunity. I’m not so convinced that there is some tiny timeline here.
But if there is, then say so “we know that we often talked about mana being eventually worth $100 mana per dollar, but we printed too much and we’re sorry. Here are some reasons we won’t devalue in the future..”
Austin took his salary in mana as an often referred to incentive for him to want mana to become valuable, presumably at that rate.
I recall comments like ‘we pay 250 in referrals mana per user because we reckon we’d pay about $2.50’ likewise in the in person mana auction. I’m not saying it was an explicit contract, but there were norms.
From https://manifoldmarkets.notion.site/Charitable-donation-program-668d55f4ded147cf8cf1282a007fb005
“That being said, we will do everything we can to communicate to our users what our plans are for the future and work with anyone who has participated in our platform with the expectation of being able to donate mana earnings.”
″everything we can” is not a couple of weeks notice and lot of hassle. Am I supposed to trust this organisation in future with my real money?
Well they have a much larger donation than has been spent so there were ways to avoid this abrupt change:
“Manifold for Good has received grants totaling $500k from the Center for Effective Altruism (via the FTX Future Fund) to support our charitable endeavors.”
Manifold has donated $200k so far. So there is $300k left. Why not at least, say “we will change the rate at which mana can be donated when we burn through this money”(via https://manifoldmarkets.notion.site/Charitable-donation-program-668d55f4ded147cf8cf1282a007fb005 )
Carson:
Ppl don’t seem to understand that Manifold could literally not exist in a year or 2 if they don’t find a product market fit
Carson’s response:
There was no implicit contract that 100 mana was worth $1 IMO. This was explicitly not the case given CFTC restrictions?
I’ve made a big set of expert opinions on AI and my inferred percentages from them. I guess that some people will disagree with them.
I’d appreciate hearing your criticisms so I can improve them or fill in entries I’m missing.
https://docs.google.com/spreadsheets/d/1HH1cpD48BqNUA1TYB2KYamJwxluwiAEG24wGM2yoLJw/edit?usp=sharing