No, I definitely meant size, not need (or effectiveness or quality of goals or anything else). A larger charity can mount more effective campaigns than a smaller one. This is from the Iron Law of Institutions perspective, in which charities are blobs for sucking in money from a more or less undifferentiated pool of donations. An oversimplification, but not too much of one, I fear—there’s a reason charity is a sector in employment terms.
It is necessary at all times to distinguish whether we are talking about humans or rational agents, I think.
If you expect that larger organizations mount more effective marketing campaigns and do not attend to their own diminishing marginal utility and that most people don’t attend to the diminishing marginal utility either, you should look for maximum philanthropic return among smaller organizations doing very important, almost entirely neglected things that they have trouble marketing, but not necessarily split your donation up among those smaller organizations, except insofar as, being a human, you can donate more total money if you split up your donations to get more glow.
Can’t rational agents then mostly discount your information due to publication bias? In any case where providing information is not to your benefit, you would not provide it.
Discount but not discard. Others have their own agenda and if it were directly opposed to mine such that all our interactions were zero sum then I would ignore their communication. But in most cases there is some overlap in goals or at least compatibility. In such cases communication can be useful. Particularly when the information is verifiable. There will be publication bias but that is a bias not a completely invalidated signal.
A mechanism for making evidence that supports certain conclusions more readily available to agents whose increased confidence in those conclusions benefits me.
No, I definitely meant size, not need (or effectiveness or quality of goals or anything else). A larger charity can mount more effective campaigns than a smaller one. This is from the Iron Law of Institutions perspective, in which charities are blobs for sucking in money from a more or less undifferentiated pool of donations. An oversimplification, but not too much of one, I fear—there’s a reason charity is a sector in employment terms.
It is necessary at all times to distinguish whether we are talking about humans or rational agents, I think.
If you expect that larger organizations mount more effective marketing campaigns and do not attend to their own diminishing marginal utility and that most people don’t attend to the diminishing marginal utility either, you should look for maximum philanthropic return among smaller organizations doing very important, almost entirely neglected things that they have trouble marketing, but not necessarily split your donation up among those smaller organizations, except insofar as, being a human, you can donate more total money if you split up your donations to get more glow.Marketing campaign? What’s a marketing campaign?
Voted up because swapping those tags around is funny.
Rational agents are not necessarily omniscient agents. There are cases where providing information to the market is a practical course of action.
Can’t rational agents then mostly discount your information due to publication bias? In any case where providing information is not to your benefit, you would not provide it.
Discount but not discard. Others have their own agenda and if it were directly opposed to mine such that all our interactions were zero sum then I would ignore their communication. But in most cases there is some overlap in goals or at least compatibility. In such cases communication can be useful. Particularly when the information is verifiable. There will be publication bias but that is a bias not a completely invalidated signal.
In which case the nonprovision of that info is also information.
But it wouldn’t at all resemble marketing as we know it, either way.
Although I now will treat all marketing as a specific instantiation of the clever arguer.
To amplify Eliezer’s response: What Evidence Filtered Evidence? and comments thereon.
A mechanism for making evidence that supports certain conclusions more readily available to agents whose increased confidence in those conclusions benefits me.