Alice has the option of finding a generally trusted arbiter, Carol, who she tells the plan to; then, Carol can tell the public how realistic the plan is.
Do we have those generally trusted arbiters? I note that it seems like many people who I think of as ‘generally trusted’ are trusted because of some ‘private information’, even if it’s just something like “I’ve talked to Carol and get the sense that she’s sensible.”
I don’t think there are fully general trusted arbiters, but it’s possible to bridge the gap with person X by finding person Y trusted by both you and X.
I think that sufficiently universally trusted arbiters may be very hard to find, but Alice can also refrain from that option to prevent the issue gaining more public attention, believing more attention or attention of various groups to be harmful. I can imagine cases, where more credible people (Carols) saying they are convinced that e.g. “it is really easily doable” would disproportionally give more incentives for misuse than defense (by the groups the information reaches, the reliability signals those groups accept etc).
Alice has the option of finding a generally trusted arbiter, Carol, who she tells the plan to; then, Carol can tell the public how realistic the plan is.
Do we have those generally trusted arbiters? I note that it seems like many people who I think of as ‘generally trusted’ are trusted because of some ‘private information’, even if it’s just something like “I’ve talked to Carol and get the sense that she’s sensible.”
I don’t think there are fully general trusted arbiters, but it’s possible to bridge the gap with person X by finding person Y trusted by both you and X.
I think that sufficiently universally trusted arbiters may be very hard to find, but Alice can also refrain from that option to prevent the issue gaining more public attention, believing more attention or attention of various groups to be harmful. I can imagine cases, where more credible people (Carols) saying they are convinced that e.g. “it is really easily doable” would disproportionally give more incentives for misuse than defense (by the groups the information reaches, the reliability signals those groups accept etc).