UFO Betting: Put Up or Shut Up
Are you confident in your current ontology? Are you convinced that ultimately all ufos are prosaic in nature?
If so, do you want some immediate free money?
I suspect that LW’s are overconfident in their views on ufos/uap. As such, I’m willing to offer what I think many will find to be very appealing terms for a bet.
The Bet
Essentially, I wish to bet on the world and rationalists eventually experiencing significant ontological shock as it relates to the nature of some ufos/uap.
Offer me odds for a bet, and the maximum payout you are willing to commit to. I will pick 1+ from the pool and immediately pay out to you. In the event that I ultimately win the bet, then you will pay out back to me.
I’m looking to give out between $5k-10k, but depends on what kinds of offers I get, could be more or less.
The Terms
I Send you $X Immediately, You pay out Odds*X if I win
ie, You offer 200:1 odds with max payout $20,000 and I will send you $100 immediately.
5 year time horizon starting from the date we confirm our bet.
You offer the odds and maximum payout, I will pick from the available offers to maximize my expected returns, subject to my financial constraints.
Resolution Criteria
Two Worlds: All-ufos-are-ultimately-prosaic, and Not-all-ufos-are-ultimately-prosaic. I win the bet if we come to believe we likely live in the latter world. I win the bet if the ufo story ultimately gives us LW’s a significant ontological shock. I win the bet if the ufo story ultimately causes the LW community to stop, melt, and catch fire. I’ve found it difficult to precisely nail down how to phrase this, so I hope its clear what kind of criteria I’m trying to get at.
Examples of things where if we come to believe at least one of them likely explain >0 ufo/uap cases, then I win the bet:
Aliens / Extraterrestrials
Biological
Machines (Von Neumann probes, for instance)
Actual magic/spiritual/paranormal/psychic phenomenon
This explicitly does NOT include merely advanced “mentalist” type things / show magic
ie, things like ESP, astral projection, demons, god(s), angels, ghosts, remote viewing, fairy’s (actually anomalous, not just new kind of bird), etc.
Basically, the kinds of things that standard atheist materialists would reject as not being real.
Time travel
ie, future human activities (or otherwise)
Leftovers of an ancient civilization
Some other unknown non-human advanced civilization on earth
Matrix Glitches / The simulators have a sense of humor
Some other explanation I’m missing that’s of a similar level of “very weird”
Merely advanced “normal” human tech would NOT count (+2 gens stealth aircraft/drones, advanced holograms/spoofing, etc)
What WOULD count is if the story is significantly weird enough to cause ontological shock.
example: Secret Manhattan style project with beyond next gen physics, that we had back in the 60′s
Important Note: The bet resolve in my favor if we think that one of the “weird hypotheses” is likely (>50%) true, NOT that we are confident in which specific explanation is true. Essentially, the bet resolves in my favor if we agree with the statement: “Whatever these most perplexing ufo/uap cases represent, they are likely something beyond our current paradigm”
Further Details
I hereby forfeit any “gotcha” cases.
I’m not trying to be slick or capitalize on technicalities. A world in which I win is one where the community would broadly agree that I won.
Determination of resolution in my favor is left up to you.
I reserve the right to appeal to the LW community to adjudicate resolution if I believe I am being stiffed.
I hereby commit to not abusing this right. I don’t expect that I would ever have to invoke it, I suspect it would be very obvious if I win or not to everyone.
If these terms are acceptable, please make an offer and maximum payout amount. I will select from available offers as I see fit. I would prefer to pay out in bitcoin/eth but can work with you for another method.
Cheers :D
- How to make real-money prediction markets on arbitrary topics (Outdated) by 30 Jul 2023 2:11 UTC; 57 points) (
- The UAP Disclosure Act of 2023 and its implications by 21 Jul 2023 17:21 UTC; 36 points) (
- Yet more UFO Betting: Put Up or Shut Up by 8 Aug 2023 17:50 UTC; 10 points) (
- Another UFO Bet by 1 Nov 2024 1:55 UTC; 6 points) (
- 16 Jun 2023 4:40 UTC; 5 points) 's comment on I still think it’s very unlikely we’re observing alien aircraft by (
- 21 Jul 2023 21:39 UTC; 4 points) 's comment on The UAP Disclosure Act of 2023 and its implications by (
- 16 Jun 2023 3:23 UTC; 3 points) 's comment on I still think it’s very unlikely we’re observing alien aircraft by (
- 18 Jun 2023 20:14 UTC; 2 points) 's comment on The foundations of knowledge. by (
- 16 Jun 2023 14:59 UTC; 2 points) 's comment on I still think it’s very unlikely we’re observing alien aircraft by (
- 17 Jun 2023 11:32 UTC; 1 point) 's comment on The Dial of Progress by (
- 13 Jun 2023 4:08 UTC; 1 point) 's comment on Intelligence Officials Say U.S. Has Retrieved Craft of Non-Human Origin by (
- 13 Jun 2023 4:10 UTC; 1 point) 's comment on Intelligence Officials Say U.S. Has Retrieved Craft of Non-Human Origin by (
- 13 Jun 2023 4:11 UTC; 1 point) 's comment on Intelligence Officials Say U.S. Has Retrieved Craft of Non-Human Origin by (
- 13 Jun 2023 4:07 UTC; 1 point) 's comment on Intelligence Officials Say U.S. Has Retrieved Craft of Non-Human Origin by (
My $150K against your $1K if you’re still up for it at 150:1. Paypal to yudkowsky@gmail.com with “UFO bet” in subject or text, please include counterparty payment info if it’s not “email the address which sent me that payment”.
Key qualifier: This applies only to UFOs spotted before July 19th, 2023, rather than applying to eg future UFOs generated by secret AI projects which were not putatively flying around and spotted before July 19th, 2023.
ADDED: $150K is as much as I care to stake at my current wealth level, to rise to this bettors’ challenge and make this point; not taking on further bets except at substantially less extreme odds.
Though I disagree with @RatsWrongAboutUAP (see this tweet) and took the other side of the bet, I say a word of praise for RatsWrong about following exactly the proper procedure to make the point they wanted to make, and communicating that they really actually think we’re wrong here. Object-level disagreement, meta-level high-five.
Glad to have made this bet with you!
How does all of the recent official activity fit into your worldview here? Do you have your own speculations/explanations for why, e.g., Chuck Schumer would propose such specifically-worded legislation on this topic? Does that stuff just not factor into your worldview at all (or perhaps is weighted next to nothing against your own tweeted-about intuitions)?
Meta-level high-five for engaging with a stigmatised topic with extensive reasoning, onto object-level disagreement: What better strategies did you have in mind re. achieving goals like positioning themselves at the top of our status hierarchy? NHI would likely be aware of cognitive biases we are not, as well as those we are (e.g. the biases that cause humans to double down when prophecies fail in cults, and generally act weirdly around incredibly slim evidence).
The highest-status authority, in the eyes of the vast majority of humans, is a deity or deities, and these highly influential, species-shaping status hierarchies are largely based on a few flimsy apparitions. (This is somewhat suspicious, if your priors for alien visitation are relatively high; mine are relatively high due to molecular panspermia.) If you had to isolate seeds for future dominant religions, UFO and UFO-adjacent cults (including Scientology and the New Age) seem like plausible candidates; UFOs are frequently cited as the primary example of an emerging myth in the modern world.
If we assume these results are the desired result, we could hypothesise NHI is using its monopoly on miracle generation to craft human-tailored memetically viral belief systems, from ancient gods to today’s saucers. Given that ancient gods DO occupy the top of our status hierarchy, beyond our corporate, cultural and political leaders, I’m not sure we can be so confident that creating disreputable UFO reports is a poor strategy; less reputable reports dominated the world in a few centuries.
Received.
Eliezer, I’m offering you double the odds at 75:1, and would put up to $50k against your corresponding amount. If the UFO phenomenon is real, p(doom) may be much lower than you think, and in this view you’d probably be happy to pay it out.
Unless, of course, those UAPs turn up, and don’t have biological organisms in them, in which case we’d have the possibility that another civilization developed AI and it went poorly.
...or it is biological and we end up in a situation like 3 body problem/killing-star where the saucer fiends decide to gank us because humans are kinda violent and too dangerous to keep around.
All those super-intelligence as danger arguments also apply to biological super intelligences too.
But most likely: There are no damn UFOs and the laws of physics and their big ugly light speed prohibition still holds.
For all of the people who like these odds, I’ve created a secondary prediction market about this bet, so you don’t have to try to send Eliezer money.
Hey Eliezer, someone sent me your tweet and the link to this specific comment without the wider context and I thought this was a open invitation to put a wager, and I sent out $1000 to your paypal. Are you willing to accept an additional 1000$ bet under the same terms? (apologies for the hassle, completely missed the wider context)
I am not - $150K is as much as I care to stake at my present weath levels—and while I refunded your payment, I was charged a $44.90 fee on the original transmission which was not then refunded to me.
Oh, that’s suboptimal, sending 100$ to cover the fee charge (the extra in case they take another fee for some reason).
Again, apologies for the inconvenience. (wire sent)
Received $95.51. :)
(I like your eagerness to put your money where your mouth is and appreciate you covering the costs when your offer was not accepted.)
Eliezer, I agree with you. I’m curious to know. Have you considered the implications of the presense of an NHI Artifical Intelligence on earth?
I have recently been thinking about this quite a bit. I find it interesting, signifigantly increased interest in AI, due to the release of Chat GPT, almost directly preceeded David Grusch’s whisteblower report. I am becoming more of the opinion, that our advancements in AI, may be related to why disclosure feels so imminent.
I made a very long, detail post, with a TLDR on Reddit. I would be honored to have your opinion on the subject.
https://www.reddit.com/r/UFOs/comments/155te7s/the_most_urgent_question_we_need_answered_do_uaps/?sort=new
Willing to bet against Eliezer 100:1. Up to $50k available for grabs. If others are also interested in the bet, please contact me.
When it comes to solution criteria, it might be useful to have a Metaculus question. Metaculus questions have a good track record of being resolved in a fair matter.
This is the most similar question that I could find that already exists.
https://www.metaculus.com/questions/7384/alien-tech-in-solar-system-before-2030/
If this were to result in a Yes, I would be more inclined to believe that factors like media control or a significant number of people in positions of power going mad, possibly due to mind-hack content, are more likely than aliens hanging around Earth instead of being grabby. A probability of ~1% for this scenario seems reasonable to me.
I’m wondering if there are more reliable ways to verify claims about aliens or supernatural phenomena. It seems like OP is trying to solve this problem by requiring an ontological shift in the community, which can still be manipulated either intentionally or unintentionally. However, there is probably less motivation for such manipulation compared to mainstream media.
Linked question: “Will mainstream news media report that alien technology has visited our solar system before 2030?”
I would say that is far from unambiguous. If one is generous in one’s interpretation of “mainstream” and the certainty described one could say mainstream news has already reported this (I remember National Inquirer articles from the seventies...).
Don’t confuse the headline with the resolution criteria.
The resolution criteria is:
The fine print is:
Do they? My experience has been the opposite. E.g. admins resolved “[Short Fuse] How much money will be awarded to Johnny Depp in his defamation suit against his ex-wife Amber Heard?” in an absurd manner* and refused to correct it when I followed up on it.
*they resolved it to something other than the amount awarded to Depp despite thatamount being the answer to the question and the correct resolution according to the resolution criteria
The resolution criteria does have the sentence “In the event that this trial results in a monetary award for Amber Heard, including legal fees or other penalties imposed by a court, this question will resolve in the negative to the dollar amount awarded Amber Heard.”
It seems that after the judge’s decision, there’s 10,350,000 for Depp and 2,000,000 for Amber. To me, that sentence reads like i’s reasonable to do 10,350,000 − 2,000,000 = 8,350,000
I think the situation is simple enough we can talk directly about how it is, rather than how it might seem.
The question itself does not imply any kind of net award, and the resolution criteria do not mention any kind of net reward. Further, the resolution criteria are worded in such a way that implies the question should not be resolved to a net award. So, if you are to make an argument in favour of a net award it would make sense to address why you are going against the resolution criteria and in doing so resolving to something other than the answer to the question asked.
Here are the resolution criteria, edited for improved readability:
This question will resolve to the total dollar amount awarded to Depp as a result of the ongoing jury trial.
In the event that no money is awarded or the jury does not find Heard responsible or the trial ends without a verdict this question will resolve to $0 USD.
In the event that this trial results in a monetary award for Amber Heard, including legal fees or other penalties imposed by a court, this question will resolve in the negative to the dollar amount awarded
Clause 3, which you quoted, is intended to come into effect only if clause 1 has not already come into effect (this is clear not just because it is the structure of the criteria, but also because otherwise we would reach a contradiction of resolving to both X and not-X). So, clause 3 is not meant to be and cannot be applied to the situation at hand.
Clause 3, even if it did apply to the situation at hand, makes no mention of a net award.
Clause 1, on the other hand, can be applied—following clause 1, the question would be resolved to the total dollar amount awarded to Depp (total, not less any anount), which would be appropriate because it precisley answers the actual question asked: “How much money will be awarded to Johnny Depp in his defamation suit against his ex-wife Amber Heard?”.
Now, you might nonetheless think that it is more reasonable to resolve to a net amount, despite that not being an answer to the question asked, and it being a resolution not supported by the resolution criteria, but if so it would be logical to make an argument for it not based on the resolution criteria, which do not support it. And it would make sense to address the fact that you are going against the resolution criteria and in doing so unneccesarily resolving to something other than the answer to the question asked.
That’s written nowhere in the resolution criteria and something you made up yourself.
As written both #1 and #3 apply. I think reading the phrase “resolve in the negative to the dollar” as being about subtraction is a reasonable reading.
I don’t think a headline should be seen as “the actual question”. I think it makes more sense to see the resolution criteria as the actual question.
You seem to have different intuitions of how the question should be resolved then the Metaculus team or I myself have. It’s generally shouldn’t be surprising that different people have different intuitions.
“This question will resolve in the negative to the dollar amount awarded”
This is a clear, unambiguous statement.
If we can’t agree even on that, we have little hope of reaching any kind of satisfying conclusion here.
Further, if you’re going to accuse me of making things up (I think this is, in this case, a violation of the sensible frontpage commenting guideline “If you disagree, try getting curious about what your partner is thinking”) then I doubt it’s worth it to continue this conversation.
The thing that has me pretty confused about your confidence here is not just that there’s something weird going on here, but, that you expect it to be confirmed within 5 years.
Assume the counterfactual. Actual wreckage has been recovered, and assume that analysis has revealed a smoking gun.
Examples: working “antigravity” (assume it works by some unknown interaction with the mass of the planet and thus respects conservation laws)
Mass Spectrometry of the materials reveals atomic weights outside the known stable elements range
Currently impossible material properties
Electron micrographs show obvious patterning that looks like the object was assembled of cell sized nanorobots
VIN in an obvious alien language (this is weaker without other ontology breaking evidence)
One single update—the analysis of ONE crashed vehicle, by credible individuals with third party confirmation, is enough for ontology breakage.
Only way to win a bet like this is insider knowledge. Maybe the OP has actually observed something in the class of the above.
With all that said, if such evidence exists, why wasn’t it leaked or found by another government or private group and revealed? Probability seems low.
That should let you update at least slightly in favor of the thing he claims being right. That’s how betting and prediction markets work, right?
The potential for a disputed confirmation could also be a problem here.
I can imagine more congressional hearings happening on UFOs, then OP says this is a confirmation that UFOs are of alien or paranormal origin, while the other party disagrees.
Im assuming its due to those silly congress UFO hearings. Not that I can speak on behalf of RatsWrong but I assume thats his thinking.
I would give 200:1 odds for up to 50,000 of my own dollars.
My likelihood for one of the weird hypotheses you listed being true is higher than .5%. However my odds are much lower that we get any significant evidence of those hypotheses being true within the next 5 years and that UFOs + UAPs are caused by that weird hypothesis.
I think the issue is going to be disagreements about what the > 50% likelihood means. A lot of people are saying the current round of military and federal officials coming forward with their stories about the government keeping alien craft in secret facilities is significant evidence in favor of aliens. I would like a resolution criteria that is either public polling (>50% of people polled say that X hypothesis is true) or maybe a particular public figure taking a serious stance (Scott Alexander seriously claims that UFOs are shadow US government 4d vehicles extending into our visible space).
This is the best offer so far! I would love to enter into this bet with you.
I would be perfectly happy with either of those methods of resolution in the event there’s a disagreement. In that event I would be happy for you to more or less entirely dictate the specifics of that process. I commit to operating in good faith with you, and I obviously take as a given that you will do the same.
If you have any other concerns please let me know. Otherwise please provide (either publicly or privately) a means for me to pay you. We can then both confirm here that we have begun our bet.
This will be an accepted on payment kind of deal? I need probably another few days to mull it over. I’ve never committed to a bet where I could potentially have to spend $50,000 in the future. I would feel really dumb if I jumped into it
Clarification, if we agree that the likelihood of non-prosaic UFOs is >50% 4 years into the future but then at the time horizon the likelihood is back down way <50% do I pay or no? This is really unlikely, but what came to top of mind. Also, if I do have to pay in that scenario, how immediate do you want the payment?
Curious what your thoughts are now? Still mulling it over? I would like to make this bet, so please let me know if there are any further concerns you have.
I think your offer to bet did me some good. I don’t think my belief in non-prosaic UFOs is actually much lower than .5%. Either that or seeing you accept worse deals makes me want to negotiate.
If we lower my exposure to $20,000 and the odds to 1:100 I’ll accept the bet with all conditions previously stated in our comment chain and the post. I will also PM you my personal info if you accept.
My btc address: bc1q32lqjmncj07wm2nyppzzuctv4y8q53h4khn8n8
Sounds good to me! I will send $200 worth of bitcoin to this address sometime today
BTC transaction cleared today for 665,668 satoshis, equivalent of $200 usd. Bet is agreed to as above.
Correct, accepted at payment time. If you need more time to think it over, no problem.
Interesting edge case. I would ask that if you at any point became >50% within the time horizon, that you would proactively reach out in short order.
Respectfully, that sounds like the “catch” here, though I doubt you have any actual ill intentions. If it applies at any point within the period, then it could apply for something as simple as a brief miscommunication from the White House that gets resolved within 24 hours. Some overworked and underpaid headline-writer makes a critical typo, aliens suddenly seem confirmed to LWers, and then… it’s game?
I would strongly recommend that you amend that edge case interpretation to only consider the state of things at the end of the period. While there could still technically be a spike of credulity around that time, it would be quite unlikely, whereas if UFOs have actually properly been established at some point in that time period, they will remain so throughout.
A proper Bayesian currently at less 0.5% credence for a proposition P should assign a less than 1 in 100 chance that their credence in P rises above 50% at any point in the future. This isn’t a catch for someone who’s well-calibrated.
In the example you give, the extent to which it seems likely that critical typos would happen and trigger this mechanism by accident is exactly the extent to which an observer of a strange headline should discount their trust in it! Evidence for unlikely events cannot be both strong and probable-to-appear, or the events would not be unlikely.
If the purpose of this betting is to reward those who bet on the truth, though, then allowing a spike in credulity to count for it works against that purpose, and turns it into more of a combined bet of “Odds that the true evidence available to the public and LW suggests >50% likelihood or that substantial false evidence comes out for a very short period within the longer time period”.
In his comment reply to me, OP mentioned he would be fine with a window of a month for things to settle and considered it a reasonable concern, which suggests that he is (rightly) focused more on betting about actual UFO likelihood, rather than the hybrid likelihood that includes hypothetical instances of massive short-term misinformation.
While you are correct that the probability of that misinformation should theoretically be factored in on the better’s end, that’s not what the OP is really wanting to bet on in the first place; as such, I don’t think it was a mistake to point it out.
That’s a reasonable concern. My concern is that without some principal to avoid it, that would just mean that everyone waits out the full 5 years even if its clear I’m the winner.
I wouldn’t mind giving a window of a month for things to settle before there’s a duty to settle. I would still demand that if anyones credence ever goes >50% that they still have to register that publicly (or at least to me)
That sounds reasonable enough.
I’ll take OP’s side of this bet with you if you’re still interested.
I ended up betting with Rats at 100:1 odds for only $20,000 of my own dollars. If you’re still interested in these odds we can set something up.
I am as well, I’ll DM you.
I (a different person than codyz) sent $200 to frontier64 today, finalizing a 100:1 bet we set up in private on the same terms as RatsWrongAboutUAP. If UFOs are non-prosaic, frontier64 will send me $20,000.
I agree, received the $200. Bet is on.
How come you’re trusting essentially random internet strangers to pay up significant sum of money if they lose a bet in up to 5 years?
LW’s with a reputation are a far cry from random internet strangers. I made the bet terms as such to be as frictionless and minimum downside for my counterparties as possible to try and eliminate as many concerns as possible, I do want to make bets afterall.
If I get stiffed I’d be pretty surprised, but I take that risk knowingly.
I think if your P(weird) is 3%, it might be hard for you to in-expectation make money even from someone whose P(weird) is 0.00001%. You should definitely worry about being stiffed to some extent, and both sides should expect small probabilities of other sorts of costly drama. This limits what bets people should actually agree on.
He doesn’t have to. The fact that he probably bet people and won would give him quite some impressive bragging rights. And I guess some would still pay—people are mostly trustworthy.
It’s not random internet strangers: elsewhere he writes, “I was only ever going to engage with people with established reputations because obviously. I reserved the right to choose who to bet with.”
I am not willing to bet about the object-level proposition, but I am willing to bet that he gets paid at least .4 of his winnings. In other words, if it turns out that he won the bet, then I would be willing give you $1000 in exchange for $2500 ($1000 times .4) times whatever fraction he ends up collecting (over the ensuing 5 years, say).
An Update: I have now paid out $4864 to 9 different bettors. From this point on I will only be accepting offers with at least 150:1 odds. I would love to make more bets, so feel free to reach out with offers at any point. Thanks to everyone who has already finalized bets with me.
Would you mind sharing how much you will win if the bet goes your way and everyone pays out?
Also, I would like to see more actions like yours, so I’d like to put money into that. I want to unconditionally give you $50; if you win the bet you may (but would be under no obligation to) return this money to me. All I’d need now is an ETH wallet to send money to.
I would like this to be construed as a meta-level incentive for people to have this attitude of “put up or shut up” while offering immediate payouts; not as taking a stance on the object-level question.
Eth: 0x1E9f00B7FF9699869f6E81277909115c11399296
Btc: bc1qegk25dy4kt2hgx0s6qla8gddv09cga874dr372
So far I have paid out $6164 and I stand to make $515,000 if I win. I appreciate your incentive offer.
I’m Currently on Vacation, I will follow up on this in a week
Your post led me down an interesting path. Thank you. I would love to know your thoughts of the congressional hearing.
I’m willing to pay out a maximum of $800,000 USD for a 200:1 bet of $4k. I’ll pay out at 666:1 if the biologic pilots are confirmed to be supernatural in origin, ex. demons or angels.
Will reach out
I’m a different person from the thread maker. I’ll agree to 150:1 odds and pay out $1000 if I can get some assurance of your reputability. You can see the details of my bet here.
This sounds like the opening premise of a fun TV show or film.
UFO believer makes big bet with (for the sake of TV) one very rich person. Then heads out on an epic road trip in a camper van to find the alien evidence. A reporter covers the story and she starts travelling with him sending updates back to her paper. Obviously they fall for eachother.
They have various fun adventures where they keep encountering unconvincing evidence, or occasionally super-convincing evidence (UFO flys by) that they comically fail to catch on camera. Meanwhile the rich person on the other side of the bet becomes a villain, sending a hench-person to cut the tires on their van, get them in trouble with the police and generally obstruct the process.
We’re still doing this? Fine. No need to arrange any odds or payment methods or your bankroll limits, I’m willing to just commit my money:
(If space aliens are semi-consensus and I weasel out of it anyway, then you will have to settle for laughing at me online.)
While we’re on the topic: in addition to the NY Post expose on the toxic stew of fraud, quasi-embezzlement, abuse of classification, echo chambers, and misinterpreted Chinese spy balloon references (carrying water for a dictatorship), which manufactured most of the recent UFO craze, today the NYT has a profile of Avi Loeb which helps explain where a lot of this UFO noise is coming from: rich old techies subsidizing intellectual hobbies (perhaps due to deep emotional attachment to the belief there must be biological aliens like us out there, we have to be going to the stars, it can’t just be that we’re going to get to AGI and render the whole thing moot). In addition to Yuri Milner, whose role is well-known, there’s also a “Eugene Jhong” who is a ‘software engineer’ I’ve never heard of but who has put at least $1.2m into Loeb’s aliens*, and Charles Hoskinson (Ethereum) appears to be putting in millions, at least, in bankrolling Loeb’s recent oceanographic expedition to find UFO crash debris ($1.5m there alone in addition to use of his private jet).
* He has also apparently put $1.5m into DMT psychedelic research, which seems like a better topic to research…
† Inasmuch as they are the most notable public figures in our circles in terms of I’m-not-saying-it’s-aliens-but-it’s-aliens, which isn’t fooling anyone, so it seems only fair to let them pick if they are ultimately vindicated.
codyz is doubling down on the UFO claims, but as far as I can see, the case has fallen apart so completely no one even wants to discuss it and even Tyler Cowen & Robin Hanson have stopped nudge-nudge-wink-winking it for now.
So I hereby double my unilateral bet to $2,000.
I don’t know what assumptions the OP has, but don’t forget the simulation argument: If you think we are heeded for super intelligence, then the following all become more likely:
You are in a simulation—maybe a weird one, maybe one that messes with your brain
Aliens intervene to prevent the creation of the ASI.
An ASI creates unbelievable effects on Earth.
I don’t think “we’re currently living in a simulation” or “ASI would have effects beyond imagination, at least for the median human imaginer” are such weird beliefs among this crowd that them proving true would qualify for OP to win the bet. Of course, they specifically say that if UAP are special cases in the simulation that counts, but not the mere belief in simulation.
Max bet $50k, I would be totally happy to bet at 50:1 odds.
Let us move forward!
I commit to operating in good faith with you, and I obviously take as a given that you will do the same. If you have any other concerns please let me know. Otherwise please provide (either publicly or privately) a means for me to pay you. We can then both confirm here that we have begun our bet.
I commit to paying up if I agree there’s a >0.4 probability something non-mundane happened in a UFO/UAP case, or if there’s overwhelming consensus to that effect and my probability is >0.1.
Though I guess I should warn you in advance that I expect that this would require either big obvious evidence or repeatable evidence. An example of big would be an alien ship hovering at the fifty yard line during superbowl, repeatable would be some way of doing science to the aliens. Government alien-existence announcements lacking any such evidence might lead to me paying on the second clause rather than the first.
I’ll message you details.
Good enough for me
I have recieved $1000. The bet is on!
Glad we could make this bet!
Enticing offer. Barring better odds and max payout offer that would eat up my budget, I would like I go forward with this. I will wait to see what offers come in first.
You’ve quadrupled my P(aliens or demons or such have been flying around Earth’s atmosphere). Thanks for this post (and this comment in particular).
I’d take a bet at 1:50 odds for $200. I’m happy to let the LW community adjudicate, or for us to talk it over. I’m currently at something like 5e-5 for there being UFOs-as-non-prosaic. So I don’t think I’d be that hard to convince.
Sure, lets bet. Reach out with means to receive payment
I bet RatsWrongAboutUAP $200 at 50:1 odds against us both assigning >50% odds for a non-prosaic explanation for UFOs within 5 years from today. He agreed, and I have received the money. We’ll try to adjudicate the bet ourselves, or failing that, ask the LW community, or whomever is suitable, to adjudicate matters.
Happy to make this bet!
I’ll repeat this bet, same odds same conditions same payout, if you’re still interested. My $10k to your $200 in advance.
Sure, reach out
Hi Algon. I’m interested in also taking taking RatsWrongAboutUAP’s side of the bet, if you’d like to bet more. I’ll also happy to give you the same odds as you just specified. DM me if you’re interested.
I’m interested in my $250k against your $10k.
I could offer $5k against your $185k, Carl. If you’re interested, DM me. Same odds as a European Roulette, albeit with a much delayed payment.
We’ve agreed to make a 25:1 bet on this. John will put the hash of the bet amount/terms below.
Carl and I have ultimately agreed to a 29:1 bet on the combined amount. The term will expire on July 25 2028 and may be extended by no more than 2 days upon reasonable request at Carl’s sole discretion. The resolution criteria is as laid out in the main post of this thread by the user RatsWrongAboutUAP. Unless either of the parties wishes to disclose it, the total amount agreed upon will remain in confidence between the parties.
+1
I’d take the same bet on even better terms, if you’re willing. My $200k against your $5k.
Ted and I agreed on a 40:1 bet where I take RatsWrongAboutUAP’s side. The term will expire on Aug 2 2028. The resolution criteria is as laid out in the main post of this thread by the user RatsWrongAboutUAP. Unless either of the parties wishes to disclose it, the total amount agreed upon will remain in confidence between the parties.
Confirmed.
Hi Ted. I’m interested in also taking taking RatsWrongAboutUAP’s side of the bet, if you’d like to bet more. I’ll also happy to give you the same odds as you just specified. DM me if you’re interested.
I responded to you via DM
To clarify, I would send the $5k now.
Carl, have you written somewhere about why you are confident that all UFOs so far are prosaic in nature? Would be interest to read/listen to your thoughts on this. (Alternatively, a link to some other source that you find gives a particularly compelling explanation is also good.)
No. Short version is that the prior for the combination of technologies and motives for aliens (and worse for magic, etc) is very low, and the evidence distribution is familiar from deep dives in multiple bogus fields (including parapsychology, imaginary social science phenomena, and others), with understandable data-generating processes so not much likelihood ratio.
Carl, I’m interested in also taking taking RatsWrongAboutUAP’s side of the bet, if you’d like to bet more. I’ll also happy to give you better odds than 150:1. DM me if you’re interested.
So I could get 0.5% of the committed payout right away, but would have to avoid spending the committed value for 5 years, even though the world could change significantly in a lot of non UAP-related ways in that time frame. That’s not actually that attractive.
That’s not how I understand it literally. You don’t have to put it to the side/into some savings account. You just have to accept the risk that if you have to pay out in the unlikely case, you have to go into debt.
Yeah for some reason people come up with this absurd complicated mechanism for prediction bets that they don’t apply to pretty much any other form of debt, don’t know why this keeps happening but I’ve seen it elsewhere too.
Or take the risk that you’d feel bad by just … not paying. This is the one which should worry your counterparty, and which leads to escrow requirements.
Assuming the OP only accepts bets with accounts linked to a real world identity, or pseudonymous accounts with a very high reputation, such as gwern, I think it’s safe enough to not require an escrow.
Why would someone who’s built up a reputation in the LW/rationalist/etc. community wreck it, publicly and on-the-record, over <$50k USD?
They’d be sacrificing way more in future potential since no one will willingly work with a scoundrel.
A lot can happen in 5 years. The OP could die. The bettor could die. And who knows, maybe the evidence of aliens is just deniable enough that it doesn’t cost reputation to claim a win.
That doesn’t extinguish the record of the bet, whoever is the heir to their assets would still be responsible for settling the bet, maybe not at the full amount, but some settlement would still be necessary.
That’s already factored into the odds.
LOL! If you think an executor (or worse, an heir if the estate is already settled) is going to pay $100K to a rando based on a 5-year old less-wrong post, you have a VERY different model of humanity than I do. Even more so if the estate didn’t include any mention of it or money earmarked for it.
How do the desires of possible executors/heirs/etc. factor into this?
Clearly the bet will not auto-extinguish and auto-erase itself regardless of the future desires of anyone.
If you thought I implied that the bet must be settled in purely monetary terms, that wasn’t my intention. It’s entirely possible for the majority, or entirety, of the bet to be settled with non-monetary currencies, such as social-status, reputation, etc…
It’s just not all that likely for someone, or their successors, to insist on going down that path.
I made the same argument myself (lol) in response to lsusr regarding Eliezer’s bet with Bryan Caplan:
https://www.lesswrong.com/posts/BknXGnQSfccoQTquR/the-caplan-yudkowsky-end-of-the-world-bet-scheme-doesn-t?commentId=44YGGYcx8wZZpgiof
(hit “see in context” to see the rest of my debate with lsusr)
Somehow it feels different at 0.5% though, as compared to the relatively even odds in the Yudkowsky-Caplan bet. (It’s not like I could earn, say, USD $200k in a few weeks before a deadline, like Eliezer could earn $100). 2% is getting closer to compensating for this issue for me though.
True, but you presumably have to have the ability to pay it someway or another, and that’s still resources that could have been available for something else (e.g. could have gone in to debt anyway, if something happened to warrant doing so).
I did interpret it as a 0.5% thing though, and now that the OP has stated they would be ok with 2% that makes it significantly less unattractive - Charlie Steiner’s offer, which OP provisionally accepted, seems not too far off from something I might want to copy.
However, the fact that OP is making this offer means, IMO, that they are likely to be convinced by evidence significantly less convincing that what I would be convinced by. So there’s a not unlikely possibility that 5 years from now if I accept we’ll get into an annoying debate over whether I’m trying to shirk on payment, when I’m just not convinced by whatever the latest UFO news is that he’s been convinced by. It’s also possible that other LessWrongers might also be convinced by such evidence that I wouldn’t be convinced by—consider how there seems to be a fair amount of belief here regarding the Nimitz incident that if Fravor wasn’t lying or exaggerating it must be something unusual like, if not aliens, then at least some kind of advanced technology (whereas I’ve pointed out that even if Fravor is honest and reasonably reliable (for a human), the evidence still looks compatible with conventional technology and normal errors/glitches).
That might be a hard-to-resolve sticking point since I don’t really consider it that unlikely that a large fraction of LessWrongers might (given Nimitz) be convinced by what I would consider to be weak evidence, and even if it was left to my discretion whether to pay, the reputational hit probably wouldn’t be worth the initial money.
BTW, I don’t consider it super unlikely that there are discoveries out there to be made that would be pretty ontologically surprising, it’s just that I mostly don’t expect them either to be behind UAPs or to be uncovered in the next 5 years (though I suppose AI developments could speed up revelations...)
I also note that some incidents do seem to me like they could possibly be deliberate hoaxes perpetrated within the government against other government employees who then, themselves sincere, spread it to the public (e.g. the current thing and maybe Bob Lazar). If I were to bet I would specifically disclaim paying out merely because such hoaxes were found to be carried out by some larger conspiracy which was also doing a lot of other stuff as well, even if sufficiently extensive to cause ontological shock—I am not comfortable betting against that at 2%. I would be OK, if I were otherwise satisfied with the bet, on paying out conditional on such a conspiracy being proven to have access to an ontologically shocking level of technology relative to the expected level of secret government tech.
Mere government hoax/psyop with no accompanying reality to non-prosaic uap would NOT resolve in my favor, no issue from me on that.
In a world where a sizeable fraction of LW becomes convinced I might win the bet, I would expect that I then wouldn’t have to wait very long before it then became conclusive, so I wouldn’t mind just waiting that out. If in that case, we then hit time horizon constraints before it was definitive to you, then depending on the specifics I definitely would not rule out appealing to the community (or specific ‘trusted’ individuals like Scott Alexander or Eliezer). I find this scenario unlikely to come to pass. I would of course in all cases commit to operating with you in good faith.
If you wish to extend that offer, I indeed will accept 50:1 (max bet size?). If you have any other concerns please let me know.
Regarding if there is evidence convincing to you, but not to me, after the five years:
If the LW community overwhelmingly agrees (say >85%) that my refusal to accept the evidence available as of 5 years from the time of the bet as overcoming the prior against ontologically surprising things being responsible for some “UAPs” was unreasonable, then I would agree to pay. I wouldn’t accept 50% of LessWrong having that view as enough, and don’t trust the judgement of particular individuals even if I trust them to be intelligent and honest.
Evidence that arises or becomes publicly available after the 5 years doesn’t count, even if the bet was still under dispute at the time of the new evidence.
I will also operate in good faith, but don’t promise not to be a stickler to the terms (see for example Bryan Caplan on his successful bet that no member nation of the EU with a population over 10 million would leave before 2020 (which he won despite the UK voting to leave in 2016) (Bet 10 at https://docs.google.com/document/d/1qShKedFJptpxfTHl9MBtHARAiurX-WK6ChrMgQRQz-0)
If you agree to these, in addition to what was discussed above, then I would be willing to offer $100k USD max bet for $2k USD now.
This is more than acceptable for me. Please reach out for a way for me to pay you.
This is to publicly confirm that I have received approximately $2000 USD equivalent.
Unless you dispute what timing is appropriate for the knowledge cutoff, I will consider the knowledge cutoff for the paradigm-shattering UAP-related revelations for me to send you $100k USD to be 11:59pm, June 14, 2028 UTC time.
Glad we could make this bet!
The whole idea conflates refusal to accept the bet for reasons that apply to bets in general, with refusing to accept the bet because you’re not really confident that UFOs are mundane.
If there are reasons to refuse bets in general, that apply to the LessWrong community in aggregate, something has gone horribly horribly wrong.
No one is requiring you personally to participate, and I doubt anyone here is going to judge you for reluctance to engage in bets with people from the Internet who you don’t know. Certainly I wouldn’t. But if no one took up this bet, it would have a meaningful impact on my view of the community as a whole.
It is my opinion that for the LessWrong community in aggregate, something has gone horribly horribly wrong.
At a minimum, LWers should have 1) observed that normies don’t bet like this and 2) applied Chesterton’s Fence.
It’s often hard to give an exhaustive, bulletproof, explanation of why normies act in some way that does, in fact, make sense as a way to act. Rationalists have a habit of saying “well, I don’t see a rational reason for X, so I can just discard X”. That’s what Chesterton’s Fence is about avoiding.
It’s easy to explain why people who hold beliefs for signaling purposes don’t want to bet on those beliefs. It interferes with getting status points by exposing bullshit.
As someone who’s gambled professionally, I believe the (Chesterton’s) fence around betting for normies exists because most bets are essentially scams, which is why I’m entirely okay knocking it down for LWers. Let me elaborate.
Probability is complicated and abstract. Not only that, human intuition is really bad at it. Nearly all “bets” throughout our modern history have not been the kind of skin-in-the-game prediction competition we’re praising on lesswrong—they’ve been predatory. One person who understands probability using emotional and logical minipulation to take someone else’s money, who doesn’t.
Society protects people with taboos. “Betting is icky” is a meme that can easily spread, and will quickly reproduce, becuase it’s adaptive in this betting environment. [Dissertation about Bayesian reasoning, calibration, and the Kelley Criterion] is NOT a meme that can easily spread, because it’s far too complex and long, and thus it will not reproduce (even though it is also adaptive).
Or at least, it can’t spread in the normie population, but it CAN on LessWrong, which is why, on LessWrong, most bets are not scams. They are, in fact, what the scammers falsly proclaimed their own bets to be—friendly competitions wherein two people who disagree about the future both put skin in the game.
The sportsbooks and casinos we have today are predators. From their celebrity endorsements, to the way they form their commercials, to their messaging around winning (and especially parlays), they effectively lie about what they’re selling while trying to create addicts. I’ve engaged with many people across the betting experience spectrum (from other winners, to big losers, to smart people, who were small losers, and realized they needed to quit), and it’s pretty clear to me that “betting = icky” is a reasonable idea, even today The fence around it is not Chesterton’s, though. It’s there to help regular people avoid a certain species of predator gunning for their capital.
We can safely knock it down on here.
I don’t doubt that a lot is wrong with the LW community, both in aggregate and among many individuals. I’m not sure WHAT wrongness you’re pointing out, though.
There are good reasons for exploring normie behavior and being careful of things you don’t understand (Chesterton’s fence). They mostly apply strongly when talking about activities at scale, especially if they include normies in the actor or patient list.
Wagering as a way to signal belief, to elicit evidence of different beliefs, and to move resources to the individuals who are less wrong than the market (or counterparty in a 2-party wager) is pretty well-studied, and the puzzle of why most humans don’t do it more is usually attributed to those illegible reasons, which include signaling, status, and other outside-of-wager considerations.
IMO, that’s enough understanding to tear down the fence, at least when people who choose not to participate aren’t penalized for that choice.
That seems so clear to me that I’m surprised there can be any objection. Can you restate why you think this indicates “horribly wrong”, either as a community, or as the individuals choosing to offer wagers?
I can’t give you an exhaustive list of the problems I have with betting, but some reasons:
Properly phrasing a bet is difficult, like writing a computer program that runs perfectly the first time, or phrasing a wish to a genie. I’m no good at avoiding loopholes, and there’s no shortage of rationalists who’d exploit them as long as they can get a win. And just saying “I won’t prey on any technicalities” isn’t enough without being able to read your mind and know what you consider a technicality.
Betting has social overhead. This is the “explain to your parents/wife/children why you bet this money” scenario.
Some people value money differently than I do. Some people just have glitchy HumanOS 1.0 which leads them to spend money irrationally. Some people are just overconfident. If I bet against such a person I may
win money inbe an overall winner after X years, but until the X years are up, I’ll have essentially lost the argument, because my opponent was willing to spend money—there must be some substance behind his argument or he wouldn’t do that, right?As others have pointed out, it’s a bad idea to trust random people on the Internet to pay me money in X years. “I have a reputation” is not enough when real money is involved. And I don’t have access to the sophisticated information used by financial services in the real world to determine how likely someone is to be able to pay money in the future based on past performance. And it’s not unknown for a trusted person to run away with money. (That wasn’t even the incident I was thinking of, but I couldn’t find that one.)(Edit: does not apply, since you’d be the one paying the money)To get over the Chesterton’s Fence bar, you’re going to need more than just “well, it’s been studied and people do it for irrational reasons”. Social customs evolve as memes, and something that people don’t do for reason X may nevertheless have persisted because it is, for reason X, beneficial.
At any rate, I haven’t seen your studies and I’m not going to trust that you’ve described them properly without some links.
Even if I did get links and read the studies, we get into epistemic learned helplessness. I wouldn’t change my mind about betting just because the studies seem convincing and I can’t find any flaw in them using solely my own knowledge. I’d like to at least hear from opponents of those studies and see how convincing they are, and see how controversial the studies are. Then I’d have to check whether they might be subject to the replication crisis. And at this point, the overhead of researching betting will itself make most bets unprofitable.
Rationalists have a habit of stringing together poorly founded estimates to get more poorly founded estimates and acting based on them. I don’t agree with this practice, but concluding that I should risk money here would imply paying attention to poorly founded estimates.
Thanks for the detail—it makes me realize I responded unclearly. I don’t understand your claim (presumably based on this offer of a wager) that “the LessWrong community in aggregate, something has gone horribly horribly wrong.”
I don’t disagree with most of your points—betting is a bit unusual (in some groups; in some it’s trivially common), there are high transaction costs, and practical considerations outweigh the information value in most cases.
I don’t intend to say (and I don’t THINK anyone is saying) you should undertake bets that make you uncomfortable. I do believe (but tend not to proselytize) that aspiring rationalists benefit a lot by using a betting mindset in considering their beliefs: putting a number to it and using the intuition pump of how you imagine feeling winning or losing a bet is quite instructive. In cases where it’s practical, actually betting reifies this intuition, and you get to experience actually changing your probability estimate and acknowledging it with an extremely-hard-to-fool-yourself-or-others signal.
I don’t actually follow the chesterton’s fence argument. What is the taboo you’re worried that you don’t understand well enough to break (in some circumstances)? “normies don’t do this” is a rotten and decrepit enough fence that I don’t think it’s sufficient on it’s own for almost anything that’s voluntarily chosen by participants and has plausibly low (not provably, of course, but it’s not much of a fence to start with) externalities.
If you’re asking how I would distinguish “horribly, horribly, wrong” from “just somewhat horribly wrong” or plain “wrong”, my answer would be that there’s no real distinction and I just used that particular turn of phrase because that’s the phrase that evand used.
Sure, but “bets that make me uncomfortable” is “all rationalist bets”.
I disagree.
I should be clearer yet. I’m wondering how you distinguish “the community in aggregate has gone (just somewhat) horribly wrong” from “I don’t think this particular mechanism works for everyone, certainly not me”.
If making actual wagers makes you uncomfortable, don’t do it. If analyzing many of your beliefs in a bet-like framing (probability distribution of future experiences, with enough concreteness to be resolvable at some future point) is uncomfortable, I’d recommend giving that part of it another go, as it’s pretty generally useful as a way to avoid fuzzy thinking (and fuzzy communication, which I consider a different thing).
In any case, thanks for the discussion—I always appreciate hearing from those with different beliefs and models of how to improve our individual and shared beliefs about the world.
I would also take issue with the “mundane” part. What does that even mean? Any explanation that is good enough to cover all UFO cases with their myriad of physics-defying feats, is in itself a proof of supertechnology which should also be under the bet.
For example, an explanation that the supposed UFOs are really experimental military aircraft would simply mean that the military possesses technology that is effectively “magic” compared to the civilian aircraft technology. If you witness a flying object that can push Mach 10 effortlessly and takes instant turns without any inertia, does it matter if this is an alien craft or human military craft? It still should belong on the list.
Do you OP have access to secret (non public) information related to the bet?
No.
Asked 6 days ago, still no answer, yet OP commented a bunch in that time. Hmmm..
I will predict that no bet with significant stakes (say, over $200 from the poster) gets made. This is a stunt, and the terms (of resolution and collection) are way too loose to be useful.
update a few days later: an established (ish − 6-month history, with quite a few comments and karma. @simon.) poster has confirmed that approx $2000 payment was received. Something weird could still be discovered, but this raises my estimate of legitimacy from ~15% to ~70% (had thought as high as 85% until I realized that it’s $100k agreement for simon to pay, which is very high for such things). Note that I had a quite low prior that an anonymous offer from a new poster would be legit, so this indicates a pretty big update.
further update: multiple accepted bets and confirmation of payments received. moving closer to the 85% likelihood I previously experienced, which is probably the maximum I can reach until a few months go by without any shenanigans being mentioned. I do hope there will be a summary posted, which can be updated each year as participants acknowledge their ongoing bets and mention any evidence that would change their individual or collective beliefs.
I’m afraid you are going to lose this bet. So long as people come can up with a bitcoin/eth address or a paypal account, there will be no issues.
Don’t be afraid! This is a bet I hope to lose (well, really, a prediction I hope is false—transaction costs keep me from betting). I wish you the best, and I really do appreciate people specifying their beliefs with precision that allows betting.
Given the site and general level of goodwill here, my estimate is maybe as high as 15% that this will result in a significant deposit in the next 2 weeks, confirmed by at least one long-term poster on the site. That’s an order of magnitude higher than I’d give anywhere else, and I’m rooting for you!
Want to bet on your prediction? I’ll give you $100 right now if you’ll commit to sending me $200 if the OP does in fact end up sending LW participants at least $200 as his side of this bet.
(The OP is a complete stranger to me.)
I considered offering a bet with this, and 2:1 against it being real is probably generous. I’d make the bet in person with someone I know, but online hassles with strangers make it not worth the amusement value for me.
Why would the terms as written dissuade people from betting?
I don’t doubt that there will be offers, I doubt bet will be made. My best guess is the OP will fail to find a payment method that works, or will come up with a disagreement about terms that they use to justify backing out.
I look forward to seeing what happens. It’s a GREAT example of the legible, written proposal seeming (and being) great, and the practical human part being rather suspect.
Hesitant to bet while sick, but I’ll offer max bet $20k at 25:1.
Double the odds and I will accept immediately. Otherwise I might accept in the next few days depending on if I get more offers or not. I have reached out to others now and I expect when its confirmed that I really am giving out money, that more offers will come in.
If you were offering, say, $100K at 5:1 odds, I would be very inclined to take it, despite the risk that e.g. next month’s X-Day finally delivers, because that would let me set in motion things that, according to me, have their own transformative potential. But I’m not sure about the value of these smaller sums.
Smaller sums are more likely to convey probabilities of each party accurately. For example, if Elon Musk offers me $5000 to split between two possible outcomes, I will allocate them close to my beliefs, but if he offers me 5mil, I’ll allocate about 2.5mil each because either one is a transformative amount of money.
People are more likely to be rational with their marginal dollar because of pricing in the value of staying solvent. The first 100k in my bank account IS worth more than the second, and so the saying, a non-marginal bird in the hand is worth two in the bush.
Are you suggesting that you currently have a double digit percentage that there’s clear evidence of some form of nonhuman intelligence in the next five years (which would warrent the 5:1 odds)?
Not at all. But for a credible bet, I have to have some chance of paying out my losses. On the basis of lifetime earnings so far, even $500K is really pushing it. Promising to pay millions if I lose is not credible.
This is a very interesting topic, since from my own perception there is a high market inefficiency regarding this topic, and the lack of international press around the recent developments on the USA Congress UAP hearings.
I’ll bet. Up to $100k of mine against $2k of yours. 50:1. (I honestly think the odds are more like 1000+:1, and would in principle be willing to go higher, but generally think people shouldn’t bet more than they’d be willing to lose, as bets above that amount could drive bad behavior. I would be happy to lose $100k on discovering aliens/time travel/new laws of physics/supernatural/etc.)
Happy to write a contract of sorts. I’m a findable figure and I’ve made public bets before (e.g., $4k wagered on AGI-fueled growth by 2043).
Given your lack of history I would want much better odds and lower payment from my side, for you I would probably max at $500 and would want 200:1
Fair. I accept. 200:1 of my $100k against your $500. How are you setting these up?
I’m happy to pay $100k if my understanding of the universe (no aliens, no supernatural, etc.) is shaken. Also happy to pay up after 5 years if evidence turns up later about activities before or in this 5-year period.
(Also, regarding history, I have a second Less Wrong account with 11 years of history: https://www.lesswrong.com/users/tedsanders)
Awesome! DM me and we can figure out payment options
$500 payment received.
I am committed to paying $100k if aliens/supernatural/non-prosaic explanations are, in the next 5 years, considered, in aggregate, to be 50%+ likely in explaining at least one UFO.
(I’ve added my $50 to RatsWrong’s side of this bet)
I am concerned for your monetary strategy (unless you’re rich). Let’s say you’re absolutely right that LW is overconfident, and that there is actually a 10% chance of aliens rather than 0.5. So this is a good deal! 20x!
But only on the margin.
Depending on your current wealth it may only be rational to take a few hundred dollars worth of these bets for this particular bet. If you go making lots of these types of bets (low probability, high payoff, great EXpected returns) for a small fraction of your wealth each, you should expect to make money, but if you make only 3 or 4 of these types of bets, you are more likely to lose money because your are loading all your gains into a small fraction of possibilities in exchange for huge payouts, and most outcomes end up with you losing money.
See for example the St. Petersburg paradox which has infinite expected return, but very finite actual value given limited assets for the banker and or the player.
If you think the true likelihood is 10%, and are being offered odds of 50:1 on the bet, then the Kelly Criterion suggests you should be about 8% of your bankroll. For various reasons (mostly human fallibility and an asymmetry in the curve of the Kelly utility), lots of people recommend betting at fractions of the Kelly amount. So someone in the position you suggest might reasonably wish to be something like $2-5k per $100k of bankroll. That strategy, your proposed credences, and the behavior observed so far would imply a bankroll of a few hundred thousand dollars. That’s not trivial, but also far from implausible in this community.
I’d also guess that the proper accounting of the spending here is partly on the bet for positive expected value, and partly on some sort of marketing / pushing for higher credibility of their idea sort of thing. I’m not sure of the exact mechanism or goal, and this is not a confident prediction, but it has that feel to it.
Out of curiosity, is there anywhere you’ve written about your object-level view on this? The EY post fleshes out what I would call strong consensus on LW thoroughly, is there some equivalence of this that you’ve put together?
Congratulations, you have entered into the legendarium @RatsWrongAboutUAP. I fully agree with you and think you will win big here. I have been trying to create a bet with Eliezer since 2021 on this same issue (I have receipts) but could not word the criteria as elegantly as you did. Now, what I wanted to comment on was expanding on one of the criteria.
The Breakaway Group. This example may not fall under any of the previous explicit examples: they are still human, they are not an ancient civilization, they are not time travellers. The Breakaway Group represents a rogue element using the cover of the national security apparatus of unacknowledged special access programs to avoid disclosures, oversight and achieve compartmentalization of tech and knowledge. This is the complicated world where the nature of these programs may not be fully revealed due to the enmeshment of the military industrial complex with the nation state, but craft and/or new technology may be revealed that still matches UFOs because they came from reverse engineering programs or recovered craft.
I would consider the above scenario satisfying the “Secret Manhattan style project with beyond next gen physics, that we had back in the 60′s” criteria, partially. But there are some additional assumptions built-in, like the potential illegal nature of Breakaway Group activities (operating without proper oversight, murder to maintain secrecy as per Grusch) meaning that it is not like the Manhattan project at all. But it would still be significantly weird to cause ontological shock, so it is likely to satisfy the same. Still, worth clarification.
I am available here or on twitter @micksabox for anyone else to offer bets under same resolution criteria, if that is allowed.
Does that mean you’re willing to undercut RWAU and offer bets at substantially better-for-the-other-party odds?
RWAU says they think we’re overconfident, but without looking through the whole thread I don’t think they’ve said what their own confidence is. (Which makes sense, telling us that would be an information asymmetry that we could exploit.) They’ve accepted odds as low as 50:1, with counterparty and inflation risks, so we can infer that their confidence is presumably significantly higher than 2%. But I wouldn’t be surprised if it’s less than 50%.
But from the sounds of things your confidence is higher than 50%? So… can I interest you in a 1:1 bet (if we can avoid the counterparty and inflation risks)? :D
(Fwiw I think that “why would I take a 1:1 bet when it seems I can get much better odds than that” is totally a valid answer here.)
Happy to bet $40k at
110:120:1 odds ($364$2k). (Edited Sep 2023; previous bets confirmed at previous odds.)USDC ERC-20 (Ethereum): (address removed for privacy, please DM if you want to trade more)
USDC Polygon: (address removed for privacy, please DM if you want to trade more)
(Edit 23 June 3:45 PT): I’m only willing to bet assuming that AGI-created tech doesn’t count for the purposes of this bet—it has to be something more supernatural than that.)
Please confirm reception of funds https://etherscan.io/tx/0x0104a0005a62af25a86d9d3573c02e0715860309b3a66e1370efec7533b41ffa
Confirm.
Happy to do another $40k at 55:1 odds if you like (another $727), and another $20k at 20:1 odds after that.
Some additional people reached out to me—just reiterating that I’m happy to do more at 20:1 odds!
What if UFO are indeed really weird, but this will not shake LW belief system as it will be easily retrospectively explained: e.g. ‘we always know that acausal cooperation between glitching streams in dust theory will produce Bayesian artifacts with low apriori probability but also unprovable in classical statistic sense’.
If/when it comes out that ufos are legitimately weird, I would be very surprised to see anything other than utter bewilderment from most of LW, I don’t expect clear resolution in my favor to be an issue.
On further edit: apparently I’m a blind idiot and didn’t see the clearly stated “5 year time horizon” despite actively looking for it. Sorry. I’ll leave this here as a monument to my obliviousness, unless you prefer to delete it.
Without some kind of time limit, a bet doesn’t seem well formed, and without a reasonably short time limit, it seems impractical.
No matter how small the chance that the bet will have to be paid, it has to be possible for it to be paid, or it’s not a bet. Some entity has to have the money and be obligated to pay it out. Arranging for a bet to be paid at any time after their death would cost more than your counterparty would get out of the deal. Trying to arrange a perpetual trust that could always pay is not only grossly impractical, but actually illegal in a lot of places. Even informally asking people to hold money is really unreliable very far out. And an amount of money that could be meaningful to future people could end up tied up forever anyway, which is weird. Even trying to be sure to have the necessary money until death could be an issue.
I’m not really motivated to play, but as an example I’m statistically likely to die in under 25 years barring some very major life extension progress. I’m old for this forum, but everybody has an expiration date, including you yourself. Locating your heirs to pay them could be hard.
Deciding the bet can get hard, too. A recognizable Less Wrong community as such probably will not last even 25 years. Nor will Metaculus or whatever else. A trustee is not going to have the same judgement as the person who originally took your bet.
That’s all on top of the more “tractable” long-term risks that you can at least value in somehow… like collapse of whatever currency the bet is denominated in, AI-or-whatever completely remaking the economy and rendering money obsolete, the Rapture, etc, etc.
… but at the same time, it doesn’t seem like there’s any particular reason to expect definitive information to show up within any adequately short time.
On edit: I bet somebody’s gonna suggest a block chain. Those don’t necessarily have infinite lives, either, and the oracle that has to tell the chain to pay out could disappear at any time. And money is still tied up indefinitely, which is the real problem with perpetuities.
Hm, I don’t feel confident enough to place huge odds on none of these things being the answer (besides, the losses may appear deceptively smaller than they are; if you think $20,000 are a lot, try “$20,000 and having to explain to your wife why you lost $20,000 in a bet, all the while aliens may be attacking Earth”). I think the thing that really peeves me is running to “aliens” as the first exotic explanation as some do. If I witnessed something really unbelievable and seemingly breaking all laws of physics, and had plenty of evidence that it’s not just an obvious trick or forgery, then my next question would be “is someone messing with my brain?”. Because if we go into the realm of the seeming impossible, projecting hallucinations inside my brain via finely tuned electromagnetic fields or something seems still a lot more believable than aliens from outside the solar system to me.
I don’t think I have enough of a post history to participate. If I did, I’d factor into my bet that there may be less impact to be had in a world with advanced aliens, at least if those aliens could subdue an earth-originated ASI. Therefor, money might be less instrumentally valuable in that world.
It strikes me that you’re wearing a lot of risk beyond the face value bet. Even if we assume everyone is acting in good faith, there’s likely credit risk across 10 different people promising a $100k+ payout (because most people don’t have that much cash, and even among those who do, there’s some likelihood of falling below that level of liquidity after a 5 year period). On your side, it looks like you’re just sending people your side of the bet before resolution, so they wear zero credit risk, even though the credit risk on your end was smaller to begin with, because coming up with 5k is much more plausible (again, assuming good faith actors). There’s also a time-value of money issue, where you paying now makes your side more valuable. To take the bet you made with Eliezer, you gave him $1k today, and he promised $150k if you pay out; that’s actually about 117k in current dollars with 5-year risk-free interest rates around 5%. It might be better to bet in net-present-value terms.
Thank you for kickstarting an interesting discussion around this topic. I won’t bet against you; I, too, think most LessWrongers dramatically underestimate the plausibility of the extraterrestrial hypothesis.
The UK’s Project Condign concluded UAP were real and exotic, but were unknown natural atmospheric plasma phenomena (similar to the Hessdalen lights and reports of black or metallic-appearing ball lightning) generating electromagnetic fields that interact with human brains to induce psychedelic/out-of-body experiences (hence alien abduction and close encounter reports). Would this category of explanation (something that accounted for most or all weird aspects of the UFO phenomenon without being especially ontologically shocking in retrospect) count as “very weird,” in your view?
I’m happy to bet $150k against your $1k, resolves as your bet with Eliezer. DM or email me: ms at contact dot ms
I’ve made a 100:1 bet with Jehan Azad and have received $4k. If Eliezer loses his bet to RatsWrongAboutUAP, I send Jehan $400k.
I will take this bet if you’re interested. I’ll send you and email.
If you’re still looking, I’d be happy to bet at 50:1, my $5k against your $100.
Sure! Please reach out
Confirming that this is on, and I’ve received $100 worth of BTC from OP.
The explanation still has to ultimately relate back to explaining uap.
I am willing to bet 50:1 up to $20k. Would you be interested?
[EDIT] up to $20k on my side, not up $1M.
Yes! If you have any concerns over terms/resolution please let me know. Otherwise reach out with means to receive payment
I confirm that I have received today $400. final resolution day: 11:59pm, June 16th, 2028
Happy to make this bet!
Do you use Manifold Markets? It already has UAP-related markets you can bet on, and you can create your own.
Including one with the same terms.
This pattern matches to anonymous person on the internet offering free money, which is typically a scam. Safer to pass, I think.
Parts of it do match (free money, to be repaid years from now), parts don’t (large liability years from now if the OP is correct, preference for crypto as irrevocable money transfer, desire for public agreement and public adjudication). The trust level implied by “accepting party has final say” and “hold all the money for years” is much higher than normal, which often indicates scam. The fact that I don’t see the scam (despite knowing a bit about common ones) is some evidence that it’s not a scam. The non-specificity of terms (which payment method(s) to use, what odds they’ll take, what min/max amount to consider) could go either way.
If OP were trolling for suckers or running an overpay/refund/revoke scam, they’d scale out rather than picking just one target—offer a bet to all takers, in hopes that multiple will be duped. That doesn’t seem to be happening.
Note that it can fail to be real without being a scam. An over-simple offer that is regretted before payment is irrevocable means no bet occurs, but that’s not scammy, it’s just over-aggressive signaling in wanting to make a bet and then avoiding the pain of actually making the payment. This is where I put most of my probability weight on failure (though some to scam, of course).
Doesn’t smell like it to me, and paying up front makes scamming harder. Are you thinking “he’s scouting for marks” or “the cost is mostly in dispute headaches” or “people who join schemes this weird end up in a ditch”?
There are free money scams where someone transfers money from stolen credit cards.
One way might be to agree to pay $1000 dollar and then “accidentally” transfer $2000 (from a stolen credit card) and then ask the person to transfer $1000 dollar back to another bank account.
I think RatsWrongAboutUAP did offer to pay in crypto which removes the option for these kinds of frauds. Otherwise, just avoiding transferring any money even if someone overpays you is also a good heuristic.
Scamming is adversarial, so it’s normal for a scam to appear like it’s safe. But I’m not claiming my pattern match is superior to yours.
A scam could include getting financial information to get money, or a voice print for impersonation. Maybe the scammer has insider information about UFOs. Maybe it’s entrapment for breaking gambling laws. Maybe a journalist is writing a story about how evil rationalists exploit innocent people with fringe beliefs for money.
The scam probability doesn’t have to be large for it to dwarf the apparent benefits.
Rare counterexample
The LessWrong Review runs every year to select the posts that have most stood the test of time. This post is not yet eligible for review, but will be at the end of 2024. The top fifty or so posts are featured prominently on the site throughout the year.
Hopefully, the review is better than karma at judging enduring value. If we have accurate prediction markets on the review results, maybe we can have better incentives on LessWrong today. Will this post make the top fifty?
Is all the money gone by now? I’d be very happy to take a bet if not.
Codyz bets Algon according to the terms of this post with 150:1 odds.
Cody bets $120 USD that UFOs are prosaic against Ali’s $18,000 USD that they are not.
Cody sent the $120 USD today, 8/22/23, via Paypal. Ali to confirm receipt below.
Codyz has sent me the funds and I agree to the terms of this bet.
I am of the understanding that most users have putting down money against OP. I would like to join OP, with the same terms, for people who haven’t got their bets filled.
Willing to stake up to $100k in a 20:1 deal. If someone wants to put $200k down on the phenomenon as described above having fully prosaic explanations for example, I will deposit $10k into an escrow contract on Ethereum and have it be sent to a predefined address in 5 years if adjudicated as so. Will do this up to $100k deposited in the contract. Funds on the counterparty side must be shown as present, although I’m not requiring the amount be deposited into the escrow contract as well.
I’m a different person starting a new bet like this one. I’m looking for 150:1 odds for a 3 year time frame. I suspect people are less willing to make this bet today than they were in July.
Are you still open to taking bets?
Hey, this might be a little bit “last minute” considering the stuff tomorrow, but are you willing to do transfers with regular money instead of crypto?
I’m willing to send transfers with regular money. You don’t have much post history though so the odds would need to be favorable or you’d need something like real name verification etc.
That’s understandable. Supposing I went with some form of verification, what odds would you be feeling right now?
I’ll DM and we can discuss
I will also take this bet, on the side of /u/RatsWrongAboutUAP with exactly the same terms, with anyone who wants. I will pay you now, you pay me 150:1 if UFO are supernatural within the next 5 years.
@RatsWrongAboutUAP I’m willing to risk up to $20k at 50:1 odds (i.e. If you give me $400 now, I’ll owe you $20k in 5 years if you win the bet) conditional on (1) you not being privy to any non-public information about UFOs/UAP and (2) you being okay with forfeiting any potential winnings in the unlikely event that I die before bet resolution.
Re (1): Could you state clearly whether you do or do not have non-public information pertaining to the bet?
Re (2): FYI The odds of me dying in the next 5 years are less than 3% by SSA base rates, and my credence is even less than that if we don’t account for global or existential catastrophic risk. The reason I’d ask to not owe you any money in the worlds in which you win (and are still alive to collect money) and I’m dead is because I wouldn’t want anyone else to become responsible for settling such a significant debt on my behalf.
If you accept, please reply here and send the money to this Bitcoin address: 3P6L17gtYbj99mF8Wi4XEXviGTq81iQBBJ
I’ll confirm receipt of the money when I get notified of your reply here. Thanks!
I do not have any non-public information about ufos/uap.
Sure
Please verify reception of funds and confirm our bet https://blockstream.info/tx/ab7173abec208a6eda17bdf1b75668bc5e6efe46356f40109125a42962bfb9e2
Received $400 worth of bitcoin. I confirm the bet.
I’d love to do this, but would have a hard time paying out because, for reasons beyond my control and caused by other people’s irrationality, I’m on SSI (although that might change in a few years). In the US people can’t save more than $2000 in liquid assets without losing their benefits, so I can’t take much, and probably wouldn’t be able to pay out because every transaction must be justified to the government, and although small purchases for entertainment would go through I’d have a hard time defending paying $1000 or whatever on a bet. Also, I’ve tried to work around this with crypto and lost all I paid in a scam.
I was thinking about just lying about what I could pay back, but being alienated by what seems to be the only sane and good community on the planet would be a much bigger cost. (Other people try to be sane and good, and the lesson I’ve learned is that “ethics” is what people talk about when they are about to make things worse for everyone except for the rationalist community).
I would also gamble 200:1 odds for up to $5000. I have a strong conviction that all UFOs are prosaic in nature and have held this conviction for longer than a decade at this point.
Acceptable offer, reach out with payment details
Why would you accept an offer from an account that has three comments on LessWrong and thus no reputation to lose?
Because in absolute terms I would only have to put up $25. At the lower end I’m willing to be more flexible. I also just want to bet with more people. I’ve received a lot of offers in comments, but only 3 people have actually reached out and finalized bets with me.
Interested in my $100-200k against your $5-10k.
This is late but if betting is still available I think I’d take 1:60 odds.
In addition I am willing to reveal my identity (in private) and write an actual contract in the interest of creating a stronger sense of commitment and seriousness if you’d like that. I am also willing to return the exact sum at the end of the 5 years if we reach an “impasse” where you believe strong evidence has been provided that I do not recognize as such (for example, if belief in a supernatural origin for UFOs becomes common in the coming years for various reasons)
I am also very interested in your justifications for this bet. Are there any historical UFO “cases” that you find compelling?
I like the odds and appreciate the offer, but these terms do not interest me.
EDIT: You can safely disregard the second paragraph of this, I misread the post initially. Still, the first applies.
In the event that you decide you’re being stiffed, how will you quantify community sentiment on the issue to try and prove that the majority of the community believes in one of your categories of anomalous claims? Will you conduct a poll of some kind? Will you just say that you beg to differ?
Also, in the event that you’re actually someone who has assessed that they don’t want to be on LessWrong greater than 5 years from now anyway in the timeline where no substantial UFO/UAP evidence has surfaced by then, what would compel you to pay up instead of ghosting?
I said in a post to lsusr yesterday (https://www.lesswrong.com/posts/oY9HNicqGGihymnzk/intelligence-officials-say-u-s-has-retrieved-craft-of-non?commentId=od73EXuSL6uKLFfeD) that I would update the post today to address his concerns, but honestly im feeling very lazy and mostly disagree that its unclear what I’m trying to do.
I will be picking some people and moving forward with the bets today. I will ensure with my counterparties that any individual concerns they have are addressed.
I am still open to betting with more people (and would love to do so!).
To be clear, to resolve the bet in your favor, it has to be the case that:
a) We have >50% credence in “ontological shock” as you define it
and
b) UFOs/UAPs identified as of June 13 2023 are meaningfully a result of such “ontological shock” right?
(To be more explicit, I want to exclude scenarios like the following thing from being scored in your favor:
1. We discover novel philosophical arguments or empirical evidence that leads LessWrongers to believe we’re on balance more likely to live in a simulation than not.
2. Causally, the UFOs are a result of simulations, because everything we experience is a result of simulations, including hot-air balloons).
Anyway, happy to bet at 30:1 to up to $30k of my money, assuming that I don’t have to commit the $ to a third-party and this is just like a standard bet. (I’m willing to bet at those odds for either nominal USD $s or inflation-adjusted $2023 USD, if you want a different way to denominate the bet please let me know and I’ll think about it).
Correct, I’m not trying to collect on something like that. I would in the case of a simulation only if it was also the case that the ufos were something unique and specific (ie, actual glitches, or something expressly put in by the programmers) as opposed to being trivially true in the way that you mention.
Correct, you do not have to commit the money to any third party, you merely have to affirm that you will pay out in the event that I win.
At the moment 30:1 is less than I would prefer, if in the next few days I do not get enough new offers then I might take you up on this. 50:1 is currently the lowest I have accepted.
I’m sure that this time around, it’s definitely real aliens. Or, barring that, magic or time travel.
Wish to bet on it?
I’d consider that to be exploitation. In addition to that, too-easy-to-win bets make me wary of something unpredictable going wrong.
does too-hard-to-win bets make you wary of something unpredictably going right?
Yes. If I relied on losing a bet and someone knew that, them offering me to bet (and therefore lose) would make me wary something would unpredictably go right, I’d win, and my reliance on me losing the bet would be thwarted.
If I meet a random person who offers to give me $100 now and claims that later, if it’s not proven that they are the Lord of the Matrix, I don’t have to pay them $15,000, most of my probability mass located in “this will end badly” won’t be located in “they are the Lord of the Matrix.” I don’t have the same set of worries here, but the worry remains.