Curated. This was a great recommendation and a very readable account of an ultimately failed attempt to find a missing person. I was drawn in by eukaryote’s description of the case and, especially, the processes of the searchers. I watched some of the linked videos and sampled some blog posts. (It has also made me feel a little rueful about my time traipsing through washes in the Colorado Desert by moonlight – though in my case it was a bit more populated).
I feel a bit of pressure to precipitate out some kind of neat rationality takeaway from the story of this search. For example, the park management’s Simulacrum II decision to distort a shared map maybe lead to tragedy (which I think is pretty plausible to anyone who has tried to use directions that don’t indicate the turnings you shouldn’t take).
But, I think there’s not one neat takeaway. It’s a story of trying to carefully build a model of the situation and trying to figure out what hypotheses fit that model, and dealing with the slow drip of inconclusive data. I think it’s pretty valuable to read the logs of how people actually tried to reason through the problems they faced, and as eukaryote writes, “Mahood is really good at both being methodical and explaining his reasoning for each expedition he makes, and where he thinks to look”.
One moment in Marsland’s video really stuck out to me: he is finishing a hike that matched Ewasko’s planned routes. The conditions are similar. And as he talks to the camera, he’s very visibly slurring and affected by the heat. It’s gotta be pretty hard to think under those conditions? I wonder if whatever rationality I have will still work for me when I’m so depleted.
And of course, I am contractually obliged to mention that in 2018, someone applied Bayesian search theory to the case. That is, they started with some simple priors of where he could be and updated against him being on or near the search paths that the searchers took. Here is the posterior superposed on a map of the search area:
And here again is eukaryote’s map, the purple dot marks the place that Ewasko’s body was eventually found:
What does it mean, fundamentally, when something is NOT where it is most likely to be (like Ewasko’s body here, well outside of the most searched zone)? Or more generally when something—permanently—is NOT the way it is most likely to be? Does it mean our assessment of the likelyhoods was wrong?
Not necessarily; it could mean you’re missing relevant data or that your prior is wrong. EDIT: @the gears to ascension I meant that it’s not necessarily the case that our assessment of the likelihoods of the data were wrong despite our posterior being surprised by reality.
Curated. This was a great recommendation and a very readable account of an ultimately failed attempt to find a missing person. I was drawn in by eukaryote’s description of the case and, especially, the processes of the searchers. I watched some of the linked videos and sampled some blog posts. (It has also made me feel a little rueful about my time traipsing through washes in the Colorado Desert by moonlight – though in my case it was a bit more populated).
I feel a bit of pressure to precipitate out some kind of neat rationality takeaway from the story of this search. For example, the park management’s Simulacrum II decision to distort a shared map maybe lead to tragedy (which I think is pretty plausible to anyone who has tried to use directions that don’t indicate the turnings you shouldn’t take).
But, I think there’s not one neat takeaway. It’s a story of trying to carefully build a model of the situation and trying to figure out what hypotheses fit that model, and dealing with the slow drip of inconclusive data. I think it’s pretty valuable to read the logs of how people actually tried to reason through the problems they faced, and as eukaryote writes, “Mahood is really good at both being methodical and explaining his reasoning for each expedition he makes, and where he thinks to look”.
One moment in Marsland’s video really stuck out to me: he is finishing a hike that matched Ewasko’s planned routes. The conditions are similar. And as he talks to the camera, he’s very visibly slurring and affected by the heat. It’s gotta be pretty hard to think under those conditions? I wonder if whatever rationality I have will still work for me when I’m so depleted.
And of course, I am contractually obliged to mention that in 2018, someone applied Bayesian search theory to the case. That is, they started with some simple priors of where he could be and updated against him being on or near the search paths that the searchers took. Here is the posterior superposed on a map of the search area:
And here again is eukaryote’s map, the purple dot marks the place that Ewasko’s body was eventually found:
It took me a little while scrolling back and forth to mentally map the purple dot onto the first image. In case anyone else has the same issue:
What does it mean, fundamentally, when something is NOT where it is most likely to be (like Ewasko’s body here, well outside of the most searched zone)? Or more generally when something—permanently—is NOT the way it is most likely to be? Does it mean our assessment of the likelyhoods was wrong?
Not necessarily; it could mean you’re missing relevant data or that your prior is wrong.
EDIT: @the gears to ascension I meant that it’s not necessarily the case that our assessment of the likelihoods of the data were wrong despite our posterior being surprised by reality.