successfully bought out
*got paid to remove them as a social threat
successfully bought out
*got paid to remove them as a social threat
For people who want weirder takes I would recommend Egan’s unstable orbits in the space of lies.
To +1 the rant, my experience across the class spectrum is that many bootstrapped successful people know this but have learned not to talk about it too much as most don’t want to hear supporting evidence for meritocracy, it would invalidate their copes.
To my younger self, I would say you’ll need to learn to ignore those who would stoke your learned helplessness to excuse their own. I was personally gaslit about important life decisions, not out of malice per se but just this sort of choice supportive bias, only to much later discover that jumping in on those decisions actually appeared on lists of advice older folks would give to younger.
Notkilleveryonism, why not Omnicidal AI? As in we oppose OAI.
Thank you for writing this. A couple shorthands I keep in my head for aspects:
My confidence interval ranges across the sign flip.
Due to the waluigi effect, I don’t know if the outcomes I care about are sensitive to the dimension I’m varying my credence along.
I often feel that people don’t get how the sucking up thing works. Not only does it not matter that it is transparent, that is part of the point. There is simultaneously common knowledge of the sucking up and common knowledge that those in the inner party don’t acknowledge the sucking up, that’s part of what the inner party membership consists of. People outside can accuse the insiders of nakedly sucking up and the insiders can just politely smile at them while carrying on. Sucking up can be what deference networks look like from the outside when we don’t particularly like any of the people involved or what they are doing. But their hierarchy visibly produces their own aims, so more fools we.
The corn thresher is not inherently evil. Because it is more efficient than other types of threshers, the humans will inevitably eat corn. If this persists for long enough the humans will be unsurprised to find they have a gut well adapted to corn.
Per Douglas Adams, the puddle concludes that the indentation in which it rests fits it so perfectly that it must have been made for it.
The means by which the ring always serves sauron is that any who wear it and express a desire will have the possible worlds trimmed both in the direction of their desire, but also in the direction of sauron’s desire in ways that they cannot see. If this persists long enough they may find they no longer have the sense organs to see (the mouth of sauron is blind).
Some people seem to have more dimensions of moral care than others, it makes one wonder about the past.
These things are similar in shape.
Even a hundred million humanoid robots a year (we currently make 90 million cars a year) will be a demand shock for human labor.
https://benjamintodd.substack.com/p/how-quickly-could-robots-scale-up
No they don’t, billionaires consume very little of their net worth.
I am very confused why the tax is 99% in this example.
Post does not include the word auction, which is a key aspect of how LVT works to not have some of these downsides.
Yes, and I don’t mean to overstate a case for helplessness. Demons love convincing people that the anti demon button doesn’t work so that they never press it even though it is sitting right out in the open.
unfortunately, the disanalogy is that any driver who moves their foot towards the brakes is almost instantly replaced with one who won’t.
High variance but there’s skew. The ceiling is very high and the downside is just a bit of wasted time that likely would have been wasted anyway. The most valuable alert me to entirely different ways of thinking about problems I’ve been working on.
no
Both people ideally learn from existing practitioners for a session or two, ideally they also review the written material or in the case of Focusing also try the audiobook. Then they simply try facilitating each other. The facilitator takes brief notes to help keep track of where they are in the other person’s stack, but otherwise acts much as eg Gendlin acts in the audiobook.
Probably the most powerful intervention I know of is to trade facilitation of emotional digestion and integration practices with a peer. The modality probably only matters a little, and so should be chosen for what’s easiest to learn to facilitate. Focusing is a good start, I also like Core Transformation for going deeper once Focusing skills are good. It’s a huge return on ~3 hours per week (90 minutes facilitating and being facilitated, in two sessions) IME.
“What causes your decisions, other than incidentals?”
“My values.”
People normally model values as upstream of decisions. Causing decisions. In many cases values are downstream of decisions. I’m wondering who else has talked about this concept. One of the rare cases that the LLM was not helpful.
Ai developers heading to work, colorized
https://imgflip.com/i/9k1b0o