A university graduate on a job hunt just can’t find something that matches his skills. People never get back to him, interviews go nowhere, pretty opportunities peter out. Attempt after attempt leaves him frustrated and empty-handed, and as his motivation thins or his desperation thickens, he finally decides that he needs something and applies for a job he’s massively overqualified for, maybe an entry-level position in a supermarket. Then, to his horror, that job rejects him.
A woman trying to pay her bills runs into complications—her checking account is overcharged by an unexpected payment, a new emergency crops up or someone in her life needs her help. Running out of ideas, she finally elects to sell her more valuable items. The process of losing so much is tearful, but she prepares to part with most of her library, her furniture, her entertainment and so on. Only after trying this does she discover no one wants any of it—used bookstores won’t give her more than $30 for her collection and no one will respond to her ads.
I’ve had experiences like these, even today. And I think the root of much learned helplessness in my life stems from situations like these where things are measurably worse than I’d thought possible. I would call this insufficient cynicism, but I think it’s a combination of both overvaluing the likelihood and value of certain options combined with flinching away from thinking about them too closely.
Some options don’t work out, and some options are painful to consider, but options that are both can be generators of helplessness and bitterness. I graduated early with a degree in Communications, and my experiences since have made my wish dearly that I’d made different choices; looking back, what would need to be different for me to predict that this achievement would make no actual difference to my career?
Society would need to be different overall. Everybody is told that things and skills are more valuable than they actually are. If somebody believes this—as most people are inclined naturally—they will naturally fall into the traps you describe. Like you say cynism helps or growing up in a family that has overall more realistic views, or having other key persons (e.g. teachers) that help calibrate this right. Reading LessWrong probably also helps.
I’m skeptical that reading LessWrong helps; I had the Sequences and HPMOR under my belt well before I graduated. I also received the advice that I should take something like a business degree, though maybe not often or forcefully enough to sink in. If someone had taken me to one side, shook me by the shoulders and told me in no uncertain terms the difficulties I’d have, I want to think I’d have chosen differently.
I think cynicism often assumes an unreasonable level of hindsight, though. If we were in another branch of the multiverse where I’m an assistant librarian or working HR somewhere, it would be wrong of me to say that I’d overvalued my degree. And of course I may wind up doing one of those things, in which case it was good for me to have a degree after all, but if I had perfect foresight on graduation day I’d have done something else in the few years between getting the degree and getting the position.
Thinking more about it, I think one of the major factors in these traps is that avoiding them codes them as final actions. There’s a gun on the mantle you never touch, and when you finally pull it down in an act of desperation it winds up being jammed from so many years of disuse. Or there’s money in your mattress you never even look at for a decade, and then when you need it the most half of it’s vanished. In real life last resorts don’t really exist—a problem that no available solution will work on can at least be broken into smaller problems on which further solutions can be attempted.
Thus it was written, “The repeated failures didn’t matter, they only led into the next action in the chain—but he still needed a next action—”
I think the society tells a lot of lies, or at least often lies by omission, and it is difficult to correct these lies, because they exist for a reason—there is some social mechanism that rewards the liars and punishes the truthtellers, for example by raising the status of the liars and lowering the status of the truthtellers.
If anyone asked me about education choice to maximize the chance of getting the job, STEM is the obvious answer. (Although it depends on who asks; some people do not have the necessary skills/traits. I don’t know what would be the right answer for them.) But if I gave such advice in public, I can easily imagine the backlash. Humanities are high-status, STEM is… let’s say medium-status because it is associated with nerds but also with money… definitely lower-status than humanities. The proper way to express it is that STEM makes people merely smart, but humanities make them wise. (Wise = something like smart, but mysterious and higher-status.) Recommending STEM feels like an attempt to give nerds high status, and invites an angry response. Which is the reason why you haven’t heard such advice more often and more strongly (in general, not just on LessWrong).
Problem is that the advice “study humanities, not STEM” is actually correct for a small part of the society; namely for the rich people. If you have so much wealth and connections that you will never need a job to pay your bills, but you still want to study something, because for some weird reason having a university education is considered higher-status than not having one… then humanities are definitely the right choice. You want to know something that other people at least partially understand, so that you can impress them with some smart quotations; you don’t want the inferential distance to be too large. Also, if you study something that makes it quite difficult to get a job, that’s good counter-signaling! As a rich person, you don’t want to be suspected for someone who might need a job. -- The problem is that by saying “study STEM” you now advertise that neither you nor your friends are upper-class. All pretentious people will loudly recommend humanities instead.
I have yet to meet the opinion in the wild that the humanities are better or more worthy of study than STEM, and I’m skeptical that a degree in, say, entomology has anything like the market of a science like physics. (That said, from some cursory checks I’m surprised at the level to which a non-applied math degree seems to affect one’s career, although I’m suspicious that the STEM label lets it get lumped in during analyses with potentially higher earners like statistics, physics and engineering.) And my courses were largely on subjects like web design, running radio stations and using AV equipment, which I wouldn’t put under the same heading as studying history or literature in any case.
More to the point, though, would this line of thinking have saved me my present headache? I knew at the time that STEM was more marketable, of course, but I still underestimated how little my own degree would do for me—I assumed that I would do something in an office setting and be a writer at night. I can’t even chock it up to the degree itself being terrible or just a big counter-signal for the eccentric elite, because plenty of my classmates went on to at least have decent desk-jobs.
Despite all of that, though, if I’d really believed that mathematics or engineering was the only path and that my stalling at precalculus level would simply have to be overcome—that it was this or nothing, in a nutshell—then I think I would’ve chosen a better direction.
Reading this it seems your question was more about the second part of failing: When plan B (or C, or D) fail. Or how to reduce the likelihood that they do—or to have more realistic expectations about that.
I was once running out of options and falling back to plan C and D on a big life topic. It was related to joblessness. A previously existing plan B for such a situation had become unavailable for other reasons (also big ones but that did work out). I scrambled and got a good job offer and started to relax but they postponed the signing for weeks and weeks—while promising it would go thru. It worked out well but there was a point where I resigned and accepted a much lower economic level. Not too bad as Germany has a good safety net.
The level of the safety net depends very much on the country of course but the general pattern seems to be that few people will worry about other people’s economic mishap. Many more ways lead down than up and you are left in the dark which ways lead to the good places.
I’ve noticed two points in recent life where I’ve fallen prey to Goodhart’s law, and I’m working to improve them.
1) On my birthday a few days ago I finally dropped a goal I had for this year, to read as many books as possible. (My main strategy for this was to tell people how many books I’d read and maintain a list on Goodreads whenever I finished one.) For the last three years I’ve tried to measure books per year as a way to gauge how much I was learning, but as I improved and the number increased the differences between “reading lots of books” and “learning” began to magnify.
2) I’ve tried many different methods for organizing tasks, and my current method is to fill cells in a spreadsheet with goals and then color them in as I complete them. Despite that I enjoy it, this has led to many unnecessary problems, like running out of tasks to fill the space presented and feeling a gnawing incompleteness. Worse, if there’s a large task then either it can generate akrasia sitting alone in its cell or split up into a column of steps that are mostly unactionable.
I’ve also had to accept the repeated evidence that majestic-looking lists of tasks will only lead to eighty or so items being completed in a day, and I’ve moved more toward making one column of immediately-actionable tasks, finishing 75% of it or more, and then creating a second and a third and so on. (This still produces the flaw of all task systems I’ve found, that a kernel of akratic items will accrete and show up over and over from list to list, but I haven’t worked out a solution to that. Even typical “eat the frog” style advice only changes the composition of that kernel, hopefully to things you don’t care so much about.)
So in both of these cases I’ve noticed that I’ve gotten sidetracked by optimizing things I don’t (want to) care about. Strategizing my day and making sure I don’t forget anything doesn’t require thinking of a number of tasks divisible into columns of 20 or even keeping track of how many things are done. Likewise, reading lots of books is orthogonal to learning lots of things, and the right volume on its own can be worth fifty or a hundred random paperbacks.
The problem I continue to run into, and the root of why Goodharting is so easy to do, is that it’s difficult to operationalize these goals any other way. The desire to learn can be broken into “true” goals GTD-style and then measured by understanding certain texts, carrying out certain actions correctly or passing certain tests, but this requires a pretty solid understanding of the subject matter. I’d like to learn more about polyominoes, for instance, but knowing what that “more” consists of (or whether, having learned about them, I’d rather have learned something else) requires its own baseline of knowledge. My next solution to this going forward is probably going to be to have smaller, “scouting” goals for understanding fields enough to develop more concrete ones.
Operationalizing productivity itself seems more difficult, but I think part of it might be resolved with a better integration of habits or schedules. One very big flaw of all my lists has been that they create a sense of accomplishment whenever they’re filled out instead of encouraging a general sort of well-being throughout the doing of the tasks—writing only feels good once I’ve written a certain amount. The moments where I’m most thankful for lists like these are ironically the times that they drop out of my awareness and I start wanting to do more in one of the streams of work I’ve entered into.
I’ve been writing things in one form or another for about 13 years, and I’ve passed the first “million bad words” every writer’s said to have. I’ve also released around 200 rap songs underdifferentnames, which tends to come off lower-status but is no less dear to me. (I would consider 500-1000 songs to be the equivalent “bad” cutoff, though.) Since these are the things I’ve spent the longest time learning and practicing, I want to see if I can apply anything in them to learning and practicing in general. Here are my thoughts so far:
Every art is really several arts. There is no “learning to write” separate from learning characterization, imagery, the elements of style, dialogue, text formatting and so on.
Likewise, these sub-skills are not all created equally—while content, rhyme scheme, flow, swing, breath, cadence, delivery and so on may be the most relevant to other musicians critiquing your work, the final judgment of what you make can depend on things that seem at first to be unimportant or polish, like the technical specs of your microphone or the level of audio engineering performed on your finished product.
Attainment individuates. Rank beginners produce things that are very alike because they’re making the same handful of beginner mistakes. “Beginner mistakes” themselves are just how humans natively do things, and it’s only in overcoming them that the wider space of possible performances is opened.
Because of this, skills and techniques are generally about iterating away from whatever your earliest work looks like, one step at a time. And from wherever you’re standing, it’s possible to identify someone as being more or less skilled than you. (Someone it’s hard to do this on, who seems better at some things than you are and worse at others in a way that’s inconclusive, is basically level with you.)
You can give more advice to one student than to all the students in the world, and most advice is written for as many of them as possible. 1-on-1 lessons with someone you want to be more like can be worth more than all the articles or books you can find, if you’re able to truly accept how many things still need improving.
To reemphasize: the person who teaches you must be someone you want to be more like. Not all advice is instruction in how to be more like the advisor, because people can recognize and speak about their own flaws more easily than they can fix them, but you should expect to wind up more like your teacher than if you hadn’t studied under them.
When it’s said that those at high skill levels are “never done learning” it’s because they’ve achieved a higher granularity of what they can improve.
Early work is typically bad even to you, which can be extremely frustrating, but after enough practice it’s possible to be impressed by your own work—especially if you’ve left it for long enough to forget the details of it. Despite this fact, impostor syndrome never really goes away, it only adapts to your new situation.
Arts produce communities of practitioners. The majority of these people will be worse at what they’re practicing than you’d expect, because they fall below the skill level usually presented publicly. At first you may not recognize that you’re one of these people, and that your work is not so great either.
Because of this, spaces practitioners use will typically sprout a policy of giving as little criticism as possible.
This is not necessarily a bad thing; criticism is not a panacea, not all of it is worth following and there’s only so much of it you can act on until you’ve internalized and digested it. Plus oftentimes people really do just need a bit of social encouragement so that they can continue forward. (This goes doubly so for anyone who makes things for free.)
There’s nothing wrong with simply wanting praise, but know that’s what you’re looking for and don’t confuse it with a desire to cleave off your flaws as fast as possible.
The more work you produce the easier criticism will be to take, because you identify less with the thing being critiqued.
The practice of creating something is really the practice of making very small moment-to-moment decisions, usually between the handful of ideas that arise in your mind about which thing to do next.
As you notice the flaws in your own work and the work of others, you’ll eventually notice flaws in the work of those you admire, including the ones who inspired you to begin in the first place. The end result of this isn’t dislike of your past heroes, but an understanding that they really were just like you are.
Beginners should aim to use or dismiss as many of their ideas as possible, because better ones will appear given space and attention. Your ideas and insights into what to do will never be worse than they are now, and future ones need room the present ones are taking up. Be less concerned about running out of ideas than with the quality of the ones you already have.
The simplest way to encourage ideas is to write down each new one as soon as possible from when it enters thought, especially the ones that are no good. This both keeps the way clear for better ones and encourages your brain to generate ideas continuously.
Doing something well and doing it well in front of an audience can be completely different skills, and sometimes your feeling that you’re good at something will completely dissolve once a small group of strangers are watching you. (When I attempted a small concert at a convention the songs I thought I could repeat by heart were suddenly so foreign to me that I had to read them off of a phone.)
Likewise, there are things which may seem as trivial or “part of the package” which no amount of normal mastery will teach you, and the only path to these is to learn them separately. (The easiest example of this is freestyling, which I can’t do to save my life despite it being the first thing anyone asks for upon hearing that you make rap—to develop it, I would need to start fresh with the skillset involved, which focuses more on quick-wittedness and knowing precisely how much to extend oneself.)
Sounds correct. I was thinking how this applies to computer games:
Several subskills—technical perfection, new idea, interesting story, graphics, music… Different games become popular for different aspects (Tetris vs Mass Effect vs Cookie Clicker).
A frequent beginner mistake is making a game with multiple levels which feel like copies of each other. That’s because you code e.g. five or ten different interactive elements, and then you use all of them in every level. It makes the first level needlessly difficult, and every following level boring. Instead, you should introduce them gradually, so each level contains a little surprise, and perhaps you should never use all of them in the same level, but always use different subsets, so each level has a different flavor instead of merely being more difficult.
Another beginner mistake is to focus on the algorithm and ignore the non-functional aspects. If one level has a sunset in background, and another level uses a night sky with moon, it makes the game nicer, even if the background does not change anything about functionality.
Yet another mistake is to make the game insanely difficult, because as a developer you know everything about it and you played the first level for hundred times, so even the insanely difficult feels easy to you. If most new players cannot complete the tutorial, your audience is effectively just you alone.
Some people may be successful and yet you don’t want to be like them, e.g. because they optimize the product to be addictive, while you aim for a different type of experience; or their approach is “explore the market, and make a clone of whatever sells best”, while you have a specific vision.
You should do a very simple game first, because you are probably never going to finish a complicated one if it’s your first attempt. I know a few people who ignored this advice, spent a lot of time designing something complex, in one case even rented a studio… but never finished anything. (Epistemic check: possible base-rate fallacy; most people never write a complete computer game, this might include even most of those who started small.) And the more time you wasted trying to make a complicated game, the less likely you are to give up and start anew.
Successful game authors often recycle good ideas from their previous, less successful games.
The audience is famously toxic. Whatever game you make, some people will say horrible things about the game and about you in general. It is probably wise to ignore them. (Epistemic check: so you’re saying that you should only listen to those who liked your game? Yeah… from the profit perspective, the important thing is how many fans you have, not what is their ratio to haters. A game with 1000 fans and 10000 haters is more successful than a game with 10 fans and 1 hater.)
Being good at designing logical puzzles does not translate into being good at designing 3D shooters, and vice versa.
I’m impressed by how accurately this describes learning complex skills.
I’m practicing writing and I feel the same way most of the points describe: as if I’m exploring a system of caves without a map, finding bits and pieces of other explorers (sometimes even meeting them), but it’s all walking a complicated, 3d structure and constantly bumping into unknown unknowns. Let me illustrate it this way: about 3 years ago, when I started on this journey, I thought I would read 1-2 books about writing and I’ll be good. Now, I’m standing in sub-cave system #416, taking a hard look at “creativity”/”new ideas” and chuckling at my younger self who thought that sub-cave system #18 “good sentences” will lead him to the exit.
And even though I haven’t practices Brazilian Jiu Jitsu since the pandemic began, I see a lot of similarities there. At first, I thought I just have to practice a move. Then I noticed that there are many small variations depending on my energy level, the opponents size and weight, etc. Then I noticed that I could fake moves to lure my opponent into making mistakes, but I should avoid mistakes myself. Then I noticed that my opponents were better in at some moves than others. Then I noticed that my own build gave me certain advantages and disadvantages. Then I noticed...
At the end, just before the lockdowns, I learned a lot about humility and began to discard all the “factual knowledge” I got from youtube videos or books and instead began focusing on sets of small details to explore how they worked in different situations. Then, just practiced it over and over until I saw “the thing”.
Seems to me that you put quite differently sounding songs in the same album; was it on purpose? (Do you think it makes sense from the business perspective?)
A technical note: you seem to play with various sound effects, and sometimes it is awesome, like the ticking sound in Zeitnot, and sometimes it is annoying, like the “instant message” sound in Waystation (I would otherwise like the song, but that sound just triggers me), or the cracking sound in 960,000.
Thanks for listening to them! I have mixed feelings about most of what I make, but I think those songs are alright. My approach to making music does tend to strike a lot of different chords in a small space, but it’s more that I just feel like writing in one or two tones doesn’t really fit the ideas I have—most of my songs start as a list of concepts, and I rarely have an album’s worth of concepts that all fit together in the way other albums do.
A lot of those effects are just baked into the beats, which I had to use because I was scouring the Free Music Archive for CC-licensed material. I like the ticking in Zeitnot, too, which was the big influence in what the song ultimately became, but I agree sometimes the tracks would be better without them.
I also want to say that a couple days ago I relistened to AFAD, my Doxy album, and was pleasantly surprised at how well it holds up for me after six months of not thinking much about it.
Related to half-assing things with all you’ve got, I’ve noticed that there’s often a limit to how much you can succeed at a given task. If you write a paper for a class worth an A+ it’s certainly still possible to improve the paper, just not in the context of the course.
The Buddhist teacher Chogyam Trungpa once put this as “You will never be decorated by your guru.” In the presence of a master, your development of your art will never blow their mind and leave them deeply impressed—and if it did, it would only be a signal that you’re in need of a different master. It cannot be another way.
Thus, we should recognize success by the context we expect it to appear in, or the person who will be measuring it, and calibrate accordingly.
From what I remember, Selfish Reasons to Have More Kids contains only a very short mention of how to inspire good behavior in children—essentially the advice to punish consistently, especially including funny or endearing offenses. The more I think about this, the more it seems to me that discipline itself is the practice of training the same response to an increasing range of stimuli.
Progress in Vipassana, and in meditation in general, comes from engaging with fewer and fewer distractions. Progress in habits comes from decreasing the number of times that you allow some override or excuse to change your plans. Progress in security comes from shrinking the number of people you make exceptions for, maybe because you don’t want to be mean to them or there are extenuating circumstances. Progress in being a fair parent comes from providing the same consequences for the same actions, even if one of your children’s actions gets a warmer reaction from you.
This isn’t to say that discipline is the only virtue, or that you ought to seek discipline for its own sake; some excuses do rise above the threshold of acceptability. But the point of training discipline is, in a nutshell, to have consistent reactions to stimuli that invite inconsistent ones.
Consistency is a core part. The trick is to know when to make exceptions (and I don’t think I have figured that out fully yet). Exceptions for legit excuses but also for novel circumstances. Too much consistency also doesn’t prepare kids for real life either. It also depends on how many and how complex rules you have to begin with. Other aspects include the approach to differences in approach between parents and between other relevant households (e.g. godparents’).
Failure to apply a lesson is usually failure to register its relevance to your life. Don’t look for the moments where you try and retry but your effort doesn’t bear fruit, look for gaps between what you learn/know and your world-model.
This is one reason to make beliefs pay rent in expectations, and to ensure that all the nodes in your model of the world are hooked up to something. The go-to example of this being Feynman’s story of students who understood the math of refraction indices but not that water was a refraction index, leaving them unable to use what they’d learned. I read that example years back and then fell into the exact same trap, because it’s a story about science education and not general thought.
In my own experience, the most powerful moment of this so far was realizing my trans-ness, which is also my greatest “oops” moment. I’d found some internet content centered around trans people, including lists of potential signs that a person is trans and hasn’t realized yet. I knew these signs applied to some people, but I also thought the net they cast was far, far too broad because, after all, half of them even applied to me! (Looking back, I sometimes laugh at the utter lack of self-awareness present in that thought.)
I already knew people could be trans, and I don’t think more information on that front would have helped anything. The realization that changed the course of my life really was just that I counted as people too.
I think the logic behind this is that we handle many many concepts and not all of them will apply in our lives, so the onus tends to be set pretty high for a piece of information to become relevant to us. There’s a default sense of immunity or exclusion to new concepts, even when we take them on knowing they surely must apply to something.
To handle catastrophizing, you have to first grasp that the thing you’re doing that seems to have traits in common with it probably isn’t a special edge case or a false alarm—and that the reason it hasn’t jumped out at you sooner as an example of catastrophizing is that you weren’t focusing on it. And if you read a guide to something like negotiation, the techniques involved will avail you nothing until you find reason to believe that you’re engaging in negotiations. Application rests on these kinds of understanding.
Consider these two cases:
A university graduate on a job hunt just can’t find something that matches his skills. People never get back to him, interviews go nowhere, pretty opportunities peter out. Attempt after attempt leaves him frustrated and empty-handed, and as his motivation thins or his desperation thickens, he finally decides that he needs something and applies for a job he’s massively overqualified for, maybe an entry-level position in a supermarket. Then, to his horror, that job rejects him.
A woman trying to pay her bills runs into complications—her checking account is overcharged by an unexpected payment, a new emergency crops up or someone in her life needs her help. Running out of ideas, she finally elects to sell her more valuable items. The process of losing so much is tearful, but she prepares to part with most of her library, her furniture, her entertainment and so on. Only after trying this does she discover no one wants any of it—used bookstores won’t give her more than $30 for her collection and no one will respond to her ads.
I’ve had experiences like these, even today. And I think the root of much learned helplessness in my life stems from situations like these where things are measurably worse than I’d thought possible. I would call this insufficient cynicism, but I think it’s a combination of both overvaluing the likelihood and value of certain options combined with flinching away from thinking about them too closely.
Some options don’t work out, and some options are painful to consider, but options that are both can be generators of helplessness and bitterness. I graduated early with a degree in Communications, and my experiences since have made my wish dearly that I’d made different choices; looking back, what would need to be different for me to predict that this achievement would make no actual difference to my career?
Society would need to be different overall. Everybody is told that things and skills are more valuable than they actually are. If somebody believes this—as most people are inclined naturally—they will naturally fall into the traps you describe. Like you say cynism helps or growing up in a family that has overall more realistic views, or having other key persons (e.g. teachers) that help calibrate this right. Reading LessWrong probably also helps.
I’m skeptical that reading LessWrong helps; I had the Sequences and HPMOR under my belt well before I graduated. I also received the advice that I should take something like a business degree, though maybe not often or forcefully enough to sink in. If someone had taken me to one side, shook me by the shoulders and told me in no uncertain terms the difficulties I’d have, I want to think I’d have chosen differently.
I think cynicism often assumes an unreasonable level of hindsight, though. If we were in another branch of the multiverse where I’m an assistant librarian or working HR somewhere, it would be wrong of me to say that I’d overvalued my degree. And of course I may wind up doing one of those things, in which case it was good for me to have a degree after all, but if I had perfect foresight on graduation day I’d have done something else in the few years between getting the degree and getting the position.
Thinking more about it, I think one of the major factors in these traps is that avoiding them codes them as final actions. There’s a gun on the mantle you never touch, and when you finally pull it down in an act of desperation it winds up being jammed from so many years of disuse. Or there’s money in your mattress you never even look at for a decade, and then when you need it the most half of it’s vanished. In real life last resorts don’t really exist—a problem that no available solution will work on can at least be broken into smaller problems on which further solutions can be attempted.
Thus it was written, “The repeated failures didn’t matter, they only led into the next action in the chain—but he still needed a next action—”
I think the society tells a lot of lies, or at least often lies by omission, and it is difficult to correct these lies, because they exist for a reason—there is some social mechanism that rewards the liars and punishes the truthtellers, for example by raising the status of the liars and lowering the status of the truthtellers.
If anyone asked me about education choice to maximize the chance of getting the job, STEM is the obvious answer. (Although it depends on who asks; some people do not have the necessary skills/traits. I don’t know what would be the right answer for them.) But if I gave such advice in public, I can easily imagine the backlash. Humanities are high-status, STEM is… let’s say medium-status because it is associated with nerds but also with money… definitely lower-status than humanities. The proper way to express it is that STEM makes people merely smart, but humanities make them wise. (Wise = something like smart, but mysterious and higher-status.) Recommending STEM feels like an attempt to give nerds high status, and invites an angry response. Which is the reason why you haven’t heard such advice more often and more strongly (in general, not just on LessWrong).
Problem is that the advice “study humanities, not STEM” is actually correct for a small part of the society; namely for the rich people. If you have so much wealth and connections that you will never need a job to pay your bills, but you still want to study something, because for some weird reason having a university education is considered higher-status than not having one… then humanities are definitely the right choice. You want to know something that other people at least partially understand, so that you can impress them with some smart quotations; you don’t want the inferential distance to be too large. Also, if you study something that makes it quite difficult to get a job, that’s good counter-signaling! As a rich person, you don’t want to be suspected for someone who might need a job. -- The problem is that by saying “study STEM” you now advertise that neither you nor your friends are upper-class. All pretentious people will loudly recommend humanities instead.
I have yet to meet the opinion in the wild that the humanities are better or more worthy of study than STEM, and I’m skeptical that a degree in, say, entomology has anything like the market of a science like physics. (That said, from some cursory checks I’m surprised at the level to which a non-applied math degree seems to affect one’s career, although I’m suspicious that the STEM label lets it get lumped in during analyses with potentially higher earners like statistics, physics and engineering.) And my courses were largely on subjects like web design, running radio stations and using AV equipment, which I wouldn’t put under the same heading as studying history or literature in any case.
More to the point, though, would this line of thinking have saved me my present headache? I knew at the time that STEM was more marketable, of course, but I still underestimated how little my own degree would do for me—I assumed that I would do something in an office setting and be a writer at night. I can’t even chock it up to the degree itself being terrible or just a big counter-signal for the eccentric elite, because plenty of my classmates went on to at least have decent desk-jobs.
Despite all of that, though, if I’d really believed that mathematics or engineering was the only path and that my stalling at precalculus level would simply have to be overcome—that it was this or nothing, in a nutshell—then I think I would’ve chosen a better direction.
Reading this it seems your question was more about the second part of failing: When plan B (or C, or D) fail. Or how to reduce the likelihood that they do—or to have more realistic expectations about that.
I was once running out of options and falling back to plan C and D on a big life topic. It was related to joblessness. A previously existing plan B for such a situation had become unavailable for other reasons (also big ones but that did work out). I scrambled and got a good job offer and started to relax but they postponed the signing for weeks and weeks—while promising it would go thru. It worked out well but there was a point where I resigned and accepted a much lower economic level. Not too bad as Germany has a good safety net.
The level of the safety net depends very much on the country of course but the general pattern seems to be that few people will worry about other people’s economic mishap. Many more ways lead down than up and you are left in the dark which ways lead to the good places.
I’ve noticed two points in recent life where I’ve fallen prey to Goodhart’s law, and I’m working to improve them.
1) On my birthday a few days ago I finally dropped a goal I had for this year, to read as many books as possible. (My main strategy for this was to tell people how many books I’d read and maintain a list on Goodreads whenever I finished one.) For the last three years I’ve tried to measure books per year as a way to gauge how much I was learning, but as I improved and the number increased the differences between “reading lots of books” and “learning” began to magnify.
2) I’ve tried many different methods for organizing tasks, and my current method is to fill cells in a spreadsheet with goals and then color them in as I complete them. Despite that I enjoy it, this has led to many unnecessary problems, like running out of tasks to fill the space presented and feeling a gnawing incompleteness. Worse, if there’s a large task then either it can generate akrasia sitting alone in its cell or split up into a column of steps that are mostly unactionable.
I’ve also had to accept the repeated evidence that majestic-looking lists of tasks will only lead to eighty or so items being completed in a day, and I’ve moved more toward making one column of immediately-actionable tasks, finishing 75% of it or more, and then creating a second and a third and so on. (This still produces the flaw of all task systems I’ve found, that a kernel of akratic items will accrete and show up over and over from list to list, but I haven’t worked out a solution to that. Even typical “eat the frog” style advice only changes the composition of that kernel, hopefully to things you don’t care so much about.)
So in both of these cases I’ve noticed that I’ve gotten sidetracked by optimizing things I don’t (want to) care about. Strategizing my day and making sure I don’t forget anything doesn’t require thinking of a number of tasks divisible into columns of 20 or even keeping track of how many things are done. Likewise, reading lots of books is orthogonal to learning lots of things, and the right volume on its own can be worth fifty or a hundred random paperbacks.
The problem I continue to run into, and the root of why Goodharting is so easy to do, is that it’s difficult to operationalize these goals any other way. The desire to learn can be broken into “true” goals GTD-style and then measured by understanding certain texts, carrying out certain actions correctly or passing certain tests, but this requires a pretty solid understanding of the subject matter. I’d like to learn more about polyominoes, for instance, but knowing what that “more” consists of (or whether, having learned about them, I’d rather have learned something else) requires its own baseline of knowledge. My next solution to this going forward is probably going to be to have smaller, “scouting” goals for understanding fields enough to develop more concrete ones.
Operationalizing productivity itself seems more difficult, but I think part of it might be resolved with a better integration of habits or schedules. One very big flaw of all my lists has been that they create a sense of accomplishment whenever they’re filled out instead of encouraging a general sort of well-being throughout the doing of the tasks—writing only feels good once I’ve written a certain amount. The moments where I’m most thankful for lists like these are ironically the times that they drop out of my awareness and I start wanting to do more in one of the streams of work I’ve entered into.
I’ve been writing things in one form or another for about 13 years, and I’ve passed the first “million bad words” every writer’s said to have. I’ve also released around 200 rap songs under different names, which tends to come off lower-status but is no less dear to me. (I would consider 500-1000 songs to be the equivalent “bad” cutoff, though.) Since these are the things I’ve spent the longest time learning and practicing, I want to see if I can apply anything in them to learning and practicing in general. Here are my thoughts so far:
Every art is really several arts. There is no “learning to write” separate from learning characterization, imagery, the elements of style, dialogue, text formatting and so on.
Likewise, these sub-skills are not all created equally—while content, rhyme scheme, flow, swing, breath, cadence, delivery and so on may be the most relevant to other musicians critiquing your work, the final judgment of what you make can depend on things that seem at first to be unimportant or polish, like the technical specs of your microphone or the level of audio engineering performed on your finished product.
Attainment individuates. Rank beginners produce things that are very alike because they’re making the same handful of beginner mistakes. “Beginner mistakes” themselves are just how humans natively do things, and it’s only in overcoming them that the wider space of possible performances is opened.
Because of this, skills and techniques are generally about iterating away from whatever your earliest work looks like, one step at a time. And from wherever you’re standing, it’s possible to identify someone as being more or less skilled than you. (Someone it’s hard to do this on, who seems better at some things than you are and worse at others in a way that’s inconclusive, is basically level with you.)
You can give more advice to one student than to all the students in the world, and most advice is written for as many of them as possible. 1-on-1 lessons with someone you want to be more like can be worth more than all the articles or books you can find, if you’re able to truly accept how many things still need improving.
To reemphasize: the person who teaches you must be someone you want to be more like. Not all advice is instruction in how to be more like the advisor, because people can recognize and speak about their own flaws more easily than they can fix them, but you should expect to wind up more like your teacher than if you hadn’t studied under them.
When it’s said that those at high skill levels are “never done learning” it’s because they’ve achieved a higher granularity of what they can improve.
Early work is typically bad even to you, which can be extremely frustrating, but after enough practice it’s possible to be impressed by your own work—especially if you’ve left it for long enough to forget the details of it. Despite this fact, impostor syndrome never really goes away, it only adapts to your new situation.
Arts produce communities of practitioners. The majority of these people will be worse at what they’re practicing than you’d expect, because they fall below the skill level usually presented publicly. At first you may not recognize that you’re one of these people, and that your work is not so great either.
Because of this, spaces practitioners use will typically sprout a policy of giving as little criticism as possible.
This is not necessarily a bad thing; criticism is not a panacea, not all of it is worth following and there’s only so much of it you can act on until you’ve internalized and digested it. Plus oftentimes people really do just need a bit of social encouragement so that they can continue forward. (This goes doubly so for anyone who makes things for free.)
There’s nothing wrong with simply wanting praise, but know that’s what you’re looking for and don’t confuse it with a desire to cleave off your flaws as fast as possible.
The more work you produce the easier criticism will be to take, because you identify less with the thing being critiqued.
The practice of creating something is really the practice of making very small moment-to-moment decisions, usually between the handful of ideas that arise in your mind about which thing to do next.
As you notice the flaws in your own work and the work of others, you’ll eventually notice flaws in the work of those you admire, including the ones who inspired you to begin in the first place. The end result of this isn’t dislike of your past heroes, but an understanding that they really were just like you are.
Beginners should aim to use or dismiss as many of their ideas as possible, because better ones will appear given space and attention. Your ideas and insights into what to do will never be worse than they are now, and future ones need room the present ones are taking up. Be less concerned about running out of ideas than with the quality of the ones you already have.
The simplest way to encourage ideas is to write down each new one as soon as possible from when it enters thought, especially the ones that are no good. This both keeps the way clear for better ones and encourages your brain to generate ideas continuously.
Doing something well and doing it well in front of an audience can be completely different skills, and sometimes your feeling that you’re good at something will completely dissolve once a small group of strangers are watching you. (When I attempted a small concert at a convention the songs I thought I could repeat by heart were suddenly so foreign to me that I had to read them off of a phone.)
Likewise, there are things which may seem as trivial or “part of the package” which no amount of normal mastery will teach you, and the only path to these is to learn them separately. (The easiest example of this is freestyling, which I can’t do to save my life despite it being the first thing anyone asks for upon hearing that you make rap—to develop it, I would need to start fresh with the skillset involved, which focuses more on quick-wittedness and knowing precisely how much to extend oneself.)
Sounds correct. I was thinking how this applies to computer games:
Several subskills—technical perfection, new idea, interesting story, graphics, music… Different games become popular for different aspects (Tetris vs Mass Effect vs Cookie Clicker).
A frequent beginner mistake is making a game with multiple levels which feel like copies of each other. That’s because you code e.g. five or ten different interactive elements, and then you use all of them in every level. It makes the first level needlessly difficult, and every following level boring. Instead, you should introduce them gradually, so each level contains a little surprise, and perhaps you should never use all of them in the same level, but always use different subsets, so each level has a different flavor instead of merely being more difficult.
Another beginner mistake is to focus on the algorithm and ignore the non-functional aspects. If one level has a sunset in background, and another level uses a night sky with moon, it makes the game nicer, even if the background does not change anything about functionality.
Yet another mistake is to make the game insanely difficult, because as a developer you know everything about it and you played the first level for hundred times, so even the insanely difficult feels easy to you. If most new players cannot complete the tutorial, your audience is effectively just you alone.
Some people may be successful and yet you don’t want to be like them, e.g. because they optimize the product to be addictive, while you aim for a different type of experience; or their approach is “explore the market, and make a clone of whatever sells best”, while you have a specific vision.
You should do a very simple game first, because you are probably never going to finish a complicated one if it’s your first attempt. I know a few people who ignored this advice, spent a lot of time designing something complex, in one case even rented a studio… but never finished anything. (Epistemic check: possible base-rate fallacy; most people never write a complete computer game, this might include even most of those who started small.) And the more time you wasted trying to make a complicated game, the less likely you are to give up and start anew.
Successful game authors often recycle good ideas from their previous, less successful games.
The audience is famously toxic. Whatever game you make, some people will say horrible things about the game and about you in general. It is probably wise to ignore them. (Epistemic check: so you’re saying that you should only listen to those who liked your game? Yeah… from the profit perspective, the important thing is how many fans you have, not what is their ratio to haters. A game with 1000 fans and 10000 haters is more successful than a game with 10 fans and 1 hater.)
Being good at designing logical puzzles does not translate into being good at designing 3D shooters, and vice versa.
I’m impressed by how accurately this describes learning complex skills.
I’m practicing writing and I feel the same way most of the points describe: as if I’m exploring a system of caves without a map, finding bits and pieces of other explorers (sometimes even meeting them), but it’s all walking a complicated, 3d structure and constantly bumping into unknown unknowns. Let me illustrate it this way: about 3 years ago, when I started on this journey, I thought I would read 1-2 books about writing and I’ll be good. Now, I’m standing in sub-cave system #416, taking a hard look at “creativity”/”new ideas” and chuckling at my younger self who thought that sub-cave system #18 “good sentences” will lead him to the exit.
And even though I haven’t practices Brazilian Jiu Jitsu since the pandemic began, I see a lot of similarities there. At first, I thought I just have to practice a move. Then I noticed that there are many small variations depending on my energy level, the opponents size and weight, etc. Then I noticed that I could fake moves to lure my opponent into making mistakes, but I should avoid mistakes myself. Then I noticed that my opponents were better in at some moves than others. Then I noticed that my own build gave me certain advantages and disadvantages. Then I noticed...
At the end, just before the lockdowns, I learned a lot about humility and began to discard all the “factual knowledge” I got from youtube videos or books and instead began focusing on sets of small details to explore how they worked in different situations. Then, just practiced it over and over until I saw “the thing”.
I just listened to your “Jake the Adversary” albums. I really liked these songs: Misbegotten, H E L L O _ W O R L D, Failure Mode, Zeitnot, Thanateros.
Seems to me that you put quite differently sounding songs in the same album; was it on purpose? (Do you think it makes sense from the business perspective?)
A technical note: you seem to play with various sound effects, and sometimes it is awesome, like the ticking sound in Zeitnot, and sometimes it is annoying, like the “instant message” sound in Waystation (I would otherwise like the song, but that sound just triggers me), or the cracking sound in 960,000.
Thanks for listening to them! I have mixed feelings about most of what I make, but I think those songs are alright. My approach to making music does tend to strike a lot of different chords in a small space, but it’s more that I just feel like writing in one or two tones doesn’t really fit the ideas I have—most of my songs start as a list of concepts, and I rarely have an album’s worth of concepts that all fit together in the way other albums do.
A lot of those effects are just baked into the beats, which I had to use because I was scouring the Free Music Archive for CC-licensed material. I like the ticking in Zeitnot, too, which was the big influence in what the song ultimately became, but I agree sometimes the tracks would be better without them.
I also want to say that a couple days ago I relistened to AFAD, my Doxy album, and was pleasantly surprised at how well it holds up for me after six months of not thinking much about it.
Oh, the landscape in ink is wonderful, too!
(Though not good for exercise, which is what I use the other songs for.)
Related to half-assing things with all you’ve got, I’ve noticed that there’s often a limit to how much you can succeed at a given task. If you write a paper for a class worth an A+ it’s certainly still possible to improve the paper, just not in the context of the course.
The Buddhist teacher Chogyam Trungpa once put this as “You will never be decorated by your guru.” In the presence of a master, your development of your art will never blow their mind and leave them deeply impressed—and if it did, it would only be a signal that you’re in need of a different master. It cannot be another way.
Thus, we should recognize success by the context we expect it to appear in, or the person who will be measuring it, and calibrate accordingly.
Good observation! The converse holds too—we should change context for those things we want to do better than the current mechanisms can measure.
From what I remember, Selfish Reasons to Have More Kids contains only a very short mention of how to inspire good behavior in children—essentially the advice to punish consistently, especially including funny or endearing offenses. The more I think about this, the more it seems to me that discipline itself is the practice of training the same response to an increasing range of stimuli.
Progress in Vipassana, and in meditation in general, comes from engaging with fewer and fewer distractions. Progress in habits comes from decreasing the number of times that you allow some override or excuse to change your plans. Progress in security comes from shrinking the number of people you make exceptions for, maybe because you don’t want to be mean to them or there are extenuating circumstances. Progress in being a fair parent comes from providing the same consequences for the same actions, even if one of your children’s actions gets a warmer reaction from you.
This isn’t to say that discipline is the only virtue, or that you ought to seek discipline for its own sake; some excuses do rise above the threshold of acceptability. But the point of training discipline is, in a nutshell, to have consistent reactions to stimuli that invite inconsistent ones.
Consistency is a core part. The trick is to know when to make exceptions (and I don’t think I have figured that out fully yet). Exceptions for legit excuses but also for novel circumstances. Too much consistency also doesn’t prepare kids for real life either. It also depends on how many and how complex rules you have to begin with. Other aspects include the approach to differences in approach between parents and between other relevant households (e.g. godparents’).
Failure to apply a lesson is usually failure to register its relevance to your life. Don’t look for the moments where you try and retry but your effort doesn’t bear fruit, look for gaps between what you learn/know and your world-model.
This is one reason to make beliefs pay rent in expectations, and to ensure that all the nodes in your model of the world are hooked up to something. The go-to example of this being Feynman’s story of students who understood the math of refraction indices but not that water was a refraction index, leaving them unable to use what they’d learned. I read that example years back and then fell into the exact same trap, because it’s a story about science education and not general thought.
In my own experience, the most powerful moment of this so far was realizing my trans-ness, which is also my greatest “oops” moment. I’d found some internet content centered around trans people, including lists of potential signs that a person is trans and hasn’t realized yet. I knew these signs applied to some people, but I also thought the net they cast was far, far too broad because, after all, half of them even applied to me! (Looking back, I sometimes laugh at the utter lack of self-awareness present in that thought.)
I already knew people could be trans, and I don’t think more information on that front would have helped anything. The realization that changed the course of my life really was just that I counted as people too.
I think the logic behind this is that we handle many many concepts and not all of them will apply in our lives, so the onus tends to be set pretty high for a piece of information to become relevant to us. There’s a default sense of immunity or exclusion to new concepts, even when we take them on knowing they surely must apply to something.
To handle catastrophizing, you have to first grasp that the thing you’re doing that seems to have traits in common with it probably isn’t a special edge case or a false alarm—and that the reason it hasn’t jumped out at you sooner as an example of catastrophizing is that you weren’t focusing on it. And if you read a guide to something like negotiation, the techniques involved will avail you nothing until you find reason to believe that you’re engaging in negotiations. Application rests on these kinds of understanding.