While we’re on the conscientiousness load: conscientiousness is considered to be an invariant personality trait, but I’m not buying it. The typical person may experience on average no change in their conscientiousness, but typical people don’t commit to interventions that affect the workload they can take on either by strengthening willpower, increasing energy, changing thought patterns (see “The Motivation Hacker”) or improving organization through external aids.
I worry that you’re deluding yourself. What evidence have you examined?
I knew several extremely smart people who planned to study something else at the same time as their university course. All of them gave up these plans in a matter of weeks. There was one term where due to an organizational cock-up I had to do 25% more courses than normal. I survived, but it wasn’t pleasant. University is hard, probably the hardest thing I’ve ever done. (It was certainly much harder than any job I’ve had since).
Remember that older people are in charge of most hiring. It’s politically unacceptable to say that certain subjects are outright better than others, but they are (or at least will be treated that way when hiring), and a “funny” course like this will make it harder to get jobs than a more traditional subject—even jobs for which this course is perfectly suited (explanation: even a more spread out degree won’t give you the actual skills the job needs. But a more focused degree is proof that you can focus on a subject and learn it deeply)
Your career will force you into specialization even if your degree isn’t specialized; after your first job or two people are much more interested in your experience than your degree. If you switch into another field after 10 years that means going back to being treated like a fresh graduate—very bad for your salary.
I thought you had something about motivation, but I can’t find it now; my experience is that doing something you believe in is not sufficiently powerful motivation when you’re working on something unpleasant. You need to find a job you can be satisfied first, and one that’s doing the right thing second; you won’t be productive if you’re unhappy.
So my advice is:
Do the most traditional, focused degree you can—in your case Computer Science (and do it at the most reputable university you can)
The prior odds of you successfully studying something else while at university are vanishingly low. Plan accordingly.
If you still want to follow this plan, I’d suggest you start the learning computer science part now. You’re not going to be much smarter next year, and you are going to be much busier. This should also give you some information on how much work you can do in practice.
I’m probably underweighing more conservative assessments like this, so I appreciate it.
motivation and self-delusion
I have not collected evidence the directly contradicts statistical assessments regarding the conscientiousness trait. Instead I’m making an inference based off a collection of evidence that I can name. I don’t think I’ve given much consideration to evidence strength yet so working through this will be a good exercise.
For example:
Historically my conscientiousness has been quite low in part due to depression. I’ve been coming out of that depression recently, and have improved in my ability to keep on task even when I’m discouraged. Oftentimes psyching myself out was the reason why I haven’t instigated behavioral change, because when I fall off the bandwagon I don’t get back on. This change towards optimism makes me feel comparatively more competent and willing to explore my alternatives for support and skills.
Though, as a counterpoint: I am not experiencing mania, but the fact that I’ve recently acquired and optimistic temperament that has not been subject to calibration by the new action-space means that I might still be overestimating my abilities instead of underestimating them.
But given that I am strongly interested in doing things that successful people do that I couldn’t before:
Nick Winter’s assessments in his book “The Motivation Hacker” make me believe that there exists low hanging fruit when it comes to motivation that I have not yet picked. I would guess the same for typically surveyed people due to the recency of prescriptive motivation literature like “The Procrastination Equation”.
Successful students and learners follow regular patterns of behavior that can be turned into habits. The particular examples would be the writings of Cal Newport, Scott H Young, in addition to consulting my academic advisors and the successful students themselves. Needless to say I probably haven’t been using those patterns, which include precommtiments, oicking a good study environment and using it regularly, processing textbooks in a way that produces reviewable notes, and using office hours.
Twin and developmental studies might make me eat my dust on this if I’m directly challenging claims about a personality trait. I’m feeling a bit of resistance to looking them up but I should probably push through it and get it over with.
There are other conditions by which the amount of work and stress that someone can take on goes up, like joining the military; yes, I’m considering it. But there are also less extreme options like just having good health and being more organized, taking up a martial art or doing a sport. Not all of these are going to take off and most certainly I won’t be doing all of them at once. So one obstacle I need to consider is the timeframe towards orienting myself properly for success in biomedical and whether the value is greater or lower than lost wages or other measures of opportunity cost.
I have also experimented with nootropics, which I know believe are overrated but still a useful tool in the toolkit. Finally I am beginning to use Anki, which might be a good way of managing larger volumes of knowledge.
At this point I would like to get answers to my questions on actual working conditions, hiring practices, and future work opportunities. Grabbing all of the experiences with the largest decision-relevant information:cost ratio possible could help me resolve whether this plan will work out. This is unless all of the evidence from current models is substantial enough to outweigh the potential evidence from empiricism.
computer science and self-study; old people
There are at least two components here: the actual studying and skill acquisition, and the judgement made by the hiring practitioner.
I read on Less Wrong in this popular PSA that a handful of people have managed to get programming jobs through self-study. Although it seems reckless—would it be possible to define a satisficing case for the amount of practice that I would do towards the profile of skills of what a hiring person would want from their employee? This would help resolve the following:
whether or not the idea of studying is even feasible for the target skill level and time constraints
if you control for skill level, and add the condition of whether I have a compsci major or don’t have a compsci major, what do the probabilities of being hired look like? If for a person with a major at the expected skill level I will have has a largely dominating probability, then yeah, I would want to reconsider.
I could talk to HR people or other software engineers at developer meetups, or at career fairs, to get a clearer picture on this. But if like you claim this is a political factor, then maybe I won’t be getting the evidence I need.
You need to find a job you can be satisfied first, and one that’s doing the right thing second; you won’t be productive if you’re unhappy.
Getting the actual programming skills is easy if you’re smart. Getting the evidence that will lead people to hire you is harder. Large companies tend to go by the book; you will need the qualifications or something unusual like a personal recommendation from someone in the company. Startuppy places it’s more about fitting in with the culture and talking/coding well in interview. If that’s the kind of job you’re after you’ll probably be fine as a self-taught programmer if you can perform under interview pressure and you conform to the right stereotype. (it’s possible I’m being excessively cynical here)
The interest thing on your list is that neither Nick Winter, Cal Newport or Scott Young have jobs at some company. If you take those people as your role model, are you sure you want to focus on the goal of getting a job?
I might be a bit biased but I think it easier to do a startup when you can do computer programming.
That’s a good point. How mutually exclusive is the optimization path for being highly employable versus self-employing or bootstrapping? Is it just a question of efficiency of time spent or is there more to it?
How much computer science knowledge is necessary for startups, do you think? I can program and have worked on software modules and have written my own utilities, but I still have a lot to learn conceptually and I still need to survey a wider range of technologies, especially related to databases and web development in the front and back end. That’s even excluding some of the trendier hotspots like semantic web, NLP and machine learning.
That’s a good point. How mutually exclusive is the optimization path for being highly employable versus self-employing or bootstrapping? Is it just a question of efficiency of time spent or is there more to it?
There are companies that you can’t start via bootstrapping. I think a lot of expensive medical equipment design is in that class. I would also think that bio/nano tech is in that class.
I can program and have worked on software modules and have written my own utilities, but I still have a lot to learn conceptually and I still need to survey a wider range of technologies, especially related to databases and web development in the front and back end.
I have taken a semester worth of course on data bases and they didn’t tell me anything useful about them. It was mostly impractical theory. The most disturbing thing was that the TA didn’t know that a prepare statement in Java prevents you from SQL injections.
When it comes to databases the things you have to know are:
1) Try to never query the database directly in a way that allows for SQL injections.
2) Create indexes possible. It can make sense to experiment around with indexes to get optimal speed.
3) There something like transactions. In some settings a database automatically updates when you send it data, in other settings you have to commit or end the transaction.
Take a look at Nick Winters startup Skritter. He’s doing a spaced repetition learning software for learning Japanese and Chinese Kanji. In contrast to Anki his software allows you to draw the Kanji. As far as cognitive enchancement goes I think learning Kanji is in the ballpark.
How much computer science knowledge does that need? Not that much. You need to know how to use a webframework like Django. You need to know javascript, probably something like JQuery, html, css.
Some framework for iPhone/Android apps.
That’s a bunch but you can learn as you go along. It also isn’t deep computer science like machine learning and NLP.
In Nick Winter case it’s interesting that he’s a Asian studies minor. That’s where he learned that the world needs a better way to learn Kanjis. That’s where he felt the pain needed to focus on the idea.
I feel similar to the biochemistry that I learned while studying bioinformatics.
If you want to produce medicial technology and are already able to program I don’t think Biological Engineering is necessarily a bad choice.
But I would recommend you to put the knowledge directly into practice.
An Arduino lilypad is cheap. Design the hardware with it and program it. Think about the kind of data you can measure and what to do with it.
I worry that you’re deluding yourself. What evidence have you examined?
I knew several extremely smart people who planned to study something else at the same time as their university course. All of them gave up these plans in a matter of weeks. There was one term where due to an organizational cock-up I had to do 25% more courses than normal. I survived, but it wasn’t pleasant. University is hard, probably the hardest thing I’ve ever done. (It was certainly much harder than any job I’ve had since).
Remember that older people are in charge of most hiring. It’s politically unacceptable to say that certain subjects are outright better than others, but they are (or at least will be treated that way when hiring), and a “funny” course like this will make it harder to get jobs than a more traditional subject—even jobs for which this course is perfectly suited (explanation: even a more spread out degree won’t give you the actual skills the job needs. But a more focused degree is proof that you can focus on a subject and learn it deeply)
Your career will force you into specialization even if your degree isn’t specialized; after your first job or two people are much more interested in your experience than your degree. If you switch into another field after 10 years that means going back to being treated like a fresh graduate—very bad for your salary.
I thought you had something about motivation, but I can’t find it now; my experience is that doing something you believe in is not sufficiently powerful motivation when you’re working on something unpleasant. You need to find a job you can be satisfied first, and one that’s doing the right thing second; you won’t be productive if you’re unhappy.
So my advice is:
Do the most traditional, focused degree you can—in your case Computer Science (and do it at the most reputable university you can)
The prior odds of you successfully studying something else while at university are vanishingly low. Plan accordingly.
If you still want to follow this plan, I’d suggest you start the learning computer science part now. You’re not going to be much smarter next year, and you are going to be much busier. This should also give you some information on how much work you can do in practice.
I’m probably underweighing more conservative assessments like this, so I appreciate it.
I have not collected evidence the directly contradicts statistical assessments regarding the conscientiousness trait. Instead I’m making an inference based off a collection of evidence that I can name. I don’t think I’ve given much consideration to evidence strength yet so working through this will be a good exercise.
For example:
Historically my conscientiousness has been quite low in part due to depression. I’ve been coming out of that depression recently, and have improved in my ability to keep on task even when I’m discouraged. Oftentimes psyching myself out was the reason why I haven’t instigated behavioral change, because when I fall off the bandwagon I don’t get back on. This change towards optimism makes me feel comparatively more competent and willing to explore my alternatives for support and skills.
Though, as a counterpoint: I am not experiencing mania, but the fact that I’ve recently acquired and optimistic temperament that has not been subject to calibration by the new action-space means that I might still be overestimating my abilities instead of underestimating them.
But given that I am strongly interested in doing things that successful people do that I couldn’t before:
Nick Winter’s assessments in his book “The Motivation Hacker” make me believe that there exists low hanging fruit when it comes to motivation that I have not yet picked. I would guess the same for typically surveyed people due to the recency of prescriptive motivation literature like “The Procrastination Equation”.
Successful students and learners follow regular patterns of behavior that can be turned into habits. The particular examples would be the writings of Cal Newport, Scott H Young, in addition to consulting my academic advisors and the successful students themselves. Needless to say I probably haven’t been using those patterns, which include precommtiments, oicking a good study environment and using it regularly, processing textbooks in a way that produces reviewable notes, and using office hours.
Twin and developmental studies might make me eat my dust on this if I’m directly challenging claims about a personality trait. I’m feeling a bit of resistance to looking them up but I should probably push through it and get it over with.
There are other conditions by which the amount of work and stress that someone can take on goes up, like joining the military; yes, I’m considering it. But there are also less extreme options like just having good health and being more organized, taking up a martial art or doing a sport. Not all of these are going to take off and most certainly I won’t be doing all of them at once. So one obstacle I need to consider is the timeframe towards orienting myself properly for success in biomedical and whether the value is greater or lower than lost wages or other measures of opportunity cost.
I have also experimented with nootropics, which I know believe are overrated but still a useful tool in the toolkit. Finally I am beginning to use Anki, which might be a good way of managing larger volumes of knowledge.
At this point I would like to get answers to my questions on actual working conditions, hiring practices, and future work opportunities. Grabbing all of the experiences with the largest decision-relevant information:cost ratio possible could help me resolve whether this plan will work out. This is unless all of the evidence from current models is substantial enough to outweigh the potential evidence from empiricism.
There are at least two components here: the actual studying and skill acquisition, and the judgement made by the hiring practitioner.
I read on Less Wrong in this popular PSA that a handful of people have managed to get programming jobs through self-study. Although it seems reckless—would it be possible to define a satisficing case for the amount of practice that I would do towards the profile of skills of what a hiring person would want from their employee? This would help resolve the following:
whether or not the idea of studying is even feasible for the target skill level and time constraints
if you control for skill level, and add the condition of whether I have a compsci major or don’t have a compsci major, what do the probabilities of being hired look like? If for a person with a major at the expected skill level I will have has a largely dominating probability, then yeah, I would want to reconsider.
I could talk to HR people or other software engineers at developer meetups, or at career fairs, to get a clearer picture on this. But if like you claim this is a political factor, then maybe I won’t be getting the evidence I need.
I’ll keep this in mind. It does seem safer.
Getting the actual programming skills is easy if you’re smart. Getting the evidence that will lead people to hire you is harder. Large companies tend to go by the book; you will need the qualifications or something unusual like a personal recommendation from someone in the company. Startuppy places it’s more about fitting in with the culture and talking/coding well in interview. If that’s the kind of job you’re after you’ll probably be fine as a self-taught programmer if you can perform under interview pressure and you conform to the right stereotype. (it’s possible I’m being excessively cynical here)
The interest thing on your list is that neither Nick Winter, Cal Newport or Scott Young have jobs at some company. If you take those people as your role model, are you sure you want to focus on the goal of getting a job?
I might be a bit biased but I think it easier to do a startup when you can do computer programming.
That’s a good point. How mutually exclusive is the optimization path for being highly employable versus self-employing or bootstrapping? Is it just a question of efficiency of time spent or is there more to it?
How much computer science knowledge is necessary for startups, do you think? I can program and have worked on software modules and have written my own utilities, but I still have a lot to learn conceptually and I still need to survey a wider range of technologies, especially related to databases and web development in the front and back end. That’s even excluding some of the trendier hotspots like semantic web, NLP and machine learning.
There are companies that you can’t start via bootstrapping. I think a lot of expensive medical equipment design is in that class. I would also think that bio/nano tech is in that class.
I have taken a semester worth of course on data bases and they didn’t tell me anything useful about them. It was mostly impractical theory. The most disturbing thing was that the TA didn’t know that a prepare statement in Java prevents you from SQL injections.
When it comes to databases the things you have to know are:
1) Try to never query the database directly in a way that allows for SQL injections.
2) Create indexes possible. It can make sense to experiment around with indexes to get optimal speed.
3) There something like transactions. In some settings a database automatically updates when you send it data, in other settings you have to commit or end the transaction.
Take a look at Nick Winters startup Skritter. He’s doing a spaced repetition learning software for learning Japanese and Chinese Kanji. In contrast to Anki his software allows you to draw the Kanji. As far as cognitive enchancement goes I think learning Kanji is in the ballpark.
How much computer science knowledge does that need? Not that much. You need to know how to use a webframework like Django. You need to know javascript, probably something like JQuery, html, css. Some framework for iPhone/Android apps.
That’s a bunch but you can learn as you go along. It also isn’t deep computer science like machine learning and NLP.
In Nick Winter case it’s interesting that he’s a Asian studies minor. That’s where he learned that the world needs a better way to learn Kanjis. That’s where he felt the pain needed to focus on the idea. I feel similar to the biochemistry that I learned while studying bioinformatics.
If you want to produce medicial technology and are already able to program I don’t think Biological Engineering is necessarily a bad choice. But I would recommend you to put the knowledge directly into practice.
An Arduino lilypad is cheap. Design the hardware with it and program it. Think about the kind of data you can measure and what to do with it.