Again, your perception of the instructors’ competencies may have been the result of a mismatch between the sort of environment the program was trying to offer and the sort of environment you were hoping for.
This actually sounds about right.
I think that I care more about job-preparedness, potential for impact, and preparing people for being able to earn-to-give or do direct EA work. I think that Robert also cares about those things, which is why I liked his weekly interview sessions, as I mentioned above.
However, I didn’t get the sense that Jonah, the instructor for the first cohort, really cared about these things quite as much. Jonah strikes me as an intelligent individual whose heart is in academia, rather than in data science or industry. This was quite problematic, because, among other reasons, it meant that even his explanations of grittier things were too focused on the big picture, and too spare on details for some people to figure out how to actually do the thing at all. It also skewed the distribution of topics taught away from things relevant to industry.
Could you please elaborate with specific examples of times when Jonah’s explanations were too abstract and not sufficiently practical?
This will be useful information for us, because we certainly want to identify areas in which our curriculum needs further improvement. My personal recollection of Jonah’s lectures is that they involved a lot of example code, visualization, back-and-forth Q&A, and interactive exploration of real datasetsin lieu of presenting, say, abstract mathematical proofs.
It also skewed the distribution of topics taught away from things relevant to industry.
Along similar lines, what are some specific topics that you think were neglected in favor of more abstract but less applicable material?
I’m particularly interested in what material you thought was overemphasized in the curriculum—my impression is that all of the topics covered were very fundamental to data science as a whole. While one can express a valid preference for certain fundamental topics over others, I would be hard-pressed to say that any of the topics covered in the Signal curriculum weren’t extremely industry-relevant.
I’ve already had versions of this conversation with Robert and Jonah in person, but I’ll reiterate a few things I shared with them here, since you asked politely. Also, this conversation is becoming aversive to me, so it will become increasingly difficult for me to respond to your comments as we get farther and farther down this comment chain.
specific examples of times when Jonah’s explanations were too abstract and not sufficiently practical?
There were actually multiple times during the first couple weeks when I (or my partner and I) would spend 4+ hours trying to fix one particular line of code, and Jonah would give big-picture answers about e.g. how linear regression worked in theory, when what I’d asked for were specific suggestions on how to fix that line of code. This led me to giving up on asking Jonah for help after long enough.
what are some specific topics that you think were neglected in favor of more abstract but less applicable material?
Intermediate and advanced SQL, practice of certain social skills (e.g. handshakes, being interested in your interviewer, and other interview-relevant social skills), and possibly nonlinear models.
Thanks for the written feedback (which adds to what I had gleaned in person).
There were actually multiple times during the first couple weeks when I (or my partner and I) would spend 4+ hours trying to fix one particular line of code, and Jonah would give big-picture answers about e.g. how linear regression worked in theory, when what I’d asked for were specific suggestions on how to fix that line of code. This led me to giving up on asking Jonah for help after long enough.
I think that what happened here is me having misunderstood what you were asking for, rather than any disinclination on my part to help you with individual lines of code. I will take this feedback into account.
Intermediate and advanced SQL, practice of certain social skills (e.g. handshakes, being interested in your interviewer, and other interview-relevant social skills), and possibly nonlinear models.
This is helpful detail regarding what you were looking for. Which topics would you have preferred to have been been dropped in favor of these?
I (or my partner and I) would spend 4+ hours trying to fix one particular line of code, and Jonah would give big-picture answers about e.g. how linear regression worked in theory
For context, what was your programming ability before you started the course? It seems strange to spend 4 hours getting (one line of) linear regression to work, but it also seems strange for an instructor to give a vague answer to something so basic, unless he was using the “Socratic Method”?
That’s a funny comment. It does exactly the same thing twice: Please tell us where we didn’t do too well, oh, and you are COMPLETELY WRONG because we did everything very well.
In context, it makes a lot of sense for him to do that. He’s working for Signal now, so presumably is interested in how to improve the program, and he was a participant at the same time as Fluttershy, so he got an impression of the program as a participant.
In context, it makes a lot of sense for him to do that.
No, it doesn’t. Continuing with the charitable interpretation, wearing these two hats at the same time is… difficult. Either he, as an employee of Signal, is genuinely interested in feedback, or he as a participant thinks Fluttershy is all wrong and making shit up because it was perfect for andrewjho (here he, of course, committs the typical mind fallacy, but that’s a minor issue at this point).
This actually sounds about right.
I think that I care more about job-preparedness, potential for impact, and preparing people for being able to earn-to-give or do direct EA work. I think that Robert also cares about those things, which is why I liked his weekly interview sessions, as I mentioned above.
However, I didn’t get the sense that Jonah, the instructor for the first cohort, really cared about these things quite as much. Jonah strikes me as an intelligent individual whose heart is in academia, rather than in data science or industry. This was quite problematic, because, among other reasons, it meant that even his explanations of grittier things were too focused on the big picture, and too spare on details for some people to figure out how to actually do the thing at all. It also skewed the distribution of topics taught away from things relevant to industry.
Could you please elaborate with specific examples of times when Jonah’s explanations were too abstract and not sufficiently practical?
This will be useful information for us, because we certainly want to identify areas in which our curriculum needs further improvement. My personal recollection of Jonah’s lectures is that they involved a lot of example code, visualization, back-and-forth Q&A, and interactive exploration of real datasets in lieu of presenting, say, abstract mathematical proofs.
Along similar lines, what are some specific topics that you think were neglected in favor of more abstract but less applicable material?
I’m particularly interested in what material you thought was overemphasized in the curriculum—my impression is that all of the topics covered were very fundamental to data science as a whole. While one can express a valid preference for certain fundamental topics over others, I would be hard-pressed to say that any of the topics covered in the Signal curriculum weren’t extremely industry-relevant.
I’ve already had versions of this conversation with Robert and Jonah in person, but I’ll reiterate a few things I shared with them here, since you asked politely. Also, this conversation is becoming aversive to me, so it will become increasingly difficult for me to respond to your comments as we get farther and farther down this comment chain.
There were actually multiple times during the first couple weeks when I (or my partner and I) would spend 4+ hours trying to fix one particular line of code, and Jonah would give big-picture answers about e.g. how linear regression worked in theory, when what I’d asked for were specific suggestions on how to fix that line of code. This led me to giving up on asking Jonah for help after long enough.
Intermediate and advanced SQL, practice of certain social skills (e.g. handshakes, being interested in your interviewer, and other interview-relevant social skills), and possibly nonlinear models.
Thanks for the written feedback (which adds to what I had gleaned in person).
I think that what happened here is me having misunderstood what you were asking for, rather than any disinclination on my part to help you with individual lines of code. I will take this feedback into account.
This is helpful detail regarding what you were looking for. Which topics would you have preferred to have been been dropped in favor of these?
For context, what was your programming ability before you started the course? It seems strange to spend 4 hours getting (one line of) linear regression to work, but it also seems strange for an instructor to give a vague answer to something so basic, unless he was using the “Socratic Method”?
That’s a funny comment. It does exactly the same thing twice: Please tell us where we didn’t do too well, oh, and you are COMPLETELY WRONG because we did everything very well.
In context, it makes a lot of sense for him to do that. He’s working for Signal now, so presumably is interested in how to improve the program, and he was a participant at the same time as Fluttershy, so he got an impression of the program as a participant.
No, it doesn’t. Continuing with the charitable interpretation, wearing these two hats at the same time is… difficult. Either he, as an employee of Signal, is genuinely interested in feedback, or he as a participant thinks Fluttershy is all wrong and making shit up because it was perfect for andrewjho (here he, of course, committs the typical mind fallacy, but that’s a minor issue at this point).