What’s the best way to learn programming from a fundamentals-first perspective? I’ve taken / am taking a few introductory programming courses, but I keep feeling like I’ve got all sorts of gaps in my understanding of what’s going on. The professors keep throwing out new ideas and functions and tools and terms without thoroughly explaining how and why it works like that. If someone has a question the approach is often, “so google it or look in the help file”. But my preferred learning style is to go back to the basics and carefully work my way up so that I thoroughly understand what’s going on at each step along the way.
This might be counter-intuitive and impractical for self-teaching, but for me it was an assembly language course that made it ‘click’ for how things work behind the scenes. It doesn’t have to be much and you’ll probably never use it again, but the concepts will help your broader understanding.
If you can be more specific about which parts baffle you, I might be able to recommend something more useful.
Nothing in particular baffles me. I can get through the material pretty fine. It’s just that I prefer starting from a solid and thorough grasp of all the fundamentals and working on up from there, rather than jumping head-first into the middle of a subject and then working backwards to fill in any gaps as needed. I also prefer understanding why things work rather than just knowing that they do.
It’s just that I prefer starting from a solid and thorough grasp of all the fundamentals
Which fundamentals do you have in mind? There are multiple levels of “fundamentals” and they fork, too.
For example, the “physical execution” fork will lead you to delving into assembly language and basic operations that processors perform. But the “computer science” fork will lead you into a very different direction, maybe to LISP’s lambdas and ultimately to things like the Turing machine.
Whatever fundamentals are necessary to understand the things that I’m likely to come across while programming (I’m hoping to go into data science, if that makes a difference). I don’t know enough to know which particular fundamentals are needed for this, so I guess that’s actually part of the question.
Well, if you’ll be going into data science, it’s unlikely that you will care greatly about the particulars of the underlying hardware. This means the computer-science branch is more useful to you than the physical-execution one.
I am still not sure what kind of fundamentals do you want. The issue is that the lowest abstraction level is trivially simple: you have memory which can store and retrieve values (numbers, basically), and you have a processing unit which understands sequences of instructions about doing logical and mathematical operations on those values. That’s it.
The interesting parts, and the ones from which understanding comes (IMHO) are somewhat higher in the abstraction hierarchy. They are often referred to as programming language paradigms.
The major paradigms are imperative (Fortran, C, Perl, etc.), functional (LISP), logical (Prolog), and object-oriented (Smalltalk, Ruby).
They are noticeably different in that writing non-trivial code in different paradigms requires you to… rearrange your mind in particular ways. The experience is often described as a *click*, an “oh, now it all makes sense” moment.
I guess a good starting point might be: Where do I go to learn about each of the different paradigms? Again, I’d like to know the theory as well as the practice.
I understand what you mean here, but in programming, it sometimes makes sense to do things this way. For example, in my introduction to programming course, I used Dictionaries/Hashes to write some programs. Key-value pairs are important for writing certain types of simple programs, but I didn’t really understand how they worked. Almost a year later, I took an algorithms course and learned about hash functions and hash maps and finally understood how they worked. It wouldn’t have made sense to refrain from using this tool until I’d learned how to implement them and it was really rewarding to finally understand it.
I always like to learn things from the ground up too, but this way just works sometimes in programming.
Could you give a couple examples of specific things that you’d like to understand?
Without that, a classic that might match what you’re interested in is Structure and Interpretation of Computer Programs. It starts as an introduction to general programming concepts and ends as an introduction to writing interpreters.
I’ve been having a bit of a hard time coming up with specifics, because it’s more a general sense that I’m lacking a lot of the basics. Like the professor will say something and it’ll obliquely reference a concept that he seems to expect I’m familiar with, but I have no idea what he’s referring to. So then I look it up on Wikipedia and the article mentions 10 other basic-sounding concepts that I’ve never heard of either. Or for example when the programming assignment uses a function that I don’t know how to use yet. So I do the obvious thing of googling for it or looking it up in the documentation. But the documentation is referencing numerous concepts that I have only a vague idea of what they mean, so that I often only get a hazy notion of what the function does.
After I made my original post I looked around for a while on sites like Quora. I also took a look at this reddit list. The general sense I got was that to learn programming properly you should go for a thorough computer science curriculum. Do you agree?
The suggestion was to look up university CS degree curricula and then look around for equivalent MOOCs / books / etc. to learn it on my own. So I looked up the curricula. But most of the universities I looked at said to start out with an introductory programming language course, which is what I was doing before anyway. I’ve taken intro courses in Python and R, and I ran into the problems I mentioned above. The MITx Python course that I took was better on this score, but still not as good as I would have hoped. There are loads of resources out there for learning either of those languages, but I don’t know how to find which ones fit my learning style. Maybe I should just try out each until I find one that works for me?
The book you mentioned kept coming up as well. That book was created for MIT’s Intro to CS course, but MIT itself has since replaced the original course with the Python course that I took (I took the course on edX, so probably it’s a little dumbed-down, but my sense was that it’s pretty similar to the regular course at MIT). On the other hand, looking at the book’s table of contents it looks like the book covers several topics not covered in the class.
There were also several alternative books mentioned:
If you want a fundamentals-first perspective, I definitely suggest reading SICP. I think the Python course may have gone in a slightly different direction (I never looked at it) but I can’t think of how you could get more fundamentals-first than the book.
Afterward, I suggest out of your list Concepts, Techniques, and Models of Computer Programming. That answers your question of “where do I go to learn about each of the different paradigms.”
This is more background than you will strictly need to be a useful data scientist, but if you find it fun and satisfying to learn, then it will only be helpful.
What’s the best way to learn programming from a fundamentals-first perspective? I’ve taken / am taking a few introductory programming courses, but I keep feeling like I’ve got all sorts of gaps in my understanding of what’s going on. The professors keep throwing out new ideas and functions and tools and terms without thoroughly explaining how and why it works like that. If someone has a question the approach is often, “so google it or look in the help file”. But my preferred learning style is to go back to the basics and carefully work my way up so that I thoroughly understand what’s going on at each step along the way.
This might be counter-intuitive and impractical for self-teaching, but for me it was an assembly language course that made it ‘click’ for how things work behind the scenes. It doesn’t have to be much and you’ll probably never use it again, but the concepts will help your broader understanding.
If you can be more specific about which parts baffle you, I might be able to recommend something more useful.
Nothing in particular baffles me. I can get through the material pretty fine. It’s just that I prefer starting from a solid and thorough grasp of all the fundamentals and working on up from there, rather than jumping head-first into the middle of a subject and then working backwards to fill in any gaps as needed. I also prefer understanding why things work rather than just knowing that they do.
Which fundamentals do you have in mind? There are multiple levels of “fundamentals” and they fork, too.
For example, the “physical execution” fork will lead you to delving into assembly language and basic operations that processors perform. But the “computer science” fork will lead you into a very different direction, maybe to LISP’s lambdas and ultimately to things like the Turing machine.
Whatever fundamentals are necessary to understand the things that I’m likely to come across while programming (I’m hoping to go into data science, if that makes a difference). I don’t know enough to know which particular fundamentals are needed for this, so I guess that’s actually part of the question.
Well, if you’ll be going into data science, it’s unlikely that you will care greatly about the particulars of the underlying hardware. This means the computer-science branch is more useful to you than the physical-execution one.
I am still not sure what kind of fundamentals do you want. The issue is that the lowest abstraction level is trivially simple: you have memory which can store and retrieve values (numbers, basically), and you have a processing unit which understands sequences of instructions about doing logical and mathematical operations on those values. That’s it.
The interesting parts, and the ones from which understanding comes (IMHO) are somewhat higher in the abstraction hierarchy. They are often referred to as programming language paradigms.
The major paradigms are imperative (Fortran, C, Perl, etc.), functional (LISP), logical (Prolog), and object-oriented (Smalltalk, Ruby).
They are noticeably different in that writing non-trivial code in different paradigms requires you to… rearrange your mind in particular ways. The experience is often described as a *click*, an “oh, now it all makes sense” moment.
I guess a good starting point might be: Where do I go to learn about each of the different paradigms? Again, I’d like to know the theory as well as the practice.
Google is your friend. You can start e.g. here or here.
I understand what you mean here, but in programming, it sometimes makes sense to do things this way. For example, in my introduction to programming course, I used Dictionaries/Hashes to write some programs. Key-value pairs are important for writing certain types of simple programs, but I didn’t really understand how they worked. Almost a year later, I took an algorithms course and learned about hash functions and hash maps and finally understood how they worked. It wouldn’t have made sense to refrain from using this tool until I’d learned how to implement them and it was really rewarding to finally understand it.
I always like to learn things from the ground up too, but this way just works sometimes in programming.
Could you give a couple examples of specific things that you’d like to understand?
Without that, a classic that might match what you’re interested in is Structure and Interpretation of Computer Programs. It starts as an introduction to general programming concepts and ends as an introduction to writing interpreters.
I’ve been having a bit of a hard time coming up with specifics, because it’s more a general sense that I’m lacking a lot of the basics. Like the professor will say something and it’ll obliquely reference a concept that he seems to expect I’m familiar with, but I have no idea what he’s referring to. So then I look it up on Wikipedia and the article mentions 10 other basic-sounding concepts that I’ve never heard of either. Or for example when the programming assignment uses a function that I don’t know how to use yet. So I do the obvious thing of googling for it or looking it up in the documentation. But the documentation is referencing numerous concepts that I have only a vague idea of what they mean, so that I often only get a hazy notion of what the function does.
After I made my original post I looked around for a while on sites like Quora. I also took a look at this reddit list. The general sense I got was that to learn programming properly you should go for a thorough computer science curriculum. Do you agree?
The suggestion was to look up university CS degree curricula and then look around for equivalent MOOCs / books / etc. to learn it on my own. So I looked up the curricula. But most of the universities I looked at said to start out with an introductory programming language course, which is what I was doing before anyway. I’ve taken intro courses in Python and R, and I ran into the problems I mentioned above. The MITx Python course that I took was better on this score, but still not as good as I would have hoped. There are loads of resources out there for learning either of those languages, but I don’t know how to find which ones fit my learning style. Maybe I should just try out each until I find one that works for me?
The book you mentioned kept coming up as well. That book was created for MIT’s Intro to CS course, but MIT itself has since replaced the original course with the Python course that I took (I took the course on edX, so probably it’s a little dumbed-down, but my sense was that it’s pretty similar to the regular course at MIT). On the other hand, looking at the book’s table of contents it looks like the book covers several topics not covered in the class.
There were also several alternative books mentioned:
How to Design Programs
Concepts, Techniques, and Models of Computer Programming
Essentials of Programming Languages
Modern Programming Languages: A Practical Introduction
Programming Language Pragmatics
Programming Languages: Application and Interpretation
Any thoughts on which is the best choice to start off with?
If you want a fundamentals-first perspective, I definitely suggest reading SICP. I think the Python course may have gone in a slightly different direction (I never looked at it) but I can’t think of how you could get more fundamentals-first than the book.
Afterward, I suggest out of your list Concepts, Techniques, and Models of Computer Programming. That answers your question of “where do I go to learn about each of the different paradigms.”
This is more background than you will strictly need to be a useful data scientist, but if you find it fun and satisfying to learn, then it will only be helpful.