For info, you can find most of the exercises in python (done by someone else than Ng) here. They are still not that useful: I watched the course videos a couple of years ago and I stopped doing the exercises very quickly.
I agree with you on both the praise and the complaints about the course. Besides it being very dated, I think that the main problem was that Ng was neither clear nor consistent about the goal. The videos are mostly an non-formal introduction to a range of machine learning techniques plus some in-depth discussion of broadly useful concepts and of common pitfalls for self-trained ML users. I found it delivered very well on that. But the exercises are mostly very simple implementations, which would maybe fit a more formal course. Using an already implemented package to understand hands-on overfitting, regularization etc. would be much more fitting to the course (no pun intended). At the same time, Ng kept repeating stuff like “at the end of the course you will know more than most ML engineers” which was a very transparent lie, but gave the impression that the course wanted to impart a working knowledge of ML, which was definitely not the case.
I don’t know how much this is a common problem with MOOCs. It seems easily fixable but the incentives might be against it happening (being unclear about the course, just as aiming for students with minimal background, can be useful in attracting more people). Like johnswentworth I had more luck with open course ware, with the caveat that sometimes very good courses build on other ones with are not available or have insufficient online material.
For info, you can find most of the exercises in python (done by someone else than Ng) here. They are still not that useful: I watched the course videos a couple of years ago and I stopped doing the exercises very quickly.
I agree with you on both the praise and the complaints about the course. Besides it being very dated, I think that the main problem was that Ng was neither clear nor consistent about the goal. The videos are mostly an non-formal introduction to a range of machine learning techniques plus some in-depth discussion of broadly useful concepts and of common pitfalls for self-trained ML users. I found it delivered very well on that. But the exercises are mostly very simple implementations, which would maybe fit a more formal course. Using an already implemented package to understand hands-on overfitting, regularization etc. would be much more fitting to the course (no pun intended). At the same time, Ng kept repeating stuff like “at the end of the course you will know more than most ML engineers” which was a very transparent lie, but gave the impression that the course wanted to impart a working knowledge of ML, which was definitely not the case.
I don’t know how much this is a common problem with MOOCs. It seems easily fixable but the incentives might be against it happening (being unclear about the course, just as aiming for students with minimal background, can be useful in attracting more people). Like johnswentworth I had more luck with open course ware, with the caveat that sometimes very good courses build on other ones with are not available or have insufficient online material.