Is this list still being maintained and/or discussed over ?
I feel like the ML text-book being recommended *could* at least use an alternative in the form of: http://www.deeplearningbook.org/ , it takes a purely frequentist perspective (but consider that’s basically the “practical” perspective at the moment, with even the bayesianNN work being… not so Bayesian), but it’s much more concise, does a good job at explaining the math and skips over historical stuff that people either know of already (e.g DT) or that is essentially useless outside of niche applications and legacy systems (e.g. kernel trick).
Or possibly even the fast-ai course http://course18.fast.ai/ml , granted, it’s no a text-book per-say, but the combination of the notes is textbook-like.
At least it would be worth creating a “Modern automatic differentiation modeling” section or something for it, if people disagree that ML can be essentially reduced to “whatever the top papers on paperswithcode are doing in the last 4 or 5 years”.
[The Deep Learning Book] takes a purely frequentist perspective
Are you sure? I haven’t read too much of it (though I read some from time to time), but it seems solidly agnostic about the debate. What do you think the book lacks that would be found in an equivalent Bayesian textbook?
Hmh, I interpret standard nerual netwroks (which are the ones it focuses on) to be frequentist, since you are essentially maximising a likelihood without any priors and without an built-in uncertainty.
There’s the whole bayesian nn world where the focus is on being able to easily embed priors and treating every cell as a probability distribution and obtaining a probability distribution for every output cell (which is the important part).
In practice this doesn’t differ much, since you’re essentially just adding a few more terms to every weight and bias, but it seems to be a field that’s picking up speed… then again, I might just be stuck in my own reading bubble.
I guess upon further consideration I could scratch that whole thing, I’m honestly unsure if baesyan/frequentist is even a relevant distinction to be made anymore about modern ML/statistics/
Is this list still being maintained and/or discussed over ?
I feel like the ML text-book being recommended *could* at least use an alternative in the form of: http://www.deeplearningbook.org/ , it takes a purely frequentist perspective (but consider that’s basically the “practical” perspective at the moment, with even the bayesianNN work being… not so Bayesian), but it’s much more concise, does a good job at explaining the math and skips over historical stuff that people either know of already (e.g DT) or that is essentially useless outside of niche applications and legacy systems (e.g. kernel trick).
Or possibly even the fast-ai course http://course18.fast.ai/ml , granted, it’s no a text-book per-say, but the combination of the notes is textbook-like.
At least it would be worth creating a “Modern automatic differentiation modeling” section or something for it, if people disagree that ML can be essentially reduced to “whatever the top papers on paperswithcode are doing in the last 4 or 5 years”.
Are you sure? I haven’t read too much of it (though I read some from time to time), but it seems solidly agnostic about the debate. What do you think the book lacks that would be found in an equivalent Bayesian textbook?
Hmh, I interpret standard nerual netwroks (which are the ones it focuses on) to be frequentist, since you are essentially maximising a likelihood without any priors and without an built-in uncertainty.
There’s the whole bayesian nn world where the focus is on being able to easily embed priors and treating every cell as a probability distribution and obtaining a probability distribution for every output cell (which is the important part).
In practice this doesn’t differ much, since you’re essentially just adding a few more terms to every weight and bias, but it seems to be a field that’s picking up speed… then again, I might just be stuck in my own reading bubble.
I guess upon further consideration I could scratch that whole thing, I’m honestly unsure if baesyan/frequentist is even a relevant distinction to be made anymore about modern ML/statistics/