2020 recap, Gradient Boosting, Generalized Linear Models, AdaOpt with nnetsauce and mlsauce

[This article was first published on T. Moudiki's Webpage - Python, and kindly contributed to python-bloggers]. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.

A few highlights from 2020 in this blog include:

  • The introduction of mlsauce’s AdaOpt and LSBoost
  • The introduction of Generalized Linear Models (GLMs) in nnetsauce

What are AdaOpt, LSBoost and nnetsauce’s GLMs?

  • mlsauce’s AdaOpt is a probabilistic classifier based on a mix of multivariable optimization and a nearest neighbors algorithm. This document explains AdaOpt with more details, in English and without formulas. Hopefully that makes it accessible to more people. Other resources on AdaOpt can be
    found through this link.

  • mlsauce’s LSBoost implements Gradient Boosting of augmented base learners (base learners = basic components in ensemble learning). In LSBoost, the base learners are penalized regression models augmented through randomized hidden nodes and activation functions. Examples in both R and Python are presented in these posts. And if anyone reading this is a Windows + R specialist, I’d love to hear from him/her, because, I get sometimes notified that mlsauce doesn’t work well at this intersection (Windows + R).

  • Regarding GLMs in nnetsauce, this post from november 28th will offer you a brief introduction to what they are. nnetsauce’s GLMs are actually nonlinear models since the features (covariates) are transformed by using randomized/quasi-randomized hidden nodes and activation functions. The current optimizers for GLMs loss functions in nnetsauce are based on various gradient descent algorithms. There are probably some more efficient ways that can be explored. This is a work in progress.

In general, and not only for GLMs, the best way to read nnetsauce things is: https://thierrymoudiki.github.io/blog/#QuasiRandomizedNN. In #QuasiRandomizedNN, you’ll find nnetsauce’s posts you might have missed. For example, this one, in which nnetsauce’s MultitaskClassifier perfectly classifies penguins (in R).

I can see that nnetsauce and mlsauce are downloaded thousands of times each month. But that’s not the most important thing to me!
If you’re using mlsauce, nnetsauce or any other tool presented in this blog, feel free and do not hesitate to contribute, or star the repository. That way, we could create and keep alive a cool community around these tools. That’s ultimately the most important thing to me.

Best wishes for 2021!

To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python.

Want to share your content on python-bloggers? click here.