Python-bloggers

2020 recap, Gradient Boosting, Generalized Linear Models, AdaOpt with nnetsauce and mlsauce

This article was first published on T. Moudiki's Webpage - Python , and kindly contributed to python-bloggers. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.

A few highlights from 2020 in this blog include:

What are AdaOpt, LSBoost and nnetsauce’s GLMs?

In general, and not only for GLMs, the best way to read nnetsauce things is: https://thierrymoudiki.github.io/blog/#QuasiRandomizedNN. In #QuasiRandomizedNN, you’ll find nnetsauce’s posts you might have missed. For example, this one, in which nnetsauce’s MultitaskClassifier perfectly classifies penguins (in R).

I can see that nnetsauce and mlsauce are downloaded thousands of times each month. But that’s not the most important thing to me!
If you’re using mlsauce, nnetsauce or any other tool presented in this blog, feel free and do not hesitate to contribute, or star the repository. That way, we could create and keep alive a cool community around these tools. That’s ultimately the most important thing to me.

Best wishes for 2021!

To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python .

Want to share your content on python-bloggers? click here.
Exit mobile version