A new version of nnetsauce (v0.3.1)

[This article was first published on T. Moudiki's Webpage - Python, and kindly contributed to python-bloggers]. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.

A new version (v0.3.1) of nnetsauceis now available. The stable version on PyPi, and a development version on Github. Notable changes for this new version are:

  • The inclusion of an upper bound on the error rate of Adaboost: crucial, because the error rate at each iteration has to be at least as good as random guess’.
  • New quasi-randomized networks models for regression and classification, with two shrinkage parameters (for model regularization).

The full list of changes can always be found here on Github and a notebook describing some of the new models (for classification) here for 4 datasets (with a snippet below on a wine classification dataset).

image-title-here

Contributions/remarks are welcome as usual, you can submit a pull request on Github.

Note: I am currently looking for a gig. You can hire me on Malt or send me an email: thierry dot moudiki at pm dot me. I can do descriptive statistics, data preparation, feature engineering, model calibration, training and validation, and model outputs’ interpretation. I am fluent in Python, R, SQL, Microsoft Excel, Visual Basic (among others) and French. My résumé? Here!

To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python.

Want to share your content on python-bloggers? click here.