Tuning Machine Learning models with GPopt’s new version
This article was first published on T. Moudiki's Webpage - Python , and kindly contributed to python-bloggers. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.
Want to share your content on python-bloggers? click here.
A new version of Python package GPopt
is available on PyPI. GPopt
is a package for stochastic optimization based on Gaussian process regressors (for now, the name GP*
is ‘unfortunate’). This type of optimization is particularly useful for tuning machine learning models’ hyperparameters.
The main change in GPopt
’s v0.3.0
is: the user can now choose a different surrogate model (see this excellent book for more details on the procedure).
You’ll find below a link to a notebook showcasing the use of GPopt
for tuning Boosted Configuration Networks (BCN version 0.7.0).
To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python .
Want to share your content on python-bloggers? click here.