Backpropagating quasi-randomized neural networks
This article was first published on T. Moudiki's Webpage - Python , and kindly contributed to python-bloggers. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.
Want to share your content on python-bloggers? click here.
The FiniteDiffRegressor
– implemented in Python package tisthemachinelearner
– blends a finite difference-based training algorithm for (quasi-)randomized artificial neural network models’ weights with any supervised Machine Learning regression model.
More details about the implementation can be found in this technical note:
https://www.researchgate.net/publication/392923564_Backpropagating_quasi-randomized_neural_networks
On GitHub:
https://github.com/Techtonique/tisthemachinelearner
And here is a link to the notebook containing examples of use of FiniteDiffRegressor
:
To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python .
Want to share your content on python-bloggers? click here.