Skip to content

Update X and Y during init #67

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 13, 2017

Conversation

ajnisbet
Copy link
Contributor

I often plot/monitor the optimisation each iteration in case I want to stop early or change some parameters. I do this by adding monitoring code to the target function.

Currently, the x and y values checked during explore and init are not accessible (stored on the bo object) until after both steps are complete, which can take a long time for expensive functions.

This PR adds updates self.X and self.Y each iteration of the explore/init step. Most of the logic is converting append and += to numpy equivalents. Adds a bit of complexity but doesn't change any other part of the API.

You could go all out and also update bo.go.fit() and bo.res each iteration, and add callback parameter
to be called after each iteration to avoid hacking the target function like xgboost and some sklearn functions. But just updating X and Y is the minimum info needed for a user to just do that themselves.

Lets you do monitoring during the explore and init stages.
@fmfn
Copy link
Member

fmfn commented Oct 11, 2017

Looks good. Did you try running the usage.py script?

@ajnisbet
Copy link
Contributor Author

Yup, same logs and solution.

@fmfn fmfn merged commit bc56779 into bayesian-optimization:master Oct 13, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants