-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Mixed-integer differential evolution for next point suggestion #549
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Code linter optimisation with Ruff
@udicaprio I realized the coverage bot is broken, so let me fix that first before moving on with this PR |
@till-m, fine no problem. I just realized that I could make some easy modification that improve the computational performance without affecting the identifications one that I would like to implement. I will be working on this today, it is a quite an easy one but I need to fine-tune some DE parameters. |
Great! Could you tag me or request a review when you're done? |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #549 +/- ##
==========================================
- Coverage 97.92% 97.87% -0.06%
==========================================
Files 10 10
Lines 1158 1175 +17
==========================================
+ Hits 1134 1150 +16
- Misses 24 25 +1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Computational-efficiency of the MIP identification increased
Linter improved
FYI I need to manually approve the tests since you're a first-time contributor. If I forget to run them, please tag me and I will make sure they run. Also, don't worry about small changes in coverage. We're trying to cover the code as much as makes sense, not chase a number. |
Hello @till-m, thank you for the notification. Seems the code passed all the tests except the coverage one. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Generally speaking, this is great code: Clear, concise, etc. I've left some minor comments, though some are just to clarify things. One big issue, however, is assuming that any non-continuous parameter is necessarily integer valued... I don't see any easy way around this right now, though I assume that the algorithm isn't inherently incompatible with discrete, non-int parameters.
should I make any further modifications in order to reach at least the old coverage percentage?
The line that is currently not covered is perfectly fine to not cover, imo.
Ah, also, the name of the function should probably be adjusted somewhat. I'm not sure what the best name would be -- something like |
Working on this |
@till-m, thank you for revising the code and your comments. I am working on them. I will push the modifications as soon as possible. |
Excellent, thank you, much appreciated 🙏 |
Regarding the discrete != integrable discussion, I know that this significantly complicates the PR and it might not be what you've signed up for when you offered to implement DE optimization. If there's something I can do to make things easier for you, let me know. I'm convinced that this is an extremely valuable contribution to the library so I'd like to make sure it gets added! |
Hello @till-m, I have modified the DE search including the transformation with the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @udicaprio, things have been/are a bit busy on my end, sorry about that. I think this is almost there, but I left a few small comments that should be addressed, I think.
Thanks for all the work!
bayes_opt/acquisition.py
Outdated
ntrials = max(1, len(x_seeds) // 100) | ||
for _ in range(ntrials): | ||
xinit = space.random_sample(15 * len(space.bounds), random_state=self.random_state) | ||
de = DifferentialEvolutionSolver(acq, bounds=space.bounds, init=xinit, rng=self.random_state) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need to use polish=False
to turn off gradient-based polishing?
…continuous parameter search
…version compatibility
Hello @till-m, I have just completed the code revision with your comments. Additionally, I have added a version check on Scipy and reformatting of the differential evolution input since the one about the random seed has been changed at version 1.15. Can you please let the workflow run? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great, I think! I have one small comment regarding the code. Other than that I think the final thing I would ask you to do is the check the references within the code to l-bfgs-b
. I think in many cases variable names and docstrings should change to reflect that we're not exclusively doing L-BFGS-B Optimization anymore. After this, it should be good to merge.
Sidenote: Do you think we need to make sure that we test on scipy over/under |
…lies kernel transformation to x_min
Hello @till-m, I applied the suggested modification. Can you please trigger the testing pipeline? |
Hello @till-m, I forgot to check the linter. Should be fixed now. Can you please re-trigger the test? |
Sorry, I missed that message. Rerunning now! |
@udicaprio This is a spectacular enhancement to the library. Thank you for your efforts to implement this vital feature. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🚀 will merge soon! :)
Thank you very much |
Thank you! Looking forward to other contributions |
Thanks for your contribution! :) |
Hi,
this commit solves the pull request #548. It implements mixed-integer differential evolution (MIP-DE) to minimise the acquisition function in case of a mixed-integer optimisation.
The MIP-DE is implemented only in case discrete variables are present in the problem.