Skip to content

Mixed-integer differential evolution for next point suggestion #549

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
May 7, 2025

Conversation

udicaprio
Copy link
Contributor

@udicaprio udicaprio commented Mar 8, 2025

Hi,

this commit solves the pull request #548. It implements mixed-integer differential evolution (MIP-DE) to minimise the acquisition function in case of a mixed-integer optimisation.
The MIP-DE is implemented only in case discrete variables are present in the problem.

@udicaprio udicaprio changed the title Update acquisition.py Mixed-integer differential evolution for next point suggestion Mar 8, 2025
Code linter optimisation with Ruff
@till-m
Copy link
Member

till-m commented Mar 9, 2025

@udicaprio I realized the coverage bot is broken, so let me fix that first before moving on with this PR

@udicaprio
Copy link
Contributor Author

udicaprio commented Mar 9, 2025

@till-m, fine no problem. I just realized that I could make some easy modification that improve the computational performance without affecting the identifications one that I would like to implement.

I will be working on this today, it is a quite an easy one but I need to fine-tune some DE parameters.

@till-m
Copy link
Member

till-m commented Mar 9, 2025

Great! Could you tag me or request a review when you're done?
I think the coverage is fixed. Could you pull from upstream? Then, it should work for this PR as well :)

Copy link

codecov bot commented Mar 9, 2025

Codecov Report

Attention: Patch coverage is 96.87500% with 1 line in your changes missing coverage. Please review.

Project coverage is 97.87%. Comparing base (1714504) to head (79b8e03).
Report is 3 commits behind head on master.

Files with missing lines Patch % Lines
bayes_opt/acquisition.py 96.87% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #549      +/-   ##
==========================================
- Coverage   97.92%   97.87%   -0.06%     
==========================================
  Files          10       10              
  Lines        1158     1175      +17     
==========================================
+ Hits         1134     1150      +16     
- Misses         24       25       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Computational-efficiency of the MIP identification increased
Linter improved
@till-m
Copy link
Member

till-m commented Mar 10, 2025

FYI I need to manually approve the tests since you're a first-time contributor. If I forget to run them, please tag me and I will make sure they run. Also, don't worry about small changes in coverage. We're trying to cover the code as much as makes sense, not chase a number.

@udicaprio
Copy link
Contributor Author

Hello @till-m, thank you for the notification. Seems the code passed all the tests except the coverage one.
Can I consider it ready for merging, or should I make any further modifications in order to reach at least the old coverage percentage?

Copy link
Member

@till-m till-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally speaking, this is great code: Clear, concise, etc. I've left some minor comments, though some are just to clarify things. One big issue, however, is assuming that any non-continuous parameter is necessarily integer valued... I don't see any easy way around this right now, though I assume that the algorithm isn't inherently incompatible with discrete, non-int parameters.

should I make any further modifications in order to reach at least the old coverage percentage?

The line that is currently not covered is perfectly fine to not cover, imo.

@till-m
Copy link
Member

till-m commented Mar 10, 2025

Ah, also, the name of the function should probably be adjusted somewhat. I'm not sure what the best name would be -- something like _structured_minimize? _deterministic_minimize (c.f. the _random_sample_minimize, though since DE is somewhat random this isn't entirely correct)? _smart_minimize? _l_bfgs_b_or_de_minimize?

@udicaprio
Copy link
Contributor Author

Ah, also, the name of the function should probably be adjusted somewhat. I'm not sure what the best name would be -- something like _structured_minimize? _deterministic_minimize (c.f. the _random_sample_minimize, though since DE is somewhat random this isn't entirely correct)? _smart_minimize? _l_bfgs_b_or_de_minimize?

Working on this

@udicaprio
Copy link
Contributor Author

@till-m, thank you for revising the code and your comments. I am working on them. I will push the modifications as soon as possible.

@till-m
Copy link
Member

till-m commented Mar 11, 2025

Excellent, thank you, much appreciated 🙏

@till-m
Copy link
Member

till-m commented Mar 13, 2025

@udicaprio

Regarding the discrete != integrable discussion, I know that this significantly complicates the PR and it might not be what you've signed up for when you offered to implement DE optimization. If there's something I can do to make things easier for you, let me know. I'm convinced that this is an extremely valuable contribution to the library so I'd like to make sure it gets added!

@udicaprio
Copy link
Contributor Author

udicaprio commented Apr 5, 2025

Hello @till-m, I have modified the DE search including the transformation with the kernel_transform method of the space. Please let me know if this is fine or any further modification is required.

Copy link
Member

@till-m till-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @udicaprio, things have been/are a bit busy on my end, sorry about that. I think this is almost there, but I left a few small comments that should be addressed, I think.

Thanks for all the work!

ntrials = max(1, len(x_seeds) // 100)
for _ in range(ntrials):
xinit = space.random_sample(15 * len(space.bounds), random_state=self.random_state)
de = DifferentialEvolutionSolver(acq, bounds=space.bounds, init=xinit, rng=self.random_state)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to use polish=False to turn off gradient-based polishing?

@udicaprio
Copy link
Contributor Author

Hello @till-m, I have just completed the code revision with your comments. Additionally, I have added a version check on Scipy and reformatting of the differential evolution input since the one about the random seed has been changed at version 1.15. Can you please let the workflow run?

Copy link
Member

@till-m till-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great, I think! I have one small comment regarding the code. Other than that I think the final thing I would ask you to do is the check the references within the code to l-bfgs-b. I think in many cases variable names and docstrings should change to reflect that we're not exclusively doing L-BFGS-B Optimization anymore. After this, it should be good to merge.

@till-m
Copy link
Member

till-m commented Apr 28, 2025

Sidenote: Do you think we need to make sure that we test on scipy over/under 1.15.0?

@udicaprio
Copy link
Contributor Author

Hello @till-m, I applied the suggested modification. Can you please trigger the testing pipeline?
About the tests for scipy over/under 1.15.0, I am not sure this is needed. I realised of the error because of a faulty test following the pushing on the pull request. Therefore I guess the existing testing routines inherently tests for it (at least at an API level).

@udicaprio
Copy link
Contributor Author

Hello @till-m, I forgot to check the linter. Should be fixed now. Can you please re-trigger the test?

@till-m
Copy link
Member

till-m commented May 2, 2025

Sorry, I missed that message. Rerunning now!

@hjmjohnson
Copy link

@udicaprio This is a spectacular enhancement to the library. Thank you for your efforts to implement this vital feature.

Copy link
Member

@till-m till-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀 will merge soon! :)

@udicaprio
Copy link
Contributor Author

@udicaprio This is a spectacular enhancement to the library. Thank you for your efforts to implement this vital feature.

Thank you very much

@udicaprio
Copy link
Contributor Author

LGTM 🚀 will merge soon! :)

Thank you! Looking forward to other contributions

@till-m till-m merged commit c59cb64 into bayesian-optimization:master May 7, 2025
12 of 14 checks passed
@till-m
Copy link
Member

till-m commented May 7, 2025

Thanks for your contribution! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants