Skip to content

Commit 70dcda5

Browse files
committed
Make docs a bit more nice: add links to classes imported from elsewhere, updated docstrings
1 parent d029a29 commit 70dcda5

19 files changed

+172
-79
lines changed

bayes_opt/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,13 +9,15 @@
99
from bayes_opt.constraint import ConstraintModel
1010
from bayes_opt.domain_reduction import SequentialDomainReductionTransformer
1111
from bayes_opt.logger import JSONLogger, ScreenLogger
12+
from bayes_opt.target_space import TargetSpace
1213

1314
__version__ = importlib.metadata.version("bayesian-optimization")
1415

1516

1617
__all__ = [
1718
"acquisition",
1819
"BayesianOptimization",
20+
"TargetSpace",
1921
"ConstraintModel",
2022
"Events",
2123
"ScreenLogger",

bayes_opt/acquisition.py

Lines changed: 34 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,22 @@
1-
"""Acquisition functions for Bayesian Optimization."""
1+
"""Acquisition functions for Bayesian Optimization.
2+
3+
The acquisition functions in this module can be grouped the following way:
4+
5+
- One of the base acquisition functions
6+
(:py:class:`UpperConfidenceBound<bayes_opt.acquisition.UpperConfidenceBound>`,
7+
:py:class:`ProbabilityOfImprovement<bayes_opt.acquisition.ProbabilityOfImprovement>` and
8+
:py:class:`ExpectedImprovement<bayes_opt.acquisition.ExpectedImprovement>`) is always dictating the basic
9+
behavior of the suggestion step. They can be used alone or combined with a meta acquisition function.
10+
- :py:class:`GPHedge<bayes_opt.acquisition.GPHedge>` is a meta acquisition function that combines multiple
11+
base acquisition functions and determines the most suitable one for a particular problem.
12+
- :py:class:`ConstantLiar<bayes_opt.acquisition.ConstantLiar>` is a meta acquisition function that can be
13+
used for parallelized optimization and discourages sampling near a previously suggested, but not yet
14+
evaluated, point.
15+
- :py:class:`AcquisitionFunction<bayes_opt.acquisition.AcquisitionFunction>` is the base class for all
16+
acquisition functions. You can implement your own acquisition function by subclassing it. See the
17+
`Acquisition Functions notebook <../acquisition.html>`__ to understand the many ways this class can be
18+
modified.
19+
"""
220

321
from __future__ import annotations
422

@@ -373,6 +391,11 @@ def decay_exploration(self) -> None:
373391
"""Decay kappa by a constant rate.
374392
375393
Adjust exploration/exploitation trade-off by reducing kappa.
394+
395+
Note
396+
----
397+
398+
This method is called automatically at the end of each ``suggest()`` call.
376399
"""
377400
if self.exploration_decay is not None and (
378401
self.exploration_decay_delay is None or self.exploration_decay_delay <= self.i
@@ -495,6 +518,11 @@ def decay_exploration(self) -> None:
495518
r"""Decay xi by a constant rate.
496519
497520
Adjust exploration/exploitation trade-off by reducing xi.
521+
522+
Note
523+
----
524+
525+
This method is called automatically at the end of each ``suggest()`` call.
498526
"""
499527
if self.exploration_decay is not None and (
500528
self.exploration_decay_delay is None or self.exploration_decay_delay <= self.i
@@ -625,6 +653,11 @@ def decay_exploration(self) -> None:
625653
r"""Decay xi by a constant rate.
626654
627655
Adjust exploration/exploitation trade-off by reducing xi.
656+
657+
Note
658+
----
659+
660+
This method is called automatically at the end of each ``suggest()`` call.
628661
"""
629662
if self.exploration_decay is not None and (
630663
self.exploration_decay_delay is None or self.exploration_decay_delay <= self.i

bayes_opt/bayesian_optimization.py

Lines changed: 13 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -93,8 +93,9 @@ class BayesianOptimization(Observable):
9393
Dictionary with parameters names as keys and a tuple with minimum
9494
and maximum values.
9595
96-
constraint: A ConstraintModel. Note that the names of arguments of the
97-
constraint function and of f need to be the same.
96+
constraint: ConstraintModel.
97+
Note that the names of arguments of the constraint function and of
98+
f need to be the same.
9899
99100
random_state: int or numpy.random.RandomState, optional(default=None)
100101
If the value is an integer, it is used as the seed for creating a
@@ -112,19 +113,6 @@ class BayesianOptimization(Observable):
112113
This behavior may be desired in high noise situations where repeatedly probing
113114
the same point will give different answers. In other situations, the acquisition
114115
may occasionally generate a duplicate point.
115-
116-
Methods
117-
-------
118-
probe()
119-
Evaluates the function on the given points.
120-
Can be used to guide the optimizer.
121-
122-
maximize()
123-
Tries to find the parameters that yield the maximum value for the
124-
given function.
125-
126-
set_bounds()
127-
Allows changing the lower and upper searching bounds
128116
"""
129117

130118
def __init__(
@@ -303,12 +291,20 @@ def maximize(self, init_points=5, n_iter=25):
303291
Parameters
304292
----------
305293
init_points : int, optional(default=5)
306-
Number of iterations before the explorations starts the exploration
307-
for the maximum.
294+
Number of random points to probe before starting the optimization.
308295
309296
n_iter: int, optional(default=25)
310297
Number of iterations where the method attempts to find the maximum
311298
value.
299+
300+
Warning
301+
-------
302+
The maximize loop only fits the GP when suggesting a new point to
303+
probe based on the acquisition function. This means that the GP may
304+
not be fitted on all points registered to the target space when the
305+
method completes. If you intend to use the GP model after the
306+
optimization routine, make sure to fit it manually, e.g. by calling
307+
``optimizer._gp.fit(optimizer.space.params, optimizer.space.target)``.
312308
"""
313309
self._prime_subscriptions()
314310
self.dispatch(Events.OPTIMIZATION_START)

bayes_opt/constraint.py

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -33,12 +33,11 @@ class ConstraintModel:
3333
random_state : np.random.RandomState or int or None, default=None
3434
Random state to use.
3535
36-
Notes
37-
-----
36+
Note
37+
----
3838
In case of multiple constraints, this model assumes conditional
39-
independence. This means that for each constraint, the probability of
40-
fulfillment is the cdf of a univariate Gaussian. The overall probability
41-
is a simply the product of the individual probabilities.
39+
independence. This means that the overall probability of fulfillment is a
40+
simply the product of the individual probabilities.
4241
"""
4342

4443
def __init__(self, fun, lb, ub, random_state=None):
@@ -112,9 +111,9 @@ def fit(self, X, Y):
112111
113112
Parameters
114113
----------
115-
X :
114+
X : np.ndarray of shape (n_samples, n_features)
116115
Parameters of the constraint function.
117-
Y :
116+
Y : np.ndarray of shape (n_samples, n_constraints)
118117
Values of the constraint function.
119118
120119
@@ -146,6 +145,9 @@ def predict(self, X):
146145
:math:`c^{\text{up}}` the lower and upper bounds of the constraint
147146
respectively.
148147
148+
Note
149+
----
150+
149151
In case of multiple constraints, we assume conditional independence.
150152
This means we calculate the probability of constraint fulfilment
151153
individually, with the joint probability given as their product.

docsrc/code_docs.rst

Lines changed: 0 additions & 34 deletions
This file was deleted.

docsrc/conf.py

Lines changed: 15 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
#
1313
import os
1414
import sys
15+
import time
1516
import shutil
1617
from glob import glob
1718
from pathlib import Path
@@ -44,7 +45,8 @@
4445
'IPython.sphinxext.ipython_console_highlighting',
4546
'sphinx.ext.mathjax',
4647
"sphinx.ext.napoleon",
47-
'sphinx_immaterial'
48+
'sphinx.ext.intersphinx',
49+
'sphinx_immaterial',
4850
]
4951

5052
source_suffix = {
@@ -58,6 +60,16 @@
5860
# This pattern also affects html_static_path and html_extra_path.
5961
exclude_patterns = []
6062

63+
# Link types to the corresponding documentations
64+
intersphinx_mapping = {
65+
'python': ('https://docs.python.org/3', None),
66+
'numpy': ('https://numpy.org/doc/stable/', None),
67+
'scipy': ('https://docs.scipy.org/doc/scipy/reference/', None),
68+
'sklearn': ('https://scikit-learn.org/stable/', None),
69+
}
70+
71+
72+
napoleon_use_rtype = False
6173

6274
# -- Options for HTML output -------------------------------------------------
6375

@@ -67,7 +79,7 @@
6779

6880
html_title = "Bayesian Optimization"
6981
html_theme = "sphinx_immaterial"
70-
copyright = 'Fernando Nogueira and the bayesian-optimization developers'
82+
copyright = f"{time.strftime('%Y')}, Fernando Nogueira and the bayesian-optimization developers"
7183

7284
# material theme options (see theme.conf for more information)
7385
html_theme_options = {
@@ -122,6 +134,7 @@
122134
"version_dropdown": True,
123135
"version_json": '../versions.json',
124136
# END: version_dropdown
137+
"scope": "/", # share preferences across subsites
125138
"toc_title_is_page_title": True,
126139
# BEGIN: social icons
127140
"social": [

docsrc/examples.rst

Lines changed: 0 additions & 14 deletions
This file was deleted.

docsrc/index.rst

Lines changed: 25 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,32 @@
11
.. toctree::
22
:hidden:
3-
:maxdepth: 3
4-
:caption: Contents:
53

64
Quickstart <self>
7-
Example Notebooks </examples>
8-
/code_docs
5+
6+
.. toctree::
7+
:hidden:
8+
:maxdepth: 3
9+
:caption: Example Notebooks:
10+
11+
Basic Tour </basic-tour>
12+
Advanced Tour </advanced-tour>
13+
Constrained Bayesian Optimization </constraints>
14+
Sequential Domain Reduction </domain_reduction>
15+
Acquisition Functions </acquisition_functions>
16+
Exploration vs. Exploitation </exploitation_vs_exploration>
17+
Visualization of a 1D-Optimization </visualization>
18+
19+
.. toctree::
20+
:hidden:
21+
:maxdepth: 2
22+
:caption: API reference:
23+
24+
reference/bayes_opt
25+
reference/acquisition
26+
reference/constraint
27+
reference/domain_reduction
28+
reference/target_space
29+
reference/other
930

1031
.. raw:: html
1132

docsrc/reference/acquisition.rst

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
:py:mod:`bayes_opt.acquisition`
2+
-------------------------------
3+
4+
.. automodule:: bayes_opt.acquisition
5+
:members: AcquisitionFunction
6+
7+
.. toctree::
8+
:hidden:
9+
10+
acquisition/UpperConfidenceBound
11+
acquisition/ProbabilityOfImprovement
12+
acquisition/ExpectedImprovement
13+
acquisition/GPHedge
14+
acquisition/ConstantLiar
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.acquisition.ConstantLiar`
2+
----------------------------------------------
3+
4+
.. autoclass:: bayes_opt.acquisition.ConstantLiar
5+
:members:
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.acquisition.ExpectedImprovement`
2+
-----------------------------------------------------
3+
4+
.. autoclass:: bayes_opt.acquisition.ExpectedImprovement
5+
:members:
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.acquisition.GPHedge`
2+
-----------------------------------------
3+
4+
.. autoclass:: bayes_opt.acquisition.GPHedge
5+
:members:
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.acquisition.ProbabilityOfImprovement`
2+
----------------------------------------------------------
3+
4+
.. autoclass:: bayes_opt.acquisition.ProbabilityOfImprovement
5+
:members:
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.acquisition.UpperConfidenceBound`
2+
------------------------------------------------------
3+
4+
.. autoclass:: bayes_opt.acquisition.UpperConfidenceBound
5+
:members:

docsrc/reference/bayes_opt.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.BayesianOptimization`
2+
------------------------------------------
3+
4+
.. autoclass:: bayes_opt.BayesianOptimization
5+
:members:

docsrc/reference/constraint.rst

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
:py:class:`bayes_opt.ConstraintModel`
2+
------------------------------------------------
3+
4+
See the `Constrained Optimization notebook <../constraints.html#2.-Advanced-Constrained-Optimization>`__ for a complete example.
5+
6+
.. autoclass:: bayes_opt.constraint.ConstraintModel
7+
:members:

docsrc/reference/domain_reduction.rst

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
:py:class:`bayes_opt.SequentialDomainReductionTransformer`
2+
----------------------------------------------------------
3+
4+
See the `Sequential Domain Reduction notebook <../domain_reduction.html>`__ for a complete example.
5+
6+
.. autoclass:: bayes_opt.SequentialDomainReductionTransformer
7+
:members:

docsrc/reference/other.rst

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
Other
2+
-----
3+
4+
.. autoclass:: bayes_opt.ScreenLogger
5+
:members:
6+
7+
.. autoclass:: bayes_opt.JSONLogger
8+
:members:
9+
10+
.. autoclass:: bayes_opt.Events
11+
:members:

docsrc/reference/target_space.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
:py:class:`bayes_opt.TargetSpace`
2+
---------------------------------
3+
4+
.. autoclass:: bayes_opt.TargetSpace
5+
:members:

0 commit comments

Comments
 (0)