@@ -88,14 +88,14 @@ estimate the noise level of data. An illustration of the
8888log-marginal-likelihood (LML) landscape shows that there exist two local
8989maxima of LML.
9090
91- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_000 .png
91+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001 .png
9292 :target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
9393 :align: center
9494
9595The first corresponds to a model with a high noise level and a
9696large length scale, which explains all variations in the data by noise.
9797
98- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001 .png
98+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002 .png
9999 :target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
100100 :align: center
101101
@@ -106,7 +106,7 @@ hyperparameters, the gradient-based optimization might also converge to the
106106high-noise solution. It is thus important to repeat the optimization several
107107times for different initializations.
108108
109- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002 .png
109+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_003 .png
110110 :target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
111111 :align: center
112112
@@ -306,11 +306,11 @@ The second figure shows the log-marginal-likelihood for different choices of
306306the kernel's hyperparameters, highlighting the two choices of the
307307hyperparameters used in the first figure by black dots.
308308
309- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_000 .png
309+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_001 .png
310310 :target: ../auto_examples/gaussian_process/plot_gpc.html
311311 :align: center
312312
313- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_001 .png
313+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_002 .png
314314 :target: ../auto_examples/gaussian_process/plot_gpc.html
315315 :align: center
316316
@@ -493,7 +493,7 @@ kernel as covariance function have mean square derivatives of all orders, and ar
493493very smooth. The prior and posterior of a GP resulting from an RBF kernel are shown in
494494the following figure:
495495
496- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_000 .png
496+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_001 .png
497497 :target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
498498 :align: center
499499
@@ -534,7 +534,7 @@ allows adapting to the properties of the true underlying functional relation.
534534The prior and posterior of a GP resulting from a Matérn kernel are shown in
535535the following figure:
536536
537- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_004 .png
537+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_005 .png
538538 :target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
539539 :align: center
540540
@@ -556,7 +556,7 @@ The kernel is given by:
556556 The prior and posterior of a GP resulting from a :class: `RationalQuadratic ` kernel are shown in
557557the following figure:
558558
559- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_001 .png
559+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_002 .png
560560 :target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
561561 :align: center
562562
@@ -574,7 +574,7 @@ The kernel is given by:
574574 The prior and posterior of a GP resulting from an ExpSineSquared kernel are shown in
575575the following figure:
576576
577- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_002 .png
577+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_003 .png
578578 :target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
579579 :align: center
580580
@@ -594,7 +594,7 @@ is called the homogeneous linear kernel, otherwise it is inhomogeneous. The kern
594594 The :class: `DotProduct ` kernel is commonly combined with exponentiation. An example with exponent 2 is
595595shown in the following figure:
596596
597- .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_003 .png
597+ .. figure :: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_004 .png
598598 :target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
599599 :align: center
600600
0 commit comments