Skip to content

Commit 03aa748

Browse files
GaelVaroquauxamueller
authored andcommitted
DOC: minor formatting in model_evaluation.rst
1 parent 31254c2 commit 03aa748

File tree

1 file changed

+17
-16
lines changed

1 file changed

+17
-16
lines changed

doc/modules/model_evaluation.rst

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -772,8 +772,8 @@ function for more information see the :ref:`clustering_evaluation` section.
772772

773773
.. currentmodule:: sklearn
774774

775-
Flexible Scoring Objects
776-
========================
775+
`Scoring` objects: defining your scoring rules
776+
===============================================
777777
While the above functions provide a simple interface for most use-cases, they
778778
can not directly be used for model selection and evaluation using
779779
:class:`grid_search.GridSearchCV` and
@@ -790,35 +790,36 @@ them), you can simply provide a string as the ``scoring`` parameter. Possible
790790
values are:
791791

792792

793-
=================== =========================================
793+
=================== ===============================================
794794
Scoring Function
795-
=================== =========================================
795+
=================== ===============================================
796796
**Classification**
797-
'accuracy' sklearn.metrics.accuracy_score
798-
'average_precision' sklearn.metrics.average_precision_score
799-
'f1' sklearn.metrics.f1_score
800-
'precision' sklearn.metrics.precision_score
801-
'recall' sklearn.metrics.recall_score
802-
'roc_auc' sklearn.merrics.auc_score
797+
'accuracy' :func:`sklearn.metrics.accuracy_score`
798+
'average_precision' :func:`sklearn.metrics.average_precision_score`
799+
'f1' :func:`sklearn.metrics.f1_score`
800+
'precision' :func:`sklearn.metrics.precision_score`
801+
'recall' :func:`sklearn.metrics.recall_score`
802+
'roc_auc' :func:`sklearn.metrics.auc_score`
803803

804804
**Clustering**
805-
'ari'` sklearn.metrics.adjusted_rand_score
805+
'ari'` :func:`sklearn.metrics.adjusted_rand_score`
806806

807807
**Regression**
808-
'mse' sklearn.metrics.mean_squared_error
809-
'r2' sklearn.metrics.r2_score
810-
=================== =========================================
808+
'mse' :func:`sklearn.metrics.mean_squared_error`
809+
'r2' :func:`sklearn.metrics.r2_score`
810+
=================== ===============================================
811811

812812
.. currentmodule:: sklearn.metrics
813813

814-
Creating Scoring Objects From Score Functions
814+
Creating scoring objects from score functions
815815
---------------------------------------------
816816
If you want to use a scoring function that takes additional parameters, such as
817817
:func:`fbeta_score`, you need to generate an appropriate scoring object. The
818818
simplest way to generate a callable object for scoring is by using
819819
:class:`Scorer`.
820820
:class:`Scorer` converts score functions as above into callables
821821
that can be used for model evaluation.
822+
822823
One typical use case is to wrap an existing scoring function from the library
823824
with non default value for its parameters such as the beta parameter for the
824825
:func:fbeta_score function::
@@ -846,7 +847,7 @@ predictions as input (``needs_threshold=False``) or needs confidence scores
846847
in the example above.
847848

848849

849-
Implementing Your Own Scoring Object
850+
Implementing your own scoring object
850851
------------------------------------
851852
You can generate even more flexible model scores by constructing your own
852853
scoring object from scratch, without using the :class:`Scorer` helper class.

0 commit comments

Comments
 (0)