You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/modules/preprocessing.rst
+5-2Lines changed: 5 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -432,7 +432,7 @@ estimator that supports imputation. See :ref:`example_missing_values.py`
432
432
Generating polynomial features
433
433
==============================
434
434
435
-
Often it's useful to add complexity to the model by considering nonlinearity within input data. One basic representation is to use polynomial to get features' high-order and interaction terms, which is implemented in :class:`PolynomialFeatures`::
435
+
Often it's useful to add complexity to the model by considering nonlinear features of the input data. A simple and common method to use is polynomial features, which can get features' high-order and interaction terms. It is implemented in :class:`PolynomialFeatures`::
436
436
437
437
>>> import numpy as np
438
438
>>> from sklearn.preprocessing import PolynomialFeatures
@@ -450,6 +450,7 @@ Often it's useful to add complexity to the model by considering nonlinearity wit
450
450
The features of X have been transformed from :math:`(X_1, X_2)` to :math:`(1, X_1, X_2, X_1^2, X_1X_2, X_2^2)`.
451
451
452
452
In some cases, only interaction terms among features are required, and it can be gotten with the setting ``interaction_only=True``::
453
+
453
454
>>> X = np.arange(9).reshape(3, 3)
454
455
>>> X # doctest: +ELLIPSIS
455
456
array([[0, 1, 2],
@@ -463,4 +464,6 @@ In some cases, only interaction terms among features are required, and it can be
463
464
464
465
The features of X have been transformed from :math:`(X_1, X_2, X_3)` to :math:`(1, X_1, X_2, X_3, X_1X_2, X_1X_3, X_2X_3, X_1X_2X_3)`.
465
466
466
-
See :ref:`example_linear_model_plot_polynomial_interpolation.py` for Ridge regression using created ploynomial features.
467
+
Note that polynomial features are used implicitily in `kernel methods <http://en.wikipedia.org/wiki/Kernel_method>`_ (e.g., :class:`sklearn.svm.SVC`, :class:`sklearn.decomposition.KernelPCA`) when using polynomial :ref:`svm_kernels`.
468
+
469
+
See :ref:`example_linear_model_plot_polynomial_interpolation.py` for Ridge regression using created polynomial features.
0 commit comments