sklearn.gaussian_process.kernels.Product

class sklearn.gaussian_process.kernels.Product(k1, k2)[source]

Product-kernel k1 * k2 of two kernels k1 and k2.

The resulting kernel is defined as k_prod(X, Y) = k1(X, Y) * k2(X, Y)

New in version 0.18.

Parameters:
k1 : Kernel object

The first base-kernel of the product-kernel

k2 : Kernel object

The second base-kernel of the product-kernel

Attributes:
bounds

Returns the log-transformed bounds on the theta.

hyperparameters

Returns a list of all hyperparameter.

n_dims

Returns the number of non-fixed hyperparameters of the kernel.

theta

Returns the (flattened, log-transformed) non-fixed hyperparameters.

Methods

__call__(X[, Y, eval_gradient]) Return the kernel k(X, Y) and optionally its gradient.
clone_with_theta(theta) Returns a clone of self with given hyperparameters theta.
diag(X) Returns the diagonal of the kernel k(X, X).
get_params([deep]) Get parameters of this kernel.
is_stationary() Returns whether the kernel is stationary.
set_params(**params) Set the parameters of this kernel.
__init__(k1, k2)[source]

Initialize self. See help(type(self)) for accurate signature.

__call__(X, Y=None, eval_gradient=False)[source]

Return the kernel k(X, Y) and optionally its gradient.

Parameters:
X : array, shape (n_samples_X, n_features)

Left argument of the returned kernel k(X, Y)

Y : array, shape (n_samples_Y, n_features), (optional, default=None)

Right argument of the returned kernel k(X, Y). If None, k(X, X) if evaluated instead.

eval_gradient : bool (optional, default=False)

Determines whether the gradient with respect to the kernel hyperparameter is determined.

Returns:
K : array, shape (n_samples_X, n_samples_Y)

Kernel k(X, Y)

K_gradient : array (opt.), shape (n_samples_X, n_samples_X, n_dims)

The gradient of the kernel k(X, X) with respect to the hyperparameter of the kernel. Only returned when eval_gradient is True.

bounds

Returns the log-transformed bounds on the theta.

Returns:
bounds : array, shape (n_dims, 2)

The log-transformed bounds on the kernel’s hyperparameters theta

clone_with_theta(theta)[source]

Returns a clone of self with given hyperparameters theta.

Parameters:
theta : array, shape (n_dims,)

The hyperparameters

diag(X)[source]

Returns the diagonal of the kernel k(X, X).

The result of this method is identical to np.diag(self(X)); however, it can be evaluated more efficiently since only the diagonal is evaluated.

Parameters:
X : array, shape (n_samples_X, n_features)

Left argument of the returned kernel k(X, Y)

Returns:
K_diag : array, shape (n_samples_X,)

Diagonal of kernel k(X, X)

get_params(deep=True)[source]

Get parameters of this kernel.

Parameters:
deep : boolean, optional

If True, will return the parameters for this estimator and contained subobjects that are estimators.

Returns:
params : mapping of string to any

Parameter names mapped to their values.

hyperparameters

Returns a list of all hyperparameter.

is_stationary()[source]

Returns whether the kernel is stationary.

n_dims

Returns the number of non-fixed hyperparameters of the kernel.

set_params(**params)[source]

Set the parameters of this kernel.

The method works on simple kernels as well as on nested kernels. The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns:
self
theta

Returns the (flattened, log-transformed) non-fixed hyperparameters.

Note that theta are typically the log-transformed values of the kernel’s hyperparameters as this representation of the search space is more amenable for hyperparameter search, as hyperparameters like length-scales naturally live on a log-scale.

Returns:
theta : array, shape (n_dims,)

The non-fixed, log-transformed hyperparameters of the kernel