squared_exponential_kernel
squared_exponential_kernel(X1, X2, length=0.2, scale=1, derivatives=(-1, -1))
Evaluates the squared exponential (i.e., gaussian density) kernel and its derivatives at any given pairs of points X1 and X2.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
X1 |
(num_points,dim) numpy array
|
The first set of points to evaluate the kernel at. |
required |
X2 |
(num_points,dim) numpy array
|
The second set of points to evaluate the kernel at. |
required |
length |
float or (num_points,) numpy array
|
The scalar length scale of the kernel (can be a vector if the length is different for different points).. |
0.2
|
scale |
float
|
The scalar scale factor of the kernel. |
1
|
derivatives |
(2,) numpy array
|
The indices (i,j) of the derivatives to take: d^2 K / dx1_i dx2_j. A value of -1 means no derivative. Note that these are not commutative: the derivative wrt x1_i is not the same as the derivative wrt x2_i (there's a minus sign difference). |
(-1, -1)
|
Returns:
Name | Type | Description |
---|---|---|
K |
(num_points,) numpy array
|
The kernel evaluated at the given points. |
Examples:
```python
We can evaluate the kernel directly
x = np.reshape(np.linspace(-1,1,num_samples),(-1,1)) y = x + 0.2 v = gpytoolbox.squared_exponential_kernel(x,y)
But more often we'll use it to call a gaussian process:
def ker_fun(X1,X2,derivatives=(-1,-1)): return gpytoolbox.squared_exponential_kernel(X1,X2,derivatives=derivatives,length=1,scale=0.1)
Then, call a gaussian process
x_train = np.linspace(0,1,20) y_train = 2*x_train x_test = np.linspace(0,1,120) y_test_mean, y_test_cov = gpytoolbox.gaussian_process(X_train,y_train,x_test,kernel=ker_fun)
Source code in src/gpytoolbox/squared_exponential_kernel.py
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
|