eztaox.kernels.quasisep
Scalable kernels exploiting the quasiseparable structure in the relevant matrices to achieve a O(N) scaling. This module extends the tinygp.kernels.quasisep module.
Classes
An extension of the tinygp.kernels.quasisep.Quasisep kernel. |
|
A helper to represent the sum of two quasiseparable kernels |
|
A helper to represent the product of two quasiseparable kernels |
|
The product of a scalar and a quasiseparable kernel |
|
An extension of the tinygp.kernels.quasisep.Exp kernel, adding a power method. |
|
An extension of the tinygp.kernels.quasisep.Cosine kernel, adding a power method. |
|
An extension of the tinygp.kernels.quasisep.Celerite kernel, adding a power method. |
|
An extension of the tinygp.kernels.quasisep.Matern32 kernel, adding a power method. |
|
An extension of the tinygp.kernels.quasisep.Matern52 kernel, adding a power method. |
|
An extension of the tinygp.kernels.quasisep.SHO kernel, adding a power method. |
|
The Lorentzian kernel. |
|
A continuous-time autoregressive moving average (CARMA) process kernel |
|
A multiband kernel implementating a low-rank Kronecker covariance structure. |
Functions
|
|
|
Expand a product of quadractic equations into a polynomial |
|
Factorize a polynomial into a product of quadratic equations |
|
Compute the coefficients of the autocovariance function (ACVF) |
|
Module Contents
- class Quasisep[source]
Bases:
tinygp.kernels.quasisep.QuasisepAn extension of the tinygp.kernels.quasisep.Quasisep kernel.
tinygp.kernels.quasisep.Quasisep is the base class for all kernels that can be evaluated following an O(N) scaling. This extension adds a power method to return the power spectral density (PSD) of a quasiseparable kernel at an input frequency.
- class Sum[source]
Bases:
Quasisep,tinygp.kernels.quasisep.SumA helper to represent the sum of two quasiseparable kernels
- class Product[source]
Bases:
Quasisep,tinygp.kernels.quasisep.ProductA helper to represent the product of two quasiseparable kernels
- class Scale[source]
Bases:
Quasisep,tinygp.kernels.quasisep.ScaleThe product of a scalar and a quasiseparable kernel
- class Exp[source]
Bases:
Quasisep,tinygp.kernels.quasisep.ExpAn extension of the tinygp.kernels.quasisep.Exp kernel, adding a power method.
- class Cosine[source]
Bases:
Quasisep,tinygp.kernels.quasisep.CosineAn extension of the tinygp.kernels.quasisep.Cosine kernel, adding a power method.
- class Celerite[source]
Bases:
Quasisep,tinygp.kernels.quasisep.CeleriteAn extension of the tinygp.kernels.quasisep.Celerite kernel, adding a power method.
- class Matern32[source]
Bases:
Quasisep,tinygp.kernels.quasisep.Matern32An extension of the tinygp.kernels.quasisep.Matern32 kernel, adding a power method.
- class Matern52[source]
Bases:
Quasisep,tinygp.kernels.quasisep.Matern52An extension of the tinygp.kernels.quasisep.Matern52 kernel, adding a power method.
- class SHO[source]
Bases:
Quasisep,tinygp.kernels.quasisep.SHOAn extension of the tinygp.kernels.quasisep.SHO kernel, adding a power method.
- class Lorentzian[source]
Bases:
QuasisepThe Lorentzian kernel.
The kernel takes the form:
\[k(\tau) = \sigma^2\,\exp(-b\,\tau)\,cos(\omega\,\tau)\]for \(\tau = |x_i - x_j|\) and \(b = \frac{\omega}{2\,Q}\).
- Parameters:
omega – The parameter \(\omega\).
quality – The parameter \(Q\).
sigma (optional) – The parameter \(\sigma\). Defaults to a value of 1. Specifying the explicit value here provides a slight performance boost compared to independently multiplying the kernel with a prefactor.
- observation_model(X: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]
The observation model for the process
- class CARMA(alpha: tinygp.helpers.JAXArray | numpy.typing.NDArray, beta: tinygp.helpers.JAXArray | numpy.typing.NDArray)[source]
Bases:
QuasisepA continuous-time autoregressive moving average (CARMA) process kernel
This process has the power spectrum density (PSD)
\[P(\omega) = \sigma^2\,\frac{|\sum_{q} \beta_q\,(i\,\omega)^q|^2}{|\sum_{p} \alpha_p\,(i\,\omega)^p|^2}\]defined following Equation 1 in Kelly et al. (2014), where \(\alpha_p\) and \(\beta_0\) are set to 1. In this implementation, we absorb \(\sigma\) into the definition of \(\beta\) parameters. That is \(\beta_{new}\) = \(\beta * \sigma\).
Note
To construct a stationary CARMA kernel/process, the roots of the characteristic polynomials for Equation 1 in Kelly et al. (2014) must have negative real parts. This condition can be met automatically by requiring positive input parameters when instantiating the kernel using the
init()method for CARMA(1,0), CARMA(2,0), and CARMA(2,1) models or by requiring positive input parameters when instantiating the kernel using thefrom_quads()method.Note
Implementation details
The logic behind this implementation is simple—finding the correct combination of real/complex exponential kernels that resembles the autocovariance function of the CARMA model. Note that the order also matters. This task is achieved using the acvf method. Then the rest is copied from the Exp and Celerite kernel.
Given the requirement of negative roots for stationarity, the from_quads method is implemented to facilitate consturcting stationary higher-order CARMA models beyond CARMA(2,1). The inputs for from_quads are the coefficients of the quadratic equations factorized out of the full characteristic polynomial. poly2quads is used to factorize a polynomial into a product of said quadractic equations, and quads2poly is used for the reverse process.
One last trick is the use of _real_mask, _complex_mask, and complex_select, which are arrays of 0s and 1s. They are implemented to avoid control flows. More specifically, some intermediate quantities are computed regardless, but are only used if there is a matching real or complex exponential kernel for the specific CARMA kernel.
- Parameters:
alpha – The parameter \(\alpha\) in the definition above, exlcuding \(\alpha_p\). This should be an array of length p.
beta – The product of parameters \(\beta\) and parameter \(\sigma\) in the definition above. This should be an array of length q+1, where q+1 <= p.
- classmethod from_quads(alpha_quads: tinygp.helpers.JAXArray | numpy.typing.NDArray, beta_quads: tinygp.helpers.JAXArray | numpy.typing.NDArray, beta_mult: tinygp.helpers.JAXArray | numpy.typing.NDArray) CARMA[source]
Construct a CARMA kernel using the roots of its characteristic polynomials
The roots can be parameterized as the 0th and 1st order coefficients of a set of quadratic equations (2nd order coefficient equals 1). The product of those quadratic equations gives the characteristic polynomials of CARMA. The input of this method are said coefficients of the quadratic equations. See Equation 30 in Kelly et al. (2014). for more detail.
- Parameters:
alpha_quads – Coefficients of the auto-regressive (AR) quadratic equations corresponding to the \(\alpha\) parameters. This should be an array of length p.
beta_quads – Coefficients of the moving-average (MA) quadratic equations corresponding to the \(\beta\) parameters. This should be an array of length q.
beta_mult – A multiplier of the MA coefficients, equivalent to \(\beta_q\)—the last entry of the \(\beta\) parameters input to the
init()method.
- observation_model(X: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]
The observation model for the process
- carma_quads2poly(quads_coeffs: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]
Expand a product of quadractic equations into a polynomial
- Parameters:
quads_coeffs – The 0th and 1st order coefficients of the quadractic equations. The last entry is a multiplier, which corresponds to the coefficient of the highest order term in the output full polynomial.
- Returns:
Coefficients of the full polynomial. The first entry corresponds to the lowest order term.
- carma_poly2quads(poly_coeffs: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]
Factorize a polynomial into a product of quadratic equations
- Parameters:
poly_coeffs – Coefficients of the input characteristic polynomial. The first entry corresponds to the lowest order term.
- Returns:
The 0th and 1st order coefficients of the quadractic equations. The last entry is a multiplier, which corresponds to the coefficient of the highest order term in the full polynomial.
- carma_acvf(arroots: tinygp.helpers.JAXArray, arparam: tinygp.helpers.JAXArray, maparam: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]
Compute the coefficients of the autocovariance function (ACVF)
- Parameters:
arroots – The roots of the autoregressive characteristic polynomial.
arparam – \(\alpha\) parameters
maparam – \(\beta\) parameters
- Returns:
ACVF coefficients, each entry corresponds to one root.
- _compute(alpha: tinygp.helpers.JAXArray, beta: tinygp.helpers.JAXArray, sigma: tinygp.helpers.JAXArray) tuple[tinygp.helpers.JAXArray, Ellipsis][source]
- class MultibandLowRank[source]
Bases:
tinygp.kernels.quasisep.WrapperA multiband kernel implementating a low-rank Kronecker covariance structure.
The specific form of the cross-band Kronecker covariance matrix is given by Equation 13 of Gordon et al. (2020). The implementation is inspired by this tinygp tutorial.
- Parameters:
params – A dictionary of string and array pairs, which are used in the observational_model method to describe the cross-band covariance.