Object/Class

io.github.mandar2812.dynaml.models.gp

AbstractGPRegressionModel

Related Docs: class AbstractGPRegressionModel | package gp

Permalink

object AbstractGPRegressionModel

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AbstractGPRegressionModel
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def apply[T, I](cov: LocalScalarKernel[I], noise: LocalScalarKernel[I], basisFunc: DataPipe[I, DenseVector[Double]], basis_param_prior: MultGaussianRV)(trainingdata: T, num: Int)(implicit arg0: ClassTag[I], transform: DataPipe[T, Seq[(I, Double)]]): GPBasisFuncRegressionModel[T, I]

    Permalink

    Create an instance of GPBasisFuncRegressionModel for a particular data type T

    Create an instance of GPBasisFuncRegressionModel for a particular data type T

    T

    The type of the training data

    I

    The type of the input patterns in the data set of type T

    cov

    The covariance function

    noise

    The noise covariance function

    basisFunc

    A DataPipe transforming the input features to basis function components.

    basis_param_prior

    A MultGaussianRV which is the prior distribution on basis function coefficients

    trainingdata

    The actual data set of type T

    transform

    An implicit conversion from T to Seq represented as a DataPipe

  5. def apply[T, I](cov: LocalScalarKernel[I], noise: LocalScalarKernel[I], meanFunc: DataPipe[I, Double])(trainingdata: T, num: Int)(implicit arg0: ClassTag[I], transform: DataPipe[T, Seq[(I, Double)]]): AbstractGPRegressionModel[T, I]

    Permalink

    Create an instance of AbstractGPRegressionModel for a particular data type T

    Create an instance of AbstractGPRegressionModel for a particular data type T

    T

    The type of the training data

    I

    The type of the input patterns in the data set of type T

    cov

    The covariance function

    noise

    The noise covariance function

    meanFunc

    The trend or mean function

    trainingdata

    The actual data set of type T

    transform

    An implicit conversion from T to Seq represented as a DataPipe

  6. def apply[M <: AbstractGPRegressionModel[Seq[(DenseVector[Double], Double)], DenseVector[Double]]](data: Seq[(DenseVector[Double], Double)], cov: LocalScalarKernel[DenseVector[Double]], noise: LocalScalarKernel[DenseVector[Double]] = new DiracKernel(1.0), order: Int = 0, ex: Int = 0, meanFunc: DataPipe[DenseVector[Double], Double] = DataPipe(_ => 0.0)): M

    Permalink
  7. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  8. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  14. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  15. def logLikelihood(trainingData: PartitionedVector, kernelMatrix: PartitionedPSDMatrix): Double

    Permalink

    Calculate the marginal log likelihood of the training data for a pre-initialized kernel and noise matrices.

    Calculate the marginal log likelihood of the training data for a pre-initialized kernel and noise matrices.

    trainingData

    The function values assimilated as a DenseVector

    kernelMatrix

    The kernel matrix of the training features

  16. def logLikelihood(trainingData: DenseVector[Double], kernelMatrix: DenseMatrix[Double]): Double

    Permalink

    Calculate the marginal log likelihood of the training data for a pre-initialized kernel and noise matrices.

    Calculate the marginal log likelihood of the training data for a pre-initialized kernel and noise matrices.

    trainingData

    The function values assimilated as a DenseVector

    kernelMatrix

    The kernel matrix of the training features

  17. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  18. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  20. def solve(trainingLabels: PartitionedVector, trainingMean: PartitionedVector, priorMeanTest: PartitionedVector, smoothingMat: PartitionedPSDMatrix, kernelTest: PartitionedPSDMatrix, crossKernel: PartitionedMatrix): (PartitionedVector, PartitionedPSDMatrix)

    Permalink

    Calculate the parameters of the posterior predictive distribution for a multivariate gaussian model.

  21. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  22. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  23. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped