Trait

smile.validation

Operators

Related Doc: package validation

Permalink

trait Operators extends AnyRef

Model validation.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Operators
  2. AnyRef
  3. Any
Implicitly
  1. by any2stringadd
  2. by StringFormat
  3. by Ensuring
  4. by ArrowAssoc
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. def +(other: String): String

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to any2stringadd[Operators] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ->[B](y: B): (Operators, B)

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to ArrowAssoc[Operators] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  5. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  6. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  7. def bootstrap[T <: AnyRef](x: Array[T], y: Array[Double], k: Int, measures: RegressionMeasure*)(trainer: ⇒ (Array[T], Array[Double]) ⇒ Regression[T]): Array[Double]

    Permalink

    Bootstrap validation on a generic regression model.

    Bootstrap validation on a generic regression model.

    x

    data samples.

    y

    response variable.

    k

    k-round bootstrap estimation.

    measures

    validation measures such as MSE, AbsoluteDeviation, etc.

    trainer

    a code block to return a regression model trained on the given data.

    returns

    measure results.

  8. def bootstrap[T <: AnyRef](x: Array[T], y: Array[Int], k: Int, measures: ClassificationMeasure*)(trainer: ⇒ (Array[T], Array[Int]) ⇒ Classifier[T]): Array[Double]

    Permalink

    Bootstrap validation on a generic classifier.

    Bootstrap validation on a generic classifier. The bootstrap is a general tool for assessing statistical accuracy. The basic idea is to randomly draw datasets with replacement from the training data, each sample the same size as the original training set. This is done many times (say k = 100), producing k bootstrap datasets. Then we refit the model to each of the bootstrap datasets and examine the behavior of the fits over the k replications.

    x

    data samples.

    y

    sample labels.

    k

    k-round bootstrap estimation.

    measures

    validation measures such as accuracy, specificity, etc.

    trainer

    a code block to return a classifier trained on the given data.

    returns

    measure results.

  9. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. def cv[T <: AnyRef](x: Array[T], y: Array[Double], k: Int, measures: RegressionMeasure*)(trainer: ⇒ (Array[T], Array[Double]) ⇒ Regression[T]): Array[Double]

    Permalink

    Cross validation on a generic regression model.

    Cross validation on a generic regression model.

    x

    data samples.

    y

    response variable.

    k

    k-fold cross validation.

    measures

    validation measures such as MSE, AbsoluteDeviation, etc.

    trainer

    a code block to return a regression model trained on the given data.

    returns

    measure results.

  11. def cv[T <: AnyRef](x: Array[T], y: Array[Int], k: Int, measures: ClassificationMeasure*)(trainer: ⇒ (Array[T], Array[Int]) ⇒ Classifier[T]): Array[Double]

    Permalink

    Cross validation on a generic classifier.

    Cross validation on a generic classifier. Cross-validation is a technique for assessing how the results of a statistical analysis will generalize to an independent data set. It is mainly used in settings where the goal is prediction, and one wants to estimate how accurately a predictive model will perform in practice. One round of cross-validation involves partitioning a sample of data into complementary subsets, performing the analysis on one subset (called the training set), and validating the analysis on the other subset (called the validation set or testing set). To reduce variability, multiple rounds of cross-validation are performed using different partitions, and the validation results are averaged over the rounds.

    x

    data samples.

    y

    sample labels.

    k

    k-fold cross validation.

    measures

    validation measures such as accuracy, specificity, etc.

    trainer

    a code block to return a classifier trained on the given data.

    returns

    measure results.

  12. def ensuring(cond: (Operators) ⇒ Boolean, msg: ⇒ Any): Operators

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to Ensuring[Operators] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  13. def ensuring(cond: (Operators) ⇒ Boolean): Operators

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to Ensuring[Operators] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  14. def ensuring(cond: Boolean, msg: ⇒ Any): Operators

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to Ensuring[Operators] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  15. def ensuring(cond: Boolean): Operators

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to Ensuring[Operators] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  16. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  18. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. def formatted(fmtstr: String): String

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to StringFormat[Operators] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  20. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  21. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  23. def loocv[T <: AnyRef](x: Array[T], y: Array[Double], measures: RegressionMeasure*)(trainer: ⇒ (Array[T], Array[Double]) ⇒ Regression[T]): Array[Double]

    Permalink

    Leave-one-out cross validation on a generic regression model.

    Leave-one-out cross validation on a generic regression model.

    x

    data samples.

    y

    response variable.

    measures

    validation measures such as MSE, AbsoluteDeviation, etc.

    trainer

    a code block to return a regression model trained on the given data.

    returns

    measure results.

  24. def loocv[T <: AnyRef](x: Array[T], y: Array[Int], measures: ClassificationMeasure*)(trainer: ⇒ (Array[T], Array[Int]) ⇒ Classifier[T]): Array[Double]

    Permalink

    Leave-one-out cross validation on a generic classifier.

    Leave-one-out cross validation on a generic classifier. LOOCV uses a single observation from the original sample as the validation data, and the remaining observations as the training data. This is repeated such that each observation in the sample is used once as the validation data. This is the same as a K-fold cross-validation with K being equal to the number of observations in the original sample. Leave-one-out cross-validation is usually very expensive from a computational point of view because of the large number of times the training process is repeated.

    x

    data samples.

    y

    sample labels.

    measures

    validation measures such as accuracy, specificity, etc.

    trainer

    a code block to return a classifier trained on the given data.

    returns

    measure results.

  25. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  26. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  29. def test[T, C <: Classifier[T]](x: Array[T], y: Array[Int], testx: Array[T], testy: Array[Int], parTest: Boolean = true)(trainer: ⇒ (Array[T], Array[Int]) ⇒ C): C

    Permalink

    Test a generic classifier.

    Test a generic classifier. The accuracy will be measured and printed out on standard output.

    T

    the type of training and test data.

    x

    training data.

    y

    training labels.

    testx

    test data.

    testy

    test data labels.

    parTest

    Parallel test if true.

    trainer

    a code block to return a classifier trained on the given data.

    returns

    the trained classifier.

  30. def test2[T, C <: Classifier[T]](x: Array[T], y: Array[Int], testx: Array[T], testy: Array[Int], parTest: Boolean = true)(trainer: ⇒ (Array[T], Array[Int]) ⇒ C): C

    Permalink

    Test a binary classifier.

    Test a binary classifier. The accuracy, sensitivity, specificity, precision, F-1 score, F-2 score, and F-0.5 score will be measured and printed out on standard output.

    T

    the type of training and test data.

    x

    training data.

    y

    training labels.

    testx

    test data.

    testy

    test data labels.

    parTest

    Parallel test if true.

    trainer

    a code block to return a binary classifier trained on the given data.

    returns

    the trained classifier.

  31. def test2soft[T, C <: SoftClassifier[T]](x: Array[T], y: Array[Int], testx: Array[T], testy: Array[Int], parTest: Boolean = true)(trainer: ⇒ (Array[T], Array[Int]) ⇒ C): C

    Permalink

    Test a binary soft classifier.

    Test a binary soft classifier. The accuracy, sensitivity, specificity, precision, F-1 score, F-2 score, F-0.5 score, and AUC will be measured and printed out on standard output.

    T

    the type of training and test data.

    x

    training data.

    y

    training labels.

    testx

    test data.

    testy

    test data labels.

    parTest

    Parallel test if true.

    trainer

    a code block to return a binary classifier trained on the given data.

    returns

    the trained classifier.

  32. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  33. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. def [B](y: B): (Operators, B)

    Permalink
    Implicit information
    This member is added by an implicit conversion from Operators to ArrowAssoc[Operators] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion any2stringadd from Operators to any2stringadd[Operators]

Inherited by implicit conversion StringFormat from Operators to StringFormat[Operators]

Inherited by implicit conversion Ensuring from Operators to Ensuring[Operators]

Inherited by implicit conversion ArrowAssoc from Operators to ArrowAssoc[Operators]

Ungrouped