Object

com.salesforce.op.test

TestFeatureBuilder

Related Doc: package test

Permalink

object TestFeatureBuilder extends Product with Serializable

Test Feature Builder is a factory for creating datasets and features for tests

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TestFeatureBuilder
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. object DefaultFeatureNames extends Product with Serializable

    Permalink
  5. def apply(data: Seq[FeatureType]*)(implicit spark: SparkSession): (DataFrame, Array[Feature[_ <: FeatureType]])

    Permalink

    Build a dataset with arbitrary amount of features of specified types

    Build a dataset with arbitrary amount of features of specified types

    data

    data

    spark

    spark session

    returns

    dataset with arbitrary amount of features of specified types

  6. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType, F4 <: FeatureType, F5 <: FeatureType](data: Seq[(F1, F2, F3, F4, F5)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], arg3: scala.reflect.api.JavaUniverse.TypeTag[F4], arg4: scala.reflect.api.JavaUniverse.TypeTag[F5], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3], Feature[F4], Feature[F5])

    Permalink

    Build a dataset with five features of specified types

    Build a dataset with five features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    F4

    4th feature type

    F5

    5th feature type

    data

    data

    spark

    spark session

    returns

    dataset with five features of specified types

  7. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType, F4 <: FeatureType, F5 <: FeatureType](f1name: String, f2name: String, f3name: String, f4name: String, f5name: String, data: Seq[(F1, F2, F3, F4, F5)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], arg3: scala.reflect.api.JavaUniverse.TypeTag[F4], arg4: scala.reflect.api.JavaUniverse.TypeTag[F5], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3], Feature[F4], Feature[F5])

    Permalink

    Build a dataset with five features of specified types

    Build a dataset with five features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    F4

    4th feature type

    F5

    5th feature type

    f1name

    1st feature name

    f2name

    2nd feature name

    f3name

    3rd feature name

    f4name

    4th feature name

    f5name

    5th feature name

    data

    data

    spark

    spark session

    returns

    dataset with five features of specified types

  8. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType, F4 <: FeatureType](data: Seq[(F1, F2, F3, F4)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], arg3: scala.reflect.api.JavaUniverse.TypeTag[F4], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3], Feature[F4])

    Permalink

    Build a dataset with four features of specified types

    Build a dataset with four features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    F4

    4th feature type

    data

    data

    spark

    spark session

    returns

    dataset with four features of specified types

  9. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType, F4 <: FeatureType](f1name: String, f2name: String, f3name: String, f4name: String, data: Seq[(F1, F2, F3, F4)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], arg3: scala.reflect.api.JavaUniverse.TypeTag[F4], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3], Feature[F4])

    Permalink

    Build a dataset with four features of specified types

    Build a dataset with four features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    F4

    4th feature type

    f1name

    1st feature name

    f2name

    2nd feature name

    f3name

    3rd feature name

    f4name

    4th feature name

    data

    data

    spark

    spark session

    returns

    dataset with four features of specified types

  10. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType](data: Seq[(F1, F2, F3)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3])

    Permalink

    Build a dataset with three features of specified types

    Build a dataset with three features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    data

    data

    spark

    spark session

    returns

    dataset with three features of specified types

  11. def apply[F1 <: FeatureType, F2 <: FeatureType, F3 <: FeatureType](f1name: String, f2name: String, f3name: String, data: Seq[(F1, F2, F3)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], arg2: scala.reflect.api.JavaUniverse.TypeTag[F3], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2], Feature[F3])

    Permalink

    Build a dataset with three features of specified types

    Build a dataset with three features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    F3

    3rd feature type

    f1name

    1st feature name

    f2name

    2nd feature name

    f3name

    3rd feature name

    data

    data

    spark

    spark session

    returns

    dataset with three features of specified types

  12. def apply[F1 <: FeatureType, F2 <: FeatureType](data: Seq[(F1, F2)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2])

    Permalink

    Build a dataset with two features of specified types

    Build a dataset with two features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    data

    data

    spark

    spark session

    returns

    dataset with two features of specified types

  13. def apply[F1 <: FeatureType, F2 <: FeatureType](f1name: String, f2name: String, data: Seq[(F1, F2)])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], arg1: scala.reflect.api.JavaUniverse.TypeTag[F2], spark: SparkSession): (DataFrame, Feature[F1], Feature[F2])

    Permalink

    Build a dataset with two features of specified types

    Build a dataset with two features of specified types

    F1

    1st feature type

    F2

    2nd feature type

    f1name

    1st feature name

    f2name

    2nd feature name

    data

    data

    spark

    spark session

    returns

    dataset with two features of specified types

  14. def apply[F1 <: FeatureType](data: Seq[F1])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], spark: SparkSession): (DataFrame, Feature[F1])

    Permalink

    Build a dataset with one feature of specified type

    Build a dataset with one feature of specified type

    F1

    feature type

    data

    data

    spark

    spark session

    returns

    dataset with one feature of specified type

  15. def apply[F1 <: FeatureType](f1name: String, data: Seq[F1])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[F1], spark: SparkSession): (DataFrame, Feature[F1])

    Permalink

    Build a dataset with one feature of specified type

    Build a dataset with one feature of specified type

    F1

    feature type

    f1name

    feature name

    data

    data

    spark

    spark session

    returns

    dataset with one feature of specified type

  16. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  17. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  19. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  20. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  21. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  23. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  24. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  25. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. def random(numOfRows: Int = 10)(vectors: ⇒ Seq[OPVector] = ..., textLists: ⇒ Seq[TextList] = ..., dateLists: ⇒ Seq[DateList] = ..., dateTimeLists: ⇒ Seq[DateList] = ..., geoLocations: ⇒ Seq[Geolocation] = ..., base64Maps: ⇒ Seq[Base64Map] = ..., binaryMaps: ⇒ Seq[BinaryMap] = ..., comboBoxMaps: ⇒ Seq[ComboBoxMap] = ..., currencyMaps: ⇒ Seq[CurrencyMap] = ..., dateMaps: ⇒ Seq[DateMap] = ..., dateTimeMaps: ⇒ Seq[DateTimeMap] = ..., emailMaps: ⇒ Seq[EmailMap] = ..., idMaps: ⇒ Seq[IDMap] = ..., integralMaps: ⇒ Seq[IntegralMap] = ..., multiPickListMaps: ⇒ Seq[MultiPickListMap] = ..., percentMaps: ⇒ Seq[PercentMap] = ..., phoneMaps: ⇒ Seq[PhoneMap] = ..., pickListMaps: ⇒ Seq[PickListMap] = ..., realMaps: ⇒ Seq[RealMap] = ..., textAreaMaps: ⇒ Seq[TextAreaMap] = ..., textMaps: ⇒ Seq[TextMap] = ..., urlMaps: ⇒ Seq[URLMap] = ..., countryMaps: ⇒ Seq[CountryMap] = ..., stateMaps: ⇒ Seq[StateMap] = ..., cityMaps: ⇒ Seq[CityMap] = ..., postalCodeMaps: ⇒ Seq[PostalCodeMap] = ..., streetMaps: ⇒ Seq[StreetMap] = ..., nameStats: ⇒ Seq[NameStats] = ..., geoLocationMaps: ⇒ Seq[GeolocationMap] = ..., binaries: ⇒ Seq[Binary] = RandomBinary(0.5).limit(numOfRows), currencies: ⇒ Seq[Currency] = ..., dates: ⇒ Seq[Date] = ..., dateTimes: ⇒ Seq[DateTime] = ..., integrals: ⇒ Seq[Integral] = ..., percents: ⇒ Seq[Percent] = ..., reals: ⇒ Seq[Real] = ..., realNNs: ⇒ Seq[RealNN] = ..., multiPickLists: ⇒ Seq[MultiPickList] = ..., base64s: ⇒ Seq[Base64] = ..., comboBoxes: ⇒ Seq[ComboBox] = ..., emails: ⇒ Seq[Email] = ..., ids: ⇒ Seq[ID] = RandomText.ids.limit(numOfRows), phones: ⇒ Seq[Phone] = RandomText.phones.limit(numOfRows), pickLists: ⇒ Seq[PickList] = ..., texts: ⇒ Seq[Text] = ..., textAreas: ⇒ Seq[TextArea] = ..., urls: ⇒ Seq[URL] = RandomText.urls.limit(numOfRows), countries: ⇒ Seq[Country] = ..., states: ⇒ Seq[State] = RandomText.states.limit(numOfRows), cities: ⇒ Seq[City] = RandomText.cities.limit(numOfRows), postalCodes: ⇒ Seq[PostalCode] = ..., streets: ⇒ Seq[Street] = RandomText.streets.limit(numOfRows))(implicit spark: SparkSession): (DataFrame, Array[Feature[_ <: FeatureType]])

    Permalink

    Build a dataset with random features of specified size

    Build a dataset with random features of specified size

    numOfRows

    number of rows to generate (must be positive)

    spark

    spark session

    returns

    dataset with random features of specified size

  27. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  28. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped