frameless
package frameless
- Alphabetic
- Public
- Protected
Type Members
- trait CatalystAverageable[In, Out] extends AnyRef
When averaging Spark doesn't change these types: - BigDecimal -> BigDecimal - Double -> Double But it changes these types : - Int -> Double - Short -> Double - Long -> Double
When averaging Spark doesn't change these types: - BigDecimal -> BigDecimal - Double -> Double But it changes these types : - Int -> Double - Short -> Double - Long -> Double
- Annotations
- @implicitNotFound()
- trait CatalystBitShift[In, Out] extends AnyRef
Spark does not return always Int on shift
Spark does not return always Int on shift
- Annotations
- @implicitNotFound()
- trait CatalystBitwise[A] extends CatalystNumeric[A]
Types that can be bitwise ORed, ANDed, or XORed by Catalyst.
Types that can be bitwise ORed, ANDed, or XORed by Catalyst. Note that Catalyst requires that when performing bitwise operations between columns the two types must be the same so in some cases casting is necessary.
- Annotations
- @implicitNotFound()
- trait CatalystCast[A, B] extends AnyRef
- trait CatalystCollection[C[_]] extends AnyRef
- Annotations
- @implicitNotFound()
- trait CatalystDivisible[In, Out] extends AnyRef
Spark divides everything as Double, expect BigDecimals are divided into another BigDecimal, benefiting from some added precision.
Spark divides everything as Double, expect BigDecimals are divided into another BigDecimal, benefiting from some added precision.
- Annotations
- @implicitNotFound()
- trait CatalystIsin[A] extends AnyRef
Types for which we can check if is in
Types for which we can check if is in
- Annotations
- @implicitNotFound()
- trait CatalystNaN[A] extends AnyRef
Spark does NaN check only for these types
Spark does NaN check only for these types
- Annotations
- @implicitNotFound()
- trait CatalystNullable[A] extends AnyRef
- Annotations
- @implicitNotFound()
- trait CatalystNumeric[A] extends AnyRef
Types that can be added, subtracted and multiplied by Catalyst.
Types that can be added, subtracted and multiplied by Catalyst.
- Annotations
- @implicitNotFound()
- trait CatalystNumericWithJavaBigDecimal[In, Out] extends AnyRef
Spark does not return always the same type as the input was for example abs
Spark does not return always the same type as the input was for example abs
- Annotations
- @implicitNotFound()
- trait CatalystOrdered[A] extends AnyRef
Types that can be ordered/compared by Catalyst.
Types that can be ordered/compared by Catalyst.
- Annotations
- @implicitNotFound()
- trait CatalystPivotable[A] extends AnyRef
- Annotations
- @implicitNotFound()
- trait CatalystRound[In, Out] extends AnyRef
Spark does not return always long on round
Spark does not return always long on round
- Annotations
- @implicitNotFound()
- trait CatalystSummable[In, Out] extends AnyRef
When summing Spark doesn't change these types: - Long -> Long - BigDecimal -> BigDecimal - Double -> Double
When summing Spark doesn't change these types: - Long -> Long - BigDecimal -> BigDecimal - Double -> Double
For other types there are conversions: - Int -> Long - Short -> Long
- Annotations
- @implicitNotFound()
- trait CatalystVariance[A] extends AnyRef
Spark's variance and stddev functions always return Double
Spark's variance and stddev functions always return Double
- Annotations
- @implicitNotFound()
- trait Injection[A, B] extends Serializable
An Injection[A, B] is a reversible function from A to B.
An Injection[A, B] is a reversible function from A to B.
Must obey
forAll { a: A => invert(apply(a)) == a }
. - trait NotCatalystNullable[A] extends AnyRef
- Annotations
- @implicitNotFound()
- case class SQLDate(days: Int) extends Product with Serializable
Type for the internal Spark representation of SQL date.
Type for the internal Spark representation of SQL date. If the
spark.sql.functions
where typed, [date_add][1] would for instance be defined asdef date_add(d: SQLDate, i: Int); SQLDate
.[1]: https://spark.apache.org/docs/2.0.2/api/java/org/apache/spark/sql/functions.html#add_months(org.apache.spark.sql.Column,%20int)
- case class SQLTimestamp(us: Long) extends Product with Serializable
Type for the Spark internal representation of a timestamp.
Type for the Spark internal representation of a timestamp. If the
spark.sql.functions
where typed, [current_timestamp][1] would for instance be defined asdef current_timestamp(): SQLTimestamp
.[1]: https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/sql/functions.html#current_timestamp()
Value Members
- object CatalystAverageable
- object CatalystBitShift
- object CatalystBitwise
- object CatalystCast
- object CatalystCollection
- object CatalystDivisible
- object CatalystIsin
- object CatalystNaN
- object CatalystNullable
- object CatalystNumeric
- object CatalystNumericWithJavaBigDecimal
- object CatalystOrdered
- object CatalystPivotable
- object CatalystRound
- object CatalystSummable
- object CatalystVariance
- object Injection extends Serializable
- object NotCatalystNullable