c

frameless

AbstractTypedColumn

abstract class AbstractTypedColumn[T, U] extends UntypedExpression[T]

Generic representation of a typed column. A typed column can either be a TypedAggregate or a frameless.TypedColumn.

Documentation marked "apache/spark" is thanks to apache/spark Contributors at https://github.com/apache/spark, licensed under Apache v2.0 available at http://www.apache.org/licenses/LICENSE-2.0

T

phantom type representing the dataset on which this columns is selected. When T = A with B the selection is on either A or B.

U

type of column

Self Type
AbstractTypedColumn[T, U]
Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AbstractTypedColumn
  2. UntypedExpression
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new AbstractTypedColumn(expr: Expression)(implicit uencoder: TypedEncoder[U])

Type Members

  1. trait Mapper[X] extends AnyRef

    A helper class to make to simplify working with Optional fields.

    A helper class to make to simplify working with Optional fields.

    val x: TypedColumn[Option[Int]] = _
    x.opt.map(_*2) // This only compiles if the type of x is Option[X] (in this example X is of type Int)
    Note

    Known issue: map() will NOT work when the applied function is a udf(). It will compile and then throw a runtime error.

  2. abstract type ThisType[A, B] <: AbstractTypedColumn[A, B]

Abstract Value Members

  1. abstract def lit[U1](c: U1)(implicit arg0: TypedEncoder[U1]): ThisType[T, U1]

    Creates a typed column of either TypedColumn or TypedAggregate.

  2. abstract def typed[W, U1](c: Column)(implicit arg0: TypedEncoder[U1]): ThisType[W, U1]

    Creates a typed column of either TypedColumn or TypedAggregate.

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. def %(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]

    Modulo (a.k.a.

    Modulo (a.k.a. remainder) expression.

    apache/spark

  4. def %[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Modulo (a.k.a.

    Modulo (a.k.a. remainder) expression.

    apache/spark

  5. def &[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise AND this expression and another expression.

    Bitwise AND this expression and another expression.

    df.select(df.col('colA) & (df.col('colB)))
    other

    a constant of the same type apache/spark

  6. def &(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise AND this expression and another expression (of same type).

    Bitwise AND this expression and another expression (of same type).

    df.select(df.col('colA).cast[Int] & -1)
    u

    a constant of the same type apache/spark

  7. def &&[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Boolean AND.

    Boolean AND.

    df.filter ( df.col('a) === 1 && df.col('b) > 5)
  8. def *(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]

    Multiplication of this expression a constant.

    Multiplication of this expression a constant.

    // The following multiplies a person's height by their weight.
    people.select( people.col('height) * people.col('weight) )

    apache/spark

  9. def *[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W], t: ClassTag[U]): ThisType[W, U]

    Multiplication of this expression and another expression.

    Multiplication of this expression and another expression.

    // The following multiplies a person's height by their weight.
    people.select( people.col('height) * people.col('weight) )

    apache/spark

  10. def +(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]

    Sum of this expression (column) with a constant.

    Sum of this expression (column) with a constant.

    // The following selects the sum of a person's height and weight.
    people.select( people('height) + 2 )
    u

    a constant of the same type apache/spark

  11. def +[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Sum of this expression and another expression.

    Sum of this expression and another expression.

    // The following selects the sum of a person's height and weight.
    people.select( people.col('height) + people.col('weight) )

    apache/spark

  12. def -(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]

    Subtraction.

    Subtraction. Subtract the other expression from this expression.

    // The following selects the difference between people's height and their weight.
    people.select( people('height) - 1 )
    u

    a constant of the same type apache/spark

  13. def -[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Subtraction.

    Subtraction. Subtract the other expression from this expression.

    // The following selects the difference between people's height and their weight.
    people.select( people.col('height) - people.col('weight) )

    apache/spark

  14. def /(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, Double]

    Division this expression by another expression.

    Division this expression by another expression.

    // The following divides a person's height by their weight.
    people.select( people('height) / 2 )
    u

    a constant of the same type apache/spark

  15. def /[Out, TT, W](other: ThisType[TT, U])(implicit n: CatalystDivisible[U, Out], e: TypedEncoder[Out], w: With.Aux[T, TT, W]): ThisType[W, Out]

    Division this expression by another expression.

    Division this expression by another expression.

    // The following divides a person's height by their weight.
    people.select( people('height) / people('weight) )
    other

    another column of the same type apache/spark

  16. def <(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]

    Less than.

    Less than.

    // The following selects people younger than 21.
    df.select( df('age) < 21 )
    u

    a constant of the same type apache/spark

  17. def <[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Less than.

    Less than.

    // The following selects people younger than the maxAge column.
    df.select( df('age) < df('maxAge) )
    other

    another column of the same type apache/spark

  18. def <=(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]

    Less than or equal to.

    Less than or equal to.

    // The following selects people younger than 22.
    df.select( df('age) <= 2 )
    u

    a constant of the same type apache/spark

  19. def <=[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Less than or equal to.

    Less than or equal to.

    // The following selects people younger or equal than the maxAge column.
    df.select( df('age) <= df('maxAge)
    other

    another column of the same type apache/spark

  20. def =!=(u: U): ThisType[T, Boolean]

    Inequality test.

    Inequality test.

    df.filter( df.col('a) =!= "a" )

    apache/spark

  21. def =!=[TT, W](other: ThisType[TT, U])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Inequality test.

    Inequality test.

    df.filter( df.col('a) =!= df.col('b) )

    apache/spark

  22. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  23. def ===[TT, W](other: ThisType[TT, U])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Equality test.

    Equality test.

    df.filter( df.col('a) === df.col('b) )

    apache/spark

  24. def ===(u: U): ThisType[T, Boolean]

    Equality test.

    Equality test.

    df.filter( df.col('a) === 1 )

    apache/spark

  25. def >(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]

    Greater than.

    Greater than.

    // The following selects people older than 21.
    df.select( df('age) > 21 )
    u

    another column of the same type apache/spark

  26. def >[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Greater than.

    Greater than.

    // The following selects people older than the maxAge column.
    df.select( df('age) > df('maxAge) )
    other

    another column of the same type apache/spark

  27. def >=(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]

    Greater than or equal.

    Greater than or equal.

    // The following selects people older than 20.
    df.select( df('age) >= 21 )
    u

    another column of the same type apache/spark

  28. def >=[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Greater than or equal.

    Greater than or equal.

    // The following selects people older or equal than the maxAge column.
    df.select( df('age) >= df('maxAge) )
    other

    another column of the same type apache/spark

  29. def ^[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise XOR this expression and another expression.

    Bitwise XOR this expression and another expression.

    df.select(df.col('colA) ^ (df.col('colB)))
    other

    a constant of the same type apache/spark

  30. def ^(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise XOR this expression and another expression (of same type).

    Bitwise XOR this expression and another expression (of same type).

    df.select(df.col('colA).cast[Long] ^ 1L)
    u

    a constant of the same type apache/spark

  31. def and[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Boolean AND.

    Boolean AND.

    df.filter ( (df.col('a) === 1).and(df.col('b) > 5) )
  32. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  33. def asc(implicit catalystOrdered: CatalystOrdered[U]): SortedTypedColumn[T, U]

    Returns an ascending ordering used in sorting

    Returns an ascending ordering used in sorting

    apache/spark

  34. def between[TT1, TT2, W1, W2](lowerBound: ThisType[TT1, U], upperBound: ThisType[TT2, U])(implicit i0: CatalystOrdered[U], w0: With.Aux[T, TT1, W1], w1: With.Aux[TT2, W1, W2]): ThisType[W2, Boolean]

    True if the current column is between the lower bound and upper bound, inclusive.

    True if the current column is between the lower bound and upper bound, inclusive.

    lowerBound

    another column of the same type

    upperBound

    another column of the same type apache/spark

  35. def between(lowerBound: U, upperBound: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]

    True if the current column is between the lower bound and upper bound, inclusive.

    True if the current column is between the lower bound and upper bound, inclusive.

    lowerBound

    a constant of the same type

    upperBound

    a constant of the same type apache/spark

  36. def bitwiseAND[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise AND this expression and another expression.

    Bitwise AND this expression and another expression.

    df.select(df.col('colA) bitwiseAND (df.col('colB)))
  37. def bitwiseAND(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise AND this expression and another expression.

    Bitwise AND this expression and another expression.

    df.select(df.col('colA) bitwiseAND (df.col('colB)))
    u

    a constant of the same type apache/spark

  38. def bitwiseOR[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise OR this expression and another expression.

    Bitwise OR this expression and another expression.

    df.select(df.col('colA) bitwiseOR (df.col('colB)))
    other

    a constant of the same type apache/spark

  39. def bitwiseOR(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise OR this expression and another expression.

    Bitwise OR this expression and another expression.

    df.select(df.col('colA) bitwiseOR (df.col('colB)))
    u

    a constant of the same type apache/spark

  40. def bitwiseXOR[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise XOR this expression and another expression.

    Bitwise XOR this expression and another expression.

    df.select(df.col('colA) bitwiseXOR (df.col('colB)))
    other

    a constant of the same type apache/spark

  41. def bitwiseXOR(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise XOR this expression and another expression.

    Bitwise XOR this expression and another expression.

    df.select(df.col('colA) bitwiseXOR (df.col('colB)))
    u

    a constant of the same type apache/spark

  42. def cast[A](implicit arg0: TypedEncoder[A], c: CatalystCast[U, A]): ThisType[T, A]

    Casts the column to a different type.

    Casts the column to a different type.

    df.select(df('a).cast[Int])
  43. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  44. def contains[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    String contains.

    String contains.

    df.filter ( df.col('a).contains(df.col('b) )
    other

    a column which values is used as a string that is being tested against. apache/spark

  45. def contains(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]

    String contains another string literal.

    String contains another string literal.

    df.filter ( df.col('a).contains("foo") )
    other

    a string that is being tested against. apache/spark

  46. def desc(implicit catalystOrdered: CatalystOrdered[U]): SortedTypedColumn[T, U]

    Returns a descending ordering used in sorting

    Returns a descending ordering used in sorting

    apache/spark

  47. def divide[Out, TT, W](other: ThisType[TT, U])(implicit arg0: TypedEncoder[Out], n: CatalystDivisible[U, Out], w: With.Aux[T, TT, W]): ThisType[W, Out]

    Division this expression by another expression.

    Division this expression by another expression.

    // The following divides a person's height by their weight.
    people.select( people('height) / people('weight) )
    other

    another column of the same type apache/spark

  48. def endsWith[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    String ends with.

    String ends with.

    df.filter ( df.col('a).endsWith(df.col('b))
    other

    a column which values is used as a suffix that is being tested against. apache/spark

  49. def endsWith(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]

    String ends with another string literal.

    String ends with another string literal.

    df.filter ( df.col('a).endsWith("foo")
    other

    a suffix that is being tested against. apache/spark

  50. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  51. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  52. val expr: Expression
  53. def field[V](symbol: Lt[Symbol])(implicit i0: Exists[U, (symbol)#T, V], i1: TypedEncoder[V]): ThisType[T, V]

    Returns a nested column matching the field symbol.

    Returns a nested column matching the field symbol.

    V

    the type of the nested field

    symbol

    the field symbol

  54. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  55. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  56. def getOrElse[Out](default: Out)(implicit arg0: TypedEncoder[Out], i0: =:=[U, Option[Out]]): ThisType[T, Out]

    Convert an Optional column by providing a default value

    Convert an Optional column by providing a default value

    df( df('opt).getOrElse(defaultConstant) )
  57. def getOrElse[TT, W, Out](default: ThisType[TT, Out])(implicit i0: =:=[U, Option[Out]], i1: With.Aux[T, TT, W]): ThisType[W, Out]

    Convert an Optional column by providing a default value

    Convert an Optional column by providing a default value

    df( df('opt).getOrElse(df('defaultValue)) )
  58. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  59. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  60. def isNaN(implicit n: CatalystNaN[U]): ThisType[T, Boolean]

    True if the current expression is a fractional number and is not NaN.

    True if the current expression is a fractional number and is not NaN.

    apache/spark

  61. def isNone(implicit i0: <:<[U, Option[_]]): ThisType[T, Boolean]

    True if the current expression is an Option and it's None.

    True if the current expression is an Option and it's None.

    apache/spark

  62. def isNotNone(implicit i0: <:<[U, Option[_]]): ThisType[T, Boolean]

    True if the current expression is an Option and it's not None.

    True if the current expression is an Option and it's not None.

    apache/spark

  63. def isin(values: U*)(implicit e: CatalystIsin[U]): ThisType[T, Boolean]

    Returns true if the value of this column is contained in of the arguments.

    Returns true if the value of this column is contained in of the arguments.

    // The following selects people with age 15, 20, or 30.
    df.select( df('age).isin(15, 20, 30) )
    values

    are constants of the same type apache/spark

  64. def like(literal: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]

    SQL like expression.

    SQL like expression. Returns a boolean column based on a SQL LIKE match.

    val ds = TypedDataset.create(X2("foo", "bar") :: Nil)
    // true
    ds.select(ds('a).like("foo"))
    
    // Selected column has value "bar"
    ds.select(when(ds('a).like("f"), ds('a)).otherwise(ds('b))

    apache/spark

  65. def minus[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Subtraction.

    Subtraction. Subtract the other expression from this expression.

    // The following selects the difference between people's height and their weight.
    people.select( people.col('height) minus people.col('weight) )

    apache/spark

  66. def mod[Out, TT, W](other: ThisType[TT, U])(implicit arg0: TypedEncoder[Out], n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, Out]

    Modulo (a.k.a.

    Modulo (a.k.a. remainder) expression.

    apache/spark

  67. def multiply[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W], t: ClassTag[U]): ThisType[W, U]

    Multiplication of this expression and another expression.

    Multiplication of this expression and another expression.

    // The following multiplies a person's height by their weight.
    people.select( people.col('height) multiply people.col('weight) )

    apache/spark

  68. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  69. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  70. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  71. def opt[X](implicit x: <:<[U, Option[X]]): Mapper[X]

    Makes it easier to work with Optional columns.

    Makes it easier to work with Optional columns. It returns an instance of Mapper[X] where X is type of the unwrapped Optional. E.g., in the case of Option[Long], X is of type Long.

    val x: TypedColumn[Option[Int]] = _
    x.opt.map(_*2)
  72. def or[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Boolean OR.

    Boolean OR.

    df.filter ( (df.col('a) === 1).or(df.col('b) > 5) )
  73. def plus[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Sum of this expression and another expression.

    Sum of this expression and another expression.

    // The following selects the sum of a person's height and weight.
    people.select( people.col('height) plus people.col('weight) )

    apache/spark

  74. def rlike(literal: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]

    SQL RLIKE expression (LIKE with Regex).

    SQL RLIKE expression (LIKE with Regex). Returns a boolean column based on a regex match.

    val ds = TypedDataset.create(X1("foo") :: Nil)
    // true
    ds.select(ds('a).rlike("foo"))
    
    // true
    ds.select(ds('a).rlike(".*))

    apache/spark

  75. def startsWith[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    String starts with.

    String starts with.

    df.filter ( df.col('a).startsWith(df.col('b))
    other

    a column which values is used as a prefix that is being tested against. apache/spark

  76. def startsWith(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]

    String starts with another string literal.

    String starts with another string literal.

    df.filter ( df.col('a).startsWith("foo")
    other

    a prefix that is being tested against. apache/spark

  77. def substr[TT1, TT2, W1, W2](startPos: ThisType[TT1, Int], len: ThisType[TT2, Int])(implicit ev: =:=[U, String], w1: With.Aux[T, TT1, W1], w2: With.Aux[W1, TT2, W2]): ThisType[W2, String]

    An expression that returns a substring

    An expression that returns a substring

    df.select(df('a).substr(df('b), df('c)))
    startPos

    expression for the starting position

    len

    expression for the length of the substring

  78. def substr(startPos: Int, len: Int)(implicit ev: =:=[U, String]): ThisType[T, String]

    An expression that returns a substring

    An expression that returns a substring

    df.select(df('a).substr(0, 5))
    startPos

    starting position

    len

    length of the substring

  79. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  80. def toString(): String
    Definition Classes
    UntypedExpression → AnyRef → Any
  81. def typed[W, U1](e: Expression)(implicit arg0: TypedEncoder[U1]): ThisType[W, U1]

    Creates a typed column of either TypedColumn or TypedAggregate from an expression.

    Creates a typed column of either TypedColumn or TypedAggregate from an expression.

    Attributes
    protected
  82. implicit val uencoder: TypedEncoder[U]
  83. def unary_!(implicit i0: <:<[U, Boolean]): ThisType[T, Boolean]

    Inversion of boolean expression, i.e.

    Inversion of boolean expression, i.e. NOT.

    // Select rows that are not active (isActive === false)
    df.filter( !df('isActive) )

    apache/spark

  84. def unary_-(implicit n: CatalystNumeric[U]): ThisType[T, U]

    Unary minus, i.e.

    Unary minus, i.e. negate the expression.

    // Select the amount column and negates all values.
    df.select( -df('amount) )

    apache/spark

  85. def untyped: Column

    Fall back to an untyped Column

  86. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  87. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  88. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  89. def |[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]

    Bitwise OR this expression and another expression.

    Bitwise OR this expression and another expression.

    df.select(df.col('colA) | (df.col('colB)))
    other

    a constant of the same type apache/spark

  90. def |(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]

    Bitwise OR this expression and another expression (of same type).

    Bitwise OR this expression and another expression (of same type).

    df.select(df.col('colA).cast[Long] | 1L)
    u

    a constant of the same type apache/spark

  91. def ||[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]

    Boolean OR.

    Boolean OR.

    df.filter ( df.col('a) === 1 || df.col('b) > 5)

Inherited from UntypedExpression[T]

Inherited from AnyRef

Inherited from Any

Ungrouped