org.apache.spark.sql.execution

Filter

case class Filter(condition: Expression, child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

Linear Supertypes
UnaryNode, SparkPlan, Serializable, Serializable, Logging, QueryPlan[SparkPlan], TreeNode[SparkPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. Filter
  2. UnaryNode
  3. SparkPlan
  4. Serializable
  5. Serializable
  6. Logging
  7. QueryPlan
  8. TreeNode
  9. Product
  10. Equals
  11. AnyRef
  12. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Filter(condition: Expression, child: SparkPlan)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def apply(number: Int): SparkPlan

    Definition Classes
    TreeNode
  7. def argString: String

    Definition Classes
    TreeNode
  8. def asCode: String

    Definition Classes
    TreeNode
  9. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  10. def canProcessSafeRows: Boolean

    Specifies whether this operator is capable of processing Java-object-based Rows (i.

    Specifies whether this operator is capable of processing Java-object-based Rows (i.e. rows that are not UnsafeRows).

    Definition Classes
    FilterSparkPlan
  11. def canProcessUnsafeRows: Boolean

    Specifies whether this operator is capable of processing UnsafeRows

    Specifies whether this operator is capable of processing UnsafeRows

    Definition Classes
    FilterSparkPlan
  12. val child: SparkPlan

    Definition Classes
    Filter → UnaryNode
  13. def children: Seq[SparkPlan]

    Definition Classes
    UnaryNode → TreeNode
  14. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]

    Definition Classes
    TreeNode
  16. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]

    Definition Classes
    TreeNode
  17. val condition: Expression

  18. lazy val containsChild: Set[TreeNode[_]]

    Definition Classes
    TreeNode
  19. def doExecute(): RDD[InternalRow]

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as an RDD[InternalRow]

    Attributes
    protected
    Definition Classes
    FilterSparkPlan
  20. def doPrepare(): Unit

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Note: the prepare method has already walked down the tree, so the implementation doesn't need to call children's prepare methods.

    Attributes
    protected
    Definition Classes
    SparkPlan
  21. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  22. final def execute(): RDD[InternalRow]

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after adding query plan information to created RDDs for visualization.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after adding query plan information to created RDDs for visualization. Concrete implementations of SparkPlan should override doExecute instead.

    Definition Classes
    SparkPlan
  23. def executeCollect(): Array[InternalRow]

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  24. def executeCollectPublic(): Array[Row]

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  25. def executeTake(n: Int): Array[InternalRow]

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  26. def expressions: Seq[Expression]

    Definition Classes
    QueryPlan
  27. def fastEquals(other: TreeNode[_]): Boolean

    Definition Classes
    TreeNode
  28. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  29. def find(f: (SparkPlan) ⇒ Boolean): Option[SparkPlan]

    Definition Classes
    TreeNode
  30. def flatMap[A](f: (SparkPlan) ⇒ TraversableOnce[A]): Seq[A]

    Definition Classes
    TreeNode
  31. def foreach(f: (SparkPlan) ⇒ Unit): Unit

    Definition Classes
    TreeNode
  32. def foreachUp(f: (SparkPlan) ⇒ Unit): Unit

    Definition Classes
    TreeNode
  33. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], builder: StringBuilder): StringBuilder

    Attributes
    protected
    Definition Classes
    TreeNode
  34. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  35. def getNodeNumbered(number: MutableInt): SparkPlan

    Attributes
    protected
    Definition Classes
    TreeNode
  36. def inputSet: AttributeSet

    Definition Classes
    QueryPlan
  37. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  38. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  39. def jsonFields: List[(String, JValue)]

    Attributes
    protected
    Definition Classes
    TreeNode
  40. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  41. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  42. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  43. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  44. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  45. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  46. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  47. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  48. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  49. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  50. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  51. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  52. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Overridden make copy also propogates sqlContext to copied plan.

    Overridden make copy also propogates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  53. def map[A](f: (SparkPlan) ⇒ A): Seq[A]

    Definition Classes
    TreeNode
  54. def mapChildren(f: (SparkPlan) ⇒ SparkPlan): SparkPlan

    Definition Classes
    TreeNode
  55. def missingInput: AttributeSet

    Definition Classes
    QueryPlan
  56. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  57. def newMutableProjection(expressions: Seq[Expression], inputSchema: Seq[Attribute]): () ⇒ MutableProjection

    Attributes
    protected
    Definition Classes
    SparkPlan
  58. def newNaturalAscendingOrdering(dataTypes: Seq[DataType]): Ordering[InternalRow]

    Creates a row ordering for the given schema, in natural ascending order.

    Creates a row ordering for the given schema, in natural ascending order.

    Attributes
    protected
    Definition Classes
    SparkPlan
  59. def newOrdering(order: Seq[SortOrder], inputSchema: Seq[Attribute]): Ordering[InternalRow]

    Attributes
    protected
    Definition Classes
    SparkPlan
  60. def newPredicate(expression: Expression, inputSchema: Seq[Attribute]): (InternalRow) ⇒ Boolean

    Attributes
    protected
    Definition Classes
    SparkPlan
  61. def nodeName: String

    Definition Classes
    TreeNode
  62. final def notify(): Unit

    Definition Classes
    AnyRef
  63. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  64. def numberedTreeString: String

    Definition Classes
    TreeNode
  65. val origin: Origin

    Definition Classes
    TreeNode
  66. def otherCopyArgs: Seq[AnyRef]

    Attributes
    protected
    Definition Classes
    TreeNode
  67. def output: Seq[Attribute]

    Definition Classes
    Filter → QueryPlan
  68. def outputOrdering: Seq[SortOrder]

    Specifies how data is ordered in each partition.

    Specifies how data is ordered in each partition.

    Definition Classes
    FilterSparkPlan
  69. def outputPartitioning: Partitioning

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster.

    Definition Classes
    UnaryNode → SparkPlan
  70. def outputSet: AttributeSet

    Definition Classes
    QueryPlan
  71. def outputsUnsafeRows: Boolean

    Specifies whether this operator outputs UnsafeRows

    Specifies whether this operator outputs UnsafeRows

    Definition Classes
    FilterSparkPlan
  72. final def prepare(): Unit

    Prepare a SparkPlan for execution.

    Prepare a SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  73. def prettyJson: String

    Definition Classes
    TreeNode
  74. def printSchema(): Unit

    Definition Classes
    QueryPlan
  75. def references: AttributeSet

    Definition Classes
    QueryPlan
  76. def requiredChildDistribution: Seq[Distribution]

    Specifies any partition requirements on the input data for this operator.

    Specifies any partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  77. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Specifies sort order for each partition requirements on the input data for this operator.

    Specifies sort order for each partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  78. lazy val schema: StructType

    Definition Classes
    QueryPlan
  79. def schemaString: String

    Definition Classes
    QueryPlan
  80. def simpleString: String

    Definition Classes
    QueryPlan → TreeNode
  81. def sparkContext: SparkContext

    Attributes
    protected
    Definition Classes
    SparkPlan
  82. final val sqlContext: SQLContext

    A handle to the SQL Context that was used to create this plan.

    A handle to the SQL Context that was used to create this plan. Since many operators need access to the sqlContext for RDD operations or configuration this field is automatically populated by the query planning infrastructure.

    Attributes
    protected[org.apache.spark]
    Definition Classes
    SparkPlan
  83. def statePrefix: String

    Attributes
    protected
    Definition Classes
    QueryPlan
  84. def stringArgs: Iterator[Any]

    Attributes
    protected
    Definition Classes
    TreeNode
  85. val subexpressionEliminationEnabled: Boolean

    Definition Classes
    SparkPlan
  86. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  87. def toJSON: String

    Definition Classes
    TreeNode
  88. def toString(): String

    Definition Classes
    TreeNode → AnyRef → Any
  89. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  90. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): Filter.this.type

    Definition Classes
    QueryPlan
  91. def transformChildren(rule: PartialFunction[SparkPlan, SparkPlan], nextOperation: (SparkPlan, PartialFunction[SparkPlan, SparkPlan]) ⇒ SparkPlan): SparkPlan

    Attributes
    protected
    Definition Classes
    TreeNode
  92. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  93. def transformExpressions(rule: PartialFunction[Expression, Expression]): Filter.this.type

    Definition Classes
    QueryPlan
  94. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): Filter.this.type

    Definition Classes
    QueryPlan
  95. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): Filter.this.type

    Definition Classes
    QueryPlan
  96. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  97. def treeString: String

    Definition Classes
    TreeNode
  98. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  99. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  100. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  101. def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan

    Definition Classes
    TreeNode

Inherited from UnaryNode

Inherited from SparkPlan

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from TreeNode[SparkPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped