org.apache.spark.sql.SQLContext

SparkPlanner

class SparkPlanner extends SparkStrategies

Attributes
protected[org.apache.spark.sql]
Linear Supertypes
SparkStrategies, QueryPlanner[SparkPlan], AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkPlanner
  2. SparkStrategies
  3. QueryPlanner
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkPlanner()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. object Aggregation extends Strategy

    Used to plan the aggregate operator for expressions based on the AggregateFunction2 interface.

    Used to plan the aggregate operator for expressions based on the AggregateFunction2 interface.

    Definition Classes
    SparkStrategies
  7. object BasicOperators extends Strategy

    Definition Classes
    SparkStrategies
  8. object BroadcastNestedLoopJoin extends Strategy

    Definition Classes
    SparkStrategies
  9. object CanBroadcast

    Matches a plan whose output should be small enough to be used in broadcast join.

    Matches a plan whose output should be small enough to be used in broadcast join.

    Definition Classes
    SparkStrategies
  10. object CartesianProduct extends Strategy

    Definition Classes
    SparkStrategies
  11. object DDLStrategy extends Strategy

    Definition Classes
    SparkStrategies
  12. object EquiJoinSelection extends Strategy with PredicateHelper

    Uses the ExtractEquiJoinKeys pattern to find joins where at least some of the predicates can be evaluated by matching join keys.

    Uses the ExtractEquiJoinKeys pattern to find joins where at least some of the predicates can be evaluated by matching join keys.

    Join implementations are chosen with the following precedence:

    - Broadcast: if one side of the join has an estimated physical size that is smaller than the user-configurable org.apache.spark.sql.SQLConf.AUTO_BROADCASTJOIN_THRESHOLD threshold or if that side has an explicit broadcast hint (e.g. the user applied the org.apache.spark.sql.functions.broadcast() function to a DataFrame), then that side of the join will be broadcasted and the other side will be streamed, with no shuffling performed. If both sides of the join are eligible to be broadcasted then the - Sort merge: if the matching join keys are sortable and org.apache.spark.sql.SQLConf.SORTMERGE_JOIN is enabled (default), then sort merge join will be used. - Hash: will be chosen if neither of the above optimizations apply to this join.

    Definition Classes
    SparkStrategies
  13. object HashAggregation extends Strategy

    Definition Classes
    SparkStrategies
  14. object InMemoryScans extends Strategy

    Definition Classes
    SparkStrategies
  15. object LeftSemiJoin extends Strategy with PredicateHelper

    Definition Classes
    SparkStrategies
  16. object TakeOrderedAndProject extends Strategy

    Definition Classes
    SparkStrategies
  17. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  18. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. def codegenEnabled: Boolean

  20. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  21. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  22. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  23. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  24. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  25. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  26. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  27. final def notify(): Unit

    Definition Classes
    AnyRef
  28. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  29. def numPartitions: Int

  30. def plan(plan: LogicalPlan): Iterator[SparkPlan]

    Definition Classes
    QueryPlanner
  31. def planLater(plan: LogicalPlan): SparkPlan

    Attributes
    protected
    Definition Classes
    QueryPlanner
  32. def pruneFilterProject(projectList: Seq[NamedExpression], filterPredicates: Seq[Expression], prunePushedDownFilters: (Seq[Expression]) ⇒ Seq[Expression], scanBuilder: (Seq[Attribute]) ⇒ SparkPlan): SparkPlan

    Used to build table scan operators where complex projection and filtering are done using separate physical operators.

    Used to build table scan operators where complex projection and filtering are done using separate physical operators. This function returns the given scan operator with Project and Filter nodes added only when needed. For example, a Project operator is only used when the final desired output requires complex expressions to be evaluated or when columns can be further eliminated out after filtering has been done.

    The prunePushedDownFilters parameter is used to remove those filters that can be optimized away by the filter pushdown optimization.

    The required attributes for both filtering and expression evaluation are passed to the provided scanBuilder function so that it can avoid unnecessary column materialization.

  33. lazy val singleRowRdd: RDD[InternalRow]

    Attributes
    protected
    Definition Classes
    SparkStrategies
  34. val sparkContext: SparkContext

  35. val sqlContext: SQLContext

  36. def strategies: Seq[Strategy]

    Definition Classes
    SparkPlanner → QueryPlanner
  37. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  38. def toString(): String

    Definition Classes
    AnyRef → Any
  39. def unsafeEnabled: Boolean

  40. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  42. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkStrategies

Inherited from QueryPlanner[SparkPlan]

Inherited from AnyRef

Inherited from Any

Ungrouped