Packages

class PySparkInterpreter extends PythonInterpreter

Linear Supertypes
PythonInterpreter, Interpreter, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PySparkInterpreter
  2. PythonInterpreter
  3. Interpreter
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PySparkInterpreter(_compiler: ScalaCompiler, jepInstance: Jep, jepExecutor: Executor, jepThread: AtomicReference[Thread], jepBlockingService: Blocking, runtime: Runtime[Any], pyApi: PythonAPI, venvPath: Option[Path])

Type Members

  1. case class PythonState extends State with Product with Serializable
    Definition Classes
    PythonInterpreter

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. def compile(parsed: PyObject, cell: String): Task[PyObject]
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  7. val compiler: ScalaCompiler
    Definition Classes
    PythonInterpreter
  8. def completionsAt(code: String, pos: Int, state: State): Task[List[Completion]]
    Definition Classes
    PythonInterpreter → Interpreter
  9. def convertFromPython(jep: Jep): PartialFunction[(String, PyObject), (scala.tools.nsc.interactive.Global.Type, Any)]
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  10. def convertToPython(jep: Jep): PartialFunction[(String, Any), AnyRef]
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  11. def defaultConvertToPython(nv: (String, Any)): AnyRef
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. def errorCause(get: PyCallable): Option[Throwable]
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  15. def eval[T](code: String)(implicit arg0: ClassTag[T]): Task[T]
    Attributes
    protected[python]
    Definition Classes
    PythonInterpreter
  16. def exec(code: String): Task[Unit]
    Attributes
    protected[python]
    Definition Classes
    PythonInterpreter
  17. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. val gatewayRef: AtomicReference[GatewayServer]
  19. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. def getValue(name: String): Task[PyObject]
    Attributes
    protected[python]
    Definition Classes
    PythonInterpreter
  21. def handlePyError(get: PyCallable, trace: ArrayList[AnyRef]): Throwable
    Definition Classes
    PythonInterpreter
  22. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  23. def init(state: State): RIO[InterpreterEnv, State]
    Definition Classes
    PySparkInterpreter → PythonInterpreter → Interpreter
  24. def injectGlobals(globals: PyObject): RIO[CurrentRuntime, Unit]
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  25. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  26. def jep[T](fn: (Jep) ⇒ T): Task[T]
    Attributes
    protected[python]
    Definition Classes
    PythonInterpreter
  27. def matplotlib: String
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  28. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  29. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  30. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  31. def parametersAt(code: String, pos: Int, state: State): Task[Option[Signatures]]
    Definition Classes
    PythonInterpreter → Interpreter
  32. def parse(code: String, cell: String): Task[PyObject]
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  33. def populateGlobals(state: State): Task[PyObject]
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  34. def pysparkImports: String

    Handle setting up PySpark.

    Handle setting up PySpark.

    First, we need to pick the python interpreter. Unfortunately this means we need to re-implement Spark's interpreter configuration logic, because that's only implemented inside SparkSubmit (and only when you use pyspark-shell actually).

    Here's the order we follow for the driver python executable (from org.apache.spark.launcher.SparkSubmitCommandBuilder):

    1. conf spark.pyspark.driver.python 2. conf spark.pyspark.python 3. environment variable PYSPARK_DRIVER_PYTHON 4. environment variable PYSPARK_PYTHON

    For the executors we just omit the driver python - so it's just:

    1. conf spark.pyspark.python 2. environment variable PYSPARK_PYTHON

    Additionally, to load pyspark itself we try to grab the its location from the Spark distribution. This ensures that all the versions match up.

    WARNING: Using pyspark from pip install pyspark, could break things - don't use it!

    Attributes
    protected
  35. def run(compiled: PyObject, globals: PyObject, state: State): RIO[CurrentRuntime, State]
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  36. def run(code: String, state: State): RIO[InterpreterEnv, State]
    Definition Classes
    PythonInterpreter → Interpreter
  37. def setValue(name: String, value: AnyRef): Task[Unit]
    Attributes
    protected[python]
    Definition Classes
    PythonInterpreter
  38. def setup: String
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  39. def shutdown(): Task[Unit]
    Definition Classes
    PythonInterpreter → Interpreter
  40. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  41. def toString(): String
    Definition Classes
    AnyRef → Any
  42. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from PythonInterpreter

Inherited from Interpreter

Inherited from AnyRef

Inherited from Any

Ungrouped