scala.tools.nsc.interpreter

SparkIMain

class SparkIMain extends AbstractScriptEngine with Compilable with SparkImports

An interpreter for Scala code.

The main public entry points are compile(), interpret(), and bind(). The compile() method loads a complete Scala file. The interpret() method executes one line of Scala code at the request of the user. The bind() method binds an object to a variable that can then be used by later interpreted code.

The overall approach is based on compiling the requested code and then using a Java classloader and Java reflection to run the code and access its results.

In more detail, a single compiler instance is used to accumulate all successfully compiled or interpreted Scala code. To "interpret" a line of code, the compiler generates a fresh object that includes the line of code and which has public member(s) to export all variables defined by that code. To extract the result of an interpreted line to show the user, a second "result object" is created which imports the variables exported by the above object and then exports members called "$eval" and "$print". To accomodate user expressions that read from variables or methods defined in previous statements, "import" statements are used.

This interpreter shares the strengths and weaknesses of using the full compiler-to-Java. The main strength is that interpreted code behaves exactly as does compiled code, including running at full speed. The main weakness is that redefining classes and methods is not handled properly, because rebinding at the Java level is technically difficult.

Self Type
SparkIMain
Linear Supertypes
SparkImports, Compilable, AbstractScriptEngine, ScriptEngine, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkIMain
  2. SparkImports
  3. Compilable
  4. AbstractScriptEngine
  5. ScriptEngine
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkIMain()

  2. new SparkIMain(factory: ScriptEngineFactory)

  3. new SparkIMain(settings: Settings)

  4. new SparkIMain(factory: ScriptEngineFactory, settings: Settings)

  5. new SparkIMain(settings: Settings, out: JPrintWriter)

    construct an interpreter that reports to Console

  6. new SparkIMain(factory: ScriptEngineFactory, initialSettings: Settings, out: JPrintWriter)

Type Members

  1. case class ComputedImports(prepend: String, append: String, access: String) extends Product with Serializable

    Compute imports that allow definitions from previous requests to be visible in a new request.

    Compute imports that allow definitions from previous requests to be visible in a new request. Returns three pieces of related code:

    1. An initial code fragment that should go before the code of the new request.

    2. A code fragment that should go after the code of the new request.

    3. An access path which can be traversed to access any bindings inside code wrapped by #1 and #2 .

    The argument is a set of Names that need to be imported.

    Limitations: This method is not as precise as it could be. (1) It does not process wildcard imports to see what exactly they import. (2) If it imports any names from a request, it imports all of them, which is not really necessary. (3) It imports multiple same-named implicits, but only the last one imported is actually usable.

    Definition Classes
    SparkImports
  2. abstract class PhaseDependentOps extends AnyRef

  3. class ReadEvalPrint extends AnyRef

    Here is where we:

    Here is where we:

    1) Read some source code, and put it in the "read" object. 2) Evaluate the read object, and put the result in the "eval" object. 3) Create a String for human consumption, and put it in the "print" object.

    Read! Eval! Print! Some of that not yet centralized here.

  4. implicit class ReplTypeOps extends AnyRef

  5. class Request extends AnyRef

    One line of code submitted by the user for interpretation

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. def allDefinedNames: List[Global.Name]

  5. def allHandlers: collection.immutable.List[(memberHandlers)#MemberHandler]

  6. def allImportedNames: collection.immutable.List[Global.Name]

    Definition Classes
    SparkImports
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def backticked(s: String): String

  9. def beQuietDuring[T](body: ⇒ T): T

    Temporarily be quiet

  10. def beSilentDuring[T](operation: ⇒ T): T

  11. def bind[T](name: String, value: T)(implicit arg0: reflect.api.JavaUniverse.TypeTag[T], arg1: ClassTag[T]): Result

  12. def bind(p: NamedParam): Result

  13. def bind(name: String, boundType: String, value: Any, modifiers: List[String] = Nil): Result

    Bind a specified name to a specified value.

    Bind a specified name to a specified value. The name may later be used by expressions passed to interpret.

    name

    the variable name to bind

    boundType

    the type of the variable, as a string

    value

    the object value to bind to it

    returns

    an indication of whether the binding succeeded

  14. var bound: Boolean

  15. def classLoader: util.AbstractFileClassLoader

  16. def classOfTerm(id: String): Option[JClass]

  17. def cleanMemberDecl(owner: Global.Symbol, member: Global.Name): Global.Type

  18. def cleanTypeAfterTyper(sym: ⇒ Global.Symbol): Global.Type

  19. def clearExecutionWrapper(): Unit

  20. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. def close(): Unit

    This instance is no longer needed, so release any resources it is using.

    This instance is no longer needed, so release any resources it is using. The reporter's output gets flushed.

  22. var code: String

  23. def compile(reader: Reader): CompiledScript

    Definition Classes
    SparkIMain → Compilable
    Annotations
    @throws( ... )
  24. def compile(script: String): CompiledScript

    Definition Classes
    SparkIMain → Compilable
    Annotations
    @throws( ... )
  25. def compileSources(sources: SourceFile*): Boolean

    Compile an nsc SourceFile.

    Compile an nsc SourceFile. Returns true if there are no compilation errors, or false otherwise.

  26. def compileSourcesKeepingRun(sources: SourceFile*): (Boolean, Run)

  27. def compileString(code: String): Boolean

    Compile a string.

    Compile a string. Returns true if there are no compilation errors, or false otherwise.

  28. def compiled(script: String): CompiledScript

  29. def compilerClasspath: Seq[URL]

  30. def createBindings(): Bindings

    Definition Classes
    SparkIMain → ScriptEngine
  31. def dealiasNonPublic(tp: Global.Type): Global.Type

  32. def debugging[T](msg: String)(res: T): T

  33. object deconstruct extends StructuredTypeStrings

  34. def definedSymbolList: collection.immutable.List[Global.Symbol]

  35. def definedTerms: collection.immutable.List[Global.TermName]

  36. def definedTypes: List[Global.TypeName]

  37. def directBind[T](name: String, value: T)(implicit arg0: reflect.api.JavaUniverse.TypeTag[T], arg1: ClassTag[T]): Result

  38. def directBind(p: NamedParam): Result

  39. def directBind(name: String, boundType: String, value: Any): Result

  40. final def ensureClassLoader(): Unit

  41. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  42. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  43. def eval(reader: Reader, context: ScriptContext): AnyRef

    Definition Classes
    SparkIMain → ScriptEngine
    Annotations
    @throws( ... )
  44. def eval(script: String, context: ScriptContext): AnyRef

    Definition Classes
    SparkIMain → ScriptEngine
    Annotations
    @throws( ... )
  45. def eval(arg0: String): AnyRef

    Definition Classes
    AbstractScriptEngine → ScriptEngine
    Annotations
    @throws( ... )
  46. def eval(arg0: Reader): AnyRef

    Definition Classes
    AbstractScriptEngine → ScriptEngine
    Annotations
    @throws( ... )
  47. def eval(arg0: String, arg1: Bindings): AnyRef

    Definition Classes
    AbstractScriptEngine → ScriptEngine
    Annotations
    @throws( ... )
  48. def eval(arg0: Reader, arg1: Bindings): AnyRef

    Definition Classes
    AbstractScriptEngine → ScriptEngine
    Annotations
    @throws( ... )
  49. def executionWrapper: String

  50. object exprTyper extends SparkExprTyper

  51. val factory: ScriptEngineFactory

  52. def finalize(): Unit

    Definition Classes
    SparkIMain → AnyRef
  53. object flatOp extends PhaseDependentOps

  54. def flatPath(sym: Global.Symbol): String

  55. lazy val formatting: Formatting

  56. def get(arg0: String): AnyRef

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  57. def getBindings(arg0: Int): Bindings

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  58. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  59. def getClassIfDefined(path: String): Global.Symbol

  60. def getContext(): ScriptContext

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  61. def getFactory(): ScriptEngineFactory

    Definition Classes
    SparkIMain → ScriptEngine
  62. def getModuleIfDefined(path: String): Global.Symbol

  63. def getScriptContext(arg0: Bindings): ScriptContext

    Attributes
    protected[javax.script]
    Definition Classes
    AbstractScriptEngine
  64. lazy val global: Global

  65. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  66. def implicitSymbolsBySource: List[(Global.Symbol, List[Global.Symbol])]

    Definition Classes
    SparkImports
  67. def importHandlers: collection.immutable.List[(memberHandlers)#ImportHandler]

  68. def importedSymbols: collection.immutable.List[Global.Symbol]

    Definition Classes
    SparkImports
  69. def importedSymbolsBySource: List[(Global.Symbol, List[Global.Symbol])]

    Tuples of (source, imported symbols) in the order they were imported.

    Tuples of (source, imported symbols) in the order they were imported.

    Definition Classes
    SparkImports
  70. def importedTermSymbols: collection.immutable.List[Global.TermSymbol]

    Definition Classes
    SparkImports
  71. def importsCode(wanted: Set[Global.Name], wrapper: Wrapper, definedClass: Boolean): ComputedImports

    Attributes
    protected
    Definition Classes
    SparkImports
  72. def initialize(postInitSignal: ⇒ Unit): Unit

  73. def initializeSynchronous(): Unit

  74. def interpret(line: String, synthetic: Boolean): Result

  75. def interpret(line: String): Result

    Interpret one line of input.

    Interpret one line of input. All feedback, including parse errors and evaluation results, are printed via the supplied compiler's reporter. Values defined are available for future interpreted strings.

    The return value is whether the line was interpreter successfully, e.g. that there were no parse errors.

  76. def interpretSynthetic(line: String): Result

  77. def isInitializeComplete: Boolean

  78. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  79. lazy val isettings: SparkISettings

    interpreter settings

  80. def languageSymbols: collection.immutable.List[Global.Symbol]

    Definition Classes
    SparkImports
  81. def languageWildcardHandlers: collection.immutable.List[(memberHandlers)#ImportHandler]

    Definition Classes
    SparkImports
  82. def languageWildcardSyms: List[Global.Symbol]

    Symbols whose contents are language-defined to be imported.

    Symbols whose contents are language-defined to be imported.

    Definition Classes
    SparkImports
  83. def lastRequest: Request

  84. def lastWarnings: List[(Global.Position, String)]

  85. lazy val memberHandlers: SparkMemberHandlers { val intp: SparkIMain.this.type }

  86. def mostRecentVar: String

    Returns the name of the most recent interpreter result.

    Returns the name of the most recent interpreter result. Mostly this exists so you can conveniently invoke methods on the previous result.

  87. def namedDefinedTerms: collection.immutable.List[Global.TermName]

  88. object naming extends Naming

  89. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  90. def newCompiler(settings: Settings, reporter: Reporter): ReplGlobal

    Instantiate a compiler.

    Instantiate a compiler. Overridable.

    Attributes
    protected
  91. final def notify(): Unit

    Definition Classes
    AnyRef
  92. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  93. def onlyTerms(xs: List[Global.Name]): List[Global.TermName]

    Attributes
    protected
  94. def onlyTypes(xs: List[Global.Name]): List[Global.TypeName]

    Attributes
    protected
  95. def originalPath(sym: Global.Symbol): String

  96. def originalPath(name: Global.Name): String

  97. def originalPath(name: String): String

  98. val out: JPrintWriter

    Attributes
    protected
  99. def parentClassLoader: ClassLoader

    Parent classloader.

    Parent classloader. Overridable.

    Attributes
    protected
  100. object parse

    Parse a line into and return parsing result (error, incomplete or success with list of trees)

  101. def prevRequestList: collection.immutable.List[Request]

  102. def put(arg0: String, arg1: Any): Unit

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  103. def quietBind(p: NamedParam): Result

  104. def quietRun[T](code: String): Result

  105. def readRootPath(readPath: String): Global.Symbol

  106. def rebind(p: NamedParam): Result

  107. def recordRequest(req: Request): Unit

  108. object replOutput extends ReplOutput

  109. def replScope: Global.Scope

  110. lazy val reporter: SparkReplReporter

  111. def reset(): Unit

    Reset this interpreter, forgetting all user-specified requests.

  112. def resetClassLoader(): Unit

  113. def runtimeClassAndTypeOfTerm(id: String): Option[(JClass, Global.Type)]

  114. lazy val runtimeMirror: Mirror

  115. def runtimeTypeOfTerm(id: String): Global.Type

  116. def sessionImportedSymbols: collection.immutable.List[Global.Symbol]

    Definition Classes
    SparkImports
  117. def sessionWildcards: List[Global.Type]

    Types which have been wildcard imported, such as: val x = "abc" ; import x._ // type java.lang.String import java.lang.String._ // object java.lang.String

    Types which have been wildcard imported, such as: val x = "abc" ; import x._ // type java.lang.String import java.lang.String._ // object java.lang.String

    Used by tab completion.

    XXX right now this gets import x._ and import java.lang.String._, but doesn't figure out import String._. There's a lot of ad hoc scope twiddling which should be swept away in favor of digging into the compiler scopes.

    Definition Classes
    SparkImports
  118. def setBindings(arg0: Bindings, arg1: Int): Unit

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  119. def setContext(arg0: ScriptContext): Unit

    Definition Classes
    AbstractScriptEngine → ScriptEngine
  120. def setContextClassLoader(): Unit

  121. def setExecutionWrapper(code: String): Unit

  122. def settings: Settings

  123. def showCodeIfDebugging(code: String): Unit

  124. def showDirectory(): Unit

  125. def symbolDefString(sym: Global.Symbol): String

  126. def symbolOfIdent(id: String): Global.Symbol

  127. def symbolOfLine(code: String): Global.Symbol

  128. def symbolOfName(id: Global.Name): Global.Symbol

  129. def symbolOfTerm(id: String): Global.Symbol

  130. def symbolOfType(id: String): Global.Symbol

  131. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  132. def toString(): String

    Definition Classes
    AnyRef → Any
  133. def translateEnclosingClass(n: String): Option[String]

  134. def translatePath(path: String): Option[String]

  135. def tryTwice(op: ⇒ Global.Symbol): Global.Symbol

    It's a bit of a shotgun approach, but for now we will gain in robustness.

    It's a bit of a shotgun approach, but for now we will gain in robustness. Try a symbol-producing operation at phase typer, and if that is NoSymbol, try again at phase flatten. I'll be able to lose this and run only from exitingTyper as soon as I figure out exactly where a flat name is sneaking in when calculating imports.

  136. def typeOfExpression(expr: String, silent: Boolean = true): Global.Type

  137. def typeOfTerm(id: String): Global.Type

  138. object typerOp extends PhaseDependentOps

  139. def unqualifiedIds: List[String]

  140. def valueOfTerm(id: String): Option[Any]

  141. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  142. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  143. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  144. def withoutUnwrapping(op: ⇒ Unit): Unit

  145. def withoutWarnings[T](body: ⇒ T): T

Deprecated Value Members

  1. def virtualDirectory: ReplDir

    Annotations
    @deprecated
    Deprecated

    (Since version 2.11.0) Use replOutput.dir instead

Inherited from SparkImports

Inherited from Compilable

Inherited from AbstractScriptEngine

Inherited from ScriptEngine

Inherited from AnyRef

Inherited from Any

Ungrouped