org.apache.spark.deploy.yarn

YarnSparkHadoopUtil

class YarnSparkHadoopUtil extends SparkHadoopUtil

Contains util methods to interact with Hadoop from spark.

Linear Supertypes
SparkHadoopUtil, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. YarnSparkHadoopUtil
  2. SparkHadoopUtil
  3. Logging
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new YarnSparkHadoopUtil()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def addCredentials(conf: JobConf): Unit

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  7. def addCurrentUserCredentials(creds: Credentials): Unit

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  8. def addSecretKeyToUserCredentials(key: String, secret: String): Unit

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  9. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. val conf: Configuration

    Definition Classes
    SparkHadoopUtil
  12. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  16. def getConfigurationFromJobContext(context: JobContext): Configuration

    Definition Classes
    SparkHadoopUtil
  17. def getCurrentUserCredentials(): Credentials

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  18. def getNameNodesToAccess(sparkConf: SparkConf): Set[Path]

    Get the list of namenodes the user may access.

  19. def getSecretKeyFromUserCredentials(key: String): Array[Byte]

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  20. def getTaskAttemptIDFromTaskAttemptContext(context: TaskAttemptContext): TaskAttemptID

    Definition Classes
    SparkHadoopUtil
  21. def getTimeFromNowToRenewal(sparkConf: SparkConf, fraction: Double, credentials: Credentials): Long

    Definition Classes
    SparkHadoopUtil
  22. def getTokenRenewer(conf: Configuration): String

  23. def globPath(pattern: Path): Seq[Path]

    Definition Classes
    SparkHadoopUtil
  24. def globPathIfNecessary(pattern: Path): Seq[Path]

    Definition Classes
    SparkHadoopUtil
  25. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  26. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  27. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  28. def isYarnMode(): Boolean

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  29. def listFilesSorted(remoteFs: FileSystem, dir: Path, prefix: String, exclusionSuffix: String): Array[FileStatus]

    Definition Classes
    SparkHadoopUtil
  30. def listLeafDirStatuses(fs: FileSystem, baseStatus: FileStatus): Seq[FileStatus]

    Definition Classes
    SparkHadoopUtil
  31. def listLeafDirStatuses(fs: FileSystem, basePath: Path): Seq[FileStatus]

    Definition Classes
    SparkHadoopUtil
  32. def listLeafStatuses(fs: FileSystem, baseStatus: FileStatus): Seq[FileStatus]

    Definition Classes
    SparkHadoopUtil
  33. def listLeafStatuses(fs: FileSystem, basePath: Path): Seq[FileStatus]

    Definition Classes
    SparkHadoopUtil
  34. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  35. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  36. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  37. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  38. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  39. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  40. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  41. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  42. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  43. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  44. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  45. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  46. def loginUserFromKeytab(principalName: String, keytabFilename: String): Unit

    Definition Classes
    SparkHadoopUtil
  47. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  48. def newConfiguration(conf: SparkConf): Configuration

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  49. final def notify(): Unit

    Definition Classes
    AnyRef
  50. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  51. def obtainTokenForHiveMetastore(conf: Configuration): Option[Token[DelegationTokenIdentifier]]

    Obtains token for the Hive metastore, using the current user as the principal.

    Obtains token for the Hive metastore, using the current user as the principal. Some exceptions are caught and downgraded to a log message.

    conf

    hadoop configuration; the Hive configuration will be based on this

    returns

    a token, or None if there's no need for a token (no metastore URI or principal in the config), or if a binding exception was caught and downgraded.

  52. def obtainTokensForNamenodes(paths: Set[Path], conf: Configuration, creds: Credentials, renewer: Option[String] = None): Unit

    Obtains tokens for the namenodes passed in and adds them to the credentials.

  53. def runAsSparkUser(func: () ⇒ Unit): Unit

    Definition Classes
    SparkHadoopUtil
  54. def substituteHadoopVariables(text: String, hadoopConf: Configuration): String

    Definition Classes
    SparkHadoopUtil
  55. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  56. def toString(): String

    Definition Classes
    AnyRef → Any
  57. def transferCredentials(source: UserGroupInformation, dest: UserGroupInformation): Unit

    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  58. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  59. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  60. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def newConfiguration(): Configuration

    Definition Classes
    SparkHadoopUtil
    Annotations
    @deprecated
    Deprecated

    (Since version 1.2.0) use newConfiguration with SparkConf argument

Inherited from SparkHadoopUtil

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped