Class/Object

org.apache.spark.deploy.yarn

YarnSparkHadoopUtil

Related Docs: object YarnSparkHadoopUtil | package yarn

Permalink

class YarnSparkHadoopUtil extends SparkHadoopUtil

Contains util methods to interact with Hadoop from spark.

Linear Supertypes
SparkHadoopUtil, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. YarnSparkHadoopUtil
  2. SparkHadoopUtil
  3. Logging
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new YarnSparkHadoopUtil()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def addCredentials(conf: JobConf): Unit

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  5. def addCurrentUserCredentials(creds: Credentials): Unit

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  6. def addSecretKeyToUserCredentials(key: String, secret: String): Unit

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  7. def appendS3AndSparkHadoopConfigurations(conf: SparkConf, hadoopConf: Configuration): Unit

    Permalink
    Definition Classes
    SparkHadoopUtil
  8. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  9. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. val conf: Configuration

    Permalink
    Definition Classes
    SparkHadoopUtil
  11. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  15. def getCurrentUserCredentials(): Credentials

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  16. def getNameNodesToAccess(sparkConf: SparkConf): Set[Path]

    Permalink

    Get the list of namenodes the user may access.

  17. def getSecretKeyFromUserCredentials(key: String): Array[Byte]

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  18. def getTimeFromNowToRenewal(sparkConf: SparkConf, fraction: Double, credentials: Credentials): Long

    Permalink
    Definition Classes
    SparkHadoopUtil
  19. def getTokenRenewer(conf: Configuration): String

    Permalink
  20. def globPath(pattern: Path): Seq[Path]

    Permalink
    Definition Classes
    SparkHadoopUtil
  21. def globPathIfNecessary(pattern: Path): Seq[Path]

    Permalink
    Definition Classes
    SparkHadoopUtil
  22. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  23. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def isGlobPath(pattern: Path): Boolean

    Permalink
    Definition Classes
    SparkHadoopUtil
  25. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  26. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  27. def isYarnMode(): Boolean

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  28. def listFilesSorted(remoteFs: FileSystem, dir: Path, prefix: String, exclusionSuffix: String): Array[FileStatus]

    Permalink
    Definition Classes
    SparkHadoopUtil
  29. def listLeafDirStatuses(fs: FileSystem, baseStatus: FileStatus): Seq[FileStatus]

    Permalink
    Definition Classes
    SparkHadoopUtil
  30. def listLeafDirStatuses(fs: FileSystem, basePath: Path): Seq[FileStatus]

    Permalink
    Definition Classes
    SparkHadoopUtil
  31. def listLeafStatuses(fs: FileSystem, baseStatus: FileStatus): Seq[FileStatus]

    Permalink
    Definition Classes
    SparkHadoopUtil
  32. def listLeafStatuses(fs: FileSystem, basePath: Path): Seq[FileStatus]

    Permalink
    Definition Classes
    SparkHadoopUtil
  33. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  34. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  35. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  36. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  37. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  38. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  39. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  40. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  41. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  42. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  43. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  44. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  45. def loginUserFromKeytab(principalName: String, keytabFilename: String): Unit

    Permalink
    Definition Classes
    SparkHadoopUtil
  46. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  47. def newConfiguration(conf: SparkConf): Configuration

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  48. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  49. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  50. def obtainTokenForHBase(conf: Configuration): Option[Token[TokenIdentifier]]

    Permalink

    Obtain a security token for HBase.

    Obtain a security token for HBase.

    Requirements

    1. "hbase.security.authentication" == "kerberos" 2. The HBase classes HBaseConfiguration and TokenUtil could be loaded and invoked.

    conf

    Hadoop configuration; an HBase configuration is created from this.

    returns

    a token if the requirements were met, None if not.

  51. def obtainTokenForHBase(sparkConf: SparkConf, conf: Configuration, credentials: Credentials): Unit

    Permalink

    Obtain a security token for HBase.

  52. def obtainTokenForHBaseInner(conf: Configuration): Option[Token[TokenIdentifier]]

    Permalink

    Obtain a security token for HBase if "hbase.security.authentication" == "kerberos"

    Obtain a security token for HBase if "hbase.security.authentication" == "kerberos"

    conf

    Hadoop configuration; an HBase configuration is created from this.

    returns

    a token if one was needed

  53. def obtainTokenForHiveMetastore(conf: Configuration): Option[Token[DelegationTokenIdentifier]]

    Permalink

    Obtains token for the Hive metastore, using the current user as the principal.

    Obtains token for the Hive metastore, using the current user as the principal. Some exceptions are caught and downgraded to a log message.

    conf

    hadoop configuration; the Hive configuration will be based on this

    returns

    a token, or None if there's no need for a token (no metastore URI or principal in the config), or if a binding exception was caught and downgraded.

  54. def obtainTokenForHiveMetastore(sparkConf: SparkConf, conf: Configuration, credentials: Credentials): Unit

    Permalink

    Obtains token for the Hive metastore and adds them to the credentials.

  55. def obtainTokensForNamenodes(paths: Set[Path], conf: Configuration, creds: Credentials, renewer: Option[String] = None): Unit

    Permalink

    Obtains tokens for the namenodes passed in and adds them to the credentials.

  56. def runAsSparkUser(func: () ⇒ Unit): Unit

    Permalink
    Definition Classes
    SparkHadoopUtil
  57. def substituteHadoopVariables(text: String, hadoopConf: Configuration): String

    Permalink
    Definition Classes
    SparkHadoopUtil
  58. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  59. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  60. def transferCredentials(source: UserGroupInformation, dest: UserGroupInformation): Unit

    Permalink
    Definition Classes
    YarnSparkHadoopUtil → SparkHadoopUtil
  61. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  62. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  63. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkHadoopUtil

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped