org.bdgenomics.adam.rdd

ADAMContext

class ADAMContext extends Serializable with Logging

Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. ADAMContext
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new ADAMContext(sc: SparkContext)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def adamDictionaryLoad[T](filePath: String)(implicit ev1: (T) ⇒ SpecificRecord, ev2: Manifest[T]): SequenceDictionary

    This method should create a new SequenceDictionary from any parquet file which contains records that have the requisite reference{Name,Id,Length,Url} fields.

    This method should create a new SequenceDictionary from any parquet file which contains records that have the requisite reference{Name,Id,Length,Url} fields.

    (If the path is a BAM or SAM file, and the implicit type is an Read, then it just defaults to reading the SequenceDictionary out of the BAM header in the normal way.)

    T

    The type of records to return

    filePath

    The path to the input data

    returns

    A sequenceDictionary containing the names and indices of all the sequences to which the records in the corresponding file are aligned.

  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. def findFiles(path: Path, regex: String): Seq[Path]

    Searches a path recursively, returning the names of all directories in the tree whose name matches the given regex.

    Searches a path recursively, returning the names of all directories in the tree whose name matches the given regex.

    path

    The path to begin the search at

    regex

    A regular expression

    returns

    A sequence of Path objects corresponding to the identified directories.

  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  16. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  17. def loadAlignments(filePath: String, projection: Option[Schema] = None, filePath2Opt: Option[String] = None, recordGroupOpt: Option[String] = None, stringency: ValidationStringency = ValidationStringency.STRICT): RDD[AlignmentRecord]

  18. def loadAlignmentsFromPaths(paths: Seq[Path]): RDD[AlignmentRecord]

    Takes a sequence of Path objects (e.

    Takes a sequence of Path objects (e.g. the return value of findFiles). Treats each path as corresponding to a Read set -- loads each Read set, converts each set to use the same SequenceDictionary, and returns the union of the RDDs.

    paths

    The locations of the parquet files to load

    returns

    a single RDD[Read] that contains the union of the AlignmentRecords in the argument paths.

  19. def loadBED(filePath: String): RDD[Feature]

  20. def loadBam(filePath: String): RDD[AlignmentRecord]

  21. def loadFasta(filePath: String, fragmentLength: Long): RDD[NucleotideContigFragment]

  22. def loadFastq(filePath1: String, filePath2Opt: Option[String], recordGroupOpt: Option[String] = None, stringency: ValidationStringency = ValidationStringency.STRICT): RDD[AlignmentRecord]

  23. def loadFeatures(filePath: String, projection: Option[Schema] = None): RDD[Feature]

  24. def loadFragments(filePath: String): RDD[Fragment]

  25. def loadGTF(filePath: String): RDD[Feature]

  26. def loadGenes(filePath: String, projection: Option[Schema] = None): RDD[Gene]

  27. def loadGenotypes(filePath: String, projection: Option[Schema] = None, sd: Option[SequenceDictionary] = None): RDD[Genotype]

  28. def loadIndexedBam(filePath: String, viewRegion: ReferenceRegion): RDD[AlignmentRecord]

    Functions like loadBam, but uses bam index files to look at fewer blocks, and only returns records within a specified ReferenceRegion.

    Functions like loadBam, but uses bam index files to look at fewer blocks, and only returns records within a specified ReferenceRegion. Bam index file required.

    filePath

    The path to the input data. Currently this path must correspond to a single Bam file. The bam index file associated needs to have the same name.

    viewRegion

    The ReferenceRegion we are filtering on

  29. def loadInterleavedFastq(filePath: String): RDD[AlignmentRecord]

  30. def loadInterleavedFastqAsFragments(filePath: String): RDD[Fragment]

  31. def loadIntervalList(filePath: String): RDD[Feature]

  32. def loadNarrowPeak(filePath: String): RDD[Feature]

  33. def loadPairedFastq(filePath1: String, filePath2: String, recordGroupOpt: Option[String], stringency: ValidationStringency): RDD[AlignmentRecord]

  34. def loadParquet[T](filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None)(implicit ev1: (T) ⇒ SpecificRecord, ev2: Manifest[T]): RDD[T]

    This method will create a new RDD.

    This method will create a new RDD.

    T

    The type of records to return

    filePath

    The path to the input data

    predicate

    An optional pushdown predicate to use when reading the data

    projection

    An option projection schema to use when reading the data

    returns

    An RDD with records of the specified type

  35. def loadParquetAlignments(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[AlignmentRecord]

  36. def loadParquetContigFragments(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[NucleotideContigFragment]

  37. def loadParquetFeatures(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[Feature]

  38. def loadParquetFragments(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[Fragment]

  39. def loadParquetGenotypes(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[Genotype]

  40. def loadParquetVariantAnnotations(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[DatabaseVariantAnnotation]

  41. def loadParquetVariants(filePath: String, predicate: Option[FilterPredicate] = None, projection: Option[Schema] = None): RDD[Variant]

  42. def loadReferenceFile(filePath: String, fragmentLength: Long): ReferenceFile

  43. def loadSequence(filePath: String, projection: Option[Schema] = None, fragmentLength: Long = 10000): RDD[NucleotideContigFragment]

  44. def loadUnpairedFastq(filePath: String, recordGroupOpt: Option[String] = None, setFirstOfPair: Boolean = false, setSecondOfPair: Boolean = false, stringency: ValidationStringency = ValidationStringency.STRICT): RDD[AlignmentRecord]

  45. def loadVariantAnnotations(filePath: String, projection: Option[Schema] = None, sd: Option[SequenceDictionary] = None): RDD[DatabaseVariantAnnotation]

  46. def loadVariants(filePath: String, projection: Option[Schema] = None, sd: Option[SequenceDictionary] = None): RDD[Variant]

  47. def loadVcf(filePath: String, sd: Option[SequenceDictionary]): RDD[VariantContext]

  48. def loadVcfAnnotations(filePath: String, sd: Option[SequenceDictionary] = None): RDD[DatabaseVariantAnnotation]

  49. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  50. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  51. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  52. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  53. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  54. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  55. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  56. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  57. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  58. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  59. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  60. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  61. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  62. final def notify(): Unit

    Definition Classes
    AnyRef
  63. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  64. val sc: SparkContext

  65. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  66. def toString(): String

    Definition Classes
    AnyRef → Any
  67. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  68. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  69. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped