com.ignition

frame

package frame

Data types, implicits, aliases for DataFrame-based workflows.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. frame
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. abstract class AbstractAggregate[U] extends FrameTransformer with PairFunctions

    An abstract Aggregate step, which uses the list of grouping fields to partition the data supplied row aggregator to aggregate each partition.

  2. case class AddFields(fields: Iterable[(String, Any)]) extends FrameTransformer with Product with Serializable

    Adds a set of new fields, each of them either a constant, a variable, or an environment variable.

  3. type AfterFrameStepComputed = AfterStepComputed[DataFrame, SparkRuntime]

  4. case class BasicStats(dataFields: Iterable[(String, BasicAggregator)], groupFields: Iterable[String] = immutable.this.Nil) extends FrameTransformer with Product with Serializable

    Calculates basic statistics about the specified fields.

  5. type BeforeFrameStepComputed = BeforeStepComputed[DataFrame, SparkRuntime]

  6. case class CassandraInput(keyspace: String, table: String, columns: Iterable[String] = immutable.this.Nil, where: Option[Where] = scala.None) extends FrameProducer with Product with Serializable

    Reads rows from Apache Cassandra.

  7. case class CassandraOutput(keyspace: String, table: String) extends FrameTransformer with Product with Serializable

    Writes rows into a Cassandra table.

  8. case class CsvFileInput(path: String, separator: Option[String] = scala.Some.apply[String](","), schema: Option[StructType] = scala.None) extends FrameProducer with Product with Serializable

    Reads CSV files.

  9. case class DataGrid(schema: StructType, rows: Seq[Row]) extends FrameProducer with Product with Serializable

    Static data grid input.

  10. case class DataRowWriter(schema: StructType, tableDef: TableDef) extends RowWriter[Row] with Product with Serializable

    Cassandra row writer for DataFrame objects.

  11. case class DebugOutput(names: Boolean = true, types: Boolean = false, title: Option[String] = scala.None, maxWidth: Option[Int] = scala.Some.apply[Int](80)) extends FrameTransformer with Product with Serializable

    Prints out data on the standard output.

  12. class DefaultSparkRuntime extends SparkRuntime

    The default implementation of SparkRuntime.

  13. case class EnvLiteral(name: String) extends Product with Serializable

    Environment literal.

  14. case class Filter(condition: String) extends FrameSplitter with Product with Serializable

    Filters the data frame based on a combination of boolean conditions against fields.

  15. case class Formula(fields: Iterable[(String, RowExpression[_ <: DataType])]) extends FrameTransformer with Product with Serializable

    Calculates new fields based on string expressions in various dialects.

  16. case class FrameFlow(targets: Iterable[ConnectionSource[DataFrame, SparkRuntime]]) extends SubModule[DataFrame, SparkRuntime] with FrameStep with Product with Serializable

    Frame Flow represents an executable job.

  17. case class FrameFlowComplete(flow: FrameFlow, results: Seq[DataFrame]) extends FrameFlowEvent with Product with Serializable

  18. sealed trait FrameFlowEvent extends AnyRef

    Base trait for all frame flow events.

  19. trait FrameFlowListener extends AnyRef

    Listener which will be notified on frame flow events.

  20. case class FrameFlowStarted(flow: FrameFlow) extends FrameFlowEvent with Product with Serializable

  21. abstract class FrameMerger extends Merger[DataFrame, SparkRuntime] with FrameStep

  22. abstract class FrameModule extends Module[DataFrame, SparkRuntime] with FrameStep

  23. abstract class FrameProducer extends Producer[DataFrame, SparkRuntime] with FrameStep

  24. abstract class FrameSplitter extends Splitter[DataFrame, SparkRuntime] with FrameStep

  25. trait FrameStep extends AbstractStep with Step[DataFrame, SparkRuntime] with XmlExport with JsonExport

    Workflow step that emits DataFrame as the output.

  26. type FrameStepListener = StepListener[DataFrame, SparkRuntime]

  27. trait FrameSubFlow extends AbstractStep with FrameStep

  28. case class FrameSubMerger(body: (Seq[ConnectionTarget[DataFrame, SparkRuntime]], ConnectionSource[DataFrame, SparkRuntime])) extends SubMerger[DataFrame, SparkRuntime] with FrameSubFlow with Product with Serializable

  29. case class FrameSubModule(body: (Seq[ConnectionTarget[DataFrame, SparkRuntime]], Seq[ConnectionSource[DataFrame, SparkRuntime]])) extends SubModule[DataFrame, SparkRuntime] with FrameStep with Product with Serializable

  30. case class FrameSubProducer(body: ConnectionSource[DataFrame, SparkRuntime]) extends SubProducer[DataFrame, SparkRuntime] with FrameSubFlow with Product with Serializable

  31. case class FrameSubSplitter(body: (ConnectionTarget[DataFrame, SparkRuntime], Seq[ConnectionSource[DataFrame, SparkRuntime]])) extends SubSplitter[DataFrame, SparkRuntime] with FrameSubFlow with Product with Serializable

  32. case class FrameSubTransformer(body: (ConnectionTarget[DataFrame, SparkRuntime], ConnectionSource[DataFrame, SparkRuntime])) extends SubTransformer[DataFrame, SparkRuntime] with FrameSubFlow with Product with Serializable

  33. abstract class FrameTransformer extends Transformer[DataFrame, SparkRuntime] with FrameStep

  34. case class Intersection() extends FrameMerger with Product with Serializable

    Finds the intersection of the two DataRow RDDs.

  35. case class Invoke(path: String, fileType: String = "json") extends FrameModule with Product with Serializable

    Loads and executes a subflow stored in an external file (json or xml).

  36. case class Join(condition: Option[String], joinType: JoinType) extends FrameMerger with Product with Serializable

    Performs join of the two data frames.

  37. case class JsonFileInput(path: String, columns: Iterable[(String, String)]) extends FrameProducer with Product with Serializable

    Reads a JSON file, which contains a separate JSON object in each line.

  38. type JsonFrameStepFactory = JsonStepFactory[FrameStep, DataFrame, SparkRuntime]

  39. case class KafkaInput(zkUrl: String, topic: String, groupId: String, maxRows: Option[Int] = scala.Some.apply[Int](100), maxTimeout: Option[Long] = scala.Some.apply[Long](60000), kafkaProperties: Map[String, String] = ..., field: String = "payload") extends FrameProducer with Product with Serializable

    Reads messages from a Kafka topic, converting each of them into a row with a single column.

  40. case class KafkaOutput(field: String, topic: String, brokers: Iterable[String] = immutable.this.Nil, kafkaProperties: Map[String, String] = ...) extends FrameTransformer with Product with Serializable

    Posts rows as messages onto a Kafka topic.

  41. case class MongoInput(db: String, coll: String, schema: StructType, filter: Map[String, Any] = ..., sort: Iterable[SortOrder] = immutable.this.List.empty[Nothing], page: Page = Page.default) extends FrameProducer with Product with Serializable

    Reads documents from MongoDB.

  42. case class MongoOutput(db: String, coll: String) extends FrameTransformer with Product with Serializable

    Writes rows into a MongoDB collection.

  43. case class Page(limit: Int = 100, offset: Int = 0) extends Product with Serializable

    Used to limit the results returned from data store queries.

  44. trait PairFunctions extends AnyRef

    Supplies helper functions for pair RDDs.

  45. case class Pass() extends FrameTransformer with Product with Serializable

    A simple passthrough.

  46. case class Reduce(reducers: Iterable[(String, ReduceOp)], groupFields: Iterable[String] = immutable.this.Nil) extends FrameTransformer with PairFunctions with Product with Serializable

    Performs reduceByKey() function by grouping the rows by the selected key first, and then applying a list of reduce functions to the specified data columns.

  47. case class RequestInfo(url: URL, body: Option[String], headers: Map[String, String]) extends Product with Serializable

    Encapsulates HTTP request.

  48. case class RestClient(url: String, method: HttpMethod = HttpMethod.GET, body: Option[String] = scala.None, headers: Map[String, String] = ..., resultField: Option[String] = scala.Some.apply[String]("result"), statusField: Option[String] = scala.Some.apply[String]("status"), headersField: Option[String] = scala.None) extends FrameTransformer with Product with Serializable

    HTTP REST Client, executes one request per row.

  49. trait RowAggregator[U] extends Serializable

    Aggregates a data row into some arbitrary class U using Spark aggregateByKey method.

  50. case class SQLQuery(query: String) extends FrameMerger with Product with Serializable

    Executes an SQL statement against the inputs.

  51. sealed trait SelectAction extends Serializable

    Action performed by SelectValues step.

  52. case class SelectValues(actions: Iterable[SelectAction]) extends FrameTransformer with Product with Serializable

    Modifies, deletes, retains columns in the data rows.

  53. case class SetVariables(vars: Map[String, Any]) extends FrameTransformer with Product with Serializable

    Sets or drops the ignition runtime variables.

  54. case class SortOrder(field: String, ascending: Boolean = true) extends Product with Serializable

    A sorting order.

  55. trait SparkRuntime extends FlowRuntime with Serializable

    Encapsulates the spark context and SQL context and provides helper functions to manage Spark runtime environment.

  56. implicit final class StringToLiteral extends AnyVal

    An implicit conversion of: $".

  57. case class TextFileInput(path: String, separator: Option[String] = scala.None, dataField: String = "content") extends FrameProducer with Product with Serializable

    Reads the text file into a data frame with a single column.

  58. case class TextFileOutput(path: String, fields: Iterable[(String, String)], separator: String = ",", outputHeader: Boolean = true) extends FrameTransformer with Product with Serializable

    Writes rows to a CSV file.

  59. case class TextFolderInput(path: String, nameField: String = "filename", dataField: String = "content") extends FrameProducer with Product with Serializable

    Reads a folder of text files.

  60. case class Union() extends FrameMerger with Product with Serializable

    Merges multiple DataFrames.

  61. case class VarLiteral(name: String) extends Product with Serializable

    Variable literal.

  62. case class Where(cql: String, values: Any*) extends Product with Serializable

    CQL WHERE clause.

  63. type XmlFrameStepFactory = XmlStepFactory[FrameStep, DataFrame, SparkRuntime]

Value Members

  1. object AddFields extends Serializable

    Add Fields companion object.

  2. object BasicAggregator extends Enumeration

    Basic aggregate functions.

  3. object BasicStats extends Serializable

    Basic Stats companion object.

  4. object CassandraInput extends Serializable

    Cassandra Input companion object.

  5. object CassandraOutput extends Serializable

    Cassandra Output companion object.

  6. object CsvFileInput extends Serializable

    CSV Input companion object.

  7. object DataGrid extends Serializable

    Data grid companion object.

  8. object DebugOutput extends Serializable

    Debug output companion object.

  9. object Filter extends Serializable

    Filter companion object.

  10. object Formula extends Serializable

    Formula companion object.

  11. object FrameFlow extends Serializable

    FrameFlow companion object.

  12. object FrameStep extends Serializable

    Constants and helper functions for FrameStep.

  13. object FrameStepFactory extends XmlFrameStepFactory with JsonFrameStepFactory

    Creates FrameStep instances from Xml and Json.

  14. object FrameSubFlow extends SubFlowFactory[FrameStep, DataFrame, SparkRuntime] with Serializable

    Provides SubFlow common methods.

  15. object HttpMethod extends Enumeration

    HTTP method.

  16. object Intersection extends Serializable

    Intersection companion object.

  17. object Invoke extends Serializable

    Invoke companion object.

  18. object Join extends Serializable

    Join companion object.

  19. object JoinType extends Enumeration

    DataFrame join type.

  20. object JsonFileInput extends Serializable

    JSON file input companion object.

  21. object KafkaInput extends Serializable

    Kafka Input companion object.

  22. object KafkaOutput extends Serializable

    Kafka Output companion object.

  23. object Main

    The entry point for starting ignition frame flows.

  24. object MongoInput extends Serializable

    Mongo Input companion object.

  25. object MongoOutput extends Serializable

    Mongo Output companion object.

  26. object Page extends Serializable

  27. object Pass extends Serializable

    Passthrough companion object.

  28. object Reduce extends Serializable

    Reduce companion object.

  29. object ReduceOp extends Enumeration

    Reduce operations.

  30. object RestClient extends Serializable

    REST Client companion object.

  31. object SQLQuery extends Serializable

    SQL query companion object.

  32. object SelectAction extends Serializable

    Supported actions.

  33. object SelectValues extends Serializable

    Select Values companion object.

  34. object SetVariables extends Serializable

    SetVariables companion object.

  35. object TextFileInput extends Serializable

    Text File Input companion object.

  36. object TextFileOutput extends Serializable

    CSV output companion object.

  37. object TextFolderInput extends Serializable

    Text Folder Input companion object.

  38. object Union extends Serializable

    Union companion object.

  39. object Where extends Serializable

    CQL WHERE companion object.

  40. def injectAll(row: Row, indexMap: Map[String, Int])(expr: String)(implicit runtime: SparkRuntime): String

    Injects row fields, environment settings and variables into the string.

  41. def injectFields(row: Row, indexMap: Map[String, Int])(expr: String): String

    Injects the fields from the specified row, by replacing substrings of form ${field} with the value of the specified field.

  42. def injectGlobals(expr: String)(implicit runtime: SparkRuntime): String

    Inject JVM environment variables and spark variables by substituting e{env} and v{var} patterns in the expression.

  43. package mllib

Inherited from AnyRef

Inherited from Any

Ungrouped