Packages

  • package root
    Definition Classes
    root
  • package io
    Definition Classes
    root
  • package tarantool
    Definition Classes
    io
  • package spark
    Definition Classes
    tarantool
  • package connector

    Tarantool connector for Apache Spark.

    Tarantool connector for Apache Spark.

    Call tarantoolSpace method on the SparkContext object to create a TarantoolRDD exposing Tarantool space as a Spark RDD.

    Example:

    Execute the following on a Cartridge router node (the tarantool/crud module must be installed):

    local crud = require('crud')
    
    crud.insert('test_space', {1, nil, 'a1', 'Don Quixote', 'Miguel de Cervantes', 1605})
    crud.insert('test_space', {2, nil, 'a2', 'The Great Gatsby', 'F. Scott Fitzgerald', 1925})
    crud.insert('test_space', {3, nil, 'a3', 'War and Peace', 'Leo Tolstoy', 1869})

    Write the following in your Java client code:

    import io.tarantool.spark.connector._
    
    val sparkMasterHost = "127.0.0.1"
    val tarantoolRouterAddress = "127.0.0.1:3301"
    val space = "test_space"
    
    // Populate the Spark config with the address of a Cartridge router node and credentials:
    val conf = new SparkConf(true)
    conf.set ("tarantool.username", "admin")
    conf.set ("tarantool.password", "testapp-cluster-cookie")
    conf.set ("tarantool.hosts", tarantoolRouterAddress)
    
    // Connect to the Spark cluster:
    val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)
    
    // Read the space and print its contents:
    val rdd = sc.tarantoolSpace(space)
    rdd.toArray().foreach(println)
    
    sc.stop()
    Definition Classes
    spark
  • package config
  • package connection
  • package partition
  • package rdd
  • package util
  • Logging
  • SparkContextFunctions
  • SparkContextJavaFunctions
  • TarantoolSpark
  • TarantoolSparkException

package connector

Tarantool connector for Apache Spark.

Call tarantoolSpace method on the SparkContext object to create a TarantoolRDD exposing Tarantool space as a Spark RDD.

Example:

Execute the following on a Cartridge router node (the tarantool/crud module must be installed):

local crud = require('crud')

crud.insert('test_space', {1, nil, 'a1', 'Don Quixote', 'Miguel de Cervantes', 1605})
crud.insert('test_space', {2, nil, 'a2', 'The Great Gatsby', 'F. Scott Fitzgerald', 1925})
crud.insert('test_space', {3, nil, 'a3', 'War and Peace', 'Leo Tolstoy', 1869})

Write the following in your Java client code:

import io.tarantool.spark.connector._

val sparkMasterHost = "127.0.0.1"
val tarantoolRouterAddress = "127.0.0.1:3301"
val space = "test_space"

// Populate the Spark config with the address of a Cartridge router node and credentials:
val conf = new SparkConf(true)
conf.set ("tarantool.username", "admin")
conf.set ("tarantool.password", "testapp-cluster-cookie")
conf.set ("tarantool.hosts", tarantoolRouterAddress)

// Connect to the Spark cluster:
val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)

// Read the space and print its contents:
val rdd = sc.tarantoolSpace(space)
rdd.toArray().foreach(println)

sc.stop()
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. connector
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Package Members

  1. package config
  2. package connection
  3. package partition
  4. package rdd
  5. package util

Type Members

  1. trait Logging extends AnyRef

    Utility trait for classes that want to log data.

    Utility trait for classes that want to log data. Creates a SLF4J logger for the class and allows logging messages at different levels using methods that only evaluate parameters lazily if the log level is enabled.

    This is a copy of what Spark Previously held in org.apache.spark.Logging. That class is now private so we will expose similar functionality here.

  2. class SparkContextFunctions extends Serializable

    Spark API for Tarantool.

    Spark API for Tarantool. Provides Tarantool-specific methods on SparkContext

  3. class SparkContextJavaFunctions extends AnyRef

    Java API for bridging SparkContextFunctions functionality into Java code

  4. final class TarantoolSpark extends AnyRef

    Public facade for using the Tarantool Spark API in Java.

    Public facade for using the Tarantool Spark API in Java.

    Provides static factory methods as entrypoints for building RDDs and other Spark API entities.

  5. trait TarantoolSparkException extends TarantoolException

    Generic type for all module exceptions

Value Members

  1. implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions
  2. object TarantoolSparkException extends Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped