Datasets
and
programs.See: Description
Interface | Description |
---|---|
Application<T extends Config> |
Defines a CDAP Application.
|
ApplicationConfigurer |
Configures a CDAP Application.
|
ApplicationContext<T extends Config> |
Provides access to the environment, application configuration, and application (deployment)
arguments.
|
ApplicationSpecification |
Application Specification used in core code.
|
ApplicationUpdateContext |
Context for updating Application configs.
|
AppStateStore |
Saves and reads state for the given app.
|
RuntimeConfigurer |
The runtime configurer that can be got when the app is reconfigured before the actual program
run
|
Class | Description |
---|---|
AbstractApplication<T extends Config> |
A support class for
Applications which reduces repetition and results in a
more readable configuration. |
ApplicationUpdateResult<T extends Config> |
Stores results of upgrading an application config.
|
Enum | Description |
---|---|
ApplicationConfigUpdateAction |
Possible update actions for application config.
|
ProgramType |
Defines types of programs supported by the system.
|
Datasets
and
programs.
Every Application must either implement the Application
interface or
extend the AbstractApplication
class. Extending AbstractApplication
is simpler and helps produce cleaner code.
To create a CDAP Application, first begin by extending the AbstractApplication
class, then implement its configure()
method. In the configure method, specify the Application's metadata (its
name and description), and declare and configure each of the Application elements.
Example usage:
public class MyApp extends AbstractApplication {
@Override
public void configure() {
setName("myApp");
setDescription("My Sample Application");
createDataset("myCounters", "KeyValueTable");
addSpark(new MySparkJob());
addMapReduce(new MyMapReduceJob());
addWorkflow(new MyAppWorkflow());
}
}
A Dataset
defines the storage and retrieval of data. In
addition to the several Dataset implementations CDAP provides, you can also implement your own
Custom Datasets.
See Dataset
for details.
A Spark
program defines the processing of data using Spark.
A MapReduce
program to process in batch using MapReduce
A Workflow
program to orchestrate a series of mapreduce or
spark jobs.
Copyright © 2024 Cask Data, Inc. Licensed under the Apache License, Version 2.0.