@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class ProcessingS3Input extends Object implements Serializable, Cloneable, StructuredPojo
Configuration for downloading input data from Amazon S3 into the processing container.
| Constructor and Description | 
|---|
| ProcessingS3Input() | 
| Modifier and Type | Method and Description | 
|---|---|
| ProcessingS3Input | clone() | 
| boolean | equals(Object obj) | 
| String | getLocalPath()
 The local path in your container where you want Amazon SageMaker to write input data to. | 
| String | getS3CompressionType()
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. | 
| String | getS3DataDistributionType()
 Whether to distribute the data from Amazon S3 to all processing instances with  FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance. | 
| String | getS3DataType()
 Whether you use an  S3Prefixor aManifestFilefor the data type. | 
| String | getS3InputMode()
 Whether to use  FileorPipeinput mode. | 
| String | getS3Uri()
 The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job. | 
| int | hashCode() | 
| void | marshall(ProtocolMarshaller protocolMarshaller)Marshalls this structured data using the given  ProtocolMarshaller. | 
| void | setLocalPath(String localPath)
 The local path in your container where you want Amazon SageMaker to write input data to. | 
| void | setS3CompressionType(String s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. | 
| void | setS3DataDistributionType(String s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with  FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance. | 
| void | setS3DataType(String s3DataType)
 Whether you use an  S3Prefixor aManifestFilefor the data type. | 
| void | setS3InputMode(String s3InputMode)
 Whether to use  FileorPipeinput mode. | 
| void | setS3Uri(String s3Uri)
 The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job. | 
| String | toString()Returns a string representation of this object. | 
| ProcessingS3Input | withLocalPath(String localPath)
 The local path in your container where you want Amazon SageMaker to write input data to. | 
| ProcessingS3Input | withS3CompressionType(ProcessingS3CompressionType s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. | 
| ProcessingS3Input | withS3CompressionType(String s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. | 
| ProcessingS3Input | withS3DataDistributionType(ProcessingS3DataDistributionType s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with  FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance. | 
| ProcessingS3Input | withS3DataDistributionType(String s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with  FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance. | 
| ProcessingS3Input | withS3DataType(ProcessingS3DataType s3DataType)
 Whether you use an  S3Prefixor aManifestFilefor the data type. | 
| ProcessingS3Input | withS3DataType(String s3DataType)
 Whether you use an  S3Prefixor aManifestFilefor the data type. | 
| ProcessingS3Input | withS3InputMode(ProcessingS3InputMode s3InputMode)
 Whether to use  FileorPipeinput mode. | 
| ProcessingS3Input | withS3InputMode(String s3InputMode)
 Whether to use  FileorPipeinput mode. | 
| ProcessingS3Input | withS3Uri(String s3Uri)
 The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job. | 
public void setS3Uri(String s3Uri)
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
s3Uri - The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.public String getS3Uri()
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
public ProcessingS3Input withS3Uri(String s3Uri)
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
s3Uri - The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.public void setLocalPath(String localPath)
 The local path in your container where you want Amazon SageMaker to write input data to. LocalPath
 is an absolute path to the input data and must begin with /opt/ml/processing/.
 LocalPath is a required parameter when AppManaged is False (default).
 
localPath - The local path in your container where you want Amazon SageMaker to write input data to.
        LocalPath is an absolute path to the input data and must begin with
        /opt/ml/processing/. LocalPath is a required parameter when
        AppManaged is False (default).public String getLocalPath()
 The local path in your container where you want Amazon SageMaker to write input data to. LocalPath
 is an absolute path to the input data and must begin with /opt/ml/processing/.
 LocalPath is a required parameter when AppManaged is False (default).
 
LocalPath is an absolute path to the input data and must begin with
         /opt/ml/processing/. LocalPath is a required parameter when
         AppManaged is False (default).public ProcessingS3Input withLocalPath(String localPath)
 The local path in your container where you want Amazon SageMaker to write input data to. LocalPath
 is an absolute path to the input data and must begin with /opt/ml/processing/.
 LocalPath is a required parameter when AppManaged is False (default).
 
localPath - The local path in your container where you want Amazon SageMaker to write input data to.
        LocalPath is an absolute path to the input data and must begin with
        /opt/ml/processing/. LocalPath is a required parameter when
        AppManaged is False (default).public void setS3DataType(String s3DataType)
 Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
 S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with
 the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri
 identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to
 use for the processing job.
 
s3DataType - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
        S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects
        with the specified key name prefix for the processing job. If you choose ManifestFile,
        S3Uri identifies an object that is a manifest file containing a list of object keys that you
        want Amazon SageMaker to use for the processing job.ProcessingS3DataTypepublic String getS3DataType()
 Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
 S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with
 the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri
 identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to
 use for the processing job.
 
S3Prefix or a ManifestFile for the data type. If you choose
         S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects
         with the specified key name prefix for the processing job. If you choose ManifestFile,
         S3Uri identifies an object that is a manifest file containing a list of object keys that you
         want Amazon SageMaker to use for the processing job.ProcessingS3DataTypepublic ProcessingS3Input withS3DataType(String s3DataType)
 Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
 S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with
 the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri
 identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to
 use for the processing job.
 
s3DataType - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
        S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects
        with the specified key name prefix for the processing job. If you choose ManifestFile,
        S3Uri identifies an object that is a manifest file containing a list of object keys that you
        want Amazon SageMaker to use for the processing job.ProcessingS3DataTypepublic ProcessingS3Input withS3DataType(ProcessingS3DataType s3DataType)
 Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
 S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with
 the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri
 identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to
 use for the processing job.
 
s3DataType - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose
        S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects
        with the specified key name prefix for the processing job. If you choose ManifestFile,
        S3Uri identifies an object that is a manifest file containing a list of object keys that you
        want Amazon SageMaker to use for the processing job.ProcessingS3DataTypepublic void setS3InputMode(String s3InputMode)
 Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data
 from the input source onto the local ML storage volume before starting your processing container. This is the
 most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source
 directly to your processing container into named pipes without using the ML storage volume.
 
s3InputMode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies
        the data from the input source onto the local ML storage volume before starting your processing container.
        This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data
        from the source directly to your processing container into named pipes without using the ML storage
        volume.ProcessingS3InputModepublic String getS3InputMode()
 Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data
 from the input source onto the local ML storage volume before starting your processing container. This is the
 most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source
 directly to your processing container into named pipes without using the ML storage volume.
 
File or Pipe input mode. In File mode, Amazon SageMaker copies
         the data from the input source onto the local ML storage volume before starting your processing
         container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams
         input data from the source directly to your processing container into named pipes without using the ML
         storage volume.ProcessingS3InputModepublic ProcessingS3Input withS3InputMode(String s3InputMode)
 Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data
 from the input source onto the local ML storage volume before starting your processing container. This is the
 most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source
 directly to your processing container into named pipes without using the ML storage volume.
 
s3InputMode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies
        the data from the input source onto the local ML storage volume before starting your processing container.
        This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data
        from the source directly to your processing container into named pipes without using the ML storage
        volume.ProcessingS3InputModepublic ProcessingS3Input withS3InputMode(ProcessingS3InputMode s3InputMode)
 Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data
 from the input source onto the local ML storage volume before starting your processing container. This is the
 most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source
 directly to your processing container into named pipes without using the ML storage volume.
 
s3InputMode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies
        the data from the input source onto the local ML storage volume before starting your processing container.
        This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data
        from the source directly to your processing container into named pipes without using the ML storage
        volume.ProcessingS3InputModepublic void setS3DataDistributionType(String s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance.
 
s3DataDistributionType - Whether to distribute the data from Amazon S3 to all processing instances with
        FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading
        one shard of data to each processing instance.ProcessingS3DataDistributionTypepublic String getS3DataDistributionType()
 Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance.
 
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading
         one shard of data to each processing instance.ProcessingS3DataDistributionTypepublic ProcessingS3Input withS3DataDistributionType(String s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance.
 
s3DataDistributionType - Whether to distribute the data from Amazon S3 to all processing instances with
        FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading
        one shard of data to each processing instance.ProcessingS3DataDistributionTypepublic ProcessingS3Input withS3DataDistributionType(ProcessingS3DataDistributionType s3DataDistributionType)
 Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or
 whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing
 instance.
 
s3DataDistributionType - Whether to distribute the data from Amazon S3 to all processing instances with
        FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading
        one shard of data to each processing instance.ProcessingS3DataDistributionTypepublic void setS3CompressionType(String s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
 Gzip can only be used when Pipe mode is specified as the S3InputMode. In
 Pipe mode, Amazon SageMaker streams input data from the source directly to your container without
 using the EBS volume.
 
s3CompressionType - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
        Gzip can only be used when Pipe mode is specified as the
        S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source
        directly to your container without using the EBS volume.ProcessingS3CompressionTypepublic String getS3CompressionType()
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
 Gzip can only be used when Pipe mode is specified as the S3InputMode. In
 Pipe mode, Amazon SageMaker streams input data from the source directly to your container without
 using the EBS volume.
 
Gzip can only be used when Pipe mode is specified as the
         S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source
         directly to your container without using the EBS volume.ProcessingS3CompressionTypepublic ProcessingS3Input withS3CompressionType(String s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
 Gzip can only be used when Pipe mode is specified as the S3InputMode. In
 Pipe mode, Amazon SageMaker streams input data from the source directly to your container without
 using the EBS volume.
 
s3CompressionType - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
        Gzip can only be used when Pipe mode is specified as the
        S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source
        directly to your container without using the EBS volume.ProcessingS3CompressionTypepublic ProcessingS3Input withS3CompressionType(ProcessingS3CompressionType s3CompressionType)
 Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
 Gzip can only be used when Pipe mode is specified as the S3InputMode. In
 Pipe mode, Amazon SageMaker streams input data from the source directly to your container without
 using the EBS volume.
 
s3CompressionType - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
        Gzip can only be used when Pipe mode is specified as the
        S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source
        directly to your container without using the EBS volume.ProcessingS3CompressionTypepublic String toString()
toString in class ObjectObject.toString()public ProcessingS3Input clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojoProtocolMarshaller.marshall in interface StructuredPojoprotocolMarshaller - Implementation of ProtocolMarshaller used to marshall this object's data.