Seems a bit counter-intuitive to do this, but this allows for cloud-agnostic storage of the config.
Seems a bit counter-intuitive to do this, but this allows for cloud-agnostic storage of the config. Otherwise, a configuration would need to be created to manage which cloud this is operating on and handle native SDK object writers. Instead of re-inventing the wheel here, a DataFrame can be serialized to any cloud-native storage medium with very little issue.
The inference configuration generated for a particular modeling run
A DataFrame consisting of a single row and a single field. Cell 1:1 contains the json string.
Handler method for converting the InferenceMainConfig object to a serializable Json String with correct scala-compatible data structures.
Handler method for converting the InferenceMainConfig object to a serializable Json String with correct scala-compatible data structures.
instance of InferenceMainConfig
[InferenceJsonReturn] consisting of compact form (for logging) and prettyprint form (human readable)
Handler method for converting a read-in json config String to an instance of InferenceMainConfig
Handler method for converting a read-in json config String to an instance of InferenceMainConfig
the config as a Json-formatted String
config as InstanceOf[InferenceMainConfig]
Extract the InferenceMainConfig from a stored DataFrame containing the string-encoded json in row 1, column 1
Extract the InferenceMainConfig from a stored DataFrame containing the string-encoded json in row 1, column 1
A Dataframe that contains the configuration for the Inference run.
an instance of InferenceMainConfig
From a supplied DataFrame that contains the configuration in cell 1:1, get the json string
From a supplied DataFrame that contains the configuration in cell 1:1, get the json string
A Dataframe that contains the configuration for the Inference run.
The string-encoded json payload for InferenceMainConfig