classDefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider
Spark searches for a class named DefaultSource in a given data source package.
Spark searches for a class named DefaultSource in a given data source package.
So, use dataFrame.write.format("io.prophecy.libs.sources.jdbc") for writing.
Spark searches for a class named DefaultSource in a given data source package. So, use dataFrame.write.format("io.prophecy.libs.sources.jdbc") for writing.
READING IS NOT SUPPORTED
TODO: Use proper logging instead of print