Deprecated API
Contents
-
Interfaces Interface Description org.apache.flink.table.descriptors.DescriptorValidator SeeDescriptorfor details.org.apache.flink.table.legacy.api.constraints.Constraint SeeResolvedSchemaandConstraint.org.apache.flink.table.legacy.connector.source.AsyncTableFunctionProvider Please useAsyncLookupFunctionProviderto implement asynchronous lookup table.org.apache.flink.table.legacy.connector.source.TableFunctionProvider Please useLookupFunctionProviderto implement synchronous lookup table.org.apache.flink.table.legacy.descriptors.Descriptor Descriptorwas primarily used for the legacy connector stack and have been deprecated. UseTableDescriptorfor creating sources and sinks from the Table API.org.apache.flink.table.legacy.factories.TableFactory This interface has been replaced byFactory.org.apache.flink.table.legacy.factories.TableSinkFactory This interface has been replaced byDynamicTableSinkFactory. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.table.legacy.factories.TableSourceFactory This interface has been replaced byDynamicTableSourceFactory. The new interface produces internal data structures. See FLIP-95 for more information.org.apache.flink.table.legacy.sinks.OverwritableTableSink This interface will not be supported in the new sink design aroundDynamicTableSink. UseSupportsOverwriteinstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sinks.PartitionableTableSink This interface will not be supported in the new sink design aroundDynamicTableSink. UseSupportsPartitioninginstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sinks.TableSink This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.DefinedFieldMapping This interface will not be supported in the new source design aroundDynamicTableSource. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.DefinedProctimeAttribute This interface will not be supported in the new source design aroundDynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.DefinedRowtimeAttributes This interface will not be supported in the new source design aroundDynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.FieldComputer This interface will not be supported in the new source design aroundDynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.FilterableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseSupportsFilterPushDowninstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.LimitableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseSupportsLimitPushDowninstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.LookupableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseLookupTableSourceinstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.NestedFieldsProjectableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseSupportsProjectionPushDowninstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.PartitionableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseSupportsPartitionPushDowninstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.ProjectableTableSource This interface will not be supported in the new source design aroundDynamicTableSource. UseSupportsProjectionPushDowninstead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.TableSource This interface has been replaced byDynamicTableSource. The new interface produces internal data structures. See FLIP-95 for more information.
-
Classes Class Description org.apache.flink.table.dataview.ListViewSerializer org.apache.flink.table.dataview.ListViewSerializerSnapshot org.apache.flink.table.dataview.ListViewTypeInfo org.apache.flink.table.dataview.MapViewSerializer org.apache.flink.table.dataview.MapViewSerializerSnapshot org.apache.flink.table.dataview.MapViewTypeInfo org.apache.flink.table.dataview.NullAwareMapSerializer org.apache.flink.table.dataview.NullAwareMapSerializerSnapshot org.apache.flink.table.descriptors.ConnectorDescriptorValidator org.apache.flink.table.descriptors.DescriptorProperties This utility will be dropped soon.DynamicTableFactoryis based onConfigOptionand catalogs useCatalogPropertiesUtil.org.apache.flink.table.descriptors.FileSystemValidator The legacy CSV connector has been replaced byFileSource/FileSink. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.factories.TableFactoryService org.apache.flink.table.factories.TableSinkFactoryContextImpl org.apache.flink.table.factories.TableSourceFactoryContextImpl org.apache.flink.table.functions.AggregateFunctionDefinition Non-legacy functions can simply omit this wrapper for declarations.org.apache.flink.table.functions.LegacyUserDefinedFunctionInference org.apache.flink.table.functions.ScalarFunctionDefinition Non-legacy functions can simply omit this wrapper for declarations.org.apache.flink.table.functions.TableAggregateFunctionDefinition Non-legacy functions can simply omit this wrapper for declarations.org.apache.flink.table.functions.TableFunctionDefinition Non-legacy functions can simply omit this wrapper for declarations.org.apache.flink.table.legacy.api.constraints.UniqueConstraint SeeResolvedSchemaandUniqueConstraint.org.apache.flink.table.legacy.api.TableColumn SeeResolvedSchemaandColumn.org.apache.flink.table.legacy.api.TableSchema This class has been deprecated as part of FLIP-164. It has been replaced by two more dedicated classesSchemaandResolvedSchema. UseSchemafor declaration in APIs.ResolvedSchemais offered by the framework after resolution and validation.org.apache.flink.table.legacy.api.Types This class will be removed in future versions as it uses the old type system. It is recommended to useDataTypesinstead which uses the new type system based on instances ofDataType. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.api.WatermarkSpec SeeResolvedSchemaandWatermarkSpec.org.apache.flink.table.legacy.descriptors.Rowtime This class was used for legacy connectors usingDescriptor.org.apache.flink.table.legacy.descriptors.Schema This class was used for legacy connectors usingDescriptor.org.apache.flink.table.legacy.sources.RowtimeAttributeDescriptor This interface will not be supported in the new source design aroundDynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.org.apache.flink.table.legacy.sources.tsextractors.TimestampExtractor This interface will not be supported in the new source design aroundDynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.org.apache.flink.table.legacy.types.logical.TypeInformationRawType UseRawTypeinstead.org.apache.flink.table.legacy.utils.TypeStringUtils This utility is based onTypeInformation. However, the Table & SQL API is currently updated to useDataTypes based onLogicalTypes. UseLogicalTypeParserinstead.org.apache.flink.table.sinks.TableSinkBase This class is implementing the deprecatedTableSinkinterface. ImplementDynamicTableSinkdirectly instead.org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter UseDataTypeFactory.createDataType(TypeInformation)instead. Note that this method will not create legacy types anymore. It fully uses the new type system available only in the planner.org.apache.flink.table.typeutils.TimeIndicatorTypeInfo This class will be removed in future versions as it is used for the old type system. It is recommended to useDataTypesinstead. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.typeutils.TimeIntervalTypeInfo This class will be removed in future versions as it is used for the old type system. It is recommended to useDataTypesinstead. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
-
Exceptions Exceptions Description org.apache.flink.table.api.AmbiguousTableFactoryException This exception is considered internal and has been erroneously placed in the *.api package. It is replaced byAmbiguousTableFactoryExceptionand should not be used directly anymore.org.apache.flink.table.api.ExpressionParserException This exception is considered internal and has been erroneously placed in the *.api package. It is replaced byExpressionParserExceptionand should not be used directly anymore.org.apache.flink.table.api.NoMatchingTableFactoryException This exception is considered internal and has been erroneously placed in the *.api package. It is replaced byNoMatchingTableFactoryExceptionand should not be used directly anymore.
-
Fields Field Description org.apache.flink.table.legacy.descriptors.Schema.SCHEMA_TYPE org.apache.flink.table.module.CommonModuleOptions.MODULE_TYPE This is only required for the legacy factory stack
-
Methods Method Description org.apache.flink.table.annotation.FunctionHint.argument() UseFunctionHint.arguments()instead.org.apache.flink.table.annotation.FunctionHint.argumentNames() UseFunctionHint.arguments()instead.org.apache.flink.table.annotation.ProcedureHint.argument() UseProcedureHint.arguments()instead.org.apache.flink.table.annotation.ProcedureHint.argumentNames() UseProcedureHint.arguments()instead.org.apache.flink.table.catalog.CatalogBaseTable.getSchema() This method returns the deprecatedTableSchemaclass. The old class was a hybrid of resolved and unresolved schema information. It has been replaced by the newSchemawhich is always unresolved and will be resolved by the framework later.org.apache.flink.table.catalog.ResolvedCatalogBaseTable.getSchema() This method returns the deprecatedTableSchemaclass. The old class was a hybrid of resolved and unresolved schema information. It has been replaced by the newResolvedSchemawhich is resolved by the framework and accessible viaResolvedCatalogBaseTable.getResolvedSchema().org.apache.flink.table.functions.BuiltInFunctionDefinition.Builder.namedArguments(String...) org.apache.flink.table.functions.BuiltInFunctionDefinition.Builder.typedArguments(DataType...) org.apache.flink.table.functions.ImperativeAggregateFunction.getAccumulatorType() This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.functions.ImperativeAggregateFunction.getResultType() This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.functions.ScalarFunction.getParameterTypes(Class<?>[]) This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.functions.ScalarFunction.getResultType(Class<?>[]) This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.functions.TableFunction.getParameterTypes(Class<?>[]) This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.functions.TableFunction.getResultType() This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecatedTableEnvironment.registerFunction(...)method. The new reflective extraction logic (possibly enriched withDataTypeHintandFunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to overrideUserDefinedFunction.getTypeInference(DataTypeFactory).org.apache.flink.table.legacy.api.TableColumn.of(String, DataType) UseTableColumn.physical(String, DataType)instead.org.apache.flink.table.legacy.api.TableSchema.Builder.field(String, TypeInformation<?>) This method will be removed in future versions as it uses the old type system. It is recommended to useTableSchema.Builder.field(String, DataType)instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.api.TableSchema.fromTypeInfo(TypeInformation<?>) This method will be removed soon. UseDataTypesto declare types.org.apache.flink.table.legacy.api.TableSchema.getFieldType(int) This method will be removed in future versions as it uses the old type system. It is recommended to useTableSchema.getFieldDataType(int)instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.api.TableSchema.getFieldTypes() This method will be removed in future versions as it uses the old type system. It is recommended to useTableSchema.getFieldDataTypes()instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.api.TableSchema.toRowType() UseTableSchema.toRowDataType()instead.org.apache.flink.table.legacy.descriptors.Schema.field(String, TypeInformation<?>) This method will be removed in future versions as it uses the old type system. Please useSchema.field(String, DataType)instead.org.apache.flink.table.legacy.factories.TableSinkFactory.createTableSink(Map<String, String>) TableSinkFactory.Contextcontains more information, and already contains table schema too. Please useTableSinkFactory.createTableSink(Context)instead.org.apache.flink.table.legacy.factories.TableSourceFactory.createTableSource(Map<String, String>) TableSourceFactory.Contextcontains more information, and already contains table schema too. Please useTableSourceFactory.createTableSource(Context)instead.org.apache.flink.table.legacy.sinks.TableSink.configure(String[], TypeInformation<?>[]) This method will be dropped in future versions. It is recommended to pass a static schema when instantiating the sink instead.org.apache.flink.table.legacy.sinks.TableSink.getFieldNames() Use the field names ofTableSink.getTableSchema()instead.org.apache.flink.table.legacy.sinks.TableSink.getFieldTypes() Use the field types ofTableSink.getTableSchema()instead.org.apache.flink.table.legacy.sinks.TableSink.getOutputType() This method will be removed in future versions as it uses the old type system. It is recommended to useTableSink.getConsumedDataType()instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.sources.TableSource.getReturnType() This method will be removed in future versions as it uses the old type system. It is recommended to useTableSource.getProducedDataType()instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.org.apache.flink.table.legacy.sources.TableSource.getTableSchema() Table schema is a logical description of a table and should not be part of the physical TableSource. Define schema when registering a Table either in DDL or inTableEnvironment#connect(...).org.apache.flink.table.types.inference.TypeInference.Builder.namedArguments(List<String>) UseTypeInference.Builder.staticArguments(List)instead.org.apache.flink.table.types.inference.TypeInference.Builder.optionalArguments(List<Boolean>) UseTypeInference.Builder.staticArguments(List)instead.org.apache.flink.table.types.inference.TypeInference.Builder.typedArguments(List<DataType>) UseTypeInference.Builder.staticArguments(List)instead.org.apache.flink.table.types.inference.TypeInference.getAccumulatorTypeStrategy() UseTypeInference.getStateTypeStrategies()instead.org.apache.flink.table.types.inference.TypeInference.getNamedArguments() UseTypeInference.getStaticArguments()instead.org.apache.flink.table.types.inference.TypeInference.getOptionalArguments() UseTypeInference.getStaticArguments()instead.org.apache.flink.table.types.inference.TypeInference.getTypedArguments() UseTypeInference.getStaticArguments()instead.org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(DataType) Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use eitherInternalTypeInfoorExternalTypeInfodepending on the use case.org.apache.flink.table.types.utils.TypeConversions.fromLegacyInfoToDataType(TypeInformation<?>) Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use eitherInternalTypeInfoorExternalTypeInfodepending on the use case.org.apache.flink.table.utils.EncodingUtils.loadClass(String) UseEncodingUtils.loadClass(String, ClassLoader)instead, in order to explicitly provide the correct classloader.
-
Annotation Type Elements Annotation Type Element Description org.apache.flink.table.annotation.FunctionHint.argument() UseFunctionHint.arguments()instead.org.apache.flink.table.annotation.FunctionHint.argumentNames() UseFunctionHint.arguments()instead.org.apache.flink.table.annotation.ProcedureHint.argument() UseProcedureHint.arguments()instead.org.apache.flink.table.annotation.ProcedureHint.argumentNames() UseProcedureHint.arguments()instead.