ConstraintHandling

dotty.tools.dotc.core.ConstraintHandling

Methods for adding constraints and solving them.

What goes into a Constraint as opposed to a ConstrainHandler?

Constraint code is purely functional: Operations get constraints and produce new ones. Constraint code does not have access to a type-comparer. Anything regarding lubs and glbs has to be done elsewhere.

By comparison: Constraint handlers are parts of type comparers and can use their functionality. Constraint handlers update the current constraint as a side effect.

Attributes

Graph
Supertypes
class Object
trait Matchable
class Any
Known subtypes

Members list

Type members

Classlikes

class LevelAvoidMap(topLevelVariance: Int, maxLevel: Int)(using x$3: Context) extends AvoidMap

An approximating map that prevents types nested deeper than maxLevel as well as WildcardTypes from leaking into the constraint.

An approximating map that prevents types nested deeper than maxLevel as well as WildcardTypes from leaking into the constraint.

Attributes

Supertypes
class AvoidMap
class TypeMap
trait Type => Type
class Object
trait Matchable
class Any
Show all

Value members

Abstract methods

protected def constraint: Constraint
protected def constraint_=(c: Constraint): Unit
protected def isSame(tp1: Type, tp2: Type)(using Context): Boolean
protected def isSub(tp1: Type, tp2: Type)(using Context): Boolean

Concrete methods

protected def addBoundTransitively(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Boolean
protected def addConstraint(param: TypeParamRef, bound: Type, fromBelow: Boolean)(using Context): Boolean

Add constraint param <: bound if fromBelow is false, param >: bound otherwise. bound is assumed to be in normalized form, as specified in firstTry and secondTry of TypeComparer. In particular, it should not be an alias type, lazy ref, typevar, wildcard type, error type. In addition, upper bounds may not be AndTypes and lower bounds may not be OrTypes. This is assured by the way isSubType is organized.

Add constraint param <: bound if fromBelow is false, param >: bound otherwise. bound is assumed to be in normalized form, as specified in firstTry and secondTry of TypeComparer. In particular, it should not be an alias type, lazy ref, typevar, wildcard type, error type. In addition, upper bounds may not be AndTypes and lower bounds may not be OrTypes. This is assured by the way isSubType is organized.

Attributes

protected def addLess(p1: TypeParamRef, p2: TypeParamRef)(using Context): Boolean
protected def addOneBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Boolean

Add type lambda tl, possibly with type variables tvars, to current constraint and propagate all bounds.

Add type lambda tl, possibly with type variables tvars, to current constraint and propagate all bounds.

Value parameters

tvars

See Constraint#add

Attributes

final def approximation(param: TypeParamRef, fromBelow: Boolean, maxLevel: Int)(using Context): Type

Solve constraint set for given type parameter param. If fromBelow is true the parameter is approximated by its lower bound, otherwise it is approximated by its upper bound, unless the upper bound contains a reference to the parameter itself (such occurrences can arise for F-bounded types, addOneBound ensures that they never occur in the lower bound). The solved type is not allowed to contain references to types nested deeper than maxLevel. Wildcard types in bounds are approximated by their upper or lower bounds. The constraint is left unchanged.

Solve constraint set for given type parameter param. If fromBelow is true the parameter is approximated by its lower bound, otherwise it is approximated by its upper bound, unless the upper bound contains a reference to the parameter itself (such occurrences can arise for F-bounded types, addOneBound ensures that they never occur in the lower bound). The solved type is not allowed to contain references to types nested deeper than maxLevel. Wildcard types in bounds are approximated by their upper or lower bounds. The constraint is left unchanged.

Attributes

Returns

the instantiating type

final def assumedTrue(param: TypeParamRef)(using Context): Boolean

Is param assumed to be a sub- and super-type of any other type? This holds if TypeVarsMissContext is set unless param is a part of a MatchType that is currently normalized.

Is param assumed to be a sub- and super-type of any other type? This holds if TypeVarsMissContext is set unless param is a part of a MatchType that is currently normalized.

Attributes

def atLevel(maxLevel: Int, param: TypeParamRef)(using Context): TypeParamRef

If param is nested deeper than maxLevel, try to instantiate it to a fresh type variable of level maxLevel and return the new variable. If this isn't possible, throw a TypeError.

If param is nested deeper than maxLevel, try to instantiate it to a fresh type variable of level maxLevel and return the new variable. If this isn't possible, throw a TypeError.

Attributes

def bounds(param: TypeParamRef)(using Context): TypeBounds

The current bounds of type parameter param

The current bounds of type parameter param

Attributes

final def canConstrain(param: TypeParamRef): Boolean

Can param be constrained with new bounds?

Can param be constrained with new bounds?

Attributes

def checkPropagated(msg: => String)(result: Boolean)(using Context): Boolean

Check that constraint is fully propagated. See comment in Config.checkConstraintsPropagated

Check that constraint is fully propagated. See comment in Config.checkConstraintsPropagated

Attributes

def checkReset(): Unit
def dropTransparentTraits(tp: Type, bound: Type)(using Context): Type

If tp is an intersection such that some operands are transparent trait instances and others are not, replace as many transparent trait instances as possible with Any as long as the result is still a subtype of bound. But fall back to the original type if the resulting widened type is a supertype of all dropped types (since in this case the type was not a true intersection of transparent traits and other types to start with).

If tp is an intersection such that some operands are transparent trait instances and others are not, replace as many transparent trait instances as possible with Any as long as the result is still a subtype of bound. But fall back to the original type if the resulting widened type is a supertype of all dropped types (since in this case the type was not a true intersection of transparent traits and other types to start with).

Attributes

Full bounds of param, including other lower/upper params.

Full bounds of param, including other lower/upper params.

Note that underlying operations perform subtype checks - for this reason, recursing on fullBounds of some param when comparing types might lead to infinite recursion. Consider bounds instead.

Attributes

def fullLowerBound(param: TypeParamRef)(using Context): Type

The full lower bound of param includes both the nonParamBounds and the params in the constraint known to be <: param, except that params with a nestingLevel higher than param will be instantiated to a fresh param at a legal level. See the documentation of TypeVar for details.

The full lower bound of param includes both the nonParamBounds and the params in the constraint known to be <: param, except that params with a nestingLevel higher than param will be instantiated to a fresh param at a legal level. See the documentation of TypeVar for details.

Attributes

def fullUpperBound(param: TypeParamRef)(using Context): Type

The full upper bound of param, see the documentation of fullLowerBounds above.

The full upper bound of param, see the documentation of fullLowerBounds above.

Attributes

final inline def inFrozenConstraint[T](op: => T): T
def instanceType(param: TypeParamRef, fromBelow: Boolean, widenUnions: Boolean, maxLevel: Int)(using Context): Type

The instance type of param in the current constraint (which contains param). If fromBelow is true, the instance type is the lub of the parameter's lower bounds; otherwise it is the glb of its upper bounds. However, a lower bound instantiation can be a singleton type only if the upper bound is also a singleton type. The instance type is not allowed to contain references to types nested deeper than maxLevel.

The instance type of param in the current constraint (which contains param). If fromBelow is true, the instance type is the lub of the parameter's lower bounds; otherwise it is the glb of its upper bounds. However, a lower bound instantiation can be a singleton type only if the upper bound is also a singleton type. The instance type is not allowed to contain references to types nested deeper than maxLevel.

Attributes

final def isSameTypeWhenFrozen(tp1: Type, tp2: Type)(using Context): Boolean
final protected def isSatisfiable(using Context): Boolean

Test whether the lower bounds of all parameters in this constraint are a solution to the constraint.

Test whether the lower bounds of all parameters in this constraint are a solution to the constraint.

Attributes

protected def isSubType(tp1: Type, tp2: Type, whenFrozen: Boolean)(using Context): Boolean
final def isSubTypeWhenFrozen(tp1: Type, tp2: Type)(using Context): Boolean
protected def legalBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Type

Approximate rawBound if needed to make it a legal bound of param by avoiding cycles, wildcards and types with a level strictly greater than its nestingLevel.

Approximate rawBound if needed to make it a legal bound of param by avoiding cycles, wildcards and types with a level strictly greater than its nestingLevel.

Note that level-checking must be performed here and cannot be delayed until instantiation because if we allow level-incorrect bounds, then we might end up reasoning with bad bounds outside of the scope where they are defined. This can lead to level-correct but unsound instantiations as demonstrated by tests/neg/i8900.scala.

Attributes

def levelOK(level: Int, maxLevel: Int)(using Context): Boolean

Is level <= maxLevel or legal in the current context?

Is level <= maxLevel or legal in the current context?

Attributes

def location(using Context): String
protected def necessaryConstraintsOnly(using Context): Boolean

When collecting the constraints needed for a particular subtyping judgment to be true, we sometimes need to approximate the constraint set (see TypeComparer#either for example).

When collecting the constraints needed for a particular subtyping judgment to be true, we sometimes need to approximate the constraint set (see TypeComparer#either for example).

Normally, this means adding extra constraints which may not be necessary for the subtyping judgment to be true, but if this variable is set to true we will instead under-approximate and keep only the constraints that must always be present for the subtyping judgment to hold.

This is needed for GADT bounds inference to be sound, but it is also used when constraining a method call based on its expected type to avoid adding constraints that would later prevent us from typechecking method arguments, see or-inf.scala and and-inf.scala for examples.

Attributes

def nestingLevel(param: TypeParamRef)(using Context): Int
final protected def subsumes(c1: Constraint, c2: Constraint, pre: Constraint)(using Context): Boolean

Constraint c1 subsumes constraint c2, if under c2 as constraint we have for all poly params p defined in c2 as p >: L2 <: U2:

Constraint c1 subsumes constraint c2, if under c2 as constraint we have for all poly params p defined in c2 as p >: L2 <: U2:

c1 defines p with bounds p >: L1 <: U1, and L2 <: L1, and U1 <: U2

Both c1 and c2 are required to derive from constraint pre, without adding any new type variables but possibly narrowing already registered ones with further bounds.

Attributes

def widenInferred(inst: Type, bound: Type, widenUnions: Boolean)(using Context): Type

Widen inferred type inst with upper bound, according to the following rules:

Widen inferred type inst with upper bound, according to the following rules:

  1. If inst is a singleton type, or a union containing some singleton types, widen (all) the singleton type(s), provided the result is a subtype of bound (i.e. inst.widenSingletons <:< bound succeeds with satisfiable constraint) and is not transparent according to isTransparent. 2a. If inst is a union type and widenUnions is true, approximate the union type from above by an intersection of all common base types, provided the result is a subtype of bound. 2b. If inst is a union type and widenUnions is false, turn it into a hard union type (except for unions | Null, which are kept in the state they were).
  2. Widen some irreducible applications of higher-kinded types to wildcard arguments (see @widenIrreducible).
  3. Drop transparent traits from intersections (see @dropTransparentTraits).

Don't do these widenings if bound is a subtype of scala.Singleton. Also, if the result of these widenings is a TypeRef to a module class, and this type ref is different from inst, replace by a TermRef to its source module instead.

At this point we also drop the @Repeated annotation to avoid inferring type arguments with it, as those could leak the annotation to users (see run/inferred-repeated-result).

Attributes

def widenIrreducible(tp: Type)(using Context): Type

If tp is an applied match type alias which is also an unreducible application of a higher-kinded type to a wildcard argument, widen to the match type's bound, in order to avoid an unreducible application of higher-kinded type ... in inferred type" error in PostTyper. Fixes #11246.

If tp is an applied match type alias which is also an unreducible application of a higher-kinded type to a wildcard argument, widen to the match type's bound, in order to avoid an unreducible application of higher-kinded type ... in inferred type" error in PostTyper. Fixes #11246.

Attributes

inline def withUntrustedBounds(op: => Type): Type

Concrete fields

protected var canWidenAbstract: Boolean

Used for match type reduction: If false, we don't recognize an abstract type to be a subtype type of any of its base classes. This is in place only at the toplevel; it is turned on again when we add parts of the scrutinee to the constraint.

Used for match type reduction: If false, we don't recognize an abstract type to be a subtype type of any of its base classes. This is in place only at the toplevel; it is turned on again when we add parts of the scrutinee to the constraint.

Attributes

protected var caseLambda: Type

Potentially a type lambda that is still instantiatable, even though the constraint is generally frozen.

Potentially a type lambda that is still instantiatable, even though the constraint is generally frozen.

Attributes

We are currently comparing type lambdas. Used as a flag for optimization: when false, no need to do an expensive pruneLambdaParams

We are currently comparing type lambdas. Used as a flag for optimization: when false, no need to do an expensive pruneLambdaParams

Attributes

protected var frozenConstraint: Boolean

If the constraint is frozen we cannot add new bounds to the constraint.

If the constraint is frozen we cannot add new bounds to the constraint.

Attributes

protected var homogenizeArgs: Boolean

If set, align arguments S1, S2when taking the glb T1 { X = S1 } & T2 { X = S2 } of a constraint upper bound for some type parameter. Aligning means computing S1 =:= S2 which may change the current constraint. See note in TypeComparer#distributeAnd.

If set, align arguments S1, S2when taking the glb T1 { X = S1 } & T2 { X = S2 } of a constraint upper bound for some type parameter. Aligning means computing S1 =:= S2 which may change the current constraint. See note in TypeComparer#distributeAnd.

Attributes

protected var poisoned: Set[TypeParamRef]

Used for match type reduction. When an abstract type may not be widened, according to widenAbstractOKFor, we record it in this set, so that we can ultimately fail the reduction, but with all the information that comes out from continuing to widen the abstract type.

Used for match type reduction. When an abstract type may not be widened, according to widenAbstractOKFor, we record it in this set, so that we can ultimately fail the reduction, but with all the information that comes out from continuing to widen the abstract type.

Attributes