%
% (c) The University of Glasgow 2006
% (c) The GRASP/AQUA Project, Glasgow University, 19921998
%
TcSimplify
\begin{code}
module TcSimplify (
tcSimplifyInfer, tcSimplifyInferCheck,
tcSimplifyCheck, tcSimplifyRestricted,
tcSimplifyRuleLhs, tcSimplifyIPs,
tcSimplifySuperClasses,
tcSimplifyTop, tcSimplifyInteractive,
tcSimplifyBracket, tcSimplifyCheckPat,
tcSimplifyDeriv, tcSimplifyDefault,
bindInstsOfLocalFuns,
misMatchMsg
) where
#include "HsVersions.h"
import TcUnify( unifyType )
import HsSyn
import TcRnMonad
import TcHsSyn ( hsLPatType )
import Inst
import TcEnv
import InstEnv
import TcType
import TcMType
import TcIface
import TcTyFuns
import DsUtils
import Var
import Id
import Name
import NameSet
import Class
import FunDeps
import PrelInfo
import PrelNames
import TysWiredIn
import ErrUtils
import BasicTypes
import VarSet
import VarEnv
import FiniteMap
import Bag
import Outputable
import ListSetOps
import Util
import SrcLoc
import DynFlags
import FastString
import Control.Monad
import Data.List
\end{code}
%************************************************************************
%* *
\subsection{NOTES}
%* *
%************************************************************************
Notes on functional dependencies (a bug)
Consider this:
class C a b | a -> b
class D a b | a -> b
instance D a b => C a b
f :: C a b => a -> Bool
f _ = True
g :: C a b => a -> Bool
g = f
Here f typechecks, but g does not!! Reason: before doing improvement,
we reduce the (C a b1) constraint from the call of f to (D a b1).
Here is a more complicated example:
@
> class Foo a b | a->b
>
> class Bar a b | a->b
>
> data Obj = Obj
>
> instance Bar Obj Obj
>
> instance (Bar a b) => Foo a b
>
> foo:: (Foo a b) => a -> String
> foo _ = "works"
>
> runFoo:: (forall a b. (Foo a b) => a -> w) -> w
> runFoo f = f Obj
*Test> runFoo foo
<interactive>:1:
Could not deduce (Bar a b) from the context (Foo a b)
arising from use of `foo' at <interactive>:1
Probable fix:
Add (Bar a b) to the expected type of an expression
In the first argument of `runFoo', namely `foo'
In the definition of `it': it = runFoo foo
Why all of the sudden does GHC need the constraint Bar a b? The
function foo didn't ask for that...
@
The trouble is that to type (runFoo foo), GHC has to solve the problem:
Given constraint Foo a b
Solve constraint Foo a b'
Notice that b and b' aren't the same. To solve this, just do
improvement and then they are the same. But GHC currently does
simplify constraints
apply improvement
and loop
That is usually fine, but it isn't here, because it sees that Foo a b is
not the same as Foo a b', and so instead applies the instance decl for
instance Bar a b => Foo a b. And that's where the Bar constraint comes
from.
The Right Thing is to improve whenever the constraint set changes at
all. Not hard in principle, but it'll take a bit of fiddling to do.
Note [Choosing which variables to quantify]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suppose we are about to do a generalisation step. We have in our hand
G the environment
T the type of the RHS
C the constraints from that RHS
The game is to figure out
Q the set of type variables over which to quantify
Ct the constraints we will *not* quantify over
Cq the constraints we will quantify over
So we're going to infer the type
forall Q. Cq => T
and float the constraints Ct further outwards.
Here are the things that *must* be true:
(A) Q intersect fv(G) = EMPTY limits how big Q can be
(B) Q superset fv(Cq union T) \ oclose(fv(G),C) limits how small Q can be
(A) says we can't quantify over a variable that's free in the environment.
(B) says we must quantify over all the truly free variables in T, else
we won't get a sufficiently general type.
We do not *need* to quantify over any variable that is fixed by the
free vars of the environment G.
BETWEEN THESE TWO BOUNDS, ANY Q WILL DO!
Example: class H x y | x->y where ...
fv(G) = {a} C = {H a b, H c d}
T = c -> b
(A) Q intersect {a} is empty
(B) Q superset {a,b,c,d} \ oclose({a}, C) = {a,b,c,d} \ {a,b} = {c,d}
So Q can be {c,d}, {b,c,d}
In particular, it's perfectly OK to quantify over more type variables
than strictly necessary; there is no need to quantify over 'b', since
it is determined by 'a' which is free in the envt, but it's perfectly
OK to do so. However we must not quantify over 'a' itself.
Other things being equal, however, we'd like to quantify over as few
variables as possible: smaller types, fewer type applications, more
constraints can get into Ct instead of Cq. Here's a good way to
choose Q:
Q = grow( fv(T), C ) \ oclose( fv(G), C )
That is, quantify over all variable that that MIGHT be fixed by the
call site (which influences T), but which aren't DEFINITELY fixed by
G. This choice definitely quantifies over enough type variables,
albeit perhaps too many.
Why grow( fv(T), C ) rather than fv(T)? Consider
class H x y | x->y where ...
T = c->c
C = (H c d)
If we used fv(T) = {c} we'd get the type
forall c. H c d => c -> b
And then if the fn was called at several different c's, each of
which fixed d differently, we'd get a unification error, because
d isn't quantified. Solution: quantify d. So we must quantify
everything that might be influenced by c.
Why not oclose( fv(T), C )? Because we might not be able to see
all the functional dependencies yet:
class H x y | x->y where ...
instance H x y => Eq (T x y) where ...
T = c->c
C = (Eq (T c d))
Now oclose(fv(T),C) = {c}, because the functional dependency isn't
apparent yet, and that's wrong. We must really quantify over d too.
There really isn't any point in quantifying over any more than
grow( fv(T), C ), because the call sites can't possibly influence
any other type variables.
Note [Ambiguity]
It's very hard to be certain when a type is ambiguous. Consider
class K x
class H x y | x -> y
instance H x y => K (x,y)
Is this type ambiguous?
forall a b. (K (a,b), Eq b) => a -> a
Looks like it! But if we simplify (K (a,b)) we get (H a b) and
now we see that a fixes b. So we can't tell about ambiguity for sure
without doing a full simplification. And even that isn't possible if
the context has some free vars that may get unified. Urgle!
Here's another example: is this ambiguous?
forall a b. Eq (T b) => a -> a
Not if there's an insance decl (with no context)
instance Eq (T b) where ...
You may say of this example that we should use the instance decl right
away, but you can't always do that:
class J a b where ...
instance J Int b where ...
f :: forall a b. J a b => a -> a
(Notice: no functional dependency in J's class decl.)
Here f's type is perfectly fine, provided f is only called at Int.
It's premature to complain when meeting f's signature, or even
when inferring a type for f.
However, we don't *need* to report ambiguity right away. It'll always
show up at the call site.... and eventually at main, which needs special
treatment. Nevertheless, reporting ambiguity promptly is an excellent thing.
So here's the plan. We WARN about probable ambiguity if
fv(Cq) is not a subset of oclose(fv(T) union fv(G), C)
(all tested before quantification).
That is, all the type variables in Cq must be fixed by the the variables
in the environment, or by the variables in the type.
Notice that we union before calling oclose. Here's an example:
class J a b c | a b -> c
fv(G) = {a}
Is this ambiguous?
forall b c. (J a b c) => b -> b
Only if we union {a} from G with {b} from T before using oclose,
do we see that c is fixed.
It's a bit vague exactly which C we should use for this oclose call. If we
don't fix enough variables we might complain when we shouldn't (see
the above nasty example). Nothing will be perfect. That's why we can
only issue a warning.
Can we ever be *certain* about ambiguity? Yes: if there's a constraint
c in C such that fv(c) intersect (fv(G) union fv(T)) = EMPTY
then c is a "bubble"; there's no way it can ever improve, and it's
certainly ambiguous. UNLESS it is a constant (sigh). And what about
the nasty example?
class K x
class H x y | x -> y
instance H x y => K (x,y)
Is this type ambiguous?
forall a b. (K (a,b), Eq b) => a -> a
Urk. The (Eq b) looks "definitely ambiguous" but it isn't. What we are after
is a "bubble" that's a set of constraints
Cq = Ca union Cq' st fv(Ca) intersect (fv(Cq') union fv(T) union fv(G)) = EMPTY
Hence another idea. To decide Q start with fv(T) and grow it
by transitive closure in Cq (no functional dependencies involved).
Now partition Cq using Q, leaving the definitelyambiguous and probablyok.
The definitelyambiguous can then float out, and get smashed at top level
(which squashes out the constants, like Eq (T a) above)
Notes on principal types
class C a where
op :: a -> a
f x = let g y = op (y::Int) in True
Here the principal type of f is (forall a. a->a)
but we'll produce the nonprincipal type
f :: forall a. C Int => a -> a
The need for forall's in constraints
[Exchange on Haskell Cafe 5/6 Dec 2000]
class C t where op :: t -> Bool
instance C [t] where op x = True
p y = (let f :: c -> Bool; f x = op (y >> return x) in f, y ++ [])
q y = (y ++ [], let f :: c -> Bool; f x = op (y >> return x) in f)
The definitions of p and q differ only in the order of the components in
the pair on their righthand sides. And yet:
ghc and "Typing Haskell in Haskell" reject p, but accept q;
Hugs rejects q, but accepts p;
hbc rejects both p and q;
nhc98 ... (Malcolm, can you fill in the blank for us!).
The type signature for f forces context reduction to take place, and
the results of this depend on whether or not the type of y is known,
which in turn depends on which component of the pair the type checker
analyzes first.
Solution: if y::m a, float out the constraints
Monad m, forall c. C (m c)
When m is later unified with [], we can solve both constraints.
Notes on implicit parameters
Note [Inheriting implicit parameters]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Consider this:
f x = (x::Int) + ?y
where f is *not* a toplevel binding.
From the RHS of f we'll get the constraint (?y::Int).
There are two types we might infer for f:
f :: Int -> Int
(so we get ?y from the context of f's definition), or
f :: (?y::Int) => Int -> Int
At first you might think the first was better, becuase then
?y behaves like a free variable of the definition, rather than
having to be passed at each call site. But of course, the WHOLE
IDEA is that ?y should be passed at each call site (that's what
dynamic binding means) so we'd better infer the second.
BOTTOM LINE: when *inferring types* you *must* quantify
over implicit parameters. See the predicate isFreeWhenInferring.
Note [Implicit parameters and ambiguity]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Only a *class* predicate can give rise to ambiguity
An *implicit parameter* cannot. For example:
foo :: (?x :: [a]) => Int
foo = length ?x
is fine. The call site will suppply a particular 'x'
Furthermore, the type variables fixed by an implicit parameter
propagate to the others. E.g.
foo :: (Show a, ?x::[a]) => Int
foo = show (?x++?x)
The type of foo looks ambiguous. But it isn't, because at a call site
we might have
let ?x = 5::Int in foo
and all is well. In effect, implicit parameters are, well, parameters,
so we can take their type variables into account as part of the
"tau-tvs" stuff. This is done in the function 'FunDeps.grow'.
Question 2: type signatures
~~~~~~~~~~~~~~~~~~~~~~~~~~~
BUT WATCH OUT: When you supply a type signature, we can't force you
to quantify over implicit parameters. For example:
(?x + 1) :: Int
This is perfectly reasonable. We do not want to insist on
(?x + 1) :: (?x::Int => Int)
That would be silly. Here, the definition site *is* the occurrence site,
so the above strictures don't apply. Hence the difference between
tcSimplifyCheck (which *does* allow implicit paramters to be inherited)
and tcSimplifyCheckBind (which does not).
What about when you supply a type signature for a binding?
Is it legal to give the following explicit, user type
signature to f, thus:
f :: Int -> Int
f x = (x::Int) + ?y
At first sight this seems reasonable, but it has the nasty property
that adding a type signature changes the dynamic semantics.
Consider this:
(let f x = (x::Int) + ?y
in (f 3, f 3 with ?y=5)) with ?y = 6
returns (3+6, 3+5)
vs
(let f :: Int -> Int
f x = x + ?y
in (f 3, f 3 with ?y=5)) with ?y = 6
returns (3+6, 3+6)
Indeed, simply inlining f (at the Haskell source level) would change the
dynamic semantics.
Nevertheless, as Launchbury says (email Oct 01) we can't really give the
semantics for a Haskell program without knowing its typing, so if you
change the typing you may change the semantics.
To make things consistent in all cases where we are *checking* against
a supplied signature (as opposed to inferring a type), we adopt the
rule:
a signature does not need to quantify over implicit params.
[This represents a (rather marginal) change of policy since GHC 5.02,
which *required* an explicit signature to quantify over all implicit
params for the reasons mentioned above.]
But that raises a new question. Consider
Given (signature) ?x::Int
Wanted (inferred) ?x::Int, ?y::Bool
Clearly we want to discharge the ?x and float the ?y out. But
what is the criterion that distinguishes them? Clearly it isn't
what free type variables they have. The Right Thing seems to be
to float a constraint that
neither mentions any of the quantified type variables
nor any of the quantified implicit parameters
See the predicate isFreeWhenChecking.
Question 3: monomorphism
~~~~~~~~~~~~~~~~~~~~~~~~
There's a nasty corner case when the monomorphism restriction bites:
z = (x::Int) + ?y
The argument above suggests that we *must* generalise
over the ?y parameter, to get
z :: (?y::Int) => Int,
but the monomorphism restriction says that we *must not*, giving
z :: Int.
Why does the momomorphism restriction say this? Because if you have
let z = x + ?y in z+z
you might not expect the addition to be done twice
we follow the argument of Question 2 and generalise over ?y.
Question 4: top level
~~~~~~~~~~~~~~~~~~~~~
At the top level, monomorhism makes no sense at all.
module Main where
main = let ?x = 5 in print foo
foo = woggle 3
woggle :: (?x :: Int) => Int -> Int
woggle y = ?x + y
We definitely don't want (foo :: Int) with a toplevel implicit parameter
(?x::Int) becuase there is no way to bind it.
Possible choices
~~~~~~~~~~~~~~~~
(A) Always generalise over implicit parameters
Bindings that fall under the monomorphism restriction can't
be generalised
Consequences:
* Inlining remains valid
* No unexpected loss of sharing
* But simple bindings like
z = ?y + 1
will be rejected, unless you add an explicit type signature
(to avoid the monomorphism restriction)
z :: (?y::Int) => Int
z = ?y + 1
This seems unacceptable
(B) Monomorphism restriction "wins"
Bindings that fall under the monomorphism restriction can't
be generalised
Always generalise over implicit parameters *except* for bindings
that fall under the monomorphism restriction
Consequences
* Inlining isn't valid in general
* No unexpected loss of sharing
* Simple bindings like
z = ?y + 1
accepted (get value of ?y from binding site)
(C) Always generalise over implicit parameters
Bindings that fall under the monomorphism restriction can't
be generalised, EXCEPT for implicit parameters
Consequences
* Inlining remains valid
* Unexpected loss of sharing (from the extra generalisation)
* Simple bindings like
z = ?y + 1
accepted (get value of ?y from occurrence sites)
Discussion
~~~~~~~~~~
None of these choices seems very satisfactory. But at least we should
decide which we want to do.
It's really not clear what is the Right Thing To Do. If you see
z = (x::Int) + ?y
would you expect the value of ?y to be got from the *occurrence sites*
of 'z', or from the valuue of ?y at the *definition* of 'z'? In the
case of function definitions, the answer is clearly the former, but
less so in the case of nonfucntion definitions. On the other hand,
if we say that we get the value of ?y from the definition site of 'z',
then inlining 'z' might change the semantics of the program.
Choice (C) really says "the monomorphism restriction doesn't apply
to implicit parameters". Which is fine, but remember that every
innocent binding 'x = ...' that mentions an implicit parameter in
the RHS becomes a *function* of that parameter, called at each
use of 'x'. Now, the chances are that there are no intervening 'with'
clauses that bind ?y, so a decent compiler should common up all
those function calls. So I think I strongly favour (C). Indeed,
one could make a similar argument for abolishing the monomorphism
restriction altogether.
BOTTOM LINE: we choose (B) at present. See tcSimplifyRestricted
%************************************************************************
%* *
\subsection{tcSimplifyInfer}
%* *
%************************************************************************
tcSimplify is called when we *inferring* a type. Here's the overall game plan:
1. Compute Q = grow( fvs(T), C )
2. Partition C based on Q into Ct and Cq. Notice that ambiguous
predicates will end up in Ct; we deal with them at the top level
3. Try improvement, using functional dependencies
4. If Step 3 did any unification, repeat from step 1
(Unification can change the result of 'grow'.)
Note: we don't reduce dictionaries in step 2. For example, if we have
Eq (a,b), we don't simplify to (Eq a, Eq b). So Q won't be different
after step 2. However note that we may therefore quantify over more
type variables than we absolutely have to.
For the guts, we need a loop, that alternates context reduction and
improvement with unification. E.g. Suppose we have
class C x y | x->y where ...
and tcSimplify is called with:
(C Int a, C Int b)
Then improvement unifies a with b, giving
(C Int a, C Int a)
If we need to unify anything, we rattle round the whole thing all over
again.
\begin{code}
tcSimplifyInfer
:: SDoc
-> TcTyVarSet
-> [Inst]
-> TcM ([TcTyVar],
[Inst],
TcDictBinds)
\end{code}
\begin{code}
tcSimplifyInfer doc tau_tvs wanted
= do { tau_tvs1 <- zonkTcTyVarsAndFV (varSetElems tau_tvs)
; wanted' <- mapM zonkInst wanted
; gbl_tvs <- tcGetGlobalTyVars
; let preds1 = fdPredsOfInsts wanted'
gbl_tvs1 = oclose preds1 gbl_tvs
qtvs = growInstsTyVars wanted' tau_tvs1 `minusVarSet` gbl_tvs1
; let (free, bound) = partition (isFreeWhenInferring qtvs) wanted'
; extendLIEs free
; traceTc (text "infer" <+> (ppr preds1 $$ ppr (growInstsTyVars wanted' tau_tvs1) $$ ppr gbl_tvs $$
ppr gbl_tvs1 $$ ppr free $$ ppr bound))
; (irreds1, binds1) <- tryHardCheckLoop doc bound
; let want_dict d = tyVarsOfInst d `intersectsVarSet` qtvs
; (irreds2, binds2) <- approximateImplications doc want_dict irreds1
; tau_tvs2 <- zonkTcTyVarsAndFV (varSetElems tau_tvs1)
; gbl_tvs2 <- zonkTcTyVarsAndFV (varSetElems gbl_tvs1)
; let preds2 = fdPredsOfInsts irreds2
qtvs = growInstsTyVars irreds2 tau_tvs2 `minusVarSet` oclose preds2 gbl_tvs2
; let (free, irreds3) = partition (isFreeWhenInferring qtvs) irreds2
; extendLIEs free
; qtvs2 <- zonkQuantifiedTyVars (varSetElems qtvs)
; let (q_dicts0, implics) = partition isAbstractableInst irreds3
; loc <- getInstLoc (ImplicOrigin doc)
; implic_bind <- bindIrreds loc qtvs2 q_dicts0 implics
; let (q_eqs0,q_dicts) = partition isEqInst q_dicts0
; q_eqs <- mapM finalizeEqInst q_eqs0
; return (qtvs2, q_eqs ++ q_dicts, binds1 `unionBags` binds2 `unionBags` implic_bind) }
approximateImplications :: SDoc -> (Inst -> Bool) -> [Inst] -> TcM ([Inst], TcDictBinds)
approximateImplications doc want_dict irreds
| null extra_dicts
= return (irreds, emptyBag)
| otherwise
= do { extra_dicts' <- mapM cloneDict extra_dicts
; tryHardCheckLoop doc (extra_dicts' ++ irreds) }
where
extra_dicts = get_dicts (filter isImplicInst irreds)
get_dicts :: [Inst] -> [Inst]
get_dicts ds = concatMap get_dict ds
get_dict d@(Dict {}) | want_dict d = [d]
| otherwise = []
get_dict (ImplicInst {tci_tyvars = tvs, tci_wanted = wanteds})
= [ d | let tv_set = mkVarSet tvs
, d <- get_dicts wanteds
, not (tyVarsOfInst d `intersectsVarSet` tv_set)]
get_dict i@(EqInst {}) | want_dict i = [i]
| otherwise = []
get_dict other = pprPanic "approximateImplications" (ppr other)
\end{code}
Note [Inference and implication constraints]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suppose we have a wanted implication constraint (perhaps arising from
a nested pattern match) like
C a => D [a]
and we are now trying to quantify over 'a' when inferring the type for
a function. In principle it's possible that there might be an instance
instance (C a, E a) => D [a]
so the context (E a) would suffice. The Right Thing is to abstract over
the implication constraint, but we don't do that (a) because it'll be
surprising to programmers and (b) because we don't have the machinery to deal
with 'given' implications.
So our best approximation is to make (D [a]) part of the inferred
context, so we can use that to discharge the implication. Hence
the strange function get_dicts in approximateImplications.
The common cases are more clearcut, when we have things like
forall a. C a => C b
Here, abstracting over (C b) is not an approximation at all
Note [Freeness and implications].
See Trac #1430 and test tc228.
\begin{code}
tcSimplifyInferCheck
:: InstLoc
-> TcTyVarSet
-> [Inst]
-> [Inst]
-> TcM ([TyVar],
TcDictBinds)
tcSimplifyInferCheck loc tau_tvs givens wanteds
= do { traceTc (text "tcSimplifyInferCheck <-" <+> ppr wanteds)
; (irreds, binds) <- gentleCheckLoop loc givens wanteds
; let all_tvs = varSetElems (tau_tvs `unionVarSet` tyVarsOfInsts givens)
; all_tvs <- zonkTcTyVarsAndFV all_tvs
; gbl_tvs <- tcGetGlobalTyVars
; let qtvs = varSetElems (all_tvs `minusVarSet` gbl_tvs)
; qtvs' <- zonkQuantifiedTyVars qtvs
; implic_bind <- bindIrreds loc qtvs' givens irreds
; traceTc (text "tcSimplifyInferCheck ->" <+> ppr (implic_bind))
; return (qtvs', binds `unionBags` implic_bind) }
\end{code}
Note [Squashing methods]
~~~~~~~~~~~~~~~~~~~~~~~~~
Be careful if you want to float methods more:
truncate :: forall a. RealFrac a => forall b. Integral b => a -> b
From an application (truncate f i) we get
t1 = truncate at f
t2 = t1 at i
If we have also have a second occurrence of truncate, we get
t3 = truncate at f
t4 = t3 at i
When simplifying with i,f free, we might still notice that
t1=t3; but alas, the binding for t2 (which mentions t1)
may continue to float out!
Note [NO TYVARS]
~~~~~~~~~~~~~~~~~
class Y a b | a -> b where
y :: a -> X b
instance Y [[a]] a where
y ((x:_):_) = X x
k :: X a -> X a -> X a
g :: Num a => [X a] -> [X a]
g xs = h xs
where
h ys = ys ++ map (k (y [[0]])) xs
The excitement comes when simplifying the bindings for h. Initially
try to simplify {y @ [[t1]] t2, 0 @ t1}, with initial qtvs = {t2}.
From this we get t1~t2, but also various bindings. We can't forget
the bindings (because of [LOOP]), but in fact t1 is what g is
polymorphic in.
The net effect of [NO TYVARS]
\begin{code}
isFreeWhenInferring :: TyVarSet -> Inst -> Bool
isFreeWhenInferring qtvs inst
= isFreeWrtTyVars qtvs inst
&& isInheritableInst inst
isFreeWrtTyVars :: VarSet -> Inst -> Bool
isFreeWrtTyVars qtvs inst = tyVarsOfInst inst `disjointVarSet` qtvs
isFreeWrtIPs :: NameSet -> Inst -> Bool
isFreeWrtIPs ips inst = not (any (`elemNameSet` ips) (ipNamesOfInst inst))
\end{code}
%************************************************************************
%* *
\subsection{tcSimplifyCheck}
%* *
%************************************************************************
@tcSimplifyCheck@ is used when we know exactly the set of variables
we are going to quantify over. For example, a class or instance declaration.
\begin{code}
tcSimplifyCheck :: InstLoc
-> [TcTyVar]
-> [Inst]
-> [Inst]
-> TcM TcDictBinds
tcSimplifyCheck loc qtvs givens wanteds
= ASSERT( all isTcTyVar qtvs && all isSkolemTyVar qtvs )
do { traceTc (text "tcSimplifyCheck")
; (irreds, binds) <- gentleCheckLoop loc givens wanteds
; implic_bind <- bindIrreds loc qtvs givens irreds
; return (binds `unionBags` implic_bind) }
tcSimplifyCheckPat :: InstLoc
-> [TcTyVar]
-> [Inst]
-> [Inst]
-> TcM TcDictBinds
tcSimplifyCheckPat loc qtvs givens wanteds
= ASSERT( all isTcTyVar qtvs && all isSkolemTyVar qtvs )
do { traceTc (text "tcSimplifyCheckPat")
; (irreds, binds) <- gentleCheckLoop loc givens wanteds
; implic_bind <- bindIrredsR loc qtvs givens irreds
; return (binds `unionBags` implic_bind) }
bindIrreds :: InstLoc -> [TcTyVar]
-> [Inst] -> [Inst]
-> TcM TcDictBinds
bindIrreds loc qtvs givens irreds
= bindIrredsR loc qtvs givens irreds
bindIrredsR :: InstLoc -> [TcTyVar] -> [Inst] -> [Inst] -> TcM TcDictBinds
bindIrredsR loc qtvs givens irreds
| null irreds
= return emptyBag
| otherwise
= do { let givens' = filter isAbstractableInst givens
; irreds' <- if null givens'
then do
{ let qtv_set = mkVarSet qtvs
(frees, real_irreds) = partition (isFreeWrtTyVars qtv_set) irreds
; extendLIEs frees
; return real_irreds }
else return irreds
; (implics, bind) <- makeImplicationBind loc qtvs givens' irreds'
; extendLIEs implics
; return bind }
makeImplicationBind :: InstLoc -> [TcTyVar]
-> [Inst] -> [Inst]
-> TcM ([Inst], TcDictBinds)
makeImplicationBind loc all_tvs
givens
irreds
| null irreds
= return ([], emptyBag)
| otherwise
= do { uniq <- newUnique
; span <- getSrcSpanM
; let (eq_givens, dict_givens) = partition isEqInst givens
eq_cotvs = map eqInstType eq_givens
name = mkInternalName uniq (mkVarOcc "ic") span
implic_inst = ImplicInst { tci_name = name,
tci_tyvars = all_tvs,
tci_given = eq_givens ++ dict_givens,
tci_wanted = irreds,
tci_loc = loc }
dict_irreds = filter (not . isEqInst) irreds
dict_irred_ids = map instToId dict_irreds
lpat = mkBigLHsPatTup (map (L span . VarPat) dict_irred_ids)
rhs = L span (mkHsWrap co (HsVar (instToId implic_inst)))
co = mkWpApps (map instToId dict_givens)
<.> mkWpTyApps eq_cotvs
<.> mkWpTyApps (mkTyVarTys all_tvs)
bind | [dict_irred_id] <- dict_irred_ids
= VarBind dict_irred_id rhs
| otherwise
= PatBind { pat_lhs = lpat
, pat_rhs = unguardedGRHSs rhs
, pat_rhs_ty = hsLPatType lpat
, bind_fvs = placeHolderNames
}
; traceTc $ text "makeImplicationBind" <+> ppr implic_inst
; return ([implic_inst], unitBag (L span bind))
}
tryHardCheckLoop :: SDoc
-> [Inst]
-> TcM ([Inst], TcDictBinds)
tryHardCheckLoop doc wanteds
= do { (irreds,binds) <- checkLoop (mkInferRedEnv doc try_me) wanteds
; return (irreds,binds)
}
where
try_me _ = ReduceMe
gentleCheckLoop :: InstLoc
-> [Inst]
-> [Inst]
-> TcM ([Inst], TcDictBinds)
gentleCheckLoop inst_loc givens wanteds
= do { (irreds,binds) <- checkLoop env wanteds
; return (irreds,binds)
}
where
env = mkRedEnv (pprInstLoc inst_loc) try_me givens
try_me inst | isMethodOrLit inst = ReduceMe
| otherwise = Stop
gentleInferLoop :: SDoc -> [Inst]
-> TcM ([Inst], TcDictBinds)
gentleInferLoop doc wanteds
= do { (irreds, binds) <- checkLoop env wanteds
; return (irreds, binds) }
where
env = mkInferRedEnv doc try_me
try_me inst | isMethodOrLit inst = ReduceMe
| otherwise = Stop
\end{code}
Note [Check gently]
~~~~~~~~~~~~~~~~~~~~
We have to very careful about not simplifying too vigorously
Example:
data T a where
MkT :: a -> T [a]
f :: Show b => T b -> b
f (MkT x) = show [x]
Inside the pattern match, which binds (a:*, x:a), we know that
b ~ [a]
Hence we have a dictionary for Show [a] available; and indeed we
need it. We are going to build an implication contraint
forall a. (b~[a]) => Show [a]
Later, we will solve this constraint using the knowledge (Show b)
But we MUST NOT reduce (Show [a]) to (Show a), else the whole
thing becomes insoluble. So we simplify gently (get rid of literals
and methods only, plus common up equal things), deferring the real
work until top level, when we solve the implication constraint
with tryHardCheckLooop.
\begin{code}
checkLoop :: RedEnv
-> [Inst]
-> TcM ([Inst], TcDictBinds)
checkLoop env wanteds
= go env wanteds
where go env wanteds
= do {
; env' <- zonkRedEnv env
; wanteds' <- zonkInsts wanteds
; (improved, tybinds, binds, irreds)
<- reduceContext env' wanteds'
; execTcTyVarBinds tybinds
; if null irreds || not improved then
return (irreds, binds)
else do
{ (irreds1, binds1) <- go env' irreds
; return (irreds1, binds `unionBags` binds1) } }
\end{code}
Note [Zonking RedEnv]
~~~~~~~~~~~~~~~~~~~~~
It might appear as if the givens in RedEnv are always rigid, but that is not
necessarily the case for programs involving higherrank types that have class
contexts constraining the higherrank variables. An example from tc237 in the
testsuite is
class Modular s a | s -> a
wim :: forall a w. Integral a
=> a -> (forall s. Modular s a => M s w) -> w
wim i k = error "urk"
test5 :: (Modular s a, Integral a) => M s a
test5 = error "urk"
test4 = wim 4 test4'
Notice how the variable 'a' of (Modular s a) in the rank2 type of wim is
quantified further outside. When type checking test4, we have to check
whether the signature of test5 is an instance of
(forall s. Modular s a => M s w)
Consequently, we will get (Modular s t_a), where t_a is a TauTv into the
givens.
Given the FD of Modular in this example, class improvement will instantiate
t_a to 'a', where 'a' is the skolem from test5's signatures (due to the
Modular s a predicate in that signature). If we don't zonk (Modular s t_a) in
the givens, we will get into a loop as improveOne uses the unification engine
Unify.tcUnifyTys, which doesn't know about mutable type variables.
Note [LOOP]
~~~~~~~~~~~
class If b t e r | b t e -> r
instance If T t e t
instance If F t e e
class Lte a b c | a b -> c where lte :: a -> b -> c
instance Lte Z b T
instance (Lte a b l,If l b a c) => Max a b c
Wanted: Max Z (S x) y
Then we'll reduce using the Max instance to:
(Lte Z (S x) l, If l (S x) Z y)
and improve by binding l->T, after which we can do some reduction
on both the Lte and If constraints. What we *can't* do is start again
with (Max Z (S x) y)!
%************************************************************************
%* *
tcSimplifySuperClasses
%* *
%************************************************************************
Note [SUPERCLASSLOOP 1]
~~~~~~~~~~~~~~~~~~~~~~~~
We have to be very, very careful when generating superclasses, lest we
accidentally build a loop. Here's an example:
class S a
class S a => C a where { opc :: a -> a }
class S b => D b where { opd :: b -> b }
instance C Int where
opc = opd
instance D Int where
opd = opc
From (instance C Int) we get the constraint set {ds1:S Int, dd:D Int}
Simplifying, we may well get:
$dfCInt = :C ds1 (opd dd)
dd = $dfDInt
ds1 = $p1 dd
Notice that we spot that we can extract ds1 from dd.
Alas! Alack! We can do the same for (instance D Int):
$dfDInt = :D ds2 (opc dc)
dc = $dfCInt
ds2 = $p1 dc
And now we've defined the superclass in terms of itself.
Two more nasty cases are in
tcrun021
tcrun033
Solution:
Satisfy the superclass context *all by itself*
(tcSimplifySuperClasses)
And do so completely; i.e. no leftover constraints
to mix with the constraints arising from method declarations
Note [Recursive instances and superclases]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Consider this code, which arises in the context of "Scrap Your
Boilerplate with Class".
class Sat a
class Data ctx a
instance Sat (ctx Char) => Data ctx Char
instance (Sat (ctx [a]), Data ctx a) => Data ctx [a]
class Data Maybe a => Foo a
instance Foo t => Sat (Maybe t)
instance Data Maybe a => Foo a
instance Foo a => Foo [a]
instance Foo [Char]
In the instance for Foo [a], when generating evidence for the superclasses
(ie in tcSimplifySuperClasses) we need a superclass (Data Maybe [a]).
Using the instance for Data, we therefore need
(Sat (Maybe [a], Data Maybe a)
But we are given (Foo a), and hence its superclass (Data Maybe a).
So that leaves (Sat (Maybe [a])). Using the instance for Sat means
we need (Foo [a]). And that is the very dictionary we are bulding
an instance for! So we must put that in the "givens". So in this
case we have
Given: Foo a, Foo [a]
Watend: Data Maybe [a]
BUT we must *not not not* put the *superclasses* of (Foo [a]) in
the givens, which is what 'addGiven' would normally do. Why? Because
(Data Maybe [a]) is the superclass, so we'd "satisfy" the wanted
by selecting a superclass from Foo [a], which simply makes a loop.
On the other hand we *must* put the superclasses of (Foo a) in
the givens, as you can see from the derivation described above.
Conclusion: in the very special case of tcSimplifySuperClasses
we have one 'given' (namely the "this" dictionary) whose superclasses
must not be added to 'givens' by addGiven.
There is a complication though. Suppose there are equalities
instance (Eq a, a~b) => Num (a,b)
Then we normalise the 'givens' wrt the equalities, so the original
given "this" dictionary is cast to one of a different type. So it's a
bit trickier than before to identify the "special" dictionary whose
superclasses must not be added. See test
indexedtypes/should_run/EqInInstance
We need a persistent property of the dictionary to record this
specialness. Current I'm using the InstLocOrigin (a bit of a hack,
but cool), which is maintained by dictionary normalisation.
Specifically, the InstLocOrigin is
NoScOrigin
then the nosuperclass thing kicks in. WATCH OUT if you fiddle
with InstLocOrigin!
\begin{code}
tcSimplifySuperClasses
:: InstLoc
-> Inst
-> [Inst]
-> [Inst]
-> TcM TcDictBinds
tcSimplifySuperClasses loc this givens sc_wanteds
= do { traceTc (text "tcSimplifySuperClasses")
; no_sc_loc <- getInstLoc NoScOrigin
; let no_sc_this = setInstLoc this no_sc_loc
; let env = RedEnv { red_doc = pprInstLoc loc,
red_try_me = try_me,
red_givens = no_sc_this : givens,
red_stack = (0,[]),
red_improve = False }
; (irreds,binds1) <- checkLoop env sc_wanteds
; let (tidy_env, tidy_irreds) = tidyInsts irreds
; reportNoInstances tidy_env (Just (loc, givens)) [] tidy_irreds
; return binds1 }
where
try_me _ = ReduceMe
\end{code}
%************************************************************************
%* *
\subsection{tcSimplifyRestricted}
%* *
%************************************************************************
tcSimplifyRestricted infers which type variables to quantify for a
group of restricted bindings. This isn't trivial.
Eg1: id = \x -> x
We want to quantify over a to get id :: forall a. a->a
Eg2: eq = (==)
We do not want to quantify over a, because there's an Eq a
constraint, so we get eq :: a->a->Bool (notice no forall)
So, assume:
RHS has type 'tau', whose free tyvars are tau_tvs
RHS has constraints 'wanteds'
Plan A (simple)
Quantify over (tau_tvs \ ftvs(wanteds))
This is bad. The constraints may contain (Monad (ST s))
where we have instance Monad (ST s) where...
so there's no need to be monomorphic in s!
Also the constraint might be a method constraint,
whose type mentions a perfectly innocent tyvar:
op :: Num a => a -> b -> a
Here, b is unconstrained. A good example would be
foo = op (3::Int)
We want to infer the polymorphic type
foo :: forall b. b -> b
Plan B (cunning, used for a long time up to and including GHC 6.2)
Step 1: Simplify the constraints as much as possible (to deal
with Plan A's problem). Then set
qtvs = tau_tvs \ ftvs( simplify( wanteds ) )
Step 2: Now simplify again, treating the constraint as 'free' if
it does not mention qtvs, and trying to reduce it otherwise.
The reasons for this is to maximise sharing.
This fails for a very subtle reason. Suppose that in the Step 2
a constraint (Foo (Succ Zero) (Succ Zero) b) gets thrown upstairs as 'free'.
In the Step 1 this constraint might have been simplified, perhaps to
(Foo Zero Zero b), AND THEN THAT MIGHT BE IMPROVED, to bind 'b' to 'T'.
This won't happen in Step 2... but that in turn might prevent some other
constraint (Baz [a] b) being simplified (e.g. via instance Baz [a] T where {..})
and that in turn breaks the invariant that no constraints are quantified over.
Test typecheck/should_compile/tc177 (which failed in GHC 6.2) demonstrates
the problem.
Plan C (brutal)
Step 1: Simplify the constraints as much as possible (to deal
with Plan A's problem). Then set
qtvs = tau_tvs \ ftvs( simplify( wanteds ) )
Return the bindings from Step 1.
A note about Plan C (arising from "bug" reported by George Russel March 2004)
Consider this:
instance (HasBinary ty IO) => HasCodedValue ty
foo :: HasCodedValue a => String -> IO a
doDecodeIO :: HasCodedValue a => () -> () -> IO a
doDecodeIO codedValue view
= let { act = foo "foo" } in act
You might think this should work becuase the call to foo gives rise to a constraint
(HasCodedValue t), which can be satisfied by the type sig for doDecodeIO. But the
restricted binding act = ... calls tcSimplifyRestricted, and PlanC simplifies the
constraint using the (rather bogus) instance declaration, and now we are stuffed.
I claim this is not really a bug
plan D
Plan D (a variant of plan B)
Step 1: Simplify the constraints as much as possible (to deal
with Plan A's problem), BUT DO NO IMPROVEMENT. Then set
qtvs = tau_tvs \ ftvs( simplify( wanteds ) )
Step 2: Now simplify again, treating the constraint as 'free' if
it does not mention qtvs, and trying to reduce it otherwise.
The point here is that it's generally OK to have too few qtvs; that is,
to make the thing more monomorphic than it could be. We don't want to
do that in the common cases, but in wierd cases it's ok: the programmer
can always add a signature.
Too few qtvs => too many wanteds, which is what happens if you do less
improvement.
\begin{code}
tcSimplifyRestricted
:: SDoc
-> TopLevelFlag
-> [Name]
-> TcTyVarSet
-> [Inst]
-> TcM ([TyVar],
TcDictBinds)
tcSimplifyRestricted doc top_lvl bndrs tau_tvs wanteds
= do { traceTc (text "tcSimplifyRestricted")
; wanteds_z <- zonkInsts wanteds
; let env = mkNoImproveRedEnv doc (\_ -> ReduceMe)
; (_imp, _tybinds, _binds, constrained_dicts)
<- reduceContext env wanteds_z
; tau_tvs' <- zonkTcTyVarsAndFV (varSetElems tau_tvs)
; gbl_tvs' <- tcGetGlobalTyVars
; constrained_dicts' <- zonkInsts constrained_dicts
; let qtvs1 = tau_tvs' `minusVarSet` oclose (fdPredsOfInsts constrained_dicts) gbl_tvs'
constrained_tvs' = tyVarsOfInsts constrained_dicts'
qtvs = qtvs1 `minusVarSet` constrained_tvs'
pp_bndrs = pprWithCommas (quotes . ppr) bndrs
; warn_mono <- doptM Opt_WarnMonomorphism
; warnTc (warn_mono && (constrained_tvs' `intersectsVarSet` qtvs1))
(vcat[ ptext (sLit "the Monomorphism Restriction applies to the binding")
<> plural bndrs <+> ptext (sLit "for") <+> pp_bndrs,
ptext (sLit "Consider giving a type signature for") <+> pp_bndrs])
; traceTc (text "tcSimplifyRestricted" <+> vcat [
pprInsts wanteds, pprInsts constrained_dicts',
ppr _binds,
ppr constrained_tvs', ppr tau_tvs', ppr qtvs ])
; let is_nested_group = isNotTopLevel top_lvl
try_me inst | isFreeWrtTyVars qtvs inst,
(is_nested_group || isDict inst) = Stop
| otherwise = ReduceMe
env = mkNoImproveRedEnv doc try_me
; (_imp, tybinds, binds, irreds) <- reduceContext env wanteds_z
; execTcTyVarBinds tybinds
; ASSERT( all (isFreeWrtTyVars qtvs) irreds )
if is_nested_group then
extendLIEs irreds
else do { let (bad_ips, non_ips) = partition isIPDict irreds
; addTopIPErrs bndrs bad_ips
; extendLIEs non_ips }
; qtvs' <- zonkQuantifiedTyVars (varSetElems qtvs)
; return (qtvs', binds) }
\end{code}
%************************************************************************
%* *
tcSimplifyRuleLhs
%* *
%************************************************************************
On the LHS of transformation rules we only simplify methods and constants,
getting dictionaries. We want to keep all of them unsimplified, to serve
as the available stuff for the RHS of the rule.
Example. Consider the following lefthand side of a rule
f (x == y) (y > z) = ...
If we typecheck this expression we get constraints
d1 :: Ord a, d2 :: Eq a
We do NOT want to "simplify" to the LHS
forall x::a, y::a, z::a, d1::Ord a.
f ((==) (eqFromOrd d1) x y) ((>) d1 y z) = ...
Instead we want
forall x::a, y::a, z::a, d1::Ord a, d2::Eq a.
f ((==) d2 x y) ((>) d1 y z) = ...
Here is another example:
fromIntegral :: (Integral a, Num b) => a -> b
In the rule, a=b=Int, and Num Int is a superclass of Integral Int. But
we *dont* want to get
forall dIntegralInt.
fromIntegral Int Int dIntegralInt (scsel dIntegralInt) = id Int
because the scsel will mess up RULE matching. Instead we want
forall dIntegralInt, dNumInt.
fromIntegral Int Int dIntegralInt dNumInt = id Int
Even if we have
g (x == y) (y == z) = ..
where the two dictionaries are *identical*, we do NOT WANT
forall x::a, y::a, z::a, d1::Eq a
f ((==) d1 x y) ((>) d1 y z) = ...
because that will only match if the dict args are (visibly) equal.
Instead we want to quantify over the dictionaries separately.
In short, tcSimplifyRuleLhs must *only* squash LitInst and MethInts, leaving
all dicts unchanged, with absolutely no sharing. It's simpler to do this
from scratch, rather than further parameterise simpleReduceLoop etc.
Simpler, maybe, but alas not simple (see Trac #2494)
* Type errors may give rise to an (unsatisfiable) equality constraint
* Applications of a higherrank function on the LHS may give
rise to an implication constraint, esp if there are unsatisfiable
equality constraints inside.
\begin{code}
tcSimplifyRuleLhs :: [Inst] -> TcM ([Inst], TcDictBinds)
tcSimplifyRuleLhs wanteds
= do { wanteds' <- zonkInsts wanteds
; (_, wanteds'', tybinds, binds1) <- tcReduceEqs [] wanteds'
; execTcTyVarBinds tybinds
; (irreds, binds2) <- go [] emptyBag wanteds''
; let (dicts, bad_irreds) = partition isDict irreds
; traceTc (text "tcSimplifyrulelhs" <+> pprInsts bad_irreds)
; addNoInstanceErrs (nub bad_irreds)
; return (dicts, binds1 `unionBags` binds2) }
where
go :: [Inst] -> TcDictBinds -> [Inst] -> TcM ([Inst], TcDictBinds)
go irreds binds []
= return (irreds, binds)
go irreds binds (w:ws)
| isDict w
= go (w:irreds) binds ws
| isImplicInst w
= do { (binds1, irreds1) <- reduceImplication red_env w
; let (bad_irreds, ok_irreds) = partition isImplicInst irreds1
; go (bad_irreds ++ irreds)
(binds `unionBags` binds1)
(ok_irreds ++ ws)}
| otherwise
= do { w' <- zonkInst w
; lookup_result <- lookupSimpleInst w'
; case lookup_result of
NoInstance -> go (w:irreds) binds ws
GenInst ws' rhs -> go irreds binds' (ws' ++ ws)
where
binds' = addInstToDictBind binds w rhs
}
red_env = mkInferRedEnv doc try_me
doc = ptext (sLit "Implication constraint in RULE lhs")
try_me inst | isMethodOrLit inst = ReduceMe
| otherwise = Stop
\end{code}
tcSimplifyBracket is used when simplifying the constraints arising from
a Template Haskell bracket [| ... |]. We want to check that there aren't
any constraints that can't be satisfied (e.g. Show Foo, where Foo has no
Show instance), but we aren't otherwise interested in the results.
Nor do we care about ambiguous dictionaries etc. We will type check
this bracket again at its usage site.
\begin{code}
tcSimplifyBracket :: [Inst] -> TcM ()
tcSimplifyBracket wanteds
= do { _ <- tryHardCheckLoop doc wanteds
; return () }
where
doc = text "tcSimplifyBracket"
\end{code}
%************************************************************************
%* *
\subsection{Filtering at a dynamic binding}
%* *
%************************************************************************
When we have
let ?x = R in B
we must discharge all the ?x constraints from B. We also do an improvement
step; if we have ?x::t1 and ?x::t2 we must unify t1, t2.
Actually, the constraints from B might improve the types in ?x. For example
f :: (?x::Int) => Char -> Char
let ?x = 3 in f 'c'
then the constraint (?x::Int) arising from the call to f will
force the binding for ?x to be of type Int.
\begin{code}
tcSimplifyIPs :: [Inst]
-> [Inst]
-> TcM TcDictBinds
tcSimplifyIPs given_ips wanteds
= do { wanteds' <- zonkInsts wanteds
; given_ips' <- zonkInsts given_ips
; let env = mkRedEnv doc try_me given_ips'
; (improved, tybinds, binds, irreds) <- reduceContext env wanteds'
; execTcTyVarBinds tybinds
; if null irreds || not improved then
ASSERT( all is_free irreds )
do { extendLIEs irreds
; return binds }
else do
{ binds1 <- tcSimplifyIPs given_ips' irreds
; return $ binds `unionBags` binds1
} }
where
doc = text "tcSimplifyIPs" <+> ppr given_ips
ip_set = mkNameSet (ipNamesOfInsts given_ips)
is_free inst = isFreeWrtIPs ip_set inst
try_me inst | is_free inst = Stop
| otherwise = ReduceMe
\end{code}
%************************************************************************
%* *
\subsection[bindsforlocalfuns]{@bindInstsOfLocalFuns@}
%* *
%************************************************************************
When doing a binding group, we may have @Insts@ of local functions.
For example, we might have...
\begin{verbatim}
let f x = x + 1
f.1 = f Int
f.2 = f Float
in
(f.1 5, f.2 6.7)
\end{verbatim}
The point is: we must drop the bindings for @f.1@ and @f.2@ here,
where @f@ is in scope; those @Insts@ must certainly not be passed
upwards towards the toplevel. If the @Insts@ were bindingified up
there, they would have unresolvable references to @f@.
We pass in an @init_lie@ of @Insts@ and a list of locallybound @Ids@.
For each method @Inst@ in the @init_lie@ that mentions one of the
@Ids@, we create a binding. We return the remaining @Insts@ (in an
@LIE@), as well as the @HsBinds@ generated.
\begin{code}
bindInstsOfLocalFuns :: [Inst] -> [TcId] -> TcM TcDictBinds
bindInstsOfLocalFuns wanteds local_ids
| null overloaded_ids = do
extendLIEs wanteds
return emptyLHsBinds
| otherwise
= do { (irreds, binds) <- gentleInferLoop doc for_me
; extendLIEs not_for_me
; extendLIEs irreds
; return binds }
where
doc = text "bindInsts" <+> ppr local_ids
overloaded_ids = filter is_overloaded local_ids
is_overloaded id = isOverloadedTy (idType id)
(for_me, not_for_me) = partition (isMethodFor overloaded_set) wanteds
overloaded_set = mkVarSet overloaded_ids
\end{code}
%************************************************************************
%* *
\subsection{Data types for the reduction mechanism}
%* *
%************************************************************************
The main control over context reduction is here
\begin{code}
data RedEnv
= RedEnv { red_doc :: SDoc
, red_try_me :: Inst -> WhatToDo
, red_improve :: Bool
, red_givens :: [Inst]
, red_stack :: (Int, [Inst])
}
mkRedEnv :: SDoc -> (Inst -> WhatToDo) -> [Inst] -> RedEnv
mkRedEnv doc try_me givens
= RedEnv { red_doc = doc, red_try_me = try_me,
red_givens = givens,
red_stack = (0,[]),
red_improve = True }
mkInferRedEnv :: SDoc -> (Inst -> WhatToDo) -> RedEnv
mkInferRedEnv doc try_me
= RedEnv { red_doc = doc, red_try_me = try_me,
red_givens = [],
red_stack = (0,[]),
red_improve = True }
mkNoImproveRedEnv :: SDoc -> (Inst -> WhatToDo) -> RedEnv
mkNoImproveRedEnv doc try_me
= RedEnv { red_doc = doc, red_try_me = try_me,
red_givens = [],
red_stack = (0,[]),
red_improve = True }
data WhatToDo
= ReduceMe
| Stop
data WantSCs = NoSCs | AddSCs
zonkRedEnv :: RedEnv -> TcM RedEnv
zonkRedEnv env
= do { givens' <- mapM zonkInst (red_givens env)
; return $ env {red_givens = givens'}
}
\end{code}
%************************************************************************
%* *
\subsection[reduce]{@reduce@}
%* *
%************************************************************************
Note [Ancestor Equalities]
~~~~~~~~~~~~~~~~~~~~~~~~~~
During context reduction, we add to the wanted equalities also those
equalities that (transitively) occur in superclass contexts of wanted
class constraints. Consider the following code
class a ~ Int => C a
instance C Int
If (C a) is wanted, we want to add (a ~ Int), which will be discharged by
substituting Int for a. Hence, we ultimately want (C Int), which we
discharge with the explicit instance.
\begin{code}
reduceContext :: RedEnv
-> [Inst]
-> TcM (ImprovementDone,
TcTyVarBinds,
TcDictBinds,
[Inst])
reduceContext env wanteds0
= do { traceTc (text "reduceContext" <+> (vcat [
text "----------------------",
red_doc env,
text "given" <+> ppr (red_givens env),
text "wanted" <+> ppr wanteds0,
text "----------------------"
]))
; ancestor_eqs <- ancestorEqualities wanteds0
; traceTc $ text "reduceContext: ancestor eqs" <+> ppr ancestor_eqs
; let wanteds = wanteds0 ++ ancestor_eqs
givens = red_givens env
; (givens',
wanteds',
tybinds,
normalise_binds) <- tcReduceEqs givens wanteds
; traceTc $ text "reduceContext: tcReduceEqs result" <+> vcat
[ppr givens', ppr wanteds', ppr tybinds,
ppr normalise_binds]
; (init_state, _) <- getLIE $ do
{ init_state <- foldlM addGiven emptyAvails givens'
; return init_state
}
; let (wanted_implics, wanted_dicts) = partition isImplicInst wanteds'
; (avails, extra_eqs) <- getLIE (reduceList env wanted_dicts init_state)
; (dict_binds,
bound_dicts,
dict_irreds) <- extractResults avails wanted_dicts
; traceTc $ text "reduceContext: extractResults" <+> vcat
[ppr avails, ppr wanted_dicts, ppr dict_binds]
; let implic_env = env { red_givens
= givens ++ bound_dicts ++
map wantedToLocalEqInst dict_irreds }
; (implic_binds_s, implic_irreds_s)
<- mapAndUnzipM (reduceImplication implic_env) wanted_implics
; let implic_binds = unionManyBags implic_binds_s
implic_irreds = concat implic_irreds_s
; let all_irreds = dict_irreds ++ implic_irreds ++ extra_eqs
avails_improved = availsImproved avails
eq_improved = anyBag (not . isCoVarBind) tybinds
improvedFlexible = avails_improved || eq_improved
reduced_dicts = not (isEmptyBag dict_binds)
improved = improvedFlexible || reduced_dicts
improvedHint = (if avails_improved then " [AVAILS]" else "") ++
(if eq_improved then " [EQ]" else "")
; traceTc (text "reduceContext end" <+> (vcat [
text "----------------------",
red_doc env,
text "given" <+> ppr givens,
text "wanted" <+> ppr wanteds0,
text "----",
text "tybinds" <+> ppr tybinds,
text "avails" <+> pprAvails avails,
text "improved =" <+> ppr improved <+> text improvedHint,
text "(all) irreds = " <+> ppr all_irreds,
text "dict-binds = " <+> ppr dict_binds,
text "implic-binds = " <+> ppr implic_binds,
text "----------------------"
]))
; return (improved,
tybinds,
normalise_binds `unionBags` dict_binds
`unionBags` implic_binds,
all_irreds)
}
where
isCoVarBind (TcTyVarBind tv _) = isCoVar tv
tcImproveOne :: Avails -> Inst -> TcM ImprovementDone
tcImproveOne avails inst
| not (isDict inst) = return False
| otherwise
= do { inst_envs <- tcGetInstEnvs
; let eqns = improveOne (classInstances inst_envs)
(dictPred inst, pprInstArising inst)
[ (dictPred p, pprInstArising p)
| p <- availsInsts avails, isDict p ]
; traceTc (text "improveOne" <+> ppr inst)
; unifyEqns eqns }
unifyEqns :: [(Equation, (PredType, SDoc), (PredType, SDoc))]
-> TcM ImprovementDone
unifyEqns [] = return False
unifyEqns eqns
= do { traceTc (ptext (sLit "Improve:") <+> vcat (map pprEquationDoc eqns))
; improved <- mapM unify eqns
; return $ or improved
}
where
unify ((qtvs, pairs), what1, what2)
= addErrCtxtM (mkEqnMsg what1 what2) $
do { let freeTyVars = unionVarSets (map tvs_pr pairs)
`minusVarSet` qtvs
; (_, _, tenv) <- tcInstTyVars (varSetElems qtvs)
; mapM_ (unif_pr tenv) pairs
; anyM isFilledMetaTyVar $ varSetElems freeTyVars
}
unif_pr tenv (ty1, ty2) = unifyType (substTy tenv ty1) (substTy tenv ty2)
tvs_pr (ty1, ty2) = tyVarsOfType ty1 `unionVarSet` tyVarsOfType ty2
pprEquationDoc :: (Equation, (PredType, SDoc), (PredType, SDoc)) -> SDoc
pprEquationDoc (eqn, (p1, _), (p2, _))
= vcat [pprEquation eqn, nest 2 (ppr p1), nest 2 (ppr p2)]
mkEqnMsg :: (TcPredType, SDoc) -> (TcPredType, SDoc) -> TidyEnv
-> TcM (TidyEnv, SDoc)
mkEqnMsg (pred1,from1) (pred2,from2) tidy_env
= do { pred1' <- zonkTcPredType pred1
; pred2' <- zonkTcPredType pred2
; let { pred1'' = tidyPred tidy_env pred1'
; pred2'' = tidyPred tidy_env pred2' }
; let msg = vcat [ptext (sLit "When using functional dependencies to combine"),
nest 2 (sep [ppr pred1'' <> comma, nest 2 from1]),
nest 2 (sep [ppr pred2'' <> comma, nest 2 from2])]
; return (tidy_env, msg) }
\end{code}
Note [Dictionary Improvement]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In reduceContext, we first reduce equalities and then class constraints.
However, the letter may expose further opportunities for the former. Hence,
we need to go around again if dictionary reduction produced any dictionary
bindings. The following example demonstrated the point:
data EX _x _y (p :: * -> *)
data ANY
class Base p
class Base (Def p) => Prop p where
type Def p
instance Base ()
instance Prop () where
type Def () = ()
instance (Base (Def (p ANY))) => Base (EX _x _y p)
instance (Prop (p ANY)) => Prop (EX _x _y p) where
type Def (EX _x _y p) = EX _x _y p
data FOO x
instance Prop (FOO x) where
type Def (FOO x) = ()
data BAR
instance Prop BAR where
type Def BAR = EX () () FOO
During checking the last instance declaration, we need to check the superclass
cosntraint Base (Def BAR), which family normalisation reduced to
Base (EX () () FOO). Chasing the instance for Base (EX _x _y p), gives us
Base (Def (FOO ANY)), which again requires family normalisation of Def to
Base () before we can finish.
The main contextreduction function is @reduce@. Here's its game plan.
\begin{code}
reduceList :: RedEnv -> [Inst] -> Avails -> TcM Avails
reduceList env@(RedEnv {red_stack = (n,stk)}) wanteds state
= do { traceTc (text "reduceList " <+> (ppr wanteds $$ ppr state))
; dopts <- getDOpts
; when (debugIsOn && (n > 8)) $ do
debugDumpTcRn (hang (ptext (sLit "Interesting! Context reduction stack depth") <+> int n)
2 (ifPprDebug (nest 2 (pprStack stk))))
; if n >= ctxtStkDepth dopts then
failWithTc (reduceDepthErr n stk)
else
go wanteds state }
where
go [] state = return state
go (w:ws) state = do { state' <- reduce (env {red_stack = (n+1, w:stk)}) w state
; go ws state' }
reduce :: RedEnv -> Inst -> Avails -> TcM Avails
reduce env wanted avails
| isEqInst wanted
= return avails
| Just _ <- findAvail avails wanted
= do { traceTc (text "reduce: found " <+> ppr wanted)
; return avails
}
| otherwise
= do { traceTc (text "reduce" <+> ppr wanted $$ ppr avails)
; case red_try_me env wanted of {
Stop -> try_simple (addIrred NoSCs);
ReduceMe -> do
{ (avails, lookup_result) <- reduceInst env avails wanted
; case lookup_result of
NoInstance -> addIrred AddSCs avails wanted
GenInst [] rhs -> addWanted AddSCs avails wanted rhs []
GenInst wanteds' rhs
-> do { avails1 <- addIrred NoSCs avails wanted
; avails2 <- reduceList env wanteds' avails1
; addWanted AddSCs avails2 wanted rhs wanteds' } }
} }
where
try_simple do_this_otherwise
= do { res <- lookupSimpleInst wanted
; case res of
GenInst [] rhs -> addWanted AddSCs avails wanted rhs []
_ -> do_this_otherwise avails wanted }
\end{code}
Note [RECURSIVE DICTIONARIES]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Consider
data D r = ZeroD | SuccD (r (D r));
instance (Eq (r (D r))) => Eq (D r) where
ZeroD == ZeroD = True
(SuccD a) == (SuccD b) = a == b
_ == _ = False;
equalDC :: D [] -> D [] -> Bool;
equalDC = (==);
We need to prove (Eq (D [])). Here's how we go:
d1 : Eq (D [])
by instance decl, holds if
d2 : Eq [D []]
where d1 = dfEqD d2
by instance decl of Eq, holds if
d3 : D []
where d2 = dfEqList d3
d1 = dfEqD d2
But now we can "tie the knot" to give
d3 = d1
d2 = dfEqList d3
d1 = dfEqD d2
and it'll even run! The trick is to put the thing we are trying to prove
(in this case Eq (D []) into the database before trying to prove its
contributing clauses.
Note [SUPERCLASSLOOP 2]
~~~~~~~~~~~~~~~~~~~~~~~~
We need to be careful when adding "the constaint we are trying to prove".
Suppose we are *given* d1:Ord a, and want to deduce (d2:C [a]) where
class Ord a => C a where
instance Ord [a] => C [a] where ...
Then we'll use the instance decl to deduce C [a] from Ord [a], and then add the
superclasses of C [a] to avails. But we must not overwrite the binding
for Ord [a] (which is obtained from Ord a) with a superclass selection or we'll just
build a loop!
Here's another variant, immortalised in tcrun020
class Monad m => C1 m
class C1 m => C2 m x
instance C2 Maybe Bool
For the instance decl we need to build (C1 Maybe), and it's no good if
we run around and add (C2 Maybe Bool) and its superclasses to the avails
before we search for C1 Maybe.
Here's another example
class Eq b => Foo a b
instance Eq a => Foo [a] a
If we are reducing
(Foo [t] t)
we'll first deduce that it holds (via the instance decl). We must not
then overwrite the Eq t constraint with a superclass selection!
At first I had a gross hack, whereby I simply did not add superclass constraints
in addWanted, though I did for addGiven and addIrred. This was suboptimal,
becuase it lost legitimate superclass sharing, and it still didn't do the job:
I found a very obscure program (now tcrun021) in which improvement meant the
simplifier got two bites a the cherry... so something seemed to be an Stop
first time, but reducible next time.
Now we implement the Right Solution, which is to check for loops directly
when adding superclasses. It's a bit like the occurs check in unification.
%************************************************************************
%* *
Reducing a single constraint
%* *
%************************************************************************
\begin{code}
reduceInst :: RedEnv -> Avails -> Inst -> TcM (Avails, LookupInstResult)
reduceInst _ avails other_inst
= do { result <- lookupSimpleInst other_inst
; return (avails, result) }
\end{code}
Note [Equational Constraints in Implication Constraints]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
An implication constraint is of the form
Given => Wanted
where Given and Wanted may contain both equational and dictionary
constraints. The delay and reduction of these two kinds of constraints
is distinct:
) In the generated code, wanted Dictionary constraints are wrapped up in an
implication constraint that is created at the code site where the wanted
dictionaries can be reduced via a letbinding. This letbound implication
constraint is deconstructed at the usesite of the wanted dictionaries.
) While the reduction of equational constraints is also delayed, the delay
is not manifest in the generated code. The required evidence is generated
in the code directly at the usesite. There is no letbinding and deconstruction
necessary. The main disadvantage is that we cannot exploit sharing as the
same evidence may be generated at multiple usesites. However, this disadvantage
is limited because it only concerns coercions which are erased.
The different treatment is motivated by the different in representation. Dictionary
constraints require manifest runtime dictionaries, while equations require coercions
which are types.
\begin{code}
reduceImplication :: RedEnv
-> Inst
-> TcM (TcDictBinds, [Inst])
\end{code}
Suppose we are simplifying the constraint
forall bs. extras => wanted
in the context of an overall simplification problem with givens 'givens'.
Note that
* The 'givens' need not mention any of the quantified type variables
e.g. forall {}. Eq a => Eq [a]
forall {}. C Int => D (Tree Int)
This happens when you have something like
data T a where
T1 :: Eq a => a -> T a
f :: T a -> Int
f x = ...(case x of { T1 v -> v==v })...
\begin{code}
reduceImplication env
orig_implic@(ImplicInst { tci_name = name, tci_loc = inst_loc,
tci_tyvars = tvs,
tci_given = extra_givens, tci_wanted = wanteds
})
= do {
; let try_me _ = ReduceMe
env' = env { red_givens = extra_givens ++ red_givens env
, red_doc = sep [ptext (sLit "reduceImplication for")
<+> ppr name,
nest 2 (parens $ ptext (sLit "within")
<+> red_doc env)]
, red_try_me = try_me }
; traceTc (text "reduceImplication" <+> vcat
[ ppr (red_givens env), ppr extra_givens,
ppr wanteds])
; (irreds, binds) <- checkLoop env' wanteds
; traceTc (text "reduceImplication result" <+> vcat
[ppr irreds, ppr binds])
;
; didntSolveWantedEqs <- allM wantedEqInstIsUnsolved wanteds
; let backOff = isEmptyLHsBinds binds &&
(not $ null irreds) &&
didntSolveWantedEqs
; if backOff then
return (emptyBag, [orig_implic])
else do
{ (simpler_implic_insts, bind)
<- makeImplicationBind inst_loc tvs extra_givens irreds
; let
(extra_eq_givens, extra_dict_givens) = partition isEqInst
extra_givens
eq_cotvs = map instToVar extra_eq_givens
dict_ids = map instToId extra_dict_givens
wrap_inline | null dict_ids = idHsWrapper
| otherwise = WpInline
co = wrap_inline
<.> mkWpTyLams tvs
<.> mkWpTyLams eq_cotvs
<.> mkWpLams dict_ids
<.> WpLet (binds `unionBags` bind)
rhs = mkLHsWrap co payload
loc = instLocSpan inst_loc
dict_bndrs = map (L loc . HsVar . instToId)
. filter (not . isEqInst)
$ wanteds
payload = mkBigLHsTup dict_bndrs
; traceTc (vcat [text "reduceImplication" <+> ppr name,
ppr simpler_implic_insts,
text "->" <+> ppr rhs])
; return (unitBag (L loc (VarBind (instToId orig_implic) rhs)),
simpler_implic_insts)
}
}
reduceImplication _ i = pprPanic "reduceImplication" (ppr i)
\end{code}
Note [Always inline implication constraints]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suppose an implication constraint floats out of an INLINE function.
Then although the implication has a single call site, it won't be
inlined. And that is bad because it means that even if there is really
*no* overloading (type signatures specify the exact types) there will
still be dictionary passing in the resulting code. To avert this,
we mark the implication constraints themselves as INLINE, at least when
there is no loss of sharing as a result.
Note [Freeness and implications]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
It's hard to say when an implication constraint can be floated out. Consider
forall {} Eq a => Foo [a]
The (Foo [a]) doesn't mention any of the quantified variables, but it
still might be partially satisfied by the (Eq a).
There is a useful special case when it *is* easy to partition the
constraints, namely when there are no 'givens'. Consider
forall {a}. () => Bar b
There are no 'givens', and so there is no reason to capture (Bar b).
We can let it float out. But if there is even one constraint we
must be much more careful:
forall {a}. C a b => Bar (m b)
because (C a b) might have a superclass (D b), from which we might
deduce (Bar [b]) when m later gets instantiated to []. Ha!
Here is an even more exotic example
class C a => D a b
Now consider the constraint
forall b. D Int b => C Int
We can satisfy the (C Int) from the superclass of D, so we don't want
to float the (C Int) out, even though it mentions no type variable in
the constraints!
One more example: the constraint
class C a => D a b
instance (C a, E c) => E (a,c)
constraint: forall b. D Int b => E (Int,c)
You might think that the (D Int b) can't possibly contribute
to solving (E (Int,c)), since the latter mentions 'c'. But
in fact it can, because solving the (E (Int,c)) constraint needs
dictionaries
C Int, E c
and the (C Int) can be satisfied from the superclass of (D Int b).
So we must still not float (E (Int,c)) out.
To think about: special cases for unary type classes?
Note [Pruning the givens in an implication constraint]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suppose we are about to form the implication constraint
forall tvs. Eq a => Ord b
The (Eq a) cannot contribute to the (Ord b), because it has no access to
the type variable 'b'. So we could filter out the (Eq a) from the givens.
But BE CAREFUL of the examples above in [Freeness and implications].
Doing so would be a bit tidier, but all the implication constraints get
simplified away by the optimiser, so it's no great win. So I don't take
advantage of that at the moment.
If you do, BE CAREFUL of wobbly type variables.
%************************************************************************
%* *
Avails and AvailHow: the pool of evidence
%* *
%************************************************************************
\begin{code}
data Avails = Avails !ImprovementDone !AvailEnv
type ImprovementDone = Bool
type AvailEnv = FiniteMap Inst AvailHow
data AvailHow
= IsIrred
| Given Inst
| Rhs
(LHsExpr TcId)
[Inst]
instance Outputable Avails where
ppr = pprAvails
pprAvails :: Avails -> SDoc
pprAvails (Avails imp avails)
= vcat [ ptext (sLit "Avails") <> (if imp then ptext (sLit "[improved]") else empty)
, nest 2 $ braces $
vcat [ sep [ppr inst, nest 2 (equals <+> ppr avail)]
| (inst,avail) <- fmToList avails ]]
instance Outputable AvailHow where
ppr = pprAvail
pprAvail :: AvailHow -> SDoc
pprAvail IsIrred = text "Irred"
pprAvail (Given x) = text "Given" <+> ppr x
pprAvail (Rhs rhs bs) = sep [text "Rhs" <+> ppr bs,
nest 2 (ppr rhs)]
extendAvailEnv :: AvailEnv -> Inst -> AvailHow -> AvailEnv
extendAvailEnv env inst avail = addToFM env inst avail
findAvailEnv :: AvailEnv -> Inst -> Maybe AvailHow
findAvailEnv env wanted = lookupFM env wanted
emptyAvails :: Avails
emptyAvails = Avails False emptyFM
findAvail :: Avails -> Inst -> Maybe AvailHow
findAvail (Avails _ avails) wanted = findAvailEnv avails wanted
elemAvails :: Inst -> Avails -> Bool
elemAvails wanted (Avails _ avails) = wanted `elemFM` avails
extendAvails :: Avails -> Inst -> AvailHow -> TcM Avails
extendAvails avails@(Avails imp env) inst avail
= do { imp1 <- tcImproveOne avails inst
; return (Avails (imp || imp1) (extendAvailEnv env inst avail)) }
availsInsts :: Avails -> [Inst]
availsInsts (Avails _ avails) = keysFM avails
availsImproved :: Avails -> ImprovementDone
availsImproved (Avails imp _) = imp
\end{code}
Extracting the bindings from a bunch of Avails.
The bindings do *not* come back sorted in dependency order.
We assume that they'll be wrapped in a big Rec, so that the
dependency analyser can sort them out later
\begin{code}
type DoneEnv = FiniteMap Inst [Id]
extractResults :: Avails
-> [Inst]
-> TcM (TcDictBinds,
[Inst],
[Inst])
extractResults (Avails _ avails) wanteds
= go emptyBag [] [] emptyFM wanteds
where
go :: TcDictBinds
-> [Inst]
-> [Inst]
-> DoneEnv
-> [Inst]
-> TcM (TcDictBinds, [Inst], [Inst])
go binds bound_dicts irreds _ []
= return (binds, bound_dicts, irreds)
go binds bound_dicts irreds done (w:ws)
| isEqInst w
= go binds bound_dicts (w:irreds) done' ws
| Just done_ids@(done_id : rest_done_ids) <- lookupFM done w
= if w_id `elem` done_ids then
go binds bound_dicts irreds done ws
else
go (add_bind (nlHsVar done_id)) bound_dicts irreds
(addToFM done w (done_id : w_id : rest_done_ids)) ws
| otherwise
= case findAvailEnv avails w of
Nothing -> pprTrace "Urk: extractResults" (ppr w) $
go binds bound_dicts irreds done ws
Just IsIrred -> go binds bound_dicts (w:irreds) done' ws
Just (Rhs rhs ws') -> go (add_bind rhs) (w:bound_dicts) irreds done' (ws' ++ ws)
Just (Given g) -> go binds' bound_dicts irreds (addToFM done w [g_id]) ws
where
g_id = instToId g
binds' | w_id == g_id = binds
| otherwise = add_bind (nlHsVar g_id)
where
w_id = instToId w
done' = addToFM done w [w_id]
add_bind rhs = addInstToDictBind binds w rhs
\end{code}
Note [No superclasses for Stop]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When we decide not to reduce an Inst
add it to avails, so that any other equal Insts will be commoned up
right here. However, we do *not* add superclasses. If we have
df::Floating a
dn::Num a
but a is not bound here, then we *don't* want to derive dn from df
here lest we lose sharing.
\begin{code}
addWanted :: WantSCs -> Avails -> Inst -> LHsExpr TcId -> [Inst] -> TcM Avails
addWanted want_scs avails wanted rhs_expr wanteds
= addAvailAndSCs want_scs avails wanted avail
where
avail = Rhs rhs_expr wanteds
addGiven :: Avails -> Inst -> TcM Avails
addGiven avails given
= addAvailAndSCs want_scs avails given (Given given)
where
want_scs = case instLocOrigin (instLoc given) of
NoScOrigin -> NoSCs
_other -> AddSCs
\end{code}
\begin{code}
addIrred :: WantSCs -> Avails -> Inst -> TcM Avails
addIrred want_scs avails irred = ASSERT2( not (irred `elemAvails` avails), ppr irred $$ ppr avails )
addAvailAndSCs want_scs avails irred IsIrred
addAvailAndSCs :: WantSCs -> Avails -> Inst -> AvailHow -> TcM Avails
addAvailAndSCs want_scs avails inst avail
| not (isClassDict inst) = extendAvails avails inst avail
| NoSCs <- want_scs = extendAvails avails inst avail
| otherwise = do { traceTc (text "addAvailAndSCs" <+> vcat [ppr inst, ppr deps])
; avails' <- extendAvails avails inst avail
; addSCs is_loop avails' inst }
where
is_loop pred = any (`tcEqType` mkPredTy pred) dep_tys
deps = findAllDeps (unitVarSet (instToVar inst)) avail
dep_tys = map idType (varSetElems deps)
findAllDeps :: IdSet -> AvailHow -> IdSet
findAllDeps so_far (Rhs _ kids) = foldl find_all so_far kids
findAllDeps so_far _ = so_far
find_all :: IdSet -> Inst -> IdSet
find_all so_far kid
| isEqInst kid = so_far
| kid_id `elemVarSet` so_far = so_far
| Just avail <- findAvail avails kid = findAllDeps so_far' avail
| otherwise = so_far'
where
so_far' = extendVarSet so_far kid_id
kid_id = instToId kid
addSCs :: (TcPredType -> Bool) -> Avails -> Inst -> TcM Avails
addSCs is_loop avails dict
= ASSERT( isDict dict )
do { sc_dicts <- newDictBndrs (instLoc dict) sc_theta'
; foldlM add_sc avails (zipEqual "add_scs" sc_dicts sc_sels) }
where
(clas, tys) = getDictClassTys dict
(tyvars, sc_theta, sc_sels, _) = classBigSig clas
sc_theta' = filter (not . isEqPred) $
substTheta (zipTopTvSubst tyvars tys) sc_theta
add_sc avails (sc_dict, sc_sel)
| is_loop (dictPred sc_dict) = return avails
| is_given sc_dict = return avails
| otherwise = do { avails' <- extendAvails avails sc_dict (Rhs sc_sel_rhs [dict])
; addSCs is_loop avails' sc_dict }
where
sc_sel_rhs = L (instSpan dict) (HsWrap co_fn (HsVar sc_sel))
co_fn = WpApp (instToVar dict) <.> mkWpTyApps tys
is_given :: Inst -> Bool
is_given sc_dict = case findAvail avails sc_dict of
Just (Given _) -> True
_ -> False
ancestorEqualities :: [Inst] -> TcM [Inst]
ancestorEqualities
= mapM mkWantedEqInst
. filter isEqPred
. bagToList
. addAEsToBag emptyBag
. map dictPred
. filter isClassDict
where
addAEsToBag :: Bag PredType -> [PredType] -> Bag PredType
addAEsToBag bag [] = bag
addAEsToBag bag (pred:preds)
| pred `elemBag` bag = addAEsToBag bag preds
| isEqPred pred = addAEsToBag bagWithPred preds
| isClassPred pred = addAEsToBag bagWithPred predsWithSCs
| otherwise = addAEsToBag bag preds
where
bagWithPred = bag `snocBag` pred
predsWithSCs = preds ++ substTheta (zipTopTvSubst tyvars tys) sc_theta
(tyvars, sc_theta, _, _) = classBigSig clas
(clas, tys) = getClassPredTys pred
\end{code}
%************************************************************************
%* *
\section{tcSimplifyTop: defaulting}
%* *
%************************************************************************
@tcSimplifyTop@ is called once per module to simplify all the constant
and ambiguous Insts.
We need to be careful of one case. Suppose we have
instance Num a => Num (Foo a b) where ...
and @tcSimplifyTop@ is given a constraint (Num (Foo x y)). Then it'll simplify
to (Num x), and default x to Int. But what about y??
It's OK: the final zonking stage should zap y to (), which is fine.
\begin{code}
tcSimplifyTop, tcSimplifyInteractive :: [Inst] -> TcM TcDictBinds
tcSimplifyTop wanteds
= tc_simplify_top doc False wanteds
where
doc = text "tcSimplifyTop"
tcSimplifyInteractive wanteds
= tc_simplify_top doc True wanteds
where
doc = text "tcSimplifyInteractive"
tc_simplify_top :: SDoc -> Bool -> [Inst] -> TcM (Bag (LHsBind TcId))
tc_simplify_top doc interactive wanteds
= do { dflags <- getDOpts
; wanteds <- zonkInsts wanteds
; mapM_ zonkTopTyVar (varSetElems (tyVarsOfInsts wanteds))
; traceTc (text "tc_simplify_top 0: " <+> ppr wanteds)
; (irreds1, binds1) <- tryHardCheckLoop doc1 wanteds
; traceTc (text "tc_simplify_top 1: " <+> ppr irreds1)
; (irreds2, binds2) <- approximateImplications doc2 (\_ -> True) irreds1
; traceTc (text "tc_simplify_top 2: " <+> ppr irreds2)
; (irreds3, binds3) <- disambiguate doc3 interactive dflags irreds2
; let (bad_ips, non_ips) = partition isIPDict irreds3
(ambigs, others) = partition isTyVarDict non_ips
; topIPErrs bad_ips
; addNoInstanceErrs others
; addTopAmbigErrs ambigs
; return (binds1 `unionBags` binds2 `unionBags` binds3) }
where
doc1 = doc <+> ptext (sLit "(first round)")
doc2 = doc <+> ptext (sLit "(approximate)")
doc3 = doc <+> ptext (sLit "(disambiguate)")
\end{code}
If a dictionary constrains a type variable which is
* not mentioned in the environment
* and not mentioned in the type of the expression
then it is ambiguous. No further information will arise to instantiate
the type variable; nor will it be generalised and turned into an extra
parameter to a function.
It is an error for this to occur, except that Haskell provided for
certain rules to be applied in the special case of numeric types.
Specifically, if
* at least one of its classes is a numeric class, and
* all of its classes are numeric or standard
then the type variable can be defaulted to the first type in the
defaulttype list which is an instance of all the offending classes.
So here is the function which does the work. It takes the ambiguous
dictionaries and either resolves them (producing bindings) or
complains. It works by splitting the dictionary list by type
variable, and using @disambigOne@ to do the real business.
@disambigOne@ assumes that its arguments dictionaries constrain all
the same type variable.
ADR Comment 20/6/94: I've changed the @CReturnable@ case to default to
@()@ instead of @Int@. I reckon this is the Right Thing to do since
the most common use of defaulting is code like:
\begin{verbatim}
_ccall_ foo `seqPrimIO` bar
\end{verbatim}
Since we're not using the result of @foo@, the result if (presumably)
@void@.
\begin{code}
disambiguate :: SDoc -> Bool -> DynFlags -> [Inst] -> TcM ([Inst], TcDictBinds)
disambiguate doc interactive dflags insts
| null insts
= return (insts, emptyBag)
| null defaultable_groups
= do { traceTc (text "disambigutate, no defaultable groups" <+> vcat [ppr unaries, ppr insts, ppr bad_tvs, ppr defaultable_groups])
; return (insts, emptyBag) }
| otherwise
= do {
default_tys <- getDefaultTys extended_defaulting ovl_strings
; traceTc (text "disambiguate1" <+> vcat [ppr insts, ppr unaries, ppr bad_tvs, ppr defaultable_groups])
; mapM_ (disambigGroup default_tys) defaultable_groups
; tryHardCheckLoop doc insts }
where
extended_defaulting = interactive || dopt Opt_ExtendedDefaultRules dflags
ovl_strings = dopt Opt_OverloadedStrings dflags
unaries :: [(Inst, Class, TcTyVar)]
bad_tvs :: TcTyVarSet
(unaries, bad_tvs_s) = partitionWith find_unary insts
bad_tvs = unionVarSets bad_tvs_s
find_unary d@(Dict {tci_pred = ClassP cls [ty]})
| Just tv <- tcGetTyVar_maybe ty = Left (d,cls,tv)
find_unary inst = Right (tyVarsOfInst inst)
defaultable_groups :: [[(Inst,Class,TcTyVar)]]
defaultable_groups = filter defaultable_group (equivClasses cmp_tv unaries)
cmp_tv (_,_,tv1) (_,_,tv2) = tv1 `compare` tv2
defaultable_group :: [(Inst,Class,TcTyVar)] -> Bool
defaultable_group ds@((_,_,tv):_)
= isTyConableTyVar tv
&& not (tv `elemVarSet` bad_tvs)
&& defaultable_classes [c | (_,c,_) <- ds]
defaultable_group [] = panic "defaultable_group"
defaultable_classes clss
| extended_defaulting = any isInteractiveClass clss
| otherwise = all is_std_class clss && (any is_num_class clss)
isInteractiveClass cls
= is_num_class cls || (classKey cls `elem` [showClassKey, eqClassKey, ordClassKey])
is_num_class cls = isNumericClass cls || (ovl_strings && (cls `hasKey` isStringClassKey))
is_std_class cls = isStandardClass cls || (ovl_strings && (cls `hasKey` isStringClassKey))
disambigGroup :: [Type]
-> [(Inst,Class,TcTyVar)]
-> TcM ()
disambigGroup default_tys dicts
= do { mb_chosen_ty <- try_default default_tys
; case mb_chosen_ty of
Nothing -> return ()
Just chosen_ty -> do { _ <- unifyType chosen_ty (mkTyVarTy tyvar)
; warnDefault dicts chosen_ty } }
where
(_,_,tyvar) = ASSERT(not (null dicts)) head dicts
classes = [c | (_,c,_) <- dicts]
try_default [] = return Nothing
try_default (default_ty : default_tys)
= tryTcLIE_ (try_default default_tys) $
do { tcSimplifyDefault [mkClassPred clas [default_ty] | clas <- classes]
; return (Just default_ty)
}
getDefaultTys :: Bool -> Bool -> TcM [Type]
getDefaultTys extended_deflts ovl_strings
= do { mb_defaults <- getDeclaredDefaultTys
; case mb_defaults of {
Just tys -> return tys ;
Nothing -> do
{ integer_ty <- tcMetaTy integerTyConName
; checkWiredInTyCon doubleTyCon
; string_ty <- tcMetaTy stringTyConName
; return (opt_deflt extended_deflts unitTy
++
[integer_ty,doubleTy]
++
opt_deflt ovl_strings string_ty) } } }
where
opt_deflt True ty = [ty]
opt_deflt False _ = []
\end{code}
Note [Default unitTy]
~~~~~~~~~~~~~~~~~~~~~
In interative mode (or with XExtendedDefaultRules) we add () as the first type we
try when defaulting. This has very little real impact, except in the following case.
Consider:
Text.Printf.printf "hello"
This has type (forall a. IO a); it prints "hello", and returns 'undefined'. We don't
want the GHCi repl loop to try to print that 'undefined'. The neatest thing is to
default the 'a' to (), rather than to Integer (which is what would otherwise happen;
and then GHCi doesn't attempt to print the (). So in interactive mode, we add
() to the list of defaulting types. See Trac #1200.
Note [Avoiding spurious errors]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When doing the unification for defaulting, we check for skolem
type variables, and simply don't default them. For example:
f = (*)
g :: Num a => a -> a
g x = f x x
Here, we get a complaint when checking the type signature for g,
that g isn't polymorphic enough; but then we get another one when
dealing with the (Num a) context arising from f's definition;
we try to unify a with Int (to default it), but find that it's
already been unified with the rigid variable from g's type sig
%************************************************************************
%* *
\subsection[simple]{@Simple@ versions}
%* *
%************************************************************************
Much simpler versions when there are no bindings to make!
@tcSimplifyThetas@ simplifies classtype constraints formed by
@deriving@ declarations and when specialising instances. We are
only interested in the simplified bunch of class/type constraints.
It simplifies to constraints of the form (C a b c) where
a,b,c are type variables. This is required for the context of
instance declarations.
\begin{code}
tcSimplifyDeriv :: InstOrigin
-> [TyVar]
-> ThetaType
-> TcM ThetaType
tcSimplifyDeriv orig tyvars theta
= do { (tvs, _, tenv) <- tcInstTyVars tyvars
; wanteds <- newDictBndrsO orig (substTheta tenv theta)
; (irreds, _) <- tryHardCheckLoop doc wanteds
; let (tv_dicts, others) = partition ok irreds
(tidy_env, tidy_insts) = tidyInsts others
; reportNoInstances tidy_env Nothing [alt_fix] tidy_insts
; let rev_env = zipTopTvSubst tvs (mkTyVarTys tyvars)
simpl_theta = substTheta rev_env (map dictPred tv_dicts)
; return simpl_theta }
where
doc = ptext (sLit "deriving classes for a data type")
ok dict | isDict dict = validDerivPred (dictPred dict)
| otherwise = False
alt_fix = vcat [ptext (sLit "use a standalone 'deriving instance' declaration instead,"),
ptext (sLit "so you can specify the instance context yourself")]
\end{code}
@tcSimplifyDefault@ just checks classtype constraints, essentially;
used with \tr{default} declarations. We are only interested in
whether it worked or not.
\begin{code}
tcSimplifyDefault :: ThetaType
-> TcM ()
tcSimplifyDefault theta = do
wanteds <- newDictBndrsO DefaultOrigin theta
(irreds, _) <- tryHardCheckLoop doc wanteds
addNoInstanceErrs irreds
if null irreds then
return ()
else
traceTc (ptext (sLit "tcSimplifyDefault failing")) >> failM
where
doc = ptext (sLit "default declaration")
\end{code}
%************************************************************************
%* *
\section{Errors and contexts}
%* *
%************************************************************************
ToDo: for these error messages, should we note the location as coming
from the insts, or just whatever seems to be around in the monad just
now?
\begin{code}
groupErrs :: ([Inst] -> TcM ())
-> [Inst]
-> TcM ()
groupErrs _ []
= return ()
groupErrs report_err (inst:insts)
= do { do_one (inst:friends)
; groupErrs report_err others }
where
(friends, others) = partition is_friend insts
loc_msg = showSDoc (pprInstLoc (instLoc inst))
is_friend friend = showSDoc (pprInstLoc (instLoc friend)) == loc_msg
do_one insts = setInstCtxt (instLoc (head insts)) (report_err insts)
addInstLoc :: [Inst] -> Message -> Message
addInstLoc insts msg = msg $$ nest 2 (pprInstArising (head insts))
addTopIPErrs :: [Name] -> [Inst] -> TcM ()
addTopIPErrs _ []
= return ()
addTopIPErrs bndrs ips
= do { dflags <- getDOpts
; addErrTcM (tidy_env, mk_msg dflags tidy_ips) }
where
(tidy_env, tidy_ips) = tidyInsts ips
mk_msg dflags ips
= vcat [sep [ptext (sLit "Implicit parameters escape from"),
nest 2 (ptext (sLit "the monomorphic top-level binding")
<> plural bndrs <+> ptext (sLit "of")
<+> pprBinders bndrs <> colon)],
nest 2 (vcat (map ppr_ip ips)),
monomorphism_fix dflags]
ppr_ip ip = pprPred (dictPred ip) <+> pprInstArising ip
topIPErrs :: [Inst] -> TcM ()
topIPErrs dicts
= groupErrs report tidy_dicts
where
(tidy_env, tidy_dicts) = tidyInsts dicts
report dicts = addErrTcM (tidy_env, mk_msg dicts)
mk_msg dicts = addInstLoc dicts (ptext (sLit "Unbound implicit parameter") <>
plural tidy_dicts <+> pprDictsTheta tidy_dicts)
addNoInstanceErrs :: [Inst]
-> TcM ()
addNoInstanceErrs insts
= do { let (tidy_env, tidy_insts) = tidyInsts insts
; reportNoInstances tidy_env Nothing [] tidy_insts }
reportNoInstances
:: TidyEnv
-> Maybe (InstLoc, [Inst])
-> [SDoc]
-> [Inst]
-> TcM ()
reportNoInstances tidy_env mb_what alt_fix insts
= groupErrs (report_no_instances tidy_env mb_what alt_fix) insts
report_no_instances :: TidyEnv -> Maybe (InstLoc, [Inst]) -> [SDoc] -> [Inst] -> TcM ()
report_no_instances tidy_env mb_what alt_fixes insts
= do { inst_envs <- tcGetInstEnvs
; let (implics, insts1) = partition isImplicInst insts
(insts2, overlaps) = partitionWith (check_overlap inst_envs) insts1
(eqInsts, insts3) = partition isEqInst insts2
; traceTc (text "reportNoInstances" <+> vcat
[ppr insts, ppr implics, ppr insts1, ppr insts2])
; mapM_ complain_implic implics
; mapM_ (\doc -> addErrTcM (tidy_env, doc)) overlaps
; groupErrs complain_no_inst insts3
; mapM_ (addErrTcM . mk_eq_err) eqInsts
}
where
complain_no_inst insts = addErrTcM (tidy_env, mk_no_inst_err insts)
complain_implic inst
= reportNoInstances tidy_env
(Just (tci_loc inst, tci_given inst))
alt_fixes (tci_wanted inst)
check_overlap :: (InstEnv,InstEnv) -> Inst -> Either Inst SDoc
check_overlap inst_envs wanted
| not (isClassDict wanted) = Left wanted
| otherwise
= case lookupInstEnv inst_envs clas tys of
([], _) -> Left wanted
([_],[])
| debugIsOn -> pprPanic "reportNoInstance" (ppr wanted)
res -> Right (mk_overlap_msg wanted res)
where
(clas,tys) = getDictClassTys wanted
mk_overlap_msg dict (matches, unifiers)
= ASSERT( not (null matches) )
vcat [ addInstLoc [dict] ((ptext (sLit "Overlapping instances for")
<+> pprPred (dictPred dict))),
sep [ptext (sLit "Matching instances") <> colon,
nest 2 (vcat [pprInstances ispecs, pprInstances unifiers])],
if not (isSingleton matches)
then
empty
else
ASSERT( not (null unifiers) )
parens (vcat [ptext (sLit "The choice depends on the instantiation of") <+>
quotes (pprWithCommas ppr (varSetElems (tyVarsOfInst dict))),
ptext (sLit "To pick the first instance above, use -XIncoherentInstances"),
ptext (sLit "when compiling the other instance declarations")])]
where
ispecs = [ispec | (ispec, _) <- matches]
mk_eq_err :: Inst -> (TidyEnv, SDoc)
mk_eq_err inst = misMatchMsg tidy_env (eqInstTys inst)
mk_no_inst_err insts
| null insts = empty
| Just (loc, givens) <- mb_what,
not (isEmptyVarSet (tyVarsOfInsts insts))
= vcat [ addInstLoc insts $
sep [ ptext (sLit "Could not deduce") <+> pprDictsTheta insts
, nest 2 $ ptext (sLit "from the context") <+> pprDictsTheta givens]
, show_fixes (fix1 loc : fixes2 ++ alt_fixes) ]
| otherwise
= vcat [ addInstLoc insts $
ptext (sLit "No instance") <> plural insts
<+> ptext (sLit "for") <+> pprDictsTheta insts
, show_fixes (fixes2 ++ alt_fixes) ]
where
fix1 loc = sep [ ptext (sLit "add") <+> pprDictsTheta insts
<+> ptext (sLit "to the context of"),
nest 2 (ppr (instLocOrigin loc)) ]
fixes2 | null instance_dicts = []
| otherwise = [sep [ptext (sLit "add an instance declaration for"),
pprDictsTheta instance_dicts]]
instance_dicts = [d | d <- insts, isClassDict d, not (isTyVarDict d)]
show_fixes :: [SDoc] -> SDoc
show_fixes [] = empty
show_fixes (f:fs) = sep [ptext (sLit "Possible fix:"),
nest 2 (vcat (f : map (ptext (sLit "or") <+>) fs))]
addTopAmbigErrs :: [Inst] -> TcRn ()
addTopAmbigErrs dicts
= ifErrsM (return ()) $
mapM_ report (equivClasses cmp [(d, tvs_of d) | d <- tidy_dicts])
where
(tidy_env, tidy_dicts) = tidyInsts dicts
tvs_of :: Inst -> [TcTyVar]
tvs_of d = varSetElems (tyVarsOfInst d)
cmp (_,tvs1) (_,tvs2) = tvs1 `compare` tvs2
report :: [(Inst,[TcTyVar])] -> TcM ()
report pairs@((inst,tvs) : _) = do
(tidy_env, mono_msg) <- mkMonomorphismMsg tidy_env tvs
setSrcSpan (instSpan inst) $
addErrTcM (tidy_env, msg $$ mono_msg)
where
dicts = map fst pairs
msg = sep [text "Ambiguous type variable" <> plural tvs <+>
pprQuotedList tvs <+> in_msg,
nest 2 (pprDictsInFull dicts)]
in_msg = text "in the constraint" <> plural dicts <> colon
report [] = panic "addTopAmbigErrs"
mkMonomorphismMsg :: TidyEnv -> [TcTyVar] -> TcM (TidyEnv, Message)
mkMonomorphismMsg tidy_env inst_tvs
= do { dflags <- getDOpts
; (tidy_env, docs) <- findGlobals (mkVarSet inst_tvs) tidy_env
; return (tidy_env, mk_msg dflags docs) }
where
mk_msg _ _ | any isRuntimeUnk inst_tvs
= vcat [ptext (sLit "Cannot resolve unknown runtime types:") <+>
(pprWithCommas ppr inst_tvs),
ptext (sLit "Use :print or :force to determine these types")]
mk_msg _ [] = ptext (sLit "Probable fix: add a type signature that fixes these type variable(s)")
mk_msg dflags docs
= vcat [ptext (sLit "Possible cause: the monomorphism restriction applied to the following:"),
nest 2 (vcat docs),
monomorphism_fix dflags]
monomorphism_fix :: DynFlags -> SDoc
monomorphism_fix dflags
= ptext (sLit "Probable fix:") <+> vcat
[ptext (sLit "give these definition(s) an explicit type signature"),
if dopt Opt_MonomorphismRestriction dflags
then ptext (sLit "or use -XNoMonomorphismRestriction")
else empty]
warnDefault :: [(Inst, Class, Var)] -> Type -> TcM ()
warnDefault ups default_ty = do
warn_flag <- doptM Opt_WarnTypeDefaults
setInstCtxt (instLoc (head (dicts))) (warnTc warn_flag warn_msg)
where
dicts = [d | (d,_,_) <- ups]
(_, tidy_dicts) = tidyInsts dicts
warn_msg = vcat [ptext (sLit "Defaulting the following constraint(s) to type") <+>
quotes (ppr default_ty),
pprDictsInFull tidy_dicts]
reduceDepthErr :: Int -> [Inst] -> SDoc
reduceDepthErr n stack
= vcat [ptext (sLit "Context reduction stack overflow; size =") <+> int n,
ptext (sLit "Use -fcontext-stack=N to increase stack size to N"),
nest 4 (pprStack stack)]
pprStack :: [Inst] -> SDoc
pprStack stack = vcat (map pprInstInFull stack)
\end{code}