开发者

Why are polymorphic values not inferred in Haskell?

开发者 https://www.devze.com 2023-01-24 20:45 出处:网络
Numeric literals have a polymorphic type: *Main> :t 3 3 :: (Num t) => t But if I bind a variable to such a literal, the polymorphism is lost:

Numeric literals have a polymorphic type:

*Main> :t 3
3 :: (Num t) => t

But if I bind a variable to such a literal, the polymorphism is lost:

x = 3
...
*Main> 开发者_Python百科:t x
x :: Integer

If I define a function, on the other hand, it is of course polymorphic:

f x = 3
...
*Main> :t f
f :: (Num t1) => t -> t1

I could provide a type signature to ensure the x remains polymorphic:

x :: Num a => a
x = 3
...
*Main> :t x
x :: (Num a) => a

But why is this necessary? Why isn't the polymorphic type inferred?


It's the monomorphism restriction which says that all values, which are defined without parameters and don't have an explicit type annotation, should have a monomorphic type. This restriction can be disabled in ghc and ghci using -XNoMonomorphismRestriction.

The reason for the restriction is that without this restriction long_calculation 42 would be evaluated twice, while most people would probably expect/want it to only be evaluated once:

longCalculation :: Num a => a -> a
longCalculation = ...

x = longCalculation 42

main = print $ x + x


To expand on sepp2k's answer a bit: if you try to compile the following (or load it into GHCi), you get an error:

import Data.List (sort)
f = head . sort

This is a violation of the monomorphism restriction because we have a class constraint (introduced by sort) but no explicit arguments: we're (somewhat mysteriously) told that we have an Ambiguous type variable in the constraint Ord a.

Your example (let x = 3) has a similarly ambiguous type variable, but it doesn't give the same error, because it's saved by Haskell's "defaulting" rules:

Any monomorphic type variables that remain when type inference for an entire module is complete, are considered ambiguous, and are resolved to particular types using the defaulting rules (Section 4.3.4).

See this answer for more information about the defaulting rules—the important point is that they only work for certain numeric classes, so x = 3 is fine while f = sort isn't.

As a side note: if you'd prefer that x = 3 end up being an Int instead of an Integer, and y = 3.0 be a Rational instead of a Double, you can use a "default declaration" to override the default defaulting rules:

default (Int, Rational)
0

精彩评论

暂无评论...
验证码 换一张
取 消