开发者

Tuple bang patterns

开发者 https://www.devze.com 2023-03-16 11:34 出处:网络
I understand that in: f x = x + 1 where !y = undefined the meaning of the bang pattern is that y is to be evaluated before f.

I understand that in:

f x = x + 1 where !y = undefined

the meaning of the bang pattern is that y is to be evaluated before f.

Similarly:

f x = x + 1 where !(!a, !b) = (undefined, undefined)

the meaning is the same, w.r.t x and y.

But what do the bang patterns mean in:

f x = x + 1 where (!a, !b) = (undefined, undefined)

It doesn't seem to cause undefined to be e开发者_开发技巧valuated. When do in-tuple bang patterns come into effect? If the pattern's tuple is forced? Can anyone give an example where (!a, !b) = (..) differs from (a, b) = (..)?


A bang pattern on the tuple itself will force evaluation of the tuple but not its elements. Bang patterns on the tuple elements will force them whenever the tuple itself is evaluated.

Here's an example of the differing behavior:

Prelude> let x = a + 1 where (a, b) = (1, undefined)
Prelude> x
2
Prelude> let x = a + 1 where (!a, !b) = (1, undefined)
Prelude> x
*** Exception: Prelude.undefined


If you translate it to let:

f x = let (!a, !b) = (undefined, undefined) in x + 1

Here, you create a tuple containing (a, b), and when the tuple is evaluated, both a and b are.

But because the tuple is never evaluated, neither a nor b are. This is basically the same as writing:

f x = let y = undefined `seq` 4 in x + 1

Since y is never evaluated, neither is undefined.

0

精彩评论

暂无评论...
验证码 换一张
取 消