id,summary,reporter,owner,description,type,status,priority,milestone,component,version,resolution,keywords,cc,os,architecture,failure,testcase,blockedby,blocking,related,differential,wikipage
3650,Add a Natural number type to the pre-defined basic types.,JohnD,,"See
[http://hackage.haskell.org/trac/haskell-prime/wiki/Natural]
concerning ""Add a Natural number type to the pre-defined basic types."" I will add a link from that page to this one. That was the Haskell prime wiki which is my understanding concerns discussions concerning the Haskell language in general. It seems conceivable that GHC may be uniquely positioned to address this problem. Haskell was designed to explorer a problem that is of academic interest. Why else is the language lazy and statically typed? It's all about purity. To provide a synopsis it dates back to the discovery that untyped lambda calculus is a model of computation and not that of logic.
At first blush one might conclude that there is good reason why there is no natural number type. Natural numbers are problematic. For example, you can add them, but you cannot in general subtract them. In the general case subtraction yields an integer type, not a natural number type. This problem could be solved through duck typing, but that isn't what Haskell or ML is about. With duck typing you can compare two values at runtime to ensure that the result yields a natural number thus upholding the type system. Though duck typing is useful it is naturally off the table.
A natural number type could be included in the language, but such an addition without the benefit of duck typing might be looked upon as ad hoc.
You get the nat type for abbreviated natural number in proof assistants. Some proof assistants are built using the Haskell language in fact. It is my impression that ML has been around longer than Haskell; consequently, most proof assistants are built on ML and not Haskell. I recall that GHC has moved from System F to a subset of dependent types and can therefore handle some problems involving dependent types. Nat seems to me to potentially be such a problem. Dependent types and proof assistants are like bread and butter. You prove by semi-automated methods at compile time that one number is necessarily not less than another and thereby prove that the difference at runtime cannot be negative.
A natural number is an integer that depends on a number, a lower bound, namely zero. As such it seems likely that natural numbers are a dependent type. I am not sufficiently familiar with GHC at the present time to assess whether or not if the natural number dependent type can be resolved automatically by GHC. It is, however, something I have wondered about and may be something worth considering.
With the inclusion of a natural number type if it proves feasible suggests that interval types may also be possible. The Ada language has an interval type in that array bounds are made explicit. The point is there are special cases where such types can be resolved at compile time.",proposal,closed,normal,,Compiler (Type checker),,wontfix,,,Unknown/Multiple,Unknown/Multiple,None/Unknown,,,,#9818,,