Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know why I might want to define a tree.

I have no idea why I want to define an algebraic datatype.

Note "you can define trees using algebraic datatypes" doesn't tell me why I want to use algebraic datatypes.



I want to use them so I don't have to write very much code, so that I can easily reason about my program by directly substituting symbols with their definitions, so that I can use pattern matching against instances of that type, so that I can group them into type classes and know what operations I'll have available to work with them... etc.


It's more or less the haskell version of a struct or a public class with fields and no methods, except that you can specify more than one shape for the same type.

    data MyMaybe a = Just a | Nothing
...means in Java speak "MyMaybe<a> is an abstract generic superclass with a final singleton subclass Nothing<a> having no fields, and a final subclass Just<a> having one field of type a"

You then write functions that take that data and use it.


I thought trees where algebraic datatypes?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: