Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's ... sad. Sum types and product types are simultaneously easier to understand and are more useful to know about than inheritance.

Sum types are simply types whose values can be one of several choices - surely that's something a child can reasonably understand!

Product types might be a little harder to grok - they're essentially structs - but surely no harder to understand than inheritance and the whole IS_A/HAS_A mess



No, sum types and product types relate to values, inheritance and composition relate to objects, which include values AND behavior. Inheritance is therefore abused to share behavior. This distinction is important, and relates to some common OOP patterns of popular languages.

In that model, composition of behaviour over inheritance is definitly important to understand, as overriding behaviour gets tricky quickly, and functionality is spread accross many class in the inheritance chain, it becomes difficult to follow and see the full picture.


> No, sum types and product types relate to values, inheritance and composition relate to objects, which include values AND behavior. Inheritance is therefore abused to share behavior

I remember back in high school or early college I was having a conversation with a programmer about when the right time to use inheritance is, and he stated that "inheritance should be used for polymorphism, not just to share code", which to this day I think it's a fairly good heuristic for determining whether or not inheritance is the right tool for a given problem.


Inheritance is not a way to express sum types, it's a form of subtyping. A sum type is like a discriminated union, it can only be one thing at a time. Subtyping allows a value to have multiple (related) types simultaneously, which is much more expressive. I suppose you can use one level of single inheritance to emulate a sum type, but you could just as well emulate it with a discriminated union in Go, e.g. a struct with a type identifier and an interface{} holding the value.


> Inheritance is not a way to express sum types, it's a form of subtyping.

A distinction without a difference. This is probably most visible in languages like Scala and Kotlin, which implement algebraic data types by way of inheritance.

That inheritance creates a subtyping relationship is irrelevant; there's a similar subtyping relationship between variants (or groups of variants) and the overarching type using a traditional sum type notation as in Haskell or ML. This is most clearly visible in OCaml's polymorphic variants [1, 2].

[1] http://caml.inria.fr/pub/docs/manual-ocaml-400/manual006.htm...

[2] https://stackoverflow.com/questions/16773384/why-does-ocaml-...


Pony (https://ponylang.org) uses sum types, perhaps excessively. Just yesterday, I wrote:

    (None | (In, USize))
I.e., None (the null valued type) or a pair made of a type variable (In) and a USize.

The thing is, the values that satisfy this type are not subtypes of None and a pair. (That would be silly, given None.) Such a value is either None, or a pair.


> The thing is, the values that satisfy this type are not subtypes of None and a pair. (That would be silly, given None.) Such a value is either None, or a pair.

Unless I'm misreading you, this seems to be a misunderstanding of what sum types are. A sum type `T = A | B` represents the (disjoint) union of all possible values of `A` and of all possible values of `B`, simply put, not the intersection (as you seem to indicate by the phrasing of "not subtypes of None and pair"; correct me if you meant something else).

Recall what subtyping means (I'm going with Wikipedia's definition here for sake of accessibility):

> [S]ubtyping (also subtype polymorphism or inclusion polymorphism) is a form of type polymorphism in which a subtype is a datatype that is related to another datatype (the supertype) by some notion of substitutability, meaning that program elements, typically subroutines or functions, written to operate on elements of the supertype can also operate on elements of the subtype.

This holds in the case of sum types. Operations that work on the sum type will generally also work on the variants that constitute the sum type.

The same goes for inheritance. If an abstract class T has two concrete subclasses `A` and `B`, then a value of type `T` belongs to the union of values of type `A` and of type `B`.


Not true. You can model sums with inheritance along with an exclusivity constraint, but it's a weird model and subtyping is more general. Further, the idea of each variant being its own type is inherently a subtyping sort of idea. Sums don't give names to their conponents, only distinctions.


> You can model sums with inheritance along with an exclusivity constraint, but it's a weird model and subtyping is more general.

You don't need an exclusivity constraint. Exclusivity is purely a modularity concern; you will still have a finite number of variants in any given software system; traditional ADTs and exclusivity just limit the declaration of variants to a single module. See also "open sum types" vs. "closed sum types", because it can be beneficial to have extensible sum types [1]. Not all sum types are closed; see polymorphic variants and extensible variants in OCaml.

Also, do not confuse the language mechanism used to specify a type with the type itself.

I do agree that inheritance is a generalization of algebraic data types.

> Further, the idea of each variant being its own type is inherently a subtyping sort of idea. Sums don't give names to their conponents, only distinctions.

Try polymorphic variants in OCaml (mentioned above); or GADTs:

  # type _ t = Int: int -> int t | String: string -> string t;;
  type _ t = Int : int -> int t | String : string -> string t
  # Int 0;;
  - : int t = Int 0
  # String "";;
  - : string t = String ""
There's nothing inherent about summands not having a distinct declared type in ML and Haskell, only convention. Obviously, they do have distinct actual types.

Edit: A practical use case is the representation of nodes for an abstract syntax tree. An `Ast` type can benefit from having abstract `Expr`, `Statement`, `Declaration`, `Type`, etc. subtypes that group the respective variants together in order to get proper exhaustiveness checks, for example.

[1] See the question of how to type exceptions in Standard ML; in OCaml, this led to the generalization of exception types to extensible variants.


I'm aware of polymorphic variants and row types and the like. My concern is one of modularity in that I consider a running system pretty dead, extensions during coding are where language features and their logics are interesting. Closing your sums is valuable to consumers: they have complete induction principles, e.g.

Open sums and row types are a little different in that they represent a fixed/closed type but retain enough information to extend it more conveniently and to see it as structurally related to other (open) sums/products. This is no doubt super useful, but I see it more as an application sitting atop polymorphism rather than a fundamental concept.

Finally, I am exactly confusing the language mechanism with the type it intends to model because exactly now we have to think about things as a mechanism and model. This is where breakdowns occur.

Anyway, I doubt there's a real difference of opinion here. I'm very familiar with the concepts you're discussing, but perhaps argue that they are not as fundamental as regular, closed sums/products and language support for those simplest building blocks is important.


I'm finding that wrapping an interface in a struct can be a good technique. However, the interface{} contains a type identifier, so adding another one seems like wasted space. Usually you can compute it using a type switch.


Yeah, that's a perfectly valid solution as well. Depending on the application it may be faster to do the assertion on your type identifier rather than introspecting the type though. And having a list of type options more directly maps to a true sum type. I probably wouldn't actually do it in real Go code if I could avoid it though




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: