Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is because those are the traditional methods, and are easy to teach. And probably the curriculum hasn't been changed. Also, automatic differentiation is _hard_.

Numerical differentiation comes directly from the definition of derivative and symbolic is what you always did for exercises in your calculus classes.

Automatic differentiation come in two flavours, forward and backward modes. Forward mode is based on dual numbers [1], which is the

    quotient of a polynomial ring over the real numbers, by the principal ideal generated by the square on the indeterminate
That is ℝ[X]/≺X²≻.

Another way of thinking of this is to have an element ℇ different from zero, such that ℇ ²= 0 and hand wave the fact that the dual part of the number carry the derivative.

Backward mode builds a graph of computation and doing symbolic differentiation over the graph and compile down the derivative into runable machine code (it could be interpreted, compiled down to IR, neither the form or the execution environment changes the fundamental algorithm).

Maybe they are not really hard, but they are not easy either. Still I think they should be at least mentioned in modern courses.

[1] https://en.wikipedia.org/wiki/Dual_number



It's been a while since I've watched this talk[0], but iirc the upshot is that if you think of the "differentiation at p" operator as sending f to the pair (f(p), D_p f) instead of the usual D_p f, then you get a nice compositional operator that defines a functor on vector spaces, automatic differentiation more or less pops out, and you get reverse mode by looking at what this does when applied to dual spaces.

So in some sense it's maybe not hard, though any attempt to do it without building up some theory (e.g. abstract vector spaces and dual spaces) first will probably come across as magic. On the other hand, magic tricks would be right at home in an intro differential equations class so maybe this would be a perfect addition. Or it can replace Wronskians or something.

[0] https://youtube.com/watch?v=17gfCTnw6uE


> Still I think they [automatic differentiation methods] should be at least mentioned in modern courses.

That might make more sense in another course, like one on numerical methods, rather than in a programming course that happens to use numerical differentiation as a demonstration of programming techniques and language features.


> That is ℝ[X]/≺X²≻.

> Another way of thinking of this is to have an element ℇ different from zero, such that ℇ ²= 0 and hand wave the fact that the dual part of the number carry the derivative.

Aren't these the same way of thinking?


Different way of thinking about same thing.

A more common example of this idea is √(-1): R[x]/<x²+1> vs i²+1=0.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: