Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's hard to talk in the abstract because obviously people can abuse any type of code feature, but I generally find chaining array methods, and equivalents like c# linq, much easier to read and understand than their looping equivalents.

The fact that you single out .reduce() here is really telling to me. .reduce() definitely has a learning curve to it, but once you're used to it the resulting code is generally much simpler and the immutability of it is much less error-prone. I personally expect JS devs to be on the far side of that learning curve, but there's always a debate about what it's reasonable to expect.



The wonderful thing about .reduce() is that it can compute literally anything. The problem with .reduce() is that it can compute literally anything. As for the rest of the morphism menagerie, I like being able to break up functions and pass intermediate results around. It's literally cut and paste with map/filter, with a loop it's rewriting. Yay composability.

That said, it's easy to get carried away, and some devs certainly do. I used to be one of those devs, but these days I sometimes just suck it up and use a local variable or two in a loop when the intent is perfectly clear and it's not leaking side effects outside of a narrow scope. But I'll be damned if I let anyone tell me to make imperative loops my only style or even my primary one.


Reduce cannot calculate literally anything, in the sense you mean. It corresponds in computational power with primitive recursion. And quite famously, there are problems primitive recursion cannot solve that general recursion can.

On the other hand, I don't think I've ever seen something as recursive as Ackermann's function in real life. So it can probably solve any problem you actually mean to solve.


What the previous user means is that reduce is not a function that returns a list (albeit it can).

It just accumulates over some value, and that value can be anything.


Naw, GP is right, I'd forgotten about the limits of primitive recursion. But for almost any given real-world problem, it's something you can get away with forgetting.


Unfortunately, since we don't have continuations, we cannot make reduce _stop_ computing. In such cases where that is needed, it might be better to use a loop that can be broken out of.


Well, you can always throw an exception :) (ducks)

But yes, it's best used on sequences where you know you'll consume the whole thing, or at least when it's cheap enough to run through the rest with the accumulator unchanged.


> The fact that you single out .reduce() here is really telling to me. .reduce() definitely has a learning curve to it, but once you're used to it the resulting code is generally much simpler and the immutability of it is much less error-prone. I personally expect JS devs to be on the far side of that learning curve, but there's always a debate about what it's reasonable to expect.

Not only that, but the words that GP uses to single out .reduce() start with:

> I see so much convoluted code with arr.reduce() or many chained arr.map().filter().filter().map()

Which I do not doubt, but the point is diminished when one understands that a mapping of a filtering of a filtering of a mapping is itself a convoluted reduction. Just say that you prefer to read for-statements.


I say convoluted. I prefer using the functional-style array methods, but there's a time and place for everything, and I feel a lot of Javascript developers extend those methods beyond what is reasonable and into a convoluted mess, especially with reduce.

Give me a good classic `T[] => I` reduce function and I'm fine with it. Not the more common case of folks mutating the accumulator object.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: