Reinventing XML traversal/transformation in JavaScript will lead really fast into abstracting away the recursive tree traversal, then mapping node marchers to functions. Which is 99% what XSLT provides off the shelf.
many developers just want to have some space to do some extra logic and XSLT doesn't really accommodate for that. You have to deal with the input and create the output without pausing for breath. While this is fine in simple use-cases where the mapping is 1-1 but when the mapping gets more complicated then it becomes difficult to maintain.
That’s simply not true. Certainly it can be hard to wrap your head around architecting complex logic in XSLT 1.0, but it's not unlike how you have to change your thinking for functional programming generally. More ergonomic imperative logic and several other features in XSLT 2.0 made it a much more developer friendly language.
Most orgs don't transition to 2.0 which is a big part of why XSLT sucks.
> but it's not unlike how you have to change your thinking for functional programming generally
I'd write lisp any day of the week over XSLT. I don't think XSLT deserves a place alongside other functional languages because its bound so tightly to xml. Sure, one can argue it has its place when processing XML but I'd pick a general purpose language and process XML with that over XSLT every time.
You keep making assertions and generalizations without backing them up with anything concrete while simultaneously moving goalposts. In the cases where you do, you’re just wrong: XSLT is not held back by licensing, it doesn’t “require one liners”, and it has supported JSON since 3.0, which has been around for almost a decade.
XSLT was held back by licensing, its why everything supports 1.0 and barely anything supports 2.0. If 2.0 was released for free then everything would use 2.0.
well then tell me why .NET's XSLT platform is rooted in 1.0 instead of just saying "no". From my understanding licensing was a factor that played into the reasons why .NET natively remains on 1.0. The transition to 2.0 appeared to have specific friction: why?
for reference: I'm not talking about today, I'm talking about that point in time where (for example) .NET could have adopted XSLT 2.0 but chose not to.
.NET didn't transition to XSLT 2.0 for the simple reason that demand wasn't really there. The way they saw it, LINQ was a better all-around option, with XLINQ specifically being an adequate replacement for XPath, and VB even got a bunch of special syntax to work with XML that made it more concise than XSLT in many cases: https://learn.microsoft.com/en-us/dotnet/visual-basic/progra...
That said, it is emphatically not true that XSLT 2+ was somehow license-restricted. I worked at a company that used XSLT 2.0 and XQuery for code generation back in 2006, and we used the free edition with Saxon with no issues - paid versions had some optimizations and features like static type checking which we simply didn't need.
For the most part, XSLT 1.0 was "good enough" for most cases when XML was still the primary interchange format, and later JSON took over so there was less incentive for framework providers to spend resources on keeping up-to-date with the standard. Especially since XSLT 2.0 was such a massive leap in many ways, including a different data model that made many existing XSLT 1.0 processors incompatible in ways that required a massive redesign.