Hacker Newsnew | past | comments | ask | show | jobs | submit | empw's commentslogin

To me Perl was just Weird, to no particular end. Not the kind of mind expanding Haskell/Prolog/Lisp weirdness that opens up new possibilities. It just does roughly the same things as every other language, but everything is done slightly differently due to evolving out of the primordial soup of bourne shell and AWK filtered through Larry Wall's brain.

Perl and Python were similarly powerful and useful languages, but I could learn and start producing useful code in Python after reading an hour long tutorial. Perl took an order of magnitude longer, and remained more awkward to use just due to the Weirdness. There was a momentum building in the early 2000s toward competitors like Python and Ruby that were seen as less crufty and more modern.

Perl's developers seemed to agree, since they cooked up their own competitor to Perl, an entirely different language confusingly called Perl 6. The coexistence of Perl 5 and 6 made the Python 3 transition look like a cakewalk -- at least it would have save for Perl 6's almost entire failure to exist for over a decade after its inception. It produced lots of constantly churning specs and blog posts about register based virtual machines with native support for continuations or whatever, but no implementation of a language that anyone felt comfortable using for any real development. Meanwhile people kept using the ossifying Perl 5 for existing applications, and gradually transitioning away as they were replaced.

Also PHP overtook it for the "just FTP a script to $5 shared hosting and make a webapp" use case.


I agree, and the blame for its weirdness can be laid directly at Larry Wall's feet, because Wall wanted a language that allowed for cleverness, suprise, and ingenuity. He was never happier than when someone would come up with a completely new way to do something. For Wall, programming was less about coding an outcome, than it was about speaking a particular language (and ideally, writing poetry in it). And it was very successful in this way, and fit reasonably well with the high-knowledge users/environment of unix in the 90s.

It's just that Wall's vision was incompatible with general purpose languages used widely by a wide range of knowledge and skill amongst its users, and as unix/linux opened up to that wider range, better general purpose alternatives were chosen. Having to learn to be a poet to be a good coder was too high a barrier.


> Wall wanted a language that allowed for cleverness, suprise, and ingenuity. [...] Having to learn to be a poet to be a good coder was too high a barrier.

To me this just sounds, umm, pathologically eclectic.


Now extrapolate to "let's do a Perl 6 that allows us to do all the things I couldn't work into Perl 5" and a lot more history makes sense.


But I bet you could really list some rubbish with it…


Larry Wall is a linguist.

Perl is most human-like programming language, which is what makes it great, but also what makes it unreliable for large scale programming.


> Perl is most human-like programming language

This claim always seemed bizarre to me. What kind of drugs do you think a layman would think you were on if you showed them a typical Perl program and asserted how "human-like" it was? Even if you tried to follow-up and explain it: "No, no, see this bit means open whatever file stream is specified in the command line, as long as it starts with a letter between 'A' and 'F' that can be either upper or lower case, and is followed after any random characters by at least 3 digits in a row. Then this next character means.... See? It's just like I'm describing it to you!"


I remember the "Aha" moment on the face of a python-flavoured data scientist when I showed her my Perl scripts for managing LDAP groups.

One look at the class files (written in Moose) and she was stunned that she could immediately understand it, having never written Perl before.


Wasn't Intel trying to do something similar in Itanium i.e. use software to translate code into VLIW instructions to exploit many parallel execution units? Only they wanted the C++ compiler to do it rather than a dynamic recompiler? At least some people in Intel thought that was a good idea.

I wonder if the x86 teams at Intel people were similarly baffled by that.


Adjacent but not the same bet.

EPIC aka Itanium was conceived around trace optimizing compilers being able to find enough instruction level parallelism to pack operations into VLIW bundles, as this would eliminate the increasingly complex and expensive machinery necessary to do out of order superscalar execution.

This wasn't a proven idea at the time, but it also wasn't considered trivially wrong.

What happened is that the combination of OoO speculation, branch predictors, and fat caches ended up working a lot better than anticipated. In particular branch predictors went from fairly naive assumptions initially to shockingly good predictions on real world code.

The result is that conventional designs increasingly trounced Itanium as the latter was still baking in the oven. By the time it was shipping it was clear the concept was off target, but at that point Intel/HP et all had committed so much they tried to just bully the market into making it work. The later versions of Itanium ended up adding branch prediction and more cache capacity as a capitulation to reality, but that wasn't enough to save the platform.

Transmeta was making a slightly different bet, which is that x86 code could be dynamically translated to run efficiently on a VLIW cpu. The goal here was two fold:

First, to sidestep IP issues around shipping an x86 compatible chip. There's a reason AMD and Cyrix are the only companies to have shipped intel alternatives in volume in that era. Transmeta didn't have the legal cover they did, so this dynamic translation approach sidestepped a lot of potential litigation.

Second, dynamic translation to VLIW could in theory be more power efficient than a conventional architecture. VLIW at the hardware level is kinda like if a cpu just didn't have a decoder. Everything being statically scheduled also reduces design pressure on register file ports, etc. This is why VLIW is quite successful in embedded DSP style stuff. In theory, because the dynamic translation pays the cost of compiling a block once then calls that block many times, you could get a net efficiency gain despite the cost of the initial translation. Additionally, having access to dynamic profiling information could in theory counterbalance the problems EPIC/Itanium ran into.

So this also wasn't a trivially bad idea at the time. Transmeta specifically targeted x86 compatible laptops as that was a bit of a sore point in the Wintel world at the time, where the potential power efficiency benefits could motivate sales even if absolute performance still was inferior to intel.

From what I recall hearing from people who had them at the time, the Transmeta hardware wasn't bad but had the sort of random compatibility issues you'd expect and otherwise wasn't compelling enough to win in the market vs Intel. Note this was also before ARM rose to dominate low power mobile computing.

Transmeta ultimately failed, but some of their technical concepts in detail have been continued on in how language JITs and GPU shader IRs work today. Or how Apple used translation to migrate off both PowerPC and x86 in turn.

In both the case of Itanium and Transmeta I'd say it's historically inaccurate to say they were obviously or trivially wrong at the time people made these bets.


Itanium wasn’t really focusing on running x86 code. Intel wanted native Itanium software, and x86 execution was a bonus.


Maybe that's true for this project (implementing pointer chasing list interpreter directly in hardware), but it's much less clear to me why it would be true of the much more commonly remembered examples of "lisp machines" like symbolics, TI, etc.


By many of these arguments assembly is also not a low-level language


It is! I think that's not an accident.

CPUs evolved to execute C-like code quickly. They couldn't dramatically change the way C interfaces with the CPU, so they had to change the hidden internals instead.

For example, CPUs didn't have an option to hide DRAM latency with a SIMT architecture, so they've went for complex opaque branch prediction and speculative execution instead.

The way C is built and deployed in practice didn't leave room for recompiling code for a specific CPU, so explicit scheduling like VLIW failed. Instead there's implicit magic that works with existing binaries.

When there were enough transistors to have more ALUs, more registers, more of everything in parallel, C couldn't target that. So CPUs got increasingly complex OoO execution, hidden register banks, and magic handling of stack as registers. Contrast this with the current GPUs that have register-like storage available that is explicitly divided between threads (sort of like 6502's zero page – something that C couldn't target well either!)


You can write a function in go or rust, then write code in assembly to call it with any old nonsense. It is no different. The whole point of static typing is that it happens at compile time, not runtime.


I haven't written any ruby for over a decade and a half, when ruby 2.x was still on the horizon. But I do know the source of "eigenclass". It was first jokingly used in an odd instructional programming book / web comic / experimental art piece by "why the lucky stiff" who was briefly prominent in ruby-land then erased himself from the internet. It's funny that it has now become an established term of art for Ruby people.


Sunny Ripert paid a wonderful tribute to _why as a talk at ParisRB 2020.

https://m.youtube.com/watch?v=njr39cVU7d0

It was not so much aimed at the "old guard" (those that knew about _why) as the "next generation" (those that never heard of him).

At some point Sunny asked to raise hands if one knew about _why, maybe half did, tops.

By the end of the talk the emotion in the room was palpable.


I still refer to the * operator as “splat” after a presentation I saw him give in London in 2007.

_why is most definitely gone but not forgotten.


I thought splat was actually the "official" usage these days, tho I think some people call it spread


It is yes [0].

> You can turn an Array into an argument list with * (or splat) operator

> You can turn a Hash into keyword arguments with the * (keyword splat) operator:

It's also named spat across the MRI codebase, e.g. `VM_CALL_ARGS_SPLAT` etc.

[0] https://docs.ruby-lang.org/en/master/syntax/calling_methods_...


It's called spread in Javascript [0], where the keyword is `...`.

[0] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


splat is the official name in Ruby, spread comes from the similar JS construct (obviously, lots of people using Ruby also use JS, so its natural for terminology from one context to sometimes get applied to similar concepts in the other even where it isn’t official.)


I remember reading WPGTR in college. Really inspired me to play around and learn the language. I've always kept my printed copy in a safe place.


I read it after college when I first started working with Rails 1 after I joined a friend’s company. Funny to think that was almost two decades ago. I still remember the quirky art style, although I will admit to preferring a more prosaic book when I learned the language. But it was a fun community and the sense of creativity was really apparent.


loved _why’s stuff back when he was still around.


_why's influence is indelible.


In my experience there are two mistakes people sometimes make that leads to the fisheye lens look. The first, as parent mentioned, when calculating the ray directions you should linearly sample a line rather than interpolating view angles. Second, when doing per-column perspective division, you should divide by dot product of the forward vector and the difference between the ray intersection and the camera location. Often if someone is making the first mistake they are also making the second.

Following this will give you the normal pinhole camera 3d projection that we all expect to see from a 3D game.

Rule of thumb I found as a beginner in 3D graphics: any time polar coordinates seem like the obvious solution, there's usually a better way.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: