Thank you for this wonderfully concise summary: it’s convenient to have all this in one compact document.
I suppose “flops” means “floating-point operations” here? Heretofore I’ve always encountered this as an abbreviation for “floating-point operations per second”.
When I used Matlab as an undergrad in the late 80's it used to report the flop count after each command, and those were floating point operations, no time units involved. I tried to find an image of the old command line but found this historical note first [1] and thought it would be of interest to readers of the article.
Indeed, the meaning of “flops” is ambiguous, but that seems hard to avoid. Fortunately, the ambiguity is easily resolved from context in most cases, as it is here.
Speaking of SVD doctors, I heard many years ago (from Alan Edelman) that Gene Golub's license plate used to be "DR SVD". Later he switched to "PROF SVD".
As I recall the Dr SVD was the license on his second car.
He once left the car at SFO airport and when he heard I was flying
in, he asked me to drive it to his house (keys under the seat!)
What he didn't tell me was that it was a "stick shift" and I only once
before had a friend teach me how that works. After driving around
San Francisco for Gene I got better at it, but it was definitely scary at times.
:-)
> To avoid ambiguity I usually use `flop/s` but not everyone likes that :)
Flop/s makes no sense and is outright wrong. The whole point of flops is to express how many floating point operations are required by an algorithm. How many operations are performed per second is a property of the hardware you're using to run an implementation of the algorithm.
You want to express computational complexity in terms of floating point operations.
Well, yeah, I use it in the context of floating-points-operations-per-seconds. That's the most common use in my field. I was replying to the parent comment. No need to use this kind of tone.
The number of operations per second varies by at least an order of magnitude on the same hardware depending on the GEMM algorithm you use (reference or Goto), and you quote the performance of the implementation in terms of FLOP/s knowing the number of FLOPs required by the computation. That makes sense to people who implement and measure these things.
I suppose “flops” means “floating-point operations” here? Heretofore I’ve always encountered this as an abbreviation for “floating-point operations per second”.