Re: [dev] a suckless computer algebra system

From: David Tweed <>
Date: Fri, 20 Nov 2009 14:57:24 +0000

On Fri, Nov 20, 2009 at 2:15 PM, Jukka Ruohonen <> wrote:
> On Fri, Nov 20, 2009 at 01:53:47PM +0000, David Tweed wrote:
>> FWIW, my understanding is that the LAPACK library must have an API
>> which conforms with a reference Fortran implementation, but there are
>> various versions implemented in various languages (Fortran, C, CUDA,
>> etc).
> This is true. But the LAPACK itself is a stand-alone Fortran library;
> typically it is used via things like f2c.
>> As for the code "quality", I can see the code driving certain people
>> on this list mad because it deliberately doesn't compute things in the
>> simplest way and fewest lines in order to do things like acheive close
>> to optimal cache blocking on modern multicore machines. A comparison
>> of how much performance can vary depending on how it's coded can be
>> glimpsed in the graphs in this paper:
> This is again true, IMO. I'd say that in the sense of traditional software
> engineering, the code quality of numerical software is generally terrible.

I was pointing out more how the simple-minded software metrics would
condemn you to around about the level of performance acheived by the
reference LAPACK (white bars) in the paper referenced, which to my
mind suggests there's a flaw in the software metrics. I'd also query
that the code quality is terrible in most numerical software: what I'd
say is that they've got a task to acheive (ie, using as much of the
computing power as possible) and make the software as simple and
maintainable as it can be given the task. (What they don't generally
do is say "if we reduce what portion of the task we'll implement for
users, we get wonderfully simple code".)

> But as for validity and reliability of numerical algorithms, the thing that
> really matters in this context, LAPACK is again without doubt the most
> respected library. In fact, it is intriguing to follow the history of
> numerical matrix algebra and the close correspondence of it with the
> development of ALGOL, LINPACK, and later LAPACK.

This is probably splitting hairs, but my understanding is that LAPACK
(and BLAS below it) are more "specifications of library functionality
(in the form of a reference implementation)" rather than a single
library. The development history is certainly interesting,
particularly the adapation from old-style computer assumptions (memory
is memory is memory) to taking active steps to optimise for the
multi-level memory hierarchy in modern machines with starting around
the time of Goto-BLAS.

cheers, dave tweed__________________________
computer vision reasearcher:
"while having code so boring anyone can maintain it, use Python." --
attempted insult seen on slashdot
Received on Fri Nov 20 2009 - 14:57:24 UTC

This archive was generated by hypermail 2.2.0 : Fri Nov 20 2009 - 15:00:02 UTC