"Carl W. Brown" <email@example.com> said:
>Say you wanted to do a table lookup. APL has no string operations so you
>are comparing a one dimensional character array against a two dimensional
>character array. You use an outer product multiply function substituting
>the multiply for a compare. This yields a three dimensional array of
>true/false comparisons. You then compare rows of trues to produce a matrix
>that is one dimensional less. When you have reduced the results to one
>dimension you have a single row of true/false indicators corresponding to
>the results of your table compare. This can be coded as a single line of
>code read right to left.
Well, it takes a long time to learn APL. You can write tiny programs that do
amazing things! I think you could do the above using Union, Rho,  around
parellel array. Certainly about 6 type chars would do it. They are coded as
you mentioned right to left.
>I have spent more time figuring out what the code is doing than writing it
>in the first place.
Oh yea. You can't figure it out once you write it.
>To make it even less comprehensible you often have the program construct
>lines of code to execute, so the program builds its own executable code as
Well you can do that as you said. That's why you can write artificial
intelligence type code with ease. Many years ago I wrote a APL program to
play an obscure european game called Mill. In no time, it learned enough I
couldn't beat it! So I looked at the code it wrote, and I couldn't
understand them, either. I wrote the program!
APL is the best and the worst of what a computer language can be, all rolled
into one. But there are things you can do that seems almost breathtaking,
and its way faster to write than any normal procedural language. You have to
be really smart, have a lot of time to study, have a great memory for the 97
operators used each at least two ways; (monadic, Dydadic, vector), and
preferrably, have no life to distract you whatsoever.
Thats what makes to soooo great and sooo weird,
(Sorry its off topic but I had to defend 'A programming language" )
An explaination of its origin by the creator, Kenneth E. Iverson is at:
Among other things, it replaces calculus with a notation that actually works.
This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:20 EDT