Would anybody recommend learning J/K/APL?

ozan picture ozan · Apr 14, 2009 · Viewed 14.4k times · Source

I came across J/K/APL a few months ago while working my way through some project euler problems, and was intrigued, to say the least. For every elegant-looking 20 line python solution I produced, there'd be a gobsmacking 20 character J solution that ran in a tenth of the time. I've been keen to learn some basic J, and have made a few attempts at picking up the vocabulary, but have found the learning curve to be quite steep.

To those who are familiar with these languages, would you recommend investing some time to learn one (I'm thinking J in particular)? I would do so more for the purpose of satisfying my curiosity than for career advancement or some such thing.

Some personal circumstances to consider, if you care to:

  • I love mathematics, and use it daily in my work (as a mathematician for a startup) but to be honest I don't really feel limited by the tools that I use (like python + NumPy) so I can't use that excuse.
  • I have no particular desire to work in the finance industry, which seems to be the main port of call for K users at least. Plus I should really learn C# as a next language as it's the primary language where I work. So practically speaking, J almost definitely shouldn't be the next language I learn.
  • I'm reasonably familiar with MATLAB so using an array-based programming language wouldn't constitute a tremendous paradigm shift.

Any advice from those familiar with these languages would be much appreciated.

Answer

S.Lott picture S.Lott · Apr 14, 2009

Thousands of years ago, I was an APL programmer. By thousands, I mean back in the '70's when the custom character set meant we had special printing terminals with the APL keyboard and character set, and IBM selectric typeballs with the special characters, etc.

I went to a lecture by Ken Iverson on "Why APL Was Cool".

His thesis was this. Once upon a time long division was a serious mathematical undertaking, reserved for graduate students. Notation for things like repeating decimal expansions involved a large pile of mathematical symbolism. Once upon a time even something like a "negative" number was required elaborate notation.

Over the years -- as we came to a better understanding of these abstractions -- we came up with much more compact notation for complex concepts.

The point of APL (and J and K) is to summarize big algorithms into tidy notation.

Nowadays, I'm a Python programmer. I find that my early exposure to APL warped my brain by forcing me to ask "what's this mean?" and "is this a reusable operation?" and "what's a pithy summary for all this algorithmic fluff?"

Also, as I pursue the Project Euler problems the "functional programming lite" of Python backed by my "Gin-soaked recollections of APL" are both very helpful in tackling the exercises.