It what ways is J an abomination? APL (and to a degree J) are on my bucket list of languages to learn. I’d always read that J was a sort of spiritual successor to APL.
I think my reasons are very different from robomartin's but I share the opinion that J has some pretty serious flaws. Some are brought forward from APL, some are not.
It has no context-free grammar (but neither does APL). Scoping is local or global: if a variable in a function is local, even inner functions can't access it. It has namespaces (locales), but they are not first-class and are instead referenced by numbers or strings, so they can't be garbage collected. Until this year, there was no syntax to defined a function—even though control structures had dedicated syntax! System functionality has no syntactic support, so it's provided by passing a numeric code to the "foreign" operator. To enable recursive tacit functions, functions are passed by name. Not just as arguments, anywhere. The name is just a string with no context information, so if you pass a function to an explicit modifier and the name means something different in that scope it will do something different (probably a value error, fortunately). Oh, also these names store function ranks, and if you define the function later it won't pick it up, so even the tacit recursion thing is broken.
The designers of J weren't familiar with design principles that are common knowledge outside the APL world, and it really shows. J was my first programming language (well, except TI basic) and while I'm still a big fan of array programming it really held me back from learning these things as well.
Functions, if you are referring to monads and dyads, can be defined using "3 : '...'" or "3 : 0\n...\n)" using "3" for monads and "4" for dyads.
As for rank information, functions do carry that rank. I believe you are not defining any rank, it which case it becomes infinite rank and depending on usage, it will be applied incorrectly.
For tacit recursion, there is the "$:" operator, which allows a tacit function to call itself anonymously. You may also need the agenda operator "@." to define your base case with a gerund "`".
The "foreigns" table, while useful, seems like a cludgy way to introduce functions that don't or are difficult to introduce cohesively into J's notation.
He holds a grudge against J, but not everyone does. I write research code in J and it's a very good little language. For instance, scripting is much easier in J. (Dyalog) APL is much more like an APL machine (in the sense of the LISP machine).
No, I don't hold a grudge against J. That's preposterous. Silly, really. These are tools.
No. I don't prefer one symbol to two characters. That is also silly.
You have to understand the HISTORY in order to understand my statement.
APL was, at the time, brilliant. Remember that it started in the 60's. Way ahead of its time. I learned and worked with it professionally in the 80's and early 90's.
Ken Iverson, the creator of APL, understood the power of notation as a tool for thought. In fact, he famously authored a paper with exactly that title [0].
I had the pleasure of being at an APL conference where Iverson himself presented and discussed this paper. I also took advantage of tutorials and sessions by Iverson and many of the early APL luminaries of the time.
The power of notation might not be easy to understand without truly internalizing APL or having good command of a reasonable analog. For example, a classically trained musician appreciates the value of musical notation. While not perfect, the alternatives have failed to deliver equivalent results, power, expression, etc. The day something else is used to score and perform orchestral performances we might have something to consider.
There are other examples of the power of notation and the paper covers the subject well.
So, why is it I say J is an abomination?
History.
Why does J exist? Why did Iverson go this way after correctly noting and promoting the idea that notation was a powerful tool?
He made a mistake, likely driven by a failed attempt to improve business outcomes.
Here's the history part.
Back in the '80's doing APL characters was not easy. On mainframe based systems we either had to use specialized IBM or Tektronix terminals and printers. When the IBM PC came out we had to physically replace the video card's character ROM (not sure most people know what this is these days) in order to get APL characters.
A common hack was to install a kludgy setup with a toggle switch so you could switch between APL and standard characters. The keyboard, for the most part, got stickers glued to it on the front face of the keycaps. You could, eventually, buy new keycaps for the IBM PC standard keyboard.
Printers suffered a similar fate. You had to reprogram and hack Epson and other printers in order to be able to print the characters.
Incidentally, if you wanted to use a PC for, say, Japanese and English back then you had to resort to the same kinds of hacks or buy specialized video cards and software.
I could go on. The point is that you had to be truly determined to do APL back then and it was a pain in the ass. Convincing an employer to hack 500 PC's so you could migrate to APL was an exercise in futility. Financials and other industries where the power of APL could be put to good use took the plunge, nobody else did. I did an APL-based DNA sequencing tool for a pharmaceutical company back in the 80's.
APL wasn't going to go very far beyond academic and corner-case circles under those conditions.
That's when Iverson came up with the J concept of transliterating APL characters to combinations of standard ASCII characters. It was hard to impossible to sell APL to the masses given the issues of the time. Iverson thought the transliteration would open the doors to the wider audience. Well, it did not. Among other things, notation, believe it or not, is much more practical and easier to learn than a seemingly random mish-mash of ASCII characters.
From my perspective Iverson suffered from a lack of conviction based on likely business pressure. The hardware of the day was conspiring against being able to push APL out to the masses. He did not have visibility into what was coming. Shortly after he took the J trip, the IBM PC went fully graphical and universal multi language (spoken) characters could be rendered without hardware hacks. Except that now Iverson, the creator of APL, had his business interests firmly attached to J. Here we had a situation where the creator of APL pretty much abandoned the language he brilliantly developed, taught and promoted for over twenty years.
The J enterprise, as a business, deviated his path and likely seriously damaged APL. And it failed. It absolutely failed. Nobody who used APL for any non-trivial application was going to consider J. Not due to some test of purity. No, it was because the power of notation is such that J was, in fact, a complete abomination. The only way anyone I knew would consider it was if by force. I can't think of any APL-ers of note of the era that truly jumped on the J bandwagon. The proof that J failed is simple. It's just as dead as APL. Corner case use, sure. Just like APL. I used APL professionally for ten years and I would not touch it at all for any real application today. Anyone doing so would be crazy to make that choice. The only exception would be in maintaining or extending an existing system. Even then, you have to very seriously consider porting it to a modern language.
Notation isn't "funny characters". It's a powerful tool of both expression and thought. If notation is "funny characters" what do we say about every written human language with "funny characters" (Chinese, japanese, greek, arabic, hebrew, etc.). Do we convert every word in Chinese into ASCII transliterations of the symbols just do it doesn't look "funny" and to make Chinese more popular? No, this would be an abomination. Chinese is a rich and expressive language, complex, yes, of course. And yet the power of this language, among other things, in the notation used to put it to paper.
Imagine converting every "funny character" human language to a mish-mash of ASCII just because the IBM PC could not display them in the early days. Imagine then saying that someone calls these abominations because "he prefers one fancy character to two ugly characters" or "He holds a grudge against <language>". The first thing that comes to mind is "ignorant", followed by "misplaced". Learning about and understanding history and context is very important.
Can J be used for useful applications? Of course. So can COBOL, FORTH and LISP. Yet this does not mean it is a good idea. And using J to write research code (which I did with APL in school as well) is nowhere near the realities of real code in real life at scale and maintainable over time and hardware evolution. Extending this to mean that abandoning the power of APL notation due to hardware issues was a good long-term idea is, in my opinion very wrong. J has no future. Neither does APL in its current form. I still think everyone should be exposed to languages like FORTH, LISP and APL in school. There's a lot of value in this.
EDIT: Imagine if Guido had abandoned Python just before it started to become popular and went in a different direction. The language would have stagnated and likely become irrelevant. That's what happened to APL. And J went nowhere as well. Iverson confused and antagonized the APL community to go after perceived business opportunities. In doing so he pretty much destroyed APL and J.
First, abandoning a new branch of computing that started with the development of a specialized notation. As I said elsewhere, the power this offers cannot be appreciated without a suitable frame of reference. This can be either the use of APL (to a good degree of competency, not just dabbling) or a reasonable analog, such as musical notation.
The timing for the introduction of J was terrible. Computers made the transition from text-only terminal output to full-on, character-agnostic and graphically-rich output just around that time. I can't fault Iverson for this, nobody has a crystal ball. Having the inventor/founder of APL leave the language behind for a half measure that was primarily a reaction to character-only computers did a lot of damage to APL. One has to wonder how things might have evolved had he stayed the course he charted. After three decades of educating an entire community on the value and power of notation he threw it all away purely due to a mistimed decision about computer hardware of the day. As I said elsewhere, not the first time and not the last time someone in technology makes a bad decision. None of us are immune to this.
Maybe I'm missing the point but could you clarify a bit more on APL's notation vs J's notation?
Speaking as someone who is not very well math inclined and as someone who was born in an Asian country, both APL's special characters for verbs and J's alphabetical characters for verbs are similar enough for me. Both languages use symbols for verbs, it's just that J's symbols happens to very closely resemble the characters of the English alphabet.
Although, due to the familiarity of the English alphabet, J's symbols might intuitively bring up ideas of the alphabet character, is it not possible to just think of it as a new mathematical symbol? For example, instead of seeing "o." as the alphabet character 'o' followed by a period, couldn't it be seen as a circle followed by a dot? Or if we lived in a world where the alphabetical characters of the English were swapped with the special characters of APL, would J's notation still be broken? Does familiarity of the symbols used in a notation make it any less powerful?
Maybe the reason why I don't understand is because I haven't tried APL and only tried J. And I eventually ended up quitting on learning J because it was starting to get too difficult for me. Would it be possible to explain the differences in APL's notation and J's notation is an easier or simpler fashion?
APL’s verbs are geometrically suggestive. They are little pictures that represent what they do, and how they are related to each other. For example, ⌽ reverses an array along its last axis; you can see the array flipping around the vertical line. And you will know what ⍉ does without looking it up, I bet. These symbols are so well designed that you don’t have to memorize much, because they document themselves.
Couldn't the same be said of J? If APL's powerful notation comes from not having to memorize much and being a good visual representation of what the verb does, doesn't J's usage of alphabetical characters achieve something similar albeit a bit worse? For example "i." for index and related functions. Since the letter 'i' is usually used for indexing, one could assume that "i." is something related to indexing. Does the usage of alphabetical characters weaken the notation so much that it could be considered an abomination?
If there was another language that was a copy of APL but with new non-alphabetical symbols that were less suggestive than the original APL symbols, would that language be considered to have a less powerful notation? If so, how much weaker would it be considered? What would the symbols of a language that is APL-like and uses non-alphabetical characters, but would still be considered an abomination look like? Would that language be considered to have a more powerful notation than J?
This might be a bit of a stretch but I'd like to use the symbols on a media player as an analogy. The symbols on a media player (play, pause, resume, seek back, seek forward) could be compared to APL's symbols. Then, for the J version of the media player, rather than the symbols, there could be "Pl", "Pa", "Re", "SB", "SF" or something of the sort. I would say that the APL's symbols do look nicer, but I don't think J's usage of alphabetical characters should be considered an abomination. If so, wouldn't all text GUI's (e.g. command line managers such as nnn or MidnightCommander) be considered an abomination compared to a regular GUI version?
Maybe I'm not looking at the right thing here but APL's and J's notation seem to be similar. One does look better than the other, but both seem to serve the same purpose.
I’ve only glanced at J and never used it, so I don’t have any strong opinions about it. But APL just has that extra magic that J seems to lack. Notation does matter. It could be that I’m partially sentimental, as it’s the first programming language that I learned.
> Maybe I'm not looking at the right thing here but APL's and J's notation seem to be similar. One does look better than the other, but both seem to serve the same purpose.
Not sure if it is possible to understand this without having the context of being well versed in another means of communication that uses specialized notation. Musical notation being an easy example of this. Mathematics could be another. And, of course, languages that don't use the latin alphabet. Outside of APL, I happen to be fluent at musical notation and one non-ASCII spoken language, as well as having the mathematical background.
The closest I can come to explaining what happened with J is that they did their best to convert every APL symbol into an equivalent combination of ASCII characters. Here's the key:
They did NOT do this because Iverson thought this was a better path forward. He did not abandon thirty years of history creating and promoting notation because mashing ASCII characters together was a better idea. He did this because computers of the day made rendering non-ASCII characters a pain in the ass. This got in the way of both commercial and open adoption of the language. He likely genuinely thought the transliteration would bring array programming concepts to the masses. It did not.
In the grand context of computing, J is a failure and APL suffered greatly when its creator and primary evangelist abandoned it.
Imagine a world where people are writing perfectly legible code in C, Basic, Pascal, etc. Now imagine someone proposing the use of seemingly random arrangements of ASCII characters instead of those languages. It's like telling everyone: Stop programming in these languages! We are all going to program in something that looks like regex!
Well, the rest is history. The proof is in the fact that APL is but a curiosity and J isn't a commercially viable tool. Yes, they both exist in corner-case applications or legacy use. Nobody in their right mind would use either of them for anything other than trivial personal or academic applications. That's coming from someone who used APL professionally for ten years and even envisioned a future creating hardware-accelerated APL computing systems at some point. It's computer science history now.
I still think it should be taught (along with FORTH and LISP) as there's value in understanding a different way of thinking about solving problems computationally.
As an extension of this, part of me still thinks that the future of computing might require the development of specialized notation. For some reason I tend to think that working at a higher level (think real AI) almost requires us to be able to move away (or augment) text-based programming with something that allows us to express ideas and think at a different level.
Thanks for taking the time to reply. I think I'm beginning to understand but am not quite sure.
While I wouldn't consider myself fluent in any of the following, I do know how to read musical notation (from middle school/high school band) and I can read/write/speak a non-ASCII language (Korean). So I am somewhat familiar with non-ASCII notation.
> The closest I can come to explaining what happened with J is that they did their best to convert every APL symbol into an equivalent combination of ASCII characters.
This is the statement I keep on getting stuck on. From what I have read, besides the symbols being converted to ASCII characters, APL and J are generally the same. Both work on arrays, both are parsed right to left, etc. It seems like the only major change is that the symbols got converted to ASCII characters that are at a maximum 2 characters long. If this is the case, what would you say about the J language's notation if the authors one day decided to change all the symbols to non-ASCII characters? Everything else would stay the same, such as what the symbols do and how much space the symbols takes up (max 2 characters). If the J language were to change only its symbols and nothing else, would its notation be considered to be on par with APL's?
As you mentioned, my lack of proficiency in other specialized notation might be preventing me from understanding the issue. That said, your last set of comments strikes a chord with me and I do think I kind of understand. As you mentioned previously, notation is "a powerful tool of both expression and thought." The usage of specialized notations allows one to express their thoughts and ideas in a way that normal writing can't. But I guess this is where being well versed in the subject matter comes into play, since after all it is a "specialized" notation. It would be difficult for someone who doesn't have a strong background in the subject matter to take advantage of the specialized notation.
To me, with my limited knowledge and experience, J vs APL appears to be a symbol (graphical) design comparison rather than a notation design comparison. And as someone who doesn't have a strong mathematical background, both APL's and J's symbols conveyed nothing to me when I first saw them. Changing the symbols to non-ASCII or ASCII has no effect on me besides figuring out how I would input the non-ASCII characters. But I suppose that to you, a change in the symbols isn't something so superficial. The way I understand APL vs J now is that for those who are experienced in APL, the changing of the non-ASCII symbols to ASCII characters, simply for the purpose of not having go through the trouble of inputting non-ASCII characters, "broke" the notation.
> what would you say about the J language's notation if the authors one day decided to change all the symbols to non-ASCII characters?
That's a very interesting question. I think the only possible answer has to be that this would return the language to what I am going to label as the right path. It would be wonderful.
APL is the only programming language in history to attempt to develop a notation for computing. Iverson actually invented it to help describe the inner workings of the IBM mainframe processors. Any hardware engineer who has ever read a databook for, say, an Intel (and other) processors has run into APL-inspired notation that made it into the language of explaining how processor instructions work. It's a remarkable piece of CS history.
> besides the symbols being converted to ASCII characters, APL and J are generally the same
Let's call it "notation" rather than "symbols". The distinction I make is the difference between a language and just a set of glyphs that not entirely related to each other.
You might want to read Iverson's original paper on notation. It makes a very strong argument. Coming from the man who created APL, this is powerful. It also --at least to me-- tells me that his detour into J had to be motivated by business pressures. There is no way a man makes such an effort and defends a notion with such dedication for three decades only to throw it out for something that isn't objectively better.
I don't think we can find a paper from Ken Iverson that says something like "I abandoned three decades of promoting a new notation for computing and created J because this is better". You will find statements that hint at the issues with hardware of the era and the problems this created in popularizing APL.
Here's my lame attempt to further explore the difference. I don't know Korean at all. I just used Google translate and this is what I got for "hello from earth":
지구에서 안녕
I count seven characters, including the space.
Let's create J-korean because we are in the 80's and it is difficult to display Korean characters.
지 This looks like a "T" and an "L": So "TL".
구 This looks like a number "7" with a line across it: "7-"
에 This looks like an "o" with two "L"'s, one with a pre-dash: "O-LL"
서 This looks like an "A" with a dashed-"L": "A-L"
안 This looks like an "o" with a dashed-"L" and another "L": "OLL-"
녕 This looks like an "L" with two dashes and an "O": "L--O"
Space remains a space.
Here's that phrase in J-korean:
TL7-O-LLA-L OLL-L--O
It's a mess. You can't tell where something starts and ends.
OK, let's add a separator character then: "|"
TL|7-|O-LL|A-L| |OLL-|L--O|
Better? Well, just in case we can do better, let's make the space a separator. Two spaces in a row denote a single space:
TL 7- O-LL A-L OLL- L--O
We have now transliterated Korean characters into an ASCII printable and readable combination of characters.
Isn't this an abomination?
We destroyed the Korean language purely because computers in the 80's could not display the characters. We have now trifurcated the history of the language. Which fork will people adopt? Which will they abandon? Will all books be re-written in the new transliterated form?
Which of the above encodings (real Korean and the two transliterations) conveys, communicates and allows one to think in Korean with the least effort and the greatest degree of expressive freedom?
If I, not knowing one bit of Korean, expressed a strong opinion about J-korean being better because it doesn't use "funny symbols" I would not be treated kindly (and rightly so).
I don't know if this clarifies how I see the difference between APL and J. Had we stayed with APL's notation, evolved and enriched it over the last few decades we would have had an amazing tool for, well, thought and the expression of computational solutions to problems. No telling where it would have led. Instead Iverson took a path driven by the limitations of the hardware available at the time and managed to effectively kill both languages.
I happen to believe that the future of AI requires a specialized notation. I can't put my finger on what this means at this time. This might be a worthwhile pursuit at a future time, if I ever retire (I can't even think about what that means...I love what I do).
Here's Iverson's paper on notation. It is well worth reading. It really goes into the advantages of notation to a far greater level of detail than is possible on HN comments:
> I happen to believe that the future of AI requires a specialized notation. I can't put my finger on what this means at this time. This might be a worthwhile pursuit at a future time, if I ever retire (I can't even think about what that means...I love what I do).
I also share this opinion and I might know what you mean. A lot of breakthroughs in physics are due to new notation – e.g. Maxwell's equations, Einstein notation, etc. Or to be precise, it is easier to think new thought in notation/language that is suited for it.
Current machine learning is a 90:10 blend of empirical stuff followed by delayed theory. A language for theory-based ML is math with paper and pencil. However language for empirical experiments are PyTorch, Tensorflow, JAX, DEX, Julia, Swift, R, etc. Former is "dead" historical language, latter are notations for differentiable computational graphs that can be run on modern HW accelerators. If you look at what have those programming languages in common is that they were all influenced by APL. And funny enough, machine learning is the best use case for APL, but practically non-existent. APL should be reborn as differentiable tensor-oriented notation. That would be wild – prototyping new ML architectures at the speed of thought.
Anyway, another angle for APL + ML is an alternative interface – write APL with pencil on paper that understands your handwriting and evaluates your APL scribbles. [0] I committed myself to provide a BQN [1] with such interface and see where it leads.
Ideally, the best outcome would be the combination of both approaches.
Thank you for the discussion. I am now convinced but unfortunately, I cannot confidently say that I deeply understand.
I've taken a shot at the linked paper but will require more readings to fully grasp what it's saying. However, between what I understood of the paper and the example that you provided, it makes sense that J would be considered an abomination of APL.
Hopefully, after reading the paper a few more times and maybe even trying out APL, I'll have a better understanding. Thanks again for your time.