Seems like a good example of technical people placing too much importance on a technical decision.
It seems clear Lisp would have been a perfectly fine choice, as would a number of other languages. It also seems unlikely the world (or even Mathematica) would be dramatically different for it.
I wish more "X is better than Y because of Z" discussions would admit aesthetics were an important factor. Instead we end up with convoluted justifications that just annoy everyone (Lisp is slow, C is fast, C++ is complicated)
Wolfram's decision at worst did not impede the development of a tremendously successful software product. At best it enabled it. Mathematica is 27 years old. It predates the i486 which was the first mass market CPU with an FPU. In the early 1980's betting on Lisp was betting on Lisp machines and workstation class hardware. The first version of Mathematica ran on M68000 Macintosh machines. It's hard to imagine overcoming the pain that something slower than C would have inflicted on users.
> The first version of Mathematica ran on M68000 Macintosh machines. It's hard to imagine overcoming the pain that something slower than C would have inflicted on users.
According to this (https://en.wikipedia.org/wiki/Macsyma#Commercialization) Wikipedia article, Macsyma (a Lisp-based CAS that Mathematica was designed to compete with) was running on 68000 Sun-1s in the mid-80s and a Windows port came out about a year after Mathematica came out for the Macintosh (1989 and 1988, respectively). The pain must have been real because Mathematica ended up with all of Macsyma's market: "Macsyma's market share in symbolic math software had fallen from 70% in 1987 to 1% in 1992"
For this domain which is largely compute bound when it counts, for the PC systems it was targeting back in those days, C was almost certainly the right choice if you had enough resources for the extra programming work required.
Commercial Macsyma is ... a special, and rather sad case. As explained to me by Danny Hillis in 1982-3 when he wished my company, LMI, could provide Lisp Machines for Thinking Machines, Inc. to help develop the Connection Machine 1, one motivation was Symbolics' pathological business practices, and the nastiest example then was Macsyma. Back then the MIT Technology Licensing office was still horrible, and it was arranged that Arthur D. Little, I think, recommend how Macsyma be licensed, and it ended up being exclusively to Symbolics, which was not a common approach.
As far as we could tell, Symbolics bought it primarily to keep it out of the hands of LMI and anyone wanting to run it on conventional hardware, and went so far as to try to get people who have Vaxsyma copies to send them back and stop using it, which was not well received as you might imagine. That helped Fateman, using the DoE which had sponsored much of the work, to force MIT to release a snapshot as open source, which eventually became Maxmia.
In house, Symbolics treated it with benign neglect, and I don't think it was massively improved. This situation became ironic as their hardware business declined and the Macsyma unit became an important cash cow, but as these things go, for the usual internal political reasons, it never got development resources commensurate with its actual status and potential.
MIT also didn't reward the people who'd originally written it at MIT, something Joel Moses, who I happened to be directly reporting to in the 1987-8 time frame when he was the EECS department head, was obviously not happy with, along with I'm sure many others. So in short, Symbolics did nearly everything they could to mess up the Macsyma community and product, and the noted decline, once there were good alternatives, was inevitable.
RMS is not 100% wrong in his loathing of Symbolics....
> In house, Symbolics treated it with benign neglect, and I don't think it was massively improved.
There is a lot of butthurt from various people, who seem to know better how to run a company in hindsight. Symbolics sold Macsyma on various platforms: Lispm, Windows, DEV Vax, Sun Unix.
> it never got development resources commensurate with its actual status and potential.
Symbolics was still selling Macsyma at a time when competitors like LMI or TI were no longer in the Lisp business.
There is a lot of butthurt from various people, who seem to know better how to run a company in hindsight. Symbolics sold Macsyma on various platforms: Lispm, Windows, DEV Vax, Sun Unix.
The exclusive licencing to Symbolics resulted in a delay in porting it to non-Lispm platforms, due in part to the cited internal opposition. And not too much later Symbolics effectively exited the market; they too have a lot to say about how not to run a company.
Symbolics was still selling Macsyma at a time when competitors like LMI or TI were no longer in the Lisp business.
Yet for some explicable reason people stopped buying it and its market-share crashed to 70% to 1% in 5 years.
Between Macsyma, S-Graphics, and Statice Symbolics could have been a really great software business. S-Graphics was sold off to Nichimen and developed and marketed as Mirai until the early 2000s, and the Statice guys all left to make ObjectStore. I read on Usenet somewhere that there was an unsuccessful attempt to acquire Macsyma as a separate business.
> Between Macsyma, S-Graphics, and Statice Symbolics could have been a really great software business.
How so? Don't you think they tried and explored that? In reality in early 90s nobody was interested in Lisp anymore. Macsyma was still sold to the market, but didn't have much success. Nichimen's N-World had a small customer base, ran on SGIs (which were still expensive) and then under Windows NT. Statice on non-Lispms never left the beta status.
> I read on Usenet somewhere that there was an unsuccessful attempt to acquire Macsyma as a separate business.
That's false. See Macsyma, Inc. The company was founded in 1992 and in 1999 acquired by 'Symbolics Technology'.
To me it sounds more like NIH syndrome. Wolfram was young, smart, cocky, and didn't really know much about computation when he started - but was convinced he could do a better job if he started from scratch.
He remains two of those things to this day. I have no doubt Wolfram is legitimately a genius (despite his arrogance and tendency to take credit for other people's inventions) and I adore Mathematica as it is, but I can't help but wonder how much better it would be if Wolfram's personality was less... abrasive.
I've never met him but have read some of his stuff over the years and talked to a few people who have (met him).
He strikes me a clearly very smart, just not as smart as he thinks he is (not uncommon, but more rare at his level of talent I think). An aside: For what it's worth, I find the "genius" label problematic in general, not just for him. I think it's a concept that probably has useful application, but to < 1% of the people to whom it's applied.
His abilities as a technical writer are middling, unfortunately, and at some point being able to communicate your ideas is almost as important as the ideas.
I too wonder how much further Mathematica could have got if he hadn't pushed away a number of clearly talented people.
I agree genius is overused and try to avoid it. My point was I get the impression that he might really deserve it, but his personality leaves you with a distinctly bad aftertaste. I've only met maybe 2 or 3 people I really thought deserved the label, though I guess it's hard for me to judge given that I'm certainly not one. I do however mean a genius as measured purely by ability... in my mind there is a distinction between people who have the abilities of a "genius" versus someone with the achievements of a genius. Hard work by normal people most often often produces the latter, whereas the former case is much harder to identify.
And I agree wholeheartedly... I once had students ask me why I put so much emphasis on lab reports in an [upper level engineering course]. My sincere belief is that at least 50% of working in a technical field is your ability to communicate. The most genius answer has little value if you can't properly articulate it.
ANKOS is the exact example I have in mind when I think about this. Really I find the premise that cellular automata are somehow fundamental to computation and the universe interesting (though maybe I don't buy into it to the same degree as Wolfram), but his presentation of this thesis is so dreadfully tedious and conceited as to squash any desire I might have to investigate it.
I heard that when he was writing and editing ANKOS, he rejected a picture of a panther that was to be used to illustrate reaction-diffusion textures in nature, because he didn't like the expression on its face. ;)
Perhaps the best property of ANKOS is that it is a clear demonstration of what sort of trouble you can get into when a book doesn't have an (effective) editor.
I'm certain there is a small, interesting, well written book hiding in all that verbiage somewhere, but it's well hidden.
It appears that the discussion on arbitrary precision vs machine precision is obsolete as it does not seem to apply to Mathematica, maybe it used to be true for SMP, though I'm not sure.
I don't know why I've gotten downvoted for this, but if you reread the post you will see a bunch of references to things that just haven't been true of Mathematica since v1.0.
It seems clear Lisp would have been a perfectly fine choice, as would a number of other languages. It also seems unlikely the world (or even Mathematica) would be dramatically different for it.
I wish more "X is better than Y because of Z" discussions would admit aesthetics were an important factor. Instead we end up with convoluted justifications that just annoy everyone (Lisp is slow, C is fast, C++ is complicated)