John McCarthy, one of the most influential pioneers of computer science, has died. His main claim to fame was the invention (some would say discovery) of the Lisp programming language. It is hard to overstate the impact that Lisp has had on the programming world and on me personally. I discovered Lisp while I was still in high school (via P-Lisp running on an Apple ][ ) and it is one of the things that made me decide to pursue a career in artificial intelligence (a term that McCarthy coined). Even today it is with no small amount of regret that I note the ironic juxtaposition of two facts: 1) Lisp is one of the most influential inventions/discoveries in the history of mankind's intellectual progress, and 2) it is hardly ever used by anyone any more. Lisp is the Latin of computer science. Parts of its essence lives on in Python and Ruby and Javascript and Haskell and pretty much every other programming language in widespread use today (except C and C++). But Lisp itself is a mostly dead language [See update below]. I wish it were otherwise. The world would be a better place.
But McCarthy's legacy also has a little-noted dark side which also influenced my career, but in a much less positive way. McCarthy was not only one of the pioneers of the study of AI, but also an avid proponent of a particular school of thought about how human intelligence works. McCarthy believed that human intelligence could be modeled as a formal logic. That hypothesis turns out to be (almost certainly) wrong, and the evidence that it is wrong was overwhelming even in McCarthy's heyday. And yet McCarthy steadfastly refused to abandon this hypothesis. Well into his nominal retirement, and quite possibly to his dying day, he was still working on trying to formulate formal logics to model human thought processes.
The way human mental processes actually work, it turns out, is (again, almost certainly) according to statistical processes, not formal logics. The reason I keep hedging with "almost certainly" is that the jury is still out. We have not yet cracked the AI puzzle, but vastly more progress has been made in recent years using statistical approaches that has ever been made using logic. Very few (if indeed any) logic-based systems have ever been successfully deployed on non-toy problems. Statistics-based applications are being deployed on a regular basis nowadays, with Siri being the most recent example.
It took decades to make this switch, arguably due in no small measure to McCarthy's influence. One of the many consequences of this delay was the infamous AI-winter, which lead more or less directly to the commercial demise of Lisp. That the same person was responsible both for the invention of such a powerful idea and for its demise has to be one of the greatest ironies in human intellectual history.
It is important to remember that even great men can be wrong at times, sometimes spectacularly so. There is no shame in this. The human has yet to be born whose rightful epitaph is "He was right about everything." But John McCarthy's legacy in particular calls all of us mere mortals to a greater degree of humility. The world would be a better place if more people could acknowledge the possibility that even their most cherished beliefs might be wrong.
[UPDATE:] After posting this I felt the need to hedge my assessment of Lisp as "mostly dead." Lisp is not dead. In fact, it is probably more vibrant now than at any time in the last 20 years. But by comparison to other languages Lisp has a vanishingly small mindshare. To cite but one concrete example, of 300 or so Y Combinator companies there is (AFAIK) only one whose code is written in Lisp.
Notwithstanding all that, if you're interested in programming I really encourage you to learn Lisp. It is still the best programming language out there.
Interesting that you would suggest that McCarthy's influence delayed the impact of statistical techniques. Given the close relationship between statistical techniques and neural network approaches, shouldn't Minsky and Papert get the "credit"? Personally, I'm inclined to give props to the hardware folks who finally made enough memory and raw CPU available to enable the number-crunching that underlies statistical approaches, rather than suggesting some negative influence from folks pursuing what seemed to be promising research areas.
ReplyDeleteHistorical credit assignment is a very difficult problem, and it's impossible to know what the world would be like if McCarthy had been less devoted to logic. And I certainly do not mean to suggest that McCarthy's devotion was in any way malicious or even negligent. But these are the facts: 1) McCarthy was devoted to logic, 2) McCarthy was enormously influential, 3) logic has to date failed to solve the AI problem, 4) AI winter happened as a result of AI failing to make enough progress to satisfy DARPA, and 5) AI winter had a major negative impact on the use of Lisp. So it is plausible that McCarthy (inadvertently) helped to undermine his own creation. But to be sure, McCarthy did not create AI winter singlehandedly. He had lots and lots of help from many quarters. But he was not a bit player either.
ReplyDeleteBTW:
> ... promising research areas
I think it's highly debatable whether logic was still "a promising research area" in 1974, eighteen years after the Dartmouth conference, when DARPA's cutbacks to AI research began. It is even more debatable whether it was promising thirteen years later, in 1987, when the Lisp machine market collapsed.