Lisp and Prolog for Artificial Intelligence?

user475353 picture user475353 · Mar 7, 2011 · Viewed 38.6k times · Source

Now since i've taken a class 3 years ago in A.I. im clearly proficient enough to ask this question......just kidding just kidding ;)

but seriously, what is it about these languages that make them so popular for A.I. research. Even though A.I. research is "old"...it's came probably the longest way in the past 5-10 years it seems like.... Is it because the languages were somewhat "designed" around the concept of A.I. , or just that we have nothing really better to use right now?

I ask this because I've always found it quite interesting, and Im just kinda curious. If im entirely wrong and they use different languages I would love to know what all they use. I mean i can understand prolog, especially with Sentient/Propositional Logic and Fuzzy logic. but I dont understand "Why" we would use Lisp...and even what else A.I. researchers would use to do machine learning etc.

Any articles/books on the subject matter is helpful too :)

Answer

Fred Foo picture Fred Foo · Mar 8, 2011

The question has already been answered for Lisp, so I'll just comment on Prolog.

Prolog was designed for two things: natural language processing and logical reasoning. In the GOFAI paradigm of the early 1970s, when Prolog was invented, this meant:

  1. constructing symbolic grammars for natural language that would be used to construct logical representations of sentences/utterances;
  2. using these representations and logical axioms (not necessarily those of classical logic) to infer new facts;
  3. using similar grammars to translate logical representation back into language.

Prolog is very good at this and is used in the ISS for exactly such a task. The approach got discredited though, because

  1. "all grammars leak": no grammar can catch all the rules and exceptions in a language;
  2. the more detailed the grammar, the higher the complexity (both big O and practical) of parsing;
  3. logical reasoning is both inadequate and unnecessary for many practical tasks;
  4. statistical approaches to NLP, i.e. "word counting", have proven much more robust. With the rise of the Internet, adequate datasets are available to get the statistics NLP developers need. At the same time, memory and disk costs has declined while processing power is still relatively expensive.

Only recently have NLP researchers developed somewhat practical combined symbolic-statistical approaches, sometimes using Prolog. The rest of the world uses Java, C++ or Python, for which you can more easily find libraries, tools and non-PhD programmers. The fact that I/O and arithmetic are unwieldy in Prolog doesn't help its acceptance.

Prolog is now mostly confined to domain-specific applications involving NLP and constraint reasoning, where it does seem to fare quite well. Still, few software companies will advertise with "built on Prolog technology" since the language got a bad name for not living up to the promise of "making AI easy."

(I'd like to add that I'm a great fan of Prolog, but even I only use it for prototyping.)