Language, Consciousness and Intelligence?
Saturday, October 25, 2008
  Blog - Evangelical Philosophical Society
Blog - Evangelical Philosophical Society: "The simple truth is that in both science and philosophy, strict physicalist analysis of consciousness and the self have been breaking down since the mid-1980s. The problems with physicalism have nothing directly to do with theism; they follow from rigorous treatments of consciousness and the self as we know them to be. The real problem comes in trying to explain its origin and for this problem, naturalism in general and Darwinism in particular, are useless. In my view, the only two serious contenders are theism and panpsychism which, contrary to the musings of some, has throughout the history of philosophy been correctly taken as a rival to and not a specification of naturalism."
 
Sunday, May 07, 2006
 
http://www.iro.umontreal.ca/~nie/IFT6255/lafferty-zhai.pdf
"The development of a well-motivated framework for semantic smoothing is one of the important unresolved problems in the language modeling approach [to IR]."
 
Tuesday, October 21, 2003
 
No Silver Bullet Essence and Accidents of Software Engineering: "the essence of the design without having to express large amounts of syntactic material that add no information content."
 
Monday, October 20, 2003
 
Why You Don't Need Proprietary Bot Software (A.L.I.C.E. AI Foundation): "Perhaps the most spectacular failure has been the CYC project, initiated by Doug Lenat in 1984 and beneficiary of millions of dollars of government research money and private and institutional investment. CYC (now housed under a company called Cycorp), was and is a project with the aim of building a giant knowledge base full of 'common sense', with the idea that this would someday enable machine understanding of texts. Lenat has been telling journalists (and presumably his investors) for years that CYC is mere months away from being able to understand simple texts like TIME magazine. So far quite a lot of money has been spent, and the giant knowledge base continues to grow, but despite its intricacy (some might say beauty), its commercial use is still beyond reach, and its theoretical base lags behind the academic research front, itself still light-years away from success.
Know this: the commercial bot companies may borrow bits and pieces from conventional NLP, but by and large they are every bit as ELIZA as ELIZA. As ELIZA relies on pattern-matching and simple string manipulation, so too do all 'proprietary' offerings at their core. "
 
Friday, October 17, 2003
 
Salon.com Technology | Artificial stupidity, Part 2: "If Wallace is right, the first 'intelligent' machine according to Turing's criterion will indeed be as dumb as a bag of hammers. It will win the prize without ever learning to parse pronouns or deal creatively with enthymemes."
 
 
Salon.com Technology | Artificial stupidity, Part 2: "Wallace's theory of A.I. is no theory at all. It's not that he doesn't believe in artificial intelligence, per se; rather, he doesn't much believe in intelligence, period. In a way that oddly befits a contest sponsored by a bunch of Skinnerians, Wallace's ALICE program is based strictly on a stimulus-response model. You type something in, if the program recognizes what you typed, it picks a clever, appropriate, 'canned' answer.
There is no representation of knowledge, no common-sense reasoning, no inference engine to mimic human thought. Just a very long list of canned answers, from which it picks the best option. Basically, it's Eliza on steroids.
Conversations with ALICE are 'stateless'; that is, the program doesn't remember what you say from one conversational exchange to the next. Basically it's not listening to a word you say, it's not learning a thing about you, and it has no idea what any of its own utterances mean. It's merely a machine designed to formulate answers that will keep you talking. And this strategy works, Wallace says, because that's what people are: mindless robots who don't listen to each other but merely regurgitate canned answers. "
 
 
Salon.com Technology | Artificial stupidity, Part 2: "In the professional and academic circles the term Artificial Intelligence is passé. It is considered to be technically incorrect relative to the present day technology and the term has also picked up a strong Sci-Fi connotation. The new and improved term is Intelligent Systems. Under this general term there are two distinct categories: Decision Sciences (DS) and the human mimicry side called Mimetics Sciences (MS)"
 
 
Salon.com Technology | Artificial stupidity: "But the closer one looks at the history of the Loebner Prize, the more it appears that Loebner's real offense was showing up the biggest stars in 'real' artificial intelligence as a bunch of phonies. Thirty years ago, Minsky and other A.I. researchers were declaring that the problem of artificial intelligence would be solved in less than a decade. But they were wrong, and every year the failure of computer programs to get anywhere close to winning the Loebner Prize underlines just how spectacularly off the mark they were. "
 
Tuesday, October 14, 2003
 
Artificial intelligence - Wikipedia: "However, it has become clear that contemporary methods using both broad approaches have severe limitations."
 
Friday, October 10, 2003
 
Cognitive views of consciousness:: "When we try to understand conscious experience we aim to explain the differences between these two conditions: between the events in your nervous system that you can report, act upon, distinguish, and acknowledge as your own, and a great multitude of sophisticated and intelligent processes which are unconscious, and do not allow these operations. "

perhaps not the best quote, but an interesting read.
 
Language and Consciousness -- beyond Artificial Intelligence
  • Alan Turing's paper
  • David Chalmers' site
  • Ray Kurzweil's site
  • Daniel Dennett's site
  • John Searle's paper
  • Michael Webb's site
  • John McCrone's site
  • ARCHIVES
    08/17/2003 - 08/24/2003 / 08/24/2003 - 08/31/2003 / 09/07/2003 - 09/14/2003 / 09/14/2003 - 09/21/2003 / 09/21/2003 - 09/28/2003 / 09/28/2003 - 10/05/2003 / 10/05/2003 - 10/12/2003 / 10/12/2003 - 10/19/2003 / 10/19/2003 - 10/26/2003 / 05/07/2006 - 05/14/2006 / 10/19/2008 - 10/26/2008 /


    Powered by Blogger