Language, Consciousness and Intelligence?
Saturday, August 30, 2003
 
KurzweilAI.net

William Dembski: "If predictability is materialism’s main virtue, then hollowness is its main fault. Humans have aspirations. We long for freedom, immortality, and the beatific vision. We are restless until we find our rest in God. The problem for the materialist, however, is that these aspirations cannot be redeemed in the coin of matter. Our aspirations are, after all, spiritual (etymology confirms this point—“aspiration” and “spiritual” are cognates). We need to transcend ourselves to find ourselves. "

Marvin Minsky, 2002: "Why have we made limited progress in AI? Because we haven't developed sophisticated models of thinking, we need better programming languages and architectures, and we haven't focused on common sense problems that every normal child can solve."
 
Friday, August 29, 2003
 
The Mind Wins (review of Searle's "the Rediscovery of the Mind" by Thomas Nagel):

"According to a widely held view, the brain is a giant computer and the relation of the human mind to the human brain is like that of a computer program to the electronic hardware on which it runs. The philosopher John Searle, a dragon-slayer by temperament, has set out to show that this claim, together with the materialist tradition underlying it, is nonsense, for reasons some of which are obvious and some more subtle. "

"computers which do not have minds can be described as running programs, processing information, manipulating symbols, answering questions, and so on only because they are so constructed that people, who do have minds, can interpret their physical operations in those ways. To ascribe a computer program to the brain implies a mind that can interpret what the brain does; so the idea of explaining the mind in terms of such a program is incoherent."
 
 
Dennett: Are we explaining consciousness yet?: "theorists must resist the temptation to see global accessibility as the cause of consciousness (as if consciousness were some other, further condition); rather, it is consciousness."

On what basis can Prof. Dennett make such a preposterous claim?

Perhaps he would prefer to re-define consciousness this way, so that he side-steps all the hard problems associated with it?
 
Thursday, August 28, 2003
 
(Chalmers) Facing Up to the Problem of Consciousness: "It is common to see a paper on consciousness begin with an invocation of the mystery of consciousness, noting the strange intangibility and ineffability of subjectivity, and worrying that so far we have no theory of the phenomenon. Here, the topic is clearly the hard problem - the problem of experience. In the second half of the paper, the tone becomes more optimistic, and the author's own theory of consciousness is outlined. Upon examination, this theory turns out to be a theory of one of the more straightforward phenomena - of reportability, of introspective access, or whatever. At the close, the author declares that consciousness has turned out to be tractable after all, but the reader is left feeling like the victim of a bait-and-switch. The hard problem remains untouched."

Chalmers paper is excellent and readable. He correctly observes that some researchers deny the problem of experience altogether!

Chalmers quotes Koch: ["What is Consciousness" Discover, November, 1992] "there seems to be a huge jump between the materialistic level, of explaining molecules and neurons, and the subjective level."
 
 
During an analysis of Canadian French digit recognition errors in noise, I was puzzled over the misrecognition of "six" as "deux". To my second-language mind, this made no sense. So I recorded all the instances of "six" that the recognizer classified as "deux". The first one I listened to was (to my mind at least) a clearly-spoken "deux". Oops, I thought: a transcription error. But when I listened to the entire utterance, it was clearly a "six". Digging deeper, it became clear that the "six"-ness of the word was determined by its acoustic context. For the entertainment of my native-speaking colleagues, I sat them all in a room and played the recordings: first as the recognizer heard it (a clear "deux"), then relaxing the initial boundary of the recording. Slowly, it "became" a clear "six".
 
 
Traveling down the highway in the old stationwagon, I caught myself making phonetic alterations to my speech for the sake of robustness. My son asked me a question from the "way-back", and the answer was "three". Due to the high level of ambient noise, however, I did not say "three" exactly. Rather, I said "shree". Altogether subconscious a decision. But it was the correct one under the circumstances: the phone "th" would have been masked by the road noise. The fricative "sh", on the other hand, had sufficient high-frequency energy to carry to the back of the wagon.
 
 
Visiting Munich, I stopped at a light and waited to cross the road. An elderly gentleman graciously engaged me in light conversation. Unfortunately, I know only three or four words of German: "danke", "eins", "bier". And so the dear man's attempts at social grace were entirely lost on me. I was non-plussed, but my initial confusion was nowhere near the confusion that I felt after the light turned and I was half-way across the road. You see, by that time, whatever internal language processing there is in my brain had deciphered the message. The gentleman had said, "aren't we having a delightful spring this year?" and I knew that that is what he had said. I turned back to him only to realize that I had no hope of communicating that epiphany since I had no German-generation. My experience, if legitimate (and I am sure that it was, though I could not stop someone on the street to verify), argues for a non-statistical human language processing.
 
Wednesday, August 27, 2003
 
How the Mind Works by Steven Pinker critically reviewed By Colin McGinn:

"Evolutionary psychology, in particular, though offering a fresh perspective on human behavior, should not aspire to the condition of dogma at this early stage of inquiry."

"Pinker has given us a fine survey of the state of the art in the science of mind, but he should calm down. There is much that remains as baffling as ever."
 
 
The Discovery of Spoken Language by Peter Jusczyk: "an extensive review of research on the acquisition of language during the first year of life"
 
 
(Peter King) Dualism: An Empirical Test?: "Happily, the philosophical fashion that, for example, encouraged the sneering use of `Cartesian' as an insult, often by those who have hardly taken the trouble to read or think about Descartes, shows some sign of passing. The sooner the better."
 
 
Blay Whitby 1997: "the mistaken view that Turing's paper contains an adequate operational definition of intelligence."

"What we should conclude currently about this sort of AI work is that it represents research into the mechanisms of producing certain sorts of illusion in human beings rather than anything to do with intelligence, artificial or otherwise."

"the general misreadings of Turing's 1950 paper have led to the currency of three specific mistaken assertions, namely:

1) Intelligence in computing machinery is (or is nearly, or includes) being able to deceive a human interlocutor.

2) The best approach to the problem of defining intelligence is through some sort of operational test, of which the 'imitation game' is a paradigm example.

3) Work specifically directed at producing a machine that could perform well in the 'imitation game' is genuine (or perhaps even useful) AI research. "

 
Tuesday, August 26, 2003
 
FT June/July 2003: Hume, Austen, and First Impressions:

Rodney Delasanta quotes Samuel Johnson "Truth will not afford sufficient food to their vanity, so they have betaken themselves to error: Truth, Sir, is a cow which will yield such people no more milk, and so they are gone to milk the bull"

"Austen stands as a splendid corrective to the assumptions of radical empiricism."

Johnson and Austen -- my heros.
 
 
Hegel: Natural Religion:

In a Survey of Hegel's Natural Religion, Eric Steinhart mentions (among other things):

"The artificer wants to make something in his or her own image, something that is living and self-conscious just like itself."

"Computers are becoming increasingly intelligent; in them, we see our own intelligence. The computer is a mirror able to reflect the spiritual aspect of human life, for it is a mirror in which we see our own minds."

"computers at least today are made of silicon chips: rocks like the stone at Mecca. Perhaps this is just a coincidence, or perhaps Hegel was really onto something."
 
Monday, August 25, 2003
 
John Parrington: COMPUTERS AND CONSCIOUSNESS -­ A REPLY TO ALEX CALLINICOS (review of Dennett): Parrington's reasoned criticism to the materialist position involves some irony: he is a Marxist.

The article is an excellent one. Among other notable points:

Dennett refers to consciousness as "the programme that runs on your brain's computer"

"Steven Rose explains:
The brain/computer metaphor fails because the neuronal systems that comprise the brain, unlike a computer, are radically indeterminate... Unlike computers, brains are not error-free machines and they do not work in a linear mode or even a mode simply reducible to a small number of hidden layers. Central nervous system neurons each have many thousands of inputs (synapses) of varying weights and origins... The brain shows a great deal of plasticity that is, capacity to modify its structures, chemistry, physiology and output in response to contingencies of development and experience, yet also manifests redundancy and an extraordinary resilience of functionally appropriate output despite injury and insult."

"But surely this is just confusing the intentionality of the machine with that of its maker." John Parrington cites Derek Bickerton here. "computer assisted Pygmalianism" Pygmalion was the legendary Greek sculptor who carved a statue so beautiful that he immediately fell in love with it. To be able to do this he presumably had to believe that he and the statue were organisms of a similar kind.

"Cognitive psychology has sold itself, and the rest of psychology, short. It has failed to provide the alternative which was so badly needed after the dominance of behaviourism. The cognitive obsession with the computer metaphor meant that it fell into the same trap for which it criticised the behaviourists. By defining the human being using a metaphor, and then taking the metaphor as if that were the only possible reality, it rendered itself unable to respond to the real issues and challenges which were being thrown up." Parrington quotes Nick Hayes (Psychology in Perspective)
 
 
(Hacker) The Kurzweil Hypothesis: "[Steven] Rose [in the Making of Memory, 1992] argues that we have done a great deal of damage by confusing metaphors for realities and that brain-as-computer is simply a current metaphor. Each era has its defining technologies which are used as metaphors, perhaps incorrectly. In the 1800s, we have seen that communication was defined as transportation. In the first half of the 20th century, it was defined as electrical signal transmission. The ancient Greeks saw memory as inscriptions of wax tablets. In the Middle Ages, memory involved pipes and valves in certain scientific descriptions. For Newton, Galileo, and Descartes, the world was made of mechanisms and the main metaphor was that of the clock complete with gears, cogs, etc. This was extended into biology as mind and body were viewed as separate mechanisms. When Galvani discovered that nerves react to electricity, scientists began to talk about brain as a telegraphic signaling system or type of telephone exchange. Today's defining technology for the brain is the computer."
 
 
Computers Mimic The Human Mind quotes Dennett in Consciousness Explained: "Artificial intelligence is progressing, creating smart machines that process data somewhat the way human beings do. As the trend continues it will become clearer that we're all machines, that Ryle's strict materialism was basically on target, that the mind-body problem is in principle solved"
 
 
Weak intelligence: "Because our common sense tells us that the strong-AI viewpoint is wrong." - Gams
 
 
John Searle's Minds, Brains, and Programs: "no program by itself is sufficient for thinking."
 
 
Do Brains Make Minds? Experts debate on Closer To Truth: John Searle says, "I think that there's no PC that's ever going to replace [my dog]. The reason is very simple: I know that [my dog]is conscious and I know that a computer is not. And this conclusion has nothing to do with computing power. You can expand the power all you want, hooking up as many computers as you think you need, all in parallel, and they still won't be conscious, because all they'll ever do is shuffle symbols. Computers don't have the causal powers of brains. So, no; no computer as currently defined is going to replace my dog, because computers aren't conscious."

But he also says that brains are computers. So he is implicitly going for a consciousness-requires-more-than-a-brain position.
 
Language and Consciousness -- beyond Artificial Intelligence
  • Alan Turing's paper
  • David Chalmers' site
  • Ray Kurzweil's site
  • Daniel Dennett's site
  • John Searle's paper
  • Michael Webb's site
  • John McCrone's site
  • ARCHIVES
    08/17/2003 - 08/24/2003 / 08/24/2003 - 08/31/2003 / 09/07/2003 - 09/14/2003 / 09/14/2003 - 09/21/2003 / 09/21/2003 - 09/28/2003 / 09/28/2003 - 10/05/2003 / 10/05/2003 - 10/12/2003 / 10/12/2003 - 10/19/2003 / 10/19/2003 - 10/26/2003 / 05/07/2006 - 05/14/2006 / 10/19/2008 - 10/26/2008 /


    Powered by Blogger