Language, Consciousness and Intelligence?
Sunday, September 21, 2003
 
New Scientist 1997 (J. McCrone): "A replacement for the brain-as-computer model certainly seems overdue. The textbook view has been that brain cells are simple logic gates, adding and subtracting input spikes until some threshold level of charge is breached, at which point they convulse to produce a spike of their own. The all-or-nothing nature of a cell's firing promised to lift neurons clear of the usual soupy sloppiness of cellular processes, allowing the brain to carry out digitally crisp, noise-free calculations. The task for researchers was simply to discover how the output of each cell encoded a message. In a chase likened to the hunt to crack the genetic code, neuroscientists became obsessed with finding the 'neural code'. They tried to discover whether the message was contained in the strength of a spike, the average number of spikes produced each second, or in the timing of the firing, with information carried only on those spikes which were synchronised with spikes from other cells (see 'Dot dot dot, dash dash dash', New Scientist, 18 May 1996, p 40).
But the neurons have proved slippery customers. 'For 30 years we've been going along quite nicely, with lots of expensive equipment, lots of expensive people and lots of papers being produced, but finally the answers aren't there. We can't even say what it is about the spike train of an individual neuron that counts,' says Rodney Douglas of the Institute of Neuroinformatics in Zurich. Much worse for the idea of a simple, crackable neural code are the smattering of recent findings which show that the output of any individual neuron also depends on what the brain happens to be thinking at the time. It's as if rather than the spikes combining to produce conscious awareness, consciousness is able to decide how the cells should spike. "
 
 
What is the physical cost of creativity? (Michael Webb): "Computers sometimes give the illusion of being creative, but innovation is ultimately based on choices. Programmers and users make all the choices that determine how software executes. Therefore, computer creativity originates strictly in the human mind, and is actually a form of human creativity.
Everything that happens in software can be reduced to well-defined algorithms. Software is really nothing but a set of definite rules. Computer systems hold no fundamental surprises, unlike living systems."
 
Friday, September 19, 2003
 
TAI Archive | Commentaries: "Back in 1887, Alexander Graham Bell shared with the president his view that the telephone was so important that 'someday every community will have one.' In 1889, Western Union president William Orten, rejecting Bell's offer to sell his struggling telephone company to Western Union for $100,000, said, 'What use could this company make of an electrical toy?' In 1943, IBM chief Thomas Watson said, 'I think there is a world market for maybe five computers.' In 1977, Digital Equipment founder and president Kenneth Olson said, 'There is no reason for any individual to have a computer in their home.' In 1981, Microsoft founder and president Bill Gates said, '640K [of memory] ought to be enough for anyone.' And, of course, everyone remembers Charles Duell, the U.S. Commissioner of Patents, who said in 1899, 'Everything that can be invented has been invented.' "
 
 
Nathan Myhrvold @ Microsoft (ACM '97): "1) Software is a gas: it expands to fit the size of its container.
2) Software grows until it hits the memory and processing limits of the
current computer technology; it brings old machines to their knees just when
the new model is ready.
3) Software growth drives hardware growth. People buy new models because
their old ones are bogging down.
4) Software is limited only by human ambition and expectations.
- It is often said that we are today in a software crisis. But this has
been said throughout the history of computing. Many software technologies
were supposed to solve it but didn't: high-level languages, object oriented
programming, component software.
- Future software technologies: genetic programming, software husbandry."
 
 
The Reality Club: THE SINGULARITY: "Herbert Simon's 1965 prediction that by 1985, 'machines will be capable of doing any work a man can do.'"
 
 
Predictions for the Third Millenium (David Krieger): "I think everything that is both presently imaginable and physically possible will be within the reach of human technology within the next century at most, if we don't annihilate ourselves. "
 
 
TECHNOLOGY ZOOMS: "Birthing sentient artificial intelligences may be a hell of a lot harder than first thought. The more we findout about the brain, the slipperier the concept of “mind” gets. Even as the mapping of every synapse in the human brain becomes possible, accurate modeling of the function of even a single synapse eludes us. And the mind is not merely the product of the firing of synapses, but of some complex interaction between the brain and body, with everything from culture to the glands having a role to play"
 
Wednesday, September 17, 2003
 
The Tower of Electric Babel: "Language is a guessing game."
 
 
Wired News: AI Founder Blasts Modern Research: "'AI has been brain-dead since the 1970s,' said AI guru Marvin Minsky in a recent speech at Boston University."
 
 
Scientific American: The World in a Box -- Little Fanfare Greets The Coming Out of a Pivotal AI Project: "[2001] has come and gone with computers eliciting not invigorating repartee but muffled cursing from their users at the obtuseness of their behavior."

"It would have taken a single programmer 500 years to incorporate the almost 1.5 million facts about the everyday world that are in Cyc's database. "

hmm: I'd guess that Dennett and Hofstadter would be happy about the "intelligence" of any computer powered by Cyc; but does it address the combinatorial explosion?
 
Tuesday, September 16, 2003
 
Funding a Revolution -- Thomas Hughes (Nat. Research Council)

Artificial Intelligence

"Support for research in artificial intelligence (AI) over the past three decades has come largely from government agencies, such as the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation (NSF), and the Office of Naval Research (ONR). Firms that initiated AI research programs in the 1960s eliminated or truncated them severely once they realized that commercial applications would lie many years in the future. While not attaining the original vision of creating a truly thinking machine, research in artificial intelligence has generated numerous advances in expert systems, speech recognition, and image processing. Industry is actively commercializing many of these technologies and embedding them into a range of new products. (1999)"
 
 
KurzweilAI.net (Douglas Hofstadter): "The number of sentences you'd need to store in order to be able to respond in a normal way to all possible turns that a conversation could take is more than astronomical-it's really unimaginable. And they would have to be so intricately indexed, for retrieval...Anybody who thinks that somehow a program could be rigged up just to pull sentences out of storage like records in a jukebox, and that this program could pass the Turing Test, hasn't thought very hard about it. The funny part is that it is just this kind of unrealizable 'parrot program' that most critics of artificial intelligence cite, when they argue against the concept of the Turing Test."

indexation is trivial.

memory is exploding exponentially.

An inter-lingua based hyper-text document (i.e., 16 bits per item or 65k unique interlocutor-responses and 65k not-necessarily-mapping unique computer responses) at 2 turns per minute requires 20 turns for the "ten minutes" Hofstadter specifies. But let's be unambitious and cover 5 minutes (i.e., 10 turns).

If we "keep" the 100 most probable interlocutor-responses, this requires 100^10~2e20 B of memory.

Given that "petabytes are coming", and that in ten years, 1e15 B will cost $2k (according to Jim Gray), sufficient memory will only cost $400m... and that is only in ten years!

"unimaginable"? We'll forgive DH for his lack of imagination given the date his words were published (1984)...
 
Language and Consciousness -- beyond Artificial Intelligence
  • Alan Turing's paper
  • David Chalmers' site
  • Ray Kurzweil's site
  • Daniel Dennett's site
  • John Searle's paper
  • Michael Webb's site
  • John McCrone's site
  • ARCHIVES
    08/17/2003 - 08/24/2003 / 08/24/2003 - 08/31/2003 / 09/07/2003 - 09/14/2003 / 09/14/2003 - 09/21/2003 / 09/21/2003 - 09/28/2003 / 09/28/2003 - 10/05/2003 / 10/05/2003 - 10/12/2003 / 10/12/2003 - 10/19/2003 / 10/19/2003 - 10/26/2003 / 05/07/2006 - 05/14/2006 / 10/19/2008 - 10/26/2008 /


    Powered by Blogger