Dan Dennett/Edge magazine photo |
Key takeaway? He junks a fair amount of what he's said in the past about the details of how the mind/brain is like a computer,
BUT!
Still holds fast to the analogy that it's ... like a computer! Even though he admits that the details of this comparison or analogy simply are weak to nonexistent.
Let's look at some comments:
I guess he simply can't admit it's a crappy analogy that just doesn't work, and walk away from it.We're beginning to come to grips with the idea that your brain is not this well-organized hierarchical control system where everything is in order, a very dramatic vision of bureaucracy. In fact, it's much more like anarchy with some elements of democracy. ...
The vision of the brain as a computer, which I still champion, is changing so fast. The brain's a computer, but it's so different from any computer that you're used to. It's not like your desktop or your laptop at all, and it's not like your iPhone except in some ways. ...
Control is the real key, and you begin to realize that control in brains is very different from control in computers. Control in your commercial computer is very much a carefully designed top-down thing. I mean, with all that, there's no need to hang on to his analogy.
Because it was a crappy analogy a decade ago,. and it's in tatters now.
How do they thread the needle so that they don't offend the sophisticates in their congregation by insisting on the literal truth of the book of Genesis, let's say, while still not scaring, betraying, pulling the rug out from under the more naïve and literal-minded of their parishioners? There's no good solution to that problem as far as we can see, since they have this unspoken rule that they should not upset, undo, subvert the faith of anybody in the church.This means that there's a sort of enforced hypocrisy where the pastors speak from the pulpit quite literally, and if you weren't listening very carefully, you’d think: oh my gosh, this person really believes all this stuff. But they're putting in just enough hints for the sophisticates in the congregation so that the sophisticates are supposed to understand: Oh, no. This is all just symbolic. This is all just metaphorical. And that's the way they want it, but of course, they could never admit it. You couldn't put a little neon sign up over the pulpit that says, "Just metaphor, folks, just metaphor." It would destroy the whole thing.
Go read it.
Hello gadfly. I've come to your blog via your comments on Emmy van Deurzen's.
ReplyDeleteOf Dennett you say: "He still wants to believe the human mind, like evolution by natural selection, is algorithmic," cf computers.
Reminds me of an excellent piece by Rudy Rucker for a 2010 Edge collection: HOW IS THE INTERNET CHANGING THE WAY YOU THINK?
http://www.rudyrucker.com/blog/2010/01/09/edge-question-2010/
He said:
Twenty or thirty years ago, people dreamed of a global mind that knew everything and could answer any question. In those early times, we imagined that we’d need a huge breakthrough in artificial intelligence to make the global mind work—we thought of it as resembling an extremely smart person. The conventional Hollywood image for the global mind’s interface was a talking head on a wall-sized screen.
And now, in 2010, we have the global mind. Search-engines, user-curated encyclopedias, images of everything under the sun, clever apps to carry out simple computations—it’s all happening. But old-school artificial intelligence is barely involved at all.
As it happens, data, and not algorithms, is where it’s at. Put enough information into the planetary information cloud, crank up a search engine, and you’ve got an all-knowing global mind. The answers emerge.
Initially people resisted understanding this simple fact. Perhaps this was because the task of posting a planet’s worth of data seemed so intractable. There were hopes that some magically simple AI program might be able to extrapolate a full set of information from a few well-chosen basic facts—just a person can figure out another person on the basis of a brief conversation.
At this point, it looks like there aren’t going to be any incredibly concise aha-type AI programs for emulating how we think. The good news is that this doesn’t matter. Given enough data, a computer network can fake intelligence. And—radical notion—maybe that’s what our wetware brains are doing, too. Faking it with search and emergence. Searching a huge data base for patterns.
The seemingly insurmountable task of digitizing the world has been accomplished by ordinary people. This results from the happy miracle that the internet is that it’s unmoderated and cheap to use. Practically anyone can post information onto the web, whether as comments, photos, or full-blown web pages. We’re like worker ants in a global colony, dragging little chunks of data this way and that. We do it for free; it’s something we like to do.
Note that the internet wouldn’t work as a global mind if it were a completely flat and undistinguished sea of data. We need a way to locate the regions that are most desirable in terms of accuracy and elegance. An early, now-discarded, notion was that we would need some kind of information czar or committee to rank the data. But, here again, the anthill does the work for free.
By now it seems obvious that the only feasible way to rank the internet’s offerings is to track the online behaviors of individual users. By now it’s hard to remember how radical and rickety such a dependence upon emergence used to seem. No control! What a crazy idea. But it works. No centralized system could ever keep pace.
In the same book, Dennett wrote a disappointly predictable piece entitled 'Not at all'. Cheers, Janet