« More Kodak No. 2A Folding Autographic Brownie joy | Main | The joy of ebay and vintage cameras »

June 18, 2011

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00e39339503788340154330bdc11970c

Listed below are links to weblogs that reference Freewill and Determinism. Part 15c. The problem of rainbows, consciousness and logical supervenience:

Comments

Ah, there's your problem:

A person is made of molecules but to say that a person is nothing but a pack of molecules is the kind of thing only a clever philosopher would be stupid enough to say.

Or wise enough...

The Battle of Waterloo involved a lot of molecules moving about but to say the Battle of Waterloo was nothing but a pack of molecules moving about, is clearly nonsensical.

Is it? It's undeniably true that humans are made entirely of atoms and molecules; so I suspect that it's the "nothing but" part which offends you, and that's only because of your intuitive feelings - you have still to offer any kind of proof.

If you won't accept that a person is nothing but a pack of molecules, what about a dog, a worm, a bacteria.

If you're going to claim that consciousness is the magic ingredient, then you have to define it, which you haven't done yet, and depending on your definition, consciousness can be as broad (panpsychism) or as narrow (a human faculty) as you want.

Personally, I'd go with a relative description:
an entity is conscious of stimuli to the extent to which its behaviour is modified in response to those stimuli.

This tells us that even bacteria are conscious (just not much), and so are rocks if you define chemical processes as behaviour which few would.

I leave definitions of "behaviour" and the entity which is "conscious" when you are, as an exercise for the reader.

God forbid we should exceed Yawnerama's attention span ;-)

'so I suspect that it's the "nothing but" part which offends you,'

Yes, you're right. And as I said before, I struggle to see how the mental realm could fit in with the physical anyway. I'm not offering any solutions or proof yet I'm only saying that to dismiss the mental in this way is ridiculous, and his approach raises its own problems, which he doesn't address. (more on this to come) And to be fair to Crick, NOT dismissing the mental causes its own problems too. The difference between Crick and me is that I'm prepared to face up to those difficulties ;-)

'If you won't accept that a person is nothing but a pack of molecules, what about a dog, a worm, a bacteria.'

Very good point. You anticipate Zombie Protozoa beautifully.(next post)

"If you're going to claim that consciousness is the magic ingredient, then you have to define it, which you haven't done yet"

I SO have, and right from the first post on the subject! It's the subjective phenomenological experience I have from the moment I wake up until I fall asleep. It's what I have and (I assume) rocks don't have. It's my Qualia.

"Personally, I'd go with a relative description:
an entity is conscious of stimuli to the extent to which its behaviour is modified in response to those stimuli."

Great, and that's an equally valid description as mine. In fact in many ways it's better than mine because it's objective and testable. What it fails to do though, is to address the issue of whether it is "something to be like" a bacteria or a mouse or a computer or whether they live "in the dark." It ignores the question of why and how I have phenomenological consciousness. And for me, that is the key question.


The problem with your definition, is that it doesn't define the phenomenon, it describes your experience of it - what it feels like to be conscious, rather than what consciousness is.

Unless you're saying that to be conscious is experience consciousness; which is rather circular, you need to dig a little deeper.

You'll like this, he's singing from your hymn-sheet...

http://www.guardian.co.uk/commentisfree/belief/2011/jun/17/human-consciousness-brain-activity

What it fails to do though, is to address the issue of whether it is "something to be like" a bacteria or a mouse or a computer or whether they live "in the dark." It ignores the question of why and how I have phenomenological consciousness. And for me, that is the key question.

Actually, although I deliberately didn't define consciousness, if one defines it as the process of being conscious, and one uses the relative description above, then it is possible to address the issue of consciousness in general, and say that, yes, a bacteria is conscious - a primitive and rudimentary consciousness to be sure, and nothing we would recognise; but then Wittgenstein would say the same about lions.

As to why and how you have a phenomenological consciousness, the how is probably that probably due to the fact that it's easier to express information about the state of any complex system in summary, and the why is probably that you evolved it to make social interaction with other primates easier.

"The problem with your definition, is that it doesn't define the phenomenon, it describes your experience of it - what it feels like to be conscious, rather than what consciousness is."

I think we are arguing over words here. I don't know what consciousness is. Nobody does. The only proof of consciousness I have is my own direct experience of it, so the only definition I can give is a description of how it feels.

I don't see that defining an entity as being conscious as "an entity which has qualia" is circular. If a fly is conscious(has qualia) pulling its wings off is horribly cruel. If it doesn't have qualia then that's fine. Whether we can know such a thing is an open question (next post) but the definition itself isn't in any way circular as far as I can see.

Yes, I love the link! (I'm not alone...)

As I say, your relative description is fine as far as it goes. But it doesn't in any way address the (for me) important question.

"As to why and how you have a phenomenological consciousness, the how is probably that probably due to the fact that it's easier to express information about the state of any complex system in summary, and the why is probably that you evolved it to make social interaction with other primates easier."

If phenomenological consciousness does no "work" (Brain does the work, and Mind is an epiphenomena) then it must be causally irrelevant,and therefore cannot have evolved for any purpose, only as a incidental and accidental "bolt on."

If you accept phenomenological consciousness performs some useful evolutionary function, you give it efficacy, something a materialist cannot accept. Maybe you are "one of us" after all? ;-)

I think we are arguing over words here. I don't know what consciousness is. Nobody does. The only proof of consciousness I have is my own direct experience of it, so the only definition I can give is a description of how it feels.

The problem with that definition is that it doesn't help. It tells us nothing that we don't already know, and can't usefully serve as a springboard for any sort of scientific exploration.

Which is okay if you're only doing philosophy, I suppose ;-)

If phenomenological consciousness does no "work" (Brain does the work, and Mind is an epiphenomena) then it must be causally irrelevant,and therefore cannot have evolved for any purpose, only as a incidental and accidental "bolt on."

That's not true - for example, as a member of a tribe of primates, it's extremely useful for one primate to know their own mental states, and the mental states of others.

To see the former, consider trying to plan future actions: without a classical qualia-based consciousness, the information you are trying to manipulate is scattered throughout your brain, as firing patterns within neuronal networks - it's very hard to remember, initiate and control.

If it were somehow "summarised" by some neural structure, into a smaller, more convenient representation of itself, which was easier to manipulate, I think that the advantages this this would convey in terms of evolutionary fitness would be enough to ensure that it was retained.

Especially because once you have a group of "conscious" entities, such as primates, it's a small step to link those summaries to some some of external signal; to which other entities could respond.

If you accept phenomenological consciousness performs some useful evolutionary function, you give it efficacy, something a materialist cannot accept. Maybe you are "one of us" after all?

Definitely not!

But I certainly see no conflict between a materialist viewpoint and attributing efficacy to phenomenological consciousness, as I propose above.

"The problem with that definition is that it doesn't help. It tells us nothing that we don't already know, and can't usefully serve as a springboard for any sort of scientific exploration."

I completely agree. That's why it's called the Mind-Body "problem" ;-)

I am stuck with something that I cannot describe to anyone else, and I cannot even be sure that anyone else has it. This is why the Problem of other Minds is a philosophical problem, and although science can inform the debate, it can never solve it, because I can never show you my "green" or my "pain" and a scientist can never, by definition, turn a Subjective into an Objective. You will never be able to put my "green" on the table. The best you can do is find its correlates. And even then we will never know if your green is like my green. If you were blind from birth, all this talk about green would be completely incomprehensible to you. But it wouldn't make my green any less real to me. We can talk about green because we make the assumption that we are talking about the same thing, but we may not be, and we have no way of ever knowing.

I think the reason so many neuroscientists, scientists and philosophers get themselves into so much trouble in this field (and this resonates with what we've been discussing regarding Kock's articles) is because they fail to understand this fundamental point.

Re "But I certainly see no conflict between a materialist viewpoint and attributing efficacy to phenomenological consciousness, as I propose above" This is very interesting, but too big a topic for the Comments section! We'll come back to that...

I think you're underestimating the Power Of Science.

(When you say that phrase in your head, add reverb ;-)

It's entirely plausible, in my opinion, that there will come a time when implanted electrodes of some subtle sort will be able to create and manipulate qualia at will.

Would you then accept that we would know enough to be able to say "those neurons, firing in these patterns are, for example, green to you"?

"implanted electrodes of some subtle sort will be able to create and manipulate qualia at will."

They already do this in a basic form. They stick probes into conscious brains and stimulate a few neurons or even one neuron, and this creates in the patient the sound of music or a picture.

Would you then accept that we would know enough to be able to say "those neurons, firing in these patterns are, for example, green to you"?

I'm so glad you ask that question. My answer? No, it shows the opposite!

It shows that neurons and green are NOT the same thing. The experiment demonstrates the neural correlates of my experience of green but says nothing about why there should be a subjective correlate at all. It also tells us nothing about whether what I label as green bears any relation to your experience, and nothing about why THIS pattern of neural stimulation should result in THIS subjective experience. The mystery of phenomenal consciousness remains untouched and the scientist remains utterly tight-lipped on the subject of qualia (Cue echoing silence ;-)

It shows that neurons and green are NOT the same thing.

I didn't say they were.

The experiment demonstrates the neural correlates of my experience of green but says nothing about why there should be a subjective correlate at all.

Yes it does - if we understood it at that level, we'd know that, for example. that a phase change in one area, to match the firing rate in another meant green, and a change to match a different area meant that you were experiencing red.

In theory it's perfectly possible that we would be able to understand the precise structure of consciousness, once we understood the substrate perfectly.

I know you don't believe that; but it's an inevitable consequence of the fact that the substrate obeys the laws of nature - everything it does must obey those same laws, and what it does is consciousness.

It also tells us nothing about whether what I label as green bears any relation to your experience,

Sure it does - it tells us everything we need to know: we could, in theory, determine everything it's possible to know about both subject's experience, and compare the two down to the level of knowing precisely which emotions and memories were stimulated by the experience in each subject, and to what extent.

and nothing about why THIS pattern of neural stimulation should result in THIS subjective experience.

Again, not so: the simple why is "because the brain grew that way", a more complex answer is "because this pattern of firing represents green to your consciousness, because that's the way your consciousness is structured, because that's the way your brain grew".

The mystery of phenomenal consciousness remains untouched and the scientist remains utterly tight-lipped on the subject of qualia (Cue echoing silence

When we can simulate a whole brain sufficiently well that it experiences consciousness, we'll be able to discover all that can be known about it.

Might even happen in your lifetime.

"the simple why is "because the brain grew that way", a more complex answer is "because this pattern of firing represents green to your consciousness, because that's the way your consciousness is structured, because that's the way your brain grew"."

Now THAT's a circular argument ;-)

"a change to match a different area meant that you were experiencing red."

There are HUGE problems with your use of "match" and "means" here! I'll come back to them, because Uni M wants me to do the lawn.

"When we can simulate a whole brain sufficiently well that it experiences consciousness..."

But how will you know it experiences consciousness? When it says it's experiencing green? We keep coming back to the problem of other minds. However you cut it, it won't go away. How can we tell the difference between a computer that says it experiences green but doesn't, and one that says it does, and does? Ah I need to finish my zombie protozoa post..


"It shows that neurons and green are NOT the same thing.

I didn't say they were."

Oh sorry.

"we'll be able to discover all that can be known about it."

Are you saying that when a person who has worn glasses which make her see only in black and white but who knows everything there is to know about "red," learns nothing new when she takes the glasses off? If so you should read the Mary's Room thought experiment. You'll like it, it's got your mate Dennett in it. You know, the one who doesn't have qualia ;-)


http://en.wikipedia.org/wiki/Mary's_room


"the simple why is "because the brain grew that way", a more complex answer is "because this pattern of firing represents green to your consciousness, because that's the way your consciousness is structured, because that's the way your brain grew"."

Now THAT's a circular argument ;-)

Well, no - it would be circular argument if the structure of consciousness influenced neural development; but that's not what I said.

To me, there's a simple causal pathway, from neural structures to the features of consciousness; with no feedback.

"a change to match a different area meant that you were experiencing red."

There are HUGE problems with your use of "match" and "means" here! I'll come back to them, because Uni M wants me to do the lawn.

Well, match, with reference to discussing a hypothetical physics experiment, means pretty much what I want it to mean; but in this case means "demonstrates a causal link to", since I'm postulating a time when we completely understand neural physics.

"When we can simulate a whole brain sufficiently well that it experiences consciousness..."

But how will you know it experiences consciousness?

When it says it does.

Welcome to the "hoist by your own petard show", where people who insist on subjective definitions of things are faced with entities who insist that their experience is equally valid.

When it says it's experiencing green?

"What colour is that grass?"

"Green."

"Thank you for coming, goodnight."

We keep coming back to the problem of other minds. However you cut it, it won't go away.

If you insist on subjective definitions, no, it won't.

Put the philosophy in the box, and break out the science.

How can we tell the difference between a computer that says it experiences green but doesn't, and one that says it does, and does? Ah I need to finish my zombie protozoa post..

"It shows that neurons and green are NOT the same thing.

I didn't say they were."

Oh sorry.

"we'll be able to discover all that can be known about it."

Are you saying that when a person who has worn glasses which make her see only in black and white but who knows everything there is to know about "red," learns nothing new when she takes the glasses off? If so you should read the Mary's Room thought experiment. You'll like it, it's got your mate Dennett in it. You know, the one who doesn't have qualia ;-)

If "everything there is to know about red", includes the qualia of red, then she would learn nothing new.

If not, then she would.

BTW, Dennett's ok, but he's no scientist.

But how will you know it experiences consciousness?

When it says it does.

Welcome to the "hoist by your own petard show".


??

You surely can accept the idea that you have a subjective experience, which by definition can never be experienced directly by anyone else? That's all I'm saying.

If "everything there is to know about red", includes the qualia of red, then she would learn nothing new. If not, then she would.

Great, then we agree she learns something new, because in the context of the thought experiment, "everything there is to know about red" only means full objective knowledge of red. She has full objective knowledge but no subjective knowledge.She knows nothing about the qualia of red until she takes off the glasses. Then she learns something, namely what it means to experience red. She can't share this knowledge with anyone else though, because that's what subjective means.

"Put the philosophy in the box, and break out the science."

The whole point is that I am talking about phenomenological consciousness, which is something subjective, and, as I've said before, ad nauseum, science has not contributed a single coherent sentence to the subject. I am completely open to whatever science may come up with in the future, but so far it has only addressed the objective aspects of consciousness, and it has not addressed the Hard Problem, which is what I'm interested in. If it does address this "in my lifetime" that's great, and I'll happily put philosophy back in the box.

I agree about Dennett. He's no philosopher either ;-)

You surely can accept the idea that you have a subjective experience, which by definition can never be experienced directly by anyone else? That's all I'm saying.

And all I'm saying is that if you accept that other humans have consciousness, simply because they say so, then you must also accept the same from a consciousness living in a computer; unless you're attributing magic to meat - saying that there's something special and unique about the brain which alone can be a host to a mind.

science has not contributed a single coherent sentence to the subject.

Except that it has; but you're unwilling to accept it.

Science can tell us a great deal about phenomenological consciousness: where it comes from (the brain), that it's deterministic, some of its limitations, all sorts of stuff.

"if you accept that other humans have consciousness, simply because they say so..."

But I don't accept that! That's what the Problem of other Minds is all about. We cannot know other minds.

"unless you're attributing magic to meat" The only person who has used the work 'magic' in this debate is you. ;-) There is nothing magical about phenomenological consciousness and I've never said there is. Presumably it is a much a part of the natural world as magnetism and gravity. The difference is that we understand them but we don't yet understand PC.

"Except that it has; but you're unwilling to accept it."

I'll accept it if you show it to me!

Science cannot tell me why I have THIS particular experience. All it can do at the moment is tell me that this experience correlates with THAT firing of neurons. If there has been something more written on the subject let me know.

"Science can tell us a great deal about phenomenological consciousness: where it comes from (the brain),"

I knew that already. The interesting question is WHY neurons cause PC and WHETHER what is created by the substrate follows the same rules as it. I think I can give lots of example where the rules of the substrate are not the same rules as those governing what arises from the substrate.

"it's deterministic," Well that may be true, but just saying it doesn't make it true....

"it's deterministic," Well that may be true, but just saying it doesn't make it true...

Sure does - you can't have a non-deterministic entity running on a deterministic substrate, if only because every effect in the entity comes from a cause in the substrate.

I fear you may be right, in which case there is no free will, but I'm not convinced yet. The issue rests on whether a non physical effect (Phen. Consciousness)must obey the same laws as the physical substrate from which it springs.

It doesn't manifest the same behaviour, obviously, and it can have emergent properties which the substrate doesn't, but it's bound by the same laws and it's deterministic if the substrate is.

By the way, I'm using deterministic in the free-will sense, there's still room for quantum indeterminacy, and chaos, in both the substrate and the entity.

"and it can have emergent properties which the substrate doesn't,"

This Em. Prop. business is really tricky though. I'm still trying to get my head round it. I also need to look into the issue of top down causation, which might be helpful in the debate.

Re quantum: the problem for me is that quantum indeterminacy doesn't get me anywhere near any kind of Freewill worth having.

Re quantum: the problem for me is that quantum indeterminacy doesn't get me anywhere near any kind of Freewill worth having.

Exactly - which is why I don't normally bring it into the debate; but it was worth mentioning here for completeness.

Re. top-down causation, there's an interesting paper by Craver and Bechtel here:
http://tinyurl.com/3g7wuzk

I don't think top-down causation will help, because every aspect of the mind (can I use that as shorthand for phenomenological consciousness, it's so much easier to type ;-) is ultimately, no matter how abstract it appears, entirely reliant on its substrate, so all we're talking about, no matter how top-down it seems, is aggregates of atoms affecting aggregates of atoms.

Gosh, there's a lot to digest in that paper. Looks very interesting though. I'm going to tackle it again tomorrow.

"The mind": excellent idea!

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Comments are moderated, and will not appear until the author has approved them.