10 Comments
User's avatar
Joshua Tindall's avatar

I feel like the fact that it's used this way indicates a need for a catchy term that refers to what people want it to mean: a system which is not conscious, but is indistinguishable from a conscious system on the basis of outputs/behaviour alone. Behavioural zombies/b-zombies?

This scenario also seems like it's going to be 10x more discussed than the original metaphysical one

Expand full comment
Pete Mandik's avatar

yes! that need is what motivates many uses of “zombie” that don’t conform to the tight prescription here

Expand full comment
Robert Long's avatar

I like b-zombie to be honest!

Expand full comment
Pete Mandik's avatar

i was going to mention weather watchers, but deleted it from my comment at the last second! nicely done

Expand full comment
Pete Mandik's avatar

entities that lack behavior? oh boy

Expand full comment
Robert Long's avatar

WW2-Zombies (weather watchers 2: zombies)

Expand full comment
Pete Mandik's avatar

i think there’s a case to be made that microphysical duplication is not an absolute requirement—that a conscious being’s zombie doppelgänger need only be indistinguishable in some physical respect or other. Which respect matters depends on what the zombie invoker is trying to argue for. If antiphysicalism, then microphysical indistinguishability is required. If, instead, antifunctionalism or antibehaviorism is the target, then the zombies in question need only be indistinguishable in some coarser-grained respect. Anyway, that’s what I’ve been telling everybody for the past few decades.

Expand full comment
Robert Long's avatar

You'd definitely know better than me! But my impression is that p-zombie (sans qualifier) is invoked, the vast majority of the time, in debates about physicalism, right?

Expand full comment
Pete Mandik's avatar

yes, no doubt about that. but there are nearby debates about functionalism vs neuroreductionism that one could easily invoke the “z”-word within while not violating your (imho very good) suggestion that we shouldn’t just use “zombie” for “consciousness lacker”. Anyway, I suspect you and I are both on the losing side of a losing battle. And I’ve encountered eg neuroscientists referring to zombie modules, etc.

Expand full comment
The One Percent Rule's avatar

I agree with your claim closer to the SEP entry (https://plato.stanford.edu/entries/zombies/) even though I used P-Zombies when quoting Suleyman in my post today. Current AI systems are definitively not p-zombies in the atom-for-atom, behaviorally-indistinguishable sense. I think you made that distinction well.

To avoid the confusion you have flagged, perhaps a more fitting (though admittedly less snappy) concept to borrow from the philosophy of mind for the "apparently conscious but potentially vacant" AI would be Block's Absent Qualia or a version of Ned Block's Chinese Room or Chinese Nation thought experiments. https://plato.stanford.edu/entries/chinese-room/

The Chinese Room focuses on a system that is purely syntactic (manipulating symbols) but lacks semantic understanding or consciousness, which resonates more with the current state of Large Language Models (LLMs) and their potential lack of understanding or qualia despite impressive output.

It sidesteps the need for physical and behavioral identity with a human, and focuses directly on the gap between functional simulation and genuine understanding/experience. This seems a more precise parallel for the anxiety surrounding AI consciousness.

Expand full comment