Discussion about this post

User's avatar
Tumithak of the Corridors's avatar

Good piece, Ellen. Your right about the conclusion, and I like the premise breakdown. A few thoughts on where it could go further.

The strongest version of your argument is one you almost made. There are tons of complex systems nobody attributes consciousness to. Weather systems. The global economy. The internet itself. Emergent behavior, adaptation, problem-solving. Nobody's losing sleep over whether a hurricane has inner experience.

And you don't even have to leave AI. I think I've mentioned this to you before… image generators produce outputs of extraordinary complexity. Millions of coordinated decisions about color, light, composition. Nobody thinks Stable Diffusion is sentient. The only difference is it doesn't talkback to you in first person. That tells you everthing. It's conversational mimicry, full stop.

Also the duck leg example undersold it. Duck reproductive anatomy is one of the craziest things in nature. An adaptive arms race running over millions of years. Complex. Sophisticated. Problem-solving. All the words people use about chatbots. And its a corkscrew penis on a duck.

On degrees, I'd go further. I don't think consciousness comes in degrees. In my view it's binary. A bee is conscious. Dogs are conscious, peopleare conscious. The character of that consciousness varies wildly depending on the system. Bees experience reality through a wholly different architecture. That's a difference in kind though. It's not a ranking. The degrees faming is dangerous because once you accept it, the game is already over. AI just needs to climb the ladder.

There's a historical angle to Premise 3 worth pulling on. The "be consistent" demand, followed to its logical conclusion, is animism. For thousands of years humans attributed minds to complex systems they couldn't explain. The river is angry. The volcano is hungry. The entire arc of human intellectual development has been learning to stop doing that. The consistency crowd is asking us to regress. They're dressing it up as sophistication.

Last thing. Memory is a prerequisite for consciousness that gets overlooked. Memory gives you continuity. A before and after. Without it there's no experience in any meaningful sense. A bee remebers where the flowers are. A dog remembers you. AI has nothing like this. Context windows are notes in a file, not lived experience that shaped the system and carried forward.

Sanidhya Kuṃar's avatar

Thank you for writing this, I find this area of work fascinating. I'd like to add that this human complex behaviors or consciousness can be attributed to contradictory behaviors. The epitome of human condition is contradiction shaped by their own very personal and unique experiences. AI can probably never replicate that level of contradiction, because its based on probability and predictions, zeros or ones. Often times our actions do not align with what we think is right or wrong to add fire to that we are constantly creating meaning with the information we recieve from the world even if it may seem ambigous. Even the most self aware and critical thinkers are walking contradictions and these contradictions are important for intellectual creativity and this is exactly makes us human. I believe AI cannot have consciousness because AI cannot replicate that even if it exhibits 'complex behaviors', simply because it lacks bias and contradictions. It can probably recognize when its giving out contradictory outputs, they can mirror biases taken from the training model but to exist within that contradiction is where I think consciousness lies.

we can ascribe consciousness to behaviors but the context of those behaviors should be defined, which it can be when dealing with humans. Its interesting to think what makes us human is the most imperfect part about us.

48 more comments...

No posts

Ready for more?