

You talk like you know what the requirements for consciousness are. How do you know? As far as I know that’s an unsolved philosophical and scientific problem. We don’t even know what consciousness really is in the first place. It could just be an illusion.
You’re getting downvoted but I absolutely agree. I don’t understand why “AI algorithms are just math, therefore they can’t have consciousness” seems to be the predominant view even among people interested in the topic. I haven’t heard a single convincing argument why “math” is fundamentally different from human brains. Sure, current AI is way less complex and doesn’t have a continuous stream of perceptual input. But that’s something a “proper” humanoid robot would need to have, and processing power will increase as well.