Yes, but I don’t see that as particularly significant in this context. Information, including the knowledge of economic theory stored in a human brain, can be represented digitally. The fact that a present-day AI presumably can’t actually experience what it’s like to be unhappy as prices rise and incomes fall doesn’t affect its ability to reason about economics.
We should probably just agree to disagree. I think the strides made in AI are at the very least impressive and have made some things (text-to-speech, for example) better - if not enormously then at least noticeably.
But there isn’t a true analog to be had between calculated probabilities and conscious thought. The former is a mimic of varied competence, but has no logic inherent to it. It requires human maintenance, it’s only path to “growth” if we want to call it that, is a black-box of infinite probabilities it calculates at incredible speed.
It’s a super-magic-8-ball that we choose to pretend has agency of some sort. But it does not.
Yes, but I don’t see that as particularly significant in this context. Information, including the knowledge of economic theory stored in a human brain, can be represented digitally. The fact that a present-day AI presumably can’t actually experience what it’s like to be unhappy as prices rise and incomes fall doesn’t affect its ability to reason about economics.
We should probably just agree to disagree. I think the strides made in AI are at the very least impressive and have made some things (text-to-speech, for example) better - if not enormously then at least noticeably.
But there isn’t a true analog to be had between calculated probabilities and conscious thought. The former is a mimic of varied competence, but has no logic inherent to it. It requires human maintenance, it’s only path to “growth” if we want to call it that, is a black-box of infinite probabilities it calculates at incredible speed.
It’s a super-magic-8-ball that we choose to pretend has agency of some sort. But it does not.