First off: we have created AIs. We have not created AIs with human-like cognition, except in very limited arenas, and I doubt that human-like cognition will never be more than a hobby.
First off: we have created AIs. We have not created AIs with human-like cognition, except in very limited arenas, and I doubt that human-like cognition will never be more than a hobby.
The important distinction that I am making is that whether evolved or programmed, an entity comes pre-loaded with behaviors that arose, not because of any specific intelligence on the part of the entity in question, but because of the intelligence of the process that created it.
I think we can, but we need to take a much more thoughtful approach to describing intelligence. We're getting better at it- we're discovering so much about intelligence in other species that we're being forced to confront our anthropocentrism.
guided by an unseen hand
Aw, that's the only Shyamalan movie I like.
I think Charlie is asking for something even more impossible and unlikely: compelling writing.
I didn't say that we evolved into these AIs. I said that these AIs were a product of evolution- just like everything else humans do. It's not accurate to say that they evolved from humans. But they are a product of human evolution. It's like domestic animals- they didn't evolve from humans, but they're a byproduct of…
Technically, the last word was yours.
Yes, we did evolve, and thus, our senses are the end result of a stochastic process we had no control over. And as evolved creatures, we've built an incredible repertoire of behaviors, and these behaviors allow us to build other intelligences. Our AIs are as much the end result of evolution as an ant-colony or rabbit…
If an artificial analog is difficult to tell apart from the real thing, then what's the difference between the artificial analog and the real thing? There's no point, really, is where I'm going with this.
Not really, although that's one thing to keep in mind. I mean that our cognition is embodied in the biological framework of neurons- we are "programmed" in the same way an AI is.
Would human beings actually feel anything if they weren't driven by biology to feel those things?
Allow me to repeat myself: But as we drift away from the human biology, we must also drift away from the human mind.
It's an interesting hypothetical, but you're missing the point: we already have a really efficient way to build neurons. The technology is about 580 million years old. What's the point of building an artificial one that exactly emulates the natural one, when we can just make natural ones, cheaply and easily? It's just…
Not in the least. But would they be anything like human minds? Any AI program that targets building human like intelligence is going to stall out and be eclipsed by programs that target other kinds of intelligence.
The "proof of thought" would be the output of the program, and by your requirements, the program can't provide output, so you've just demanded something that can't exist.
The video game AI is separate from the game, or at least it should be.
Anyone who has worked with complicated software systems knows that computers aren't based on "logic" either. Oh, sure, deep down, it's all just math. But systems rapidly go non-linear once they achieve sufficient complexity.
I'd call it a system. "Machine" implies all sorts of things that I don't think are accurate. All machines are systems, not all systems are machines.
In this case, it's definitional- if a computer were better at being a human than we were, then it would be human, and we wouldn't be.