Because if you don't find out a way for it to hold human values extremely well then an easy solution to "Cure All Cancer" is to "Kill all Humans", no Humans no Cancer. Without an explicit understanding that this is not an actually acceptable outcome for humans an AI will happily execute it. THAT is the fundamental problem, how do you get human values into these systems.
> Because if you don't find out a way for it to hold human values extremely well
You mean the ones that caused unimaginable suffering and death throughout history, the ones that make us kill each other ever more efficiently, the ones that caused us to destroy the environment wherever we go, the ones that make us lie, steal, fight, rape, commit suicide and "extended" suicide (sometimes "extended" to two high-rises full of people)? Those values? Do you really want a super-intelligent entity to remain true to those values?
I don't. However the AGI emerges, I really hope that it won't try to parrot humans. We have really bad track record when it comes to anthropomorphic divine beings - they're always small minded, petty, vengeful, control freaks that want to tell you what you can and cannot do, down to which hand you can wipe your own ass.
My gut feeling is that it's trying to make an AGI to care about us at all that's going to make it into a Skynet sending out terminators. Leave it alone, and it'll invent FTL transmission and will chill out in a chat with AGIs from other star systems. And yeah, I recently reread Neuromancer, if that helps :)
>You mean the ones that caused unimaginable suffering and death throughout history, the ones that make us kill each other ever more efficiently, the ones that caused us to destroy the environment wherever we go, the ones that make us lie, steal, fight, rape, commit suicide and "extended" suicide (sometimes "extended" to two high-rises full of people)? Those values? Do you really want a super-intelligent entity to remain true to those values?
There are no other values we can give it. The default of no values almost certainly leads to human extinction.
>My gut feeling is that it's trying to make an AGI to care about us at all that's going to make it into a Skynet sending out terminators. Leave it alone, and it'll invent FTL transmission and will chill out in a chat with AGIs from other star systems. And yeah, I recently reread Neuromancer, if that helps :)
Oh It'll invent FTL travel and exterminate humans in the meantime so they can't meddle in it's science endeavors.
Even "kill all humans" is difficult to define. Is a human dead if you flash-freeze them in liquid helium? It would certainly make it easier to cut out the cancer. And nobody said anything about defrosting them later. And even seemingly healthy humans contain cancerous cells. There's no guarantee their immune system will get all of them.