I think one thing we often assume is that we make something that can learn, then it exponentially increases in intelligence to the point where it takes over (violently or not). What this relies on is that all intelligence can be deduced given enough brainpower. Just because an AI can think REALLY hard doesn't mean it…