Do you think emotion is necessary for all (not just human) intelligence? Or do you think emotional intelligence merely is one form of intelligence? Or perhaps it helps other forms? How do you define (what do you mean by) intelligence?
I would be more inclined to agree with your statement if it was restated as "In order to act in a human-like way, systems probably need an analog of human emotions."
Just incase anyone fails to spot the implications of AI without emotions, we already have examples of non artificial intelligent sentient beings without the "emotion" of empathy, we call them psychopaths and sociopaths
I'd like non sociopath/psychopath AI please. Doesn't seem too much to ask for considering the possible fallout.
But in order to recognize those emotions it must have the ability to do pattern recognition that recognises those emotions. Otherwise its just an animal.
You're still anthropomorphising too much by comparing it with an animal. A large amount of an animal's behaviour is genetically determined, wrought by Mother Nature step by step in thousands of ad-hoc adaptations, and we simply don't know what many of those adaptations are. They come together in something of a cohesive whole, each part depending intricately on all previous parts, and that's why we socially understand dogs to the extent that we do: lots and lots of the machinery that makes us both is shared.
By contrast, hopefully we'll design our AIs according to, say, engineering principles. That way, we can know what its motivation is, which I'm sure you'll agree is very important when creating something that is potentially substantially smarter than us. This is so unlike any other intelligence we interact with, that there's no particular reason we should be able to compare it to them.
My comparison is actually the pattern recognizing feedback loop not the animal part. Its just that humans can reflect on their emotions, animals cannot.
AI will be able reflect in even greater ways because its sensory input will be even more omnipresent.